OnBenchMark Logo

Shyam (RID : 14f6gl5c9mdhl)

designation   GCP Cloud Data Engineer

location   Location : hyderabd, India

experience   Experience : 12 Year

rate   Rate: $30 / Hourly

Availability   Availability : Immediate

Work From   Work From : Offsite

designation   Category : Information Technology & Services

Shortlisted : 2
Total Views : 217
Key Skills
GCP Cloud Microsoft Azure MySQL Oracle C Core Java SQL Python Unix Shell Scripting Hadoop Control M Tera Data
Discription

•       Having 8+ years of IT Experience on Cloud platforms like GCP & Azure and Python, Spark ecosystem and Scala for big data analytical and streaming purposes, Data Warehousing tool in IBM Infosphere DataStage and Quality stage, and Ascetical Data stage Parallel extender versions.

•       Excellent experience in creating Data Fusion pipelines, using big queries to write complex queries using the GCP platform.

•       Excellent experience on could composer on creating the DAGs to schedule in Airflow.

•       Excellent experience in Airflow on scheduling and monitoring pipelines, creating the necessary variables and connections on need.

•       Excellent experience in cloud storage on creating permanent, and temporary storage and handling the storage mechanism.

•       Having good experience in Looker to generate the reports

•       Excellent experience in writing programming in core languages like python, and Spark for big data analytical and streaming purposes, Scala, and Sqoop, and writing the SQLs to achieve the expectations as per the need.

•       Excellent Experience in Designing, Developing, Documenting, and Testing ETL jobs and mappings in Server and Parallel jobs using Data Stage to populate tables in Data warehouses and Data marts.

•       Proficient in developing strategies for Extraction, Transformation, and Loading (ETL) mechanism.

•       Expert in designing Parallel jobs using various stages like Join, Merge, Lookup, remove duplicates, Filter, Dataset, Lookup file set, Complex flat file, Modify, Aggregator and XML.

•       Experience in analyzing the data generated by the business process, defining the granularity, source to target mapping of the data elements, and creating Indexes and Aggregate tables for the data warehouse design and development.

DOMAIN EXPERIENCE

1)    E-commerce

2)    Insurance

3)    Banking

4)    Telecom

 EDUCATION

•       Masters in science from Kakatiya University

 CERTIFICATIONS

•      Google cloud certified professional data engineer

 SKILLS

Cloud Technologies: GCP and Microsoft Azure

Reporting Tools                              : Looker

Scheduling Tools                            : Airflow, Control-M, Autosys

Bigdata Platforms                          : Spark, Hadoop, HDFS, hive, Python, Scala, Sqoop and Hortonworks HDP

ETL tools                                         : DataStage V11.3.x, IIS V8.7, IIS V8.5.x & 7.5.x

Database                                         : MySql, DB2, Oracle, Teradata, Netezza and HP Vertica

High-Level Languages                   : C, Core JAVA & SQL.

Operating System                           : OpenSUSE Linux, AIX 5.3, AIX 6.1 & Red-Hat Linux

Scripting Languages                      : SQL, Python, Unix Shell Scripting, HiveQL, SCALA 


EXPERIENCE

Current Title: Working as a Senior Technical Lead for Delta cubes Technologies from May 2015 to till date.

Previous Title: Worked as a Senior Software Engineer for Outworks Solutions Pvt Ltd from May 2013 to Apr 2015.

PROJECTS

Project                               : Backcountry Data Engineering and Reporting Services

Client                                 : Backcountry

Role                                    : Senior Technical Lead

Duration: March 2021 to till date

Technologies and Tools: GCP, Cloud Composer, Virtual Machine, Big Query, MySql, Python, Spark, Airflow, Git, Looker

Responsibilities

•Participated in regular meetings to understand the business requirement and to provide the business solutions

• Prepared the business solution documents and sharing with the client to get approval.

• Created DAG using the cloud composer, scheduled in Airflow to monitor the jobs for a successful run.

Project                              : EDW Migration and Data Modernization

Client                                : Rodan and Fields

Role                                  : Senior Technical Lead

Duration                           : March 2020 to March 2021

Technologies and Tools: GCP, Data Fusion, Big Query, MySql, Python, Spark, Airflow, Git, Jenkins, Terraform

 Description: Rodan & Fields, LLC, known as Rodan + Fields or R+F, is an American manufacturer and multi-level marketing company specializing in skincare products. The main intention of this project is to migrate the existing system into the GCP cloud platform to generate the reports Quickly.

Responsibilities

• Involved in understanding the Business requirement, analysis, and discussions about functional and technical feasibility with the client to get the approval.

• Interpreting mapping documentation and translating it into detailed design specifications.

Project: Omni-Xref (Leading Cloud Platform solution for our retail industry client)

Client                                 : Kohl’s

Role                                    : Technical Lead

Duration: July 2019 to Feb 2020

Technologies: GCP, BQ, SQL, Python, Spark, and Scala



 

 
Matching Resources
My Project History & Feedbacks
Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon
Loading…

stuff goes in here!