OnBenchMark Logo

Venu Gopal

Azure Data Engineer
placeholder
Location Hyderabad
view-icon
Total Views96
availability
Shortlist0
back in time
Member since30+ Days ago
 back in time
Contact Details
phone call {{contact.cdata.phone}}
phone call {{contact.cdata.email}}
Candidate Information
  • User Experience
    Experience 6 Year
  • Cost
    Hourly Rate$8
  • availability
    AvailabilityImmediate
  • work from
    Work FromOffsite
  • check list
    CategoryEngineering & Design
  • back in time
    Last Active OnJune 14, 2024
Key Skills
Azure StorageMicrosoft Azure
Summary

PROJECTS PROFILE

 

Project#1      : ESG: Environment, Social and Governance


Environment  :Microsoft Azure Storage,Azure Data Factory, ADLSGEN2

                               , Azure SQL-DB


Role                   :  Azure Data Engineer


Duration          :  June 2022 – till date


Description    :   Data Hub is a single Azure based environment to host the Data and MI Solutions

                              to deliver the core foundational aspects that will allow other business led

                              Projects to be delivered successfully.

 

Roles Responsibilities:

 

·        HP project to provide metric details in automated way for developing Sustainability Impact report.

·        Worked as ETL developer to develop the code and pipelines in Azure Data Factory which facilitated the incoming and outgoing of the data to and from on-premises.

·        Responsible for taking care of the incremental loading of the data.

·        Working as a key developer in Azure SQL database for creation and maintenance of stored procedures.

·        Involved in debugging and solving bugs.

·        Started to work on Azure Synapse Analytics

 

Project#2       : QBE Insurance DataHub

 

Client                 :  QBE Insurance Group Pvt lmt

 

Environment   Microsoft Azure Storage,Azure Data Factory, ADLSGEN2

                                   Azure Data Bricks, Spark and Scala, Azure SQL-DB


Role                   : Azure Data Engineer


Duration        :  May 2021 – May 2022

 

Roles Responsibilities:

 

• Gathering information and analyzing the sources.


• Creating New ADF pipelines to ingest the data into DataHub from various sources.


• Responsible for creating or configuring the triggers.


• Creating Notebook using Pyspark in Azure data bricks and write logics

per the business requirements.


• Loading the data according to the target data model to Synapse.


• Reconcile the data at each layers of the DataHub.


• ADF is used to Orchestrate the data across layers.


• Followed the agile methodology in delivering and used Azure boards for

project management and Bugs tracking.


• Performing Unit and Integration testing before code merge.

 

Project#3

 

Title                   : MAS (Monetary Authority of Singapore)


Client                 : Prudential (Singapore)


Environment     : Microsoft Azure Storage,Azure Data Factory,

                                   Azure Data Bricks, Spark and Scala, Azure SQL-DB


Role                   :  Data Engineer

 

Duration        : April-2021 to May 2021

 

Roles & Responsibilities:


·         Developing  HQL queries as per Business Requirement in data bricks Notebooks .

 

·        Implementing the data transformation or processing logics using Azure Databricks service with spark and scala.

 

·        Create mount points for Azure services whit the help of Databricks Notebooks.

 

·        Used azure Databricks notebook to write the spark code for Master Note book .

 

·        Loading the transformed datasets into Azure SQL-DB using Azure data factory Pipeline.

 

·        Invoke the emil-pipeline to get the status of pipeline and writing the stored procedures on SQL-DB to handle upserts.

 

·        Developing data pipelines using Azure data factory to run the pipeline daily basis .


Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon