OnBenchMark Logo

Madhusudhan reddy

Azure Data Engineer
Location Hyderabad
Total Views90
back in time
Member since30+ Days ago
 back in time
Contact Details
phone call {{contact.cdata.phone}}
phone call {{contact.cdata.email}}
Candidate Information
  • User Experience
    Experience 5 Year
  • Cost
    Hourly Rate$30
  • availability
  • work from
    Work FromOffsite
  • check list
    CategoryInformation Technology & Services
  • back in time
    Last Active OnJune 01, 2024
Key Skills
AzureSQLPythonMicrosoft AzurePysparkDatabricksSynapseAzure StorageETLData Extractiondata warehousing

1.    Project Summary : XM

●      Technology used: azure logic apps, data factory, Data flows , ADLS gen2 .

●      Role : Developer

●      Description:

●      The main objective of this project is to move IRED data from azure Maxerience subscription to EDL subscription and from there we have to move to AWS with transformations. Then using Lambda we have to load the data into Amazon Redshift.

●     Project Summary : SCHUB

●      Technology used: Synapse Analytics,Pyspark,SQL Server,ADF, ADLS gen2,SQL .

●      Role : Developer

●      Description:

●      The Main Objective Of this Project is to Load the Data from blob Storage into SQL server. Files Are placed into blob storage. whenever file comes triggers ADF pipeline and reads the file , Do the transformations in Pyspark and load them into the SQL server.


2.    Project Summary :   

Azure Automation As a Service

Environment: azure synapse analytics, ADLS , Pyspark, purview, azure purview, delta lake, Azure Rest API’s.

Role: Developer


A3S is a Software Product , where we are implementing Data migration from different sources to Synapse SQL Pool, then client can use data to analyse Data. Data migration is Done using pyspark code then code will be saved in the ADLS, then Submit the code to spark using Azure Rest API’s. Then client direct Migrate data from source to Destination by simply clicking button from UI by his choice of source.


3.    Project Summary :   

Zoom ,CVENT Integration:

Environment: Logic apps, sql server,blob storage,key vaults, function apps,zoom API’s,CVENT API’s.

Role: Developer


This is Integration project, Before this we were receiving daily sheets manually from both zoom and cvent teams, where we need to automate this without human intervention ,Using zoom and cvent exposed API’s we need to automate the process using Logic apps.


Leading Torch India Pvt Ltd Application Developer

Hyderabad, Telangana

11/2019 – 05/2021


●      Develop pyspark code to transform given file and data and generate reports .

●      Built the azure logic apps, function apps work flow .

●      Unit test for both Pyspark Notebooks and Logic apps.

●      Communicate with different Teams to take inputs and give inputs.

●      Base Resources Creation in the azure portal(resource groups, key vaults, cluster creation for databricks, blob storage etc.)


4.    Project Summary :   


Environment: logic apps, pyspark, Mainframes, shell script, Spring boot , sql server, HTML5, JS, CSS.

Tool : azure cloud, databricks, spring tool suit.

Role: Developer



Margins is Incentive Claim Administration Tool and it is existing application which is responsible for paying incentives to the dealers after the RDR’s verified. Now Margins system is re-imagined with newer advanced technologies for more manageable application.


Margins is data migrating project from mainframes Cobol to pyspark, Through the ETL process. database team will load data from oracle to sql server ,then write pysaprk code to retrieve data and format reports data as required by client and store the report in blob storage. Then schedule logic apps to run run the reports and insert reports data into sql server ,from the User interface respective Dealer will take the reports data and send files to SARS(mainframes application).

Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon