1. Project Summary : XM
● Technology used: azure logic apps, data factory, Data flows , ADLS gen2 .
● Role : Developer
● The main objective of this project is to move IRED data from azure Maxerience subscription to EDL subscription and from there we have to move to AWS with transformations. Then using Lambda we have to load the data into Amazon Redshift.
● Project Summary : SCHUB
● Technology used: Synapse Analytics,Pyspark,SQL Server,ADF, ADLS gen2,SQL .
● Role : Developer
● The Main Objective Of this Project is to Load the Data from blob Storage into SQL server. Files Are placed into blob storage. whenever file comes triggers ADF pipeline and reads the file , Do the transformations in Pyspark and load them into the SQL server.
2. Project Summary :
Azure Automation As a Service
Environment: azure synapse analytics, ADLS , Pyspark, purview, azure purview, delta lake, Azure Rest API’s.
A3S is a Software Product , where we are implementing Data migration from different sources to Synapse SQL Pool, then client can use data to analyse Data. Data migration is Done using pyspark code then code will be saved in the ADLS, then Submit the code to spark using Azure Rest API’s. Then client direct Migrate data from source to Destination by simply clicking button from UI by his choice of source.
3. Project Summary :
Zoom ,CVENT Integration:
Environment: Logic apps, sql server,blob storage,key vaults, function apps,zoom API’s,CVENT API’s.
This is Integration project, Before this we were receiving daily sheets manually from both zoom and cvent teams, where we need to automate this without human intervention ,Using zoom and cvent exposed API’s we need to automate the process using Logic apps.
Leading Torch India Pvt Ltd – Application Developer
11/2019 – 05/2021
● Develop pyspark code to transform given file and data and generate reports .
● Built the azure logic apps, function apps work flow .
● Unit test for both Pyspark Notebooks and Logic apps.
● Communicate with different Teams to take inputs and give inputs.
● Base Resources Creation in the azure portal(resource groups, key vaults, cluster creation for databricks, blob storage etc.)
4. Project Summary :
Environment: logic apps, pyspark, Mainframes, shell script, Spring boot , sql server, HTML5, JS, CSS.
Tool : azure cloud, databricks, spring tool suit.
Margins is Incentive Claim Administration Tool and it is existing application which is responsible for paying incentives to the dealers after the RDR’s verified. Now Margins system is re-imagined with newer advanced technologies for more manageable application.
Margins is data migrating project from mainframes Cobol to pyspark, Through the ETL process. database team will load data from oracle to sql server ,then write pysaprk code to retrieve data and format reports data as required by client and store the report in blob storage. Then schedule logic apps to run run the reports and insert reports data into sql server ,from the User interface respective Dealer will take the reports data and send files to SARS(mainframes application).