Azure Cloud Engineer
Location : Jaipur, India
Experience : 8 Year
Rate: $26 / Hourly
Availability : Immediate
Work From : Offsite
Category : Information Technology & Services
Work History
Sr Azure Data Engineer
Description: Creating multiple ETL and Spark transformation pipeline for
Billing, Invoice and finance data using Azure ADF, Data bricks, Pyspark,
SparkSQL and writing this data to Delta Lake and Cosmos DB.
⦁ Worked on different Telecom Billing use cases where we need to import data from different sources using ADF and write into different file format and transform the data using Spark.
Spark, Pyspark, SparkSQL,Data Bricks,Event hub, TSI, and Cosmos DB.
Data Engineer
Description: • Created a Streaming pipeline for process heath with Pyspark, and predict process health using multiple algorithm Event hub, TSI, and Cosmos DB
⦁ This pipeline picks up the data from Event hub and after transformation it sends the tags(data) to ML end point
⦁ The result from ML endpoint will be again stored to Destination Event hub,TSI and Cosmos Db
2006-01
Image
Handled other CI/CD pipeline of Mogalakwena site.
Image
Algorithm used Linear Regression, Decision Tree, Random Forest, Naive Bayes
Project Engineer
Description: Indian State Government Description: • Text Classification and sentiment Analysis on citizen grievance's free text to find out Exact Department of Grievances and sentiment of user from Feedback, text classification with the help of ngram, TF IDF and Bag of Words technique with Spark Python (PySpark) and NLP. It Helps to identify Exact Department of Grievance Logged by citizen and Resolve problem 90% Faster than. Using Hadoop, Sqqop Hive, Python and Tableu
Associate Technology L1
Created a solution to improve ITservice Request and agent performance with help of Hive,Tableau dashboard.
⦁ TextAnalysis on box and vox(internal socialnetworking site)with the help of ngarm, tfidf and bagof words with PythonNLP.
⦁ Gathered employee data from different source and created interactive and decision-making dashboard on Tableau to improve policies for employee.
⦁ Loadingthedatafromthedifferent Datasourceslike(OracleandSQLServer)into Hivetables using sqoop for incrementalload. Technology used: HIVE, Sqoop, Python, CDH3, Tableau 9.3, Linux.