OnBenchMark Logo

Himanshu (RID : 4kv2lps5elmg)

designation   Azure Data Engineer

location   Location : Noida

experience   Experience : 10 Year

rate   Rate: $16 / Hourly

Availability   Availability : Immediate

Work From   Work From : Offsite

designation   Category : Information Technology & Services

Shortlisted : 2
Total Views : 55
Key Skills
Azure Python ETL Azure SQL PySpark AWS Synapse
Discription

Himanshu

Professional Summary:
● Around 10 years’ of extensive professional software development experience.
● Having 7 years’ experience on PYTHON Pandas DataFrame.
● 5 years’ of experience in ETL projects.
● 4 years’ experience in Azure services like Azure DataBrick, Azure DataFactory, Linked services, Blob, Delta lake, Synapse, Azure SQL, Azure function, web services.
● 4 years’ of experience in PySpark.
● 3 years’ of experience in AWS like S3, Athena, Redshift, AWS Lambda, SNS, SQS.
● Also worked on GITLab for creating CI/CD pipeline, code repository, YAML file.
● 7 years’ of experience in team handling (max 18 members of dev team handled).
● 4 years’ of experience in Independently handling clients and their requirements. Requirement review, analyzing scope of work.
● Extensive experience in MS SQL Server, My SQl, Oracle.
● Good understanding of PYTHON using components: Pandas DataFrame, pandas lambda, pyodbc, Numpy, Matplotlib, Pysftp, SQLALCHEMY.
● Team player with good interpersonal skills.
● Excellent verbal and written communications skills.


Technical Skills:

Technology Python, PySpark

RDBMS MS SQL (SSMS), MySQL

Tools Jupyter, PyChram VS Code, GitLab, Azure
And Synapse



Big Data Technologies Azure DataBricks, Azure Data Factory, Pyspark, Synapse, Pandas DataFrame, Numpy, Matplotlib, Pysftp, and SQLALCHEMY

Skills
Python, Pandas, Azure
DataBricks, Azure Data Factory,
PySpark, Synapse, Linked
Services, GIT, AWS, RDS,
Athena, Secret Manager,
Amazon WorkSpace, S3 Bucket,
GitLab, CI/CD Pipelines, CI/CD
Jobs, Jenkins, Docker, Yaml,
Citrix, Amazon Simple
Notification Service (SNS) |
Messaging Service, Amazon
Simple Queue Service, Athena,
Data migration, Selenium,
crome web driver, urllib2,
shutil, function tools, paramiko,
logging, pyodbc, os, os.path,
base64, create_engine,
credentials, accounts, JIRA,
SQLALCHEMY, Matplotlib,
Flask, Numpy, Matplotlib,
Pysftp, Docker, Alert
notifications | Grafana Labs.

Awards
* State-level topper in
Computer subject in High school
Board.
* Appreciation letter from
DDGIT of Indian Army.
* Developed web-based Applications for DGIS, VCOAS, and ASDC.
* Got first prize in the Carrom Board Competition at SDEC.

Hobbies
Bike Riding
Playing Carrom Board
Playing Video Game


Domain Knowledge: Insurance / Finance

Projects Undertaken:

1. Confidential —BPO calling data (ETL Development). Duration: Dec 2022 - June 2023

● Big data Developer or Azure Python Developer.
● There are 3 stages. i.e; Source-to-Raw, Raw-to-Refined, and Refined-to-Certified.
● We pull Client’s Data form source and dump the data into the Raw layer in csv format.
● We perform data cleaning and quality checks on top of raw data.
● Based on clients business logic we transform the data and store it in the form of dimension and fact tables.
● Transformation is applied to The Refined data and is stored in the form of Dimension and Facts.

Environment: Azure DataFactory, Azure Data Bricks, Azure Synapse, Azure Linked Services, SSMS, Blob, File format like; CSV, Parquet, etc., Pipeline, Activities, DataSets, Linked Services.

2. DentsPlySirona —ERP system Data Warehousing (ETL
Development).
Duration: April 2022 - Nov 2022

● Big data Developer or Azure Python Developer.

● There are 3 stages. i.e; Source-to-Raw, Raw-to-Refined, and Refined-to-Certified.
● Client’s Data received into the raw layer from Qlik.
● Quality checks are performed on raw data and stored into a refined layer.
● Transformation is applied to The Refined data and is stored in the form of Dimension and Facts.

Environment: Azure DataFactory, Azure Data Bricks, Azure Synapse, Azure Linked Services, SSMS, Blob, File format like; CSV, Parquet, etc., Pipeline, Activities, DataSets, Linked Services.


3. Otsuka — File DQ Checks.
Duration: Jan 2022 - Mar 2022

● Quality checks performed on received files like, File Name, File Type, File Size, File Header, etc.
● First files are validated based on file_Name_Pattern along with the File Format match as mentioned.
● Then checked for File_Size

 
Matching Resources
My Project History & Feedbacks
Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon
Loading…

stuff goes in here!