OnBenchMark Logo

Saurabh (RID : pp3lo5lsw37)

designation   AWS Data Engineer

location   Location : Pune, India

experience   Experience : 4 Year

rate   Rate: $14 / Hourly

Availability   Availability : Immediate

Work From   Work From : Offsite

designation   Category : Information Technology & Services

Shortlisted : 3
Total Views : 55
Key Skills
AWS Python SQL Linux GitHub Power BI
Discription

Saurabh Hinge

Professional summary

 

Data Engineer

 

 

 

 

 

With 4.1 years of experience as a Data Engineer, I specialize in building ETL pipelines using PySpark and AWS Glue, as well as

creating BI reports using Power BI Desktop. My skillset also includes ad hoc SQL query writing and proficiency in Python, SQL and AWS services such as S3, Redshift, DMS, Athena, Step Function, and Snowflake.

 

Technical Expertise

 

  • Programming Language : Python, PySpark
  • Database                           : SQL Server, MySQL
  • AWS                                   : Glue, S3, Lambda, Step Function, Redshift, DMS
  • Reporting Tools                : Power BI Desktop
  • Operating System            : Windows, Linux
  • Other                                 : GitHub, Spark, Hadoop, MS-Office, etc.

 

Professional Experience

 

  • Data Engineer                                                                                                                            – from sept 2019 to Present

 

    • Developed and maintained Data Pipelines and ETL processes using Python, Spark, and AWS services.
    • Provided end-to-end data solutions to the business and analytics team ensuring end-to-end encryption by leveraging AWS cloud services and native python scripting.
    • Engineered data pipelines using internal ETL tools, and native python scripts with Spark SQL and AWS services like Glue (Pyspark) and Lambda ensuring data encryption at rest and in transit.
    • Worked on QA support activities, test data creation, and Unit testing activities.
    • Actively participated in daily stand-up, biweekly SCRUM, and project meetings.

 

Projects

 

  1. Cloud Data Migration, Banking Domain, 19 Months duration

 

    • Tools: - Python, PySpark, SQL, AWS (Glue, S3, Lambda, Redshift, Step Function, DMS), Excel etc.
    • Roles: -
      • Led the migration of Proc workflow into PySpark, while applying optimization techniques to reduce the time and generate the expected reports.
      • Engage with BA to understand the requirement clearly and develop missing scenarios and clarified them with BA.
      • Involved in working on Spark SQL code as an alternative approach for faster data processing and better performance.
      • Inject and process a large amount of data from various structured and semi-structured sources into S3 (AWS Cloud).
      • Deployed the development code after the client's approval.
      • Collaborate on QA support activities, test data creation, and Unit testing activities.

 

  1. Build ETL Pipeline and Analysis Report, Banking Domain, 15 Months duration.

 

    • Tools: - Python, PySpark, SQL, AWS (Glue, S3, Lambda, Redshift), Excel etc.
    • Roles: -
      • Responsible for applying data cleaning techniques to correct and remove in-accurate and corrupted data, to improve data quality which significantly improves productivity.
      • Responsible for applying multiple performance tuning techniques to get faster performance.
      • Develop and Maintain ETL flows, data models, reports, and dashboards for Supply Chains departments - Inventory Data modeling, Complexity model, and products analysis.
      • Solved performance issues in scripts with the understanding of Join, Group, and Aggregation.
      • Wrote calculated columns, and Measures queries on the power bi desktop to show good data analysis techniques.
      • Filtered data and performed multiple operations to get meaningful insides that help our client in decision- making and analysis.
      • Working in Parallel with QA to solve bugs raised on modules.

 

 

Educational Qualification

 

 

  • Post Graduate Diploma In Banking Services - 8.83 CGPA(June 2019) from Manipal University , Banglore
  • Bachelor In Science (Agriculture) – 79.30% (June 2018) from Mahatma Phule Krishi Vidyapeeth ,Rahuri

 

Achievements

 

  1. Certificate of Appreciation –
    • I got 5 rating in financial year 2020-21
    • Got appreciation from MD in financial year 2022-23

 

 

Certification/Trainings

 

  1. Certification
    • Google Data Analytics
    • SQL: Data Reporting and Analysis
    • Business Analysis Fundamentals
    • Python crash Course
    • Apache Spark

 

  1. Trainings
    • AWS Pursuing the knowledge from Udemy Course
    • Share Data Through the Art of Visualization
    • Introduction to Data Engineering, IBM
 
Matching Resources
My Project History & Feedbacks
Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon
Loading…

stuff goes in here!