OnBenchMark Logo

Gunveer (RID : 210ulpi6v3xp)

designation   AWS Data Engineer

location   Location : Noida

experience   Experience : 8 Year

rate   Rate: $16 / Hourly

Availability   Availability : Immediate

Work From   Work From : Offsite

designation   Category : Information Technology & Services

Shortlisted : 1
Total Views : 53
Key Skills
AWS Data Engineer Lambda Python Lambda Functions SLA ETL
Discription

Professional Experience:

 

Project Name: Medtronic (July 2022 – September 2023)

Project Role: AWS Data Engineer.

 

       Responsibilities:

 

  • Create and maintain ETL processes to extract, transform, and load data from various sources into target data stores, such as data lakes.
  • Collaborate with data architects and stakeholders to design scalable and cost-effective data processing architectures on AWS, considering factors like performance, security, and compliance.
  • Responsible for designing and building the ETL pipeline, ensuring data extraction from various sources, transformation, and loading into the target data warehouse.
  • Develops and maintains the actual ETL code, including scripts, workflows& data integration processes. This role requires skills in programming languages (e.g., Python) and ETL tools (e.g., AWS Glue, Apache Spark,DataPyspark.).
  • Monitors and maintains the ETL and DBT pipeline in production. Troubleshoots and resolves any issues that arise, and ensures the pipeline runs smoothly.
  • Document architecture, data pipelines, and processes
  • Create and manage data catalogs using AWS Glue's Data Catalog service to maintain metadata about datasets, tables, and partitions.
  • I also help them transform their IT infrastructure, operations & applications to make the most of the scalability, innovation and cost efficiency of the AWS.

 

 

 

Project Name: AstraZeneca (December 2021 – June 2022)

Project Role: AWS Python Engineer.

 

 

SERVERLESS ON AWS

 

  • Developed jobs with the help of lambdas and Step Function.
  • Developed Python code in lambda as per requirement.
  • Transferring the file/object from one bucket to another bucket.
  • Loading the data into oracle table with the help of lambda.
  • Testing the code in dev/ppd before moving to prod/QA environment
  • Worked with AWS Lambda Functions in python to automate many tasks of AWS.

       Some of them are:

  • Automate the jobs as per schedule with the help of Step Functions (SF) and Event Bridge
  • Automated the creation of alarms and notifying them to specified distribution groups using AWS SNS Notifications.
  • Detection and Auto-remediation of un-intended permissions in Bucket and Object ACLs.
  • Processing the Big amount of data in a columnar storage and performing the data analytics on that.

 

 
Matching Resources
My Project History & Feedbacks
Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon
Loading…

stuff goes in here!