OnBenchMark Logo

Venkatesh (RID : 14f6gl5c9cei3)

designation   AWS Data Engineer

location   Location : Hyderabad, India,

experience   Experience : 7 Year

rate   Rate: $25 / Hourly

Availability   Availability : Immediate

Work From   Work From : Offsite

designation   Category : Information Technology & Services

Shortlisted : 8
Total Views : 167
Key Skills
AWS Snowflake SSIS SQL Server AWS Glue Lambada SQL Informatica Power centre
Discription
  • Total 7 years of professional career.
  • Have good knowledge and work experience in Agile methodology
  • Have good knowledge of Data warehousing
  • Had extensive hands-on using Informatica power center
  • Being an AWS Certified Solutions Architect Associate and AWS Certified Data Analytics - Specialty, have deep knowledge of AWS compute services like Ec2, Lambda, storage services like S3, EFS, and database services like relational DBs, redshift, and DynamoDB. Also have good knowledge of Glue, Snowball, SNS, SQS, KMS, CloudWatch, Kinesis, and EMR.
  • Have good knowledge of Python: created a tool using python that creates mapping specifications document from an exported Informatica workflow XML.
  • Have good knowledge and hands-on experience of Snowflake
  • Have hands-on experience in using boto3 commands.
  • Experience in creating Data sources, Mappings with various transformations, and Workflows
  • Good understanding of relational database management systems like MS SQL Server, extensively worked on Data Integration using Informatica for the Extraction, transformation, and loading of data from various database source systems, and expertise in UNIX shell scripting
  • Experience in Informatica, creating mappings, and workflows, and proficiency in using Informatica workflow Manager to create and schedule workflows.
  • Experience in scheduling applications like Autosys and TWS
  • Possess excellent communication and interpersonal skills, ability to grasp quickly new concepts both technical and business-related, and utilize them as needed.
  • Good problem-solving and Analytical skills (Received appreciation from clients multiple times for the analysis I had done).
  • Good Team Member, Ability to work both independently and in a team environment
  • Independently perform complex troubleshooting, root-cause analysis, and solution development.

Technical skills:

Languages/Technologies Informatica Power Center, SQL, Unix Shell Scripting, Python, AWS

Databases SQL Server, AWS database services, Snowflake, Redshift

Tools The rally, Jira, Notepad ++, Git, Jupyter

Operating Systems Window (98/2000/XP/2007)

CORE COMPETENCIES:

·       Working as an ETL/BI Developer in Agile methodology.

·       Played a vital role as a developer involved in the analysis, design, coding, and testing of the application.

·       Major work on ETL/BI using Informatica Power center, AWS ETL services, Snowflake procedures

·       Created scripts using shell scripting and python

Responsibilities:

Being an ETL/BI Developer and adopting of AGILE methodology, Sprint planning, Estimations, development, execution, reviews, testing, and retrospective are done by an individual for the allocated responsibility under the supervision of the Team lead and Scrum Master.

Key Achievements

·       Awarded Applause award for the efforts put into the Regulatory project for one of the leading Asset Management companies of U.S.

·       Awarded Spot award for the hard work and commitment

·       Received Appreciation from the Manager for exhibiting team spirit.

·       Received an Award for excellent performance from Infosys, India.

·       Received Insta Award for 0-defect code delivery into Production during 2016 in previous organization.

·       Received Appreciation from the Manager of a previous organization for leading the offshore team in effective code delivery.


Professional projects:

Working with Delta cubes Technologies as a Data Engineer from January 2020 to Present

Project title: Hastings IFRS17 Size:10

Technologies and tools: AWS S3, Snowflake, SSIS, SQL Server

Role: Data Engineer

Responsibilities:

·       Create an external stage to read data from S3

·       Create snowflake procedures to perform validations, transform data, and load to tables.

Project title: HR Analytics Size:10

Technologies and tools: AWS Glue, S3, RDS, Redshift, SQL Server, Lambda, DMS, Lake formation, Snowflake

Role: Data Engineer

Responsibilities:

·       Creation of workflows and jobs using AWS glue to transform and load data

·       Build data pipeline from on-prem data sources to the landing zone (AWS S3 bucket) and to cloud databases like Amazon Redshift and Snowflake.

Project Title: Asset Management Team Size:45

Technologies and tools: Informatica Power center, Unix shell scripting, Tivoli workload scheduler, Python, AWS, Snowflake

Role: ETL Developer

Responsibilities:

·       Creation of mappings and workflows where source and targets are of type XML and flat file

·       Created python scripts to perform preprocessing on S3 files using AWS Lambda.

Project Title: BI2020 Team Size:30                                                                                                     

Technologies and tools: Azure SQL Data warehouse, Hive

Role: ETL Developer

Responsibilities:

·       User story requirements and analysis to meet the stakeholder’s requirements.

·       Analyzing existing ETL processes, converting them into stored procedures and Mapping specifications.








 
Matching Resources
My Project History & Feedbacks
Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon
Loading…

stuff goes in here!