No of Positions: 1
Location: Pune
Tentative Start Date: March 27, 2023
Work From : Onsite
Rate : $ 10 - 12 (Hourly)
Experience : 6 to 8 Year
Job Description
Responsibilities:
Experience with building data pipelines using Talend and related AWS services
Create ETL jobs to extract data from multiple data sources, cleanse , transform and load data into target
data stores.
Developing and publishing Talend Jobs
Perform quality assurance on generated results to ensure accuracy and consistency
Integrate data from data sources such as databases, csv, xml files etc in batch and real time.
Build Complex transformations using Talend and load data to datastores such as S3, Redshift.
Work with SIT team to fix defects to closure.
Requirements:
· Ability to translate technical requirements into data transformation jobs. Strong Experience in
DWH.
· Build data pipelines to bring data from source systems, cleanse and transform data to support data
analytics and reporting.
Experience in working with AWS cloud -S3, Redshift
· Experience with developing and implementing using – ETL tools like Informatica, Talend and other
AWS ETL integration tools, Talend Data Catalog.
· Strong knowledge of data warehousing and Data Modelling concepts
· Strong experience in Data Quality, Source Systems Analysis, Business Rules Validation, Source
Target Mapping Design, Performance Tuning and High-Volume Data Loads
· Strong knowledge of SQL Python, PySpark, etc
Good to have either one or more:
Talend Data Integration Certified
Talend Data Catalog Certified
Talend Data Value Certified
Informatica PowerCenter Data Integration Developer Certified
AWS Certified Data Analytics Specialty