OnBenchMark Logo

Kandula (RID : 12a29lm095qjc)

designation   ETL Developer

location   Location : Chennai, India

experience   Experience : 9 Year

rate   Rate: $13 / Hourly

Availability   Availability : Immediate

Work From   Work From : Offsite

designation   Category : Information Technology & Services

Shortlisted : 1
Total Views : 66
Key Skills
ETL Replicate Truncate Tables XML
Discription

Professional Summary

  • Having 9 years of experience in IT, 7 years of experience into the Informatica PowerCenter 10.2/10.0/9.x/8.6 for development and 3 years’ Experience into the Matillion ELT/ETL Cloud with snowflake.
  • Implementation of ETL processes data warehouse and migration projects with ETL tools like Informatica PowerCenter and Matillion ETL Cloud
  • Created the Orchestration and Transformation jobs in the Matillion.
  • Experienced in creating the Grid Iterator, SQL Components, Salesforce component, Detect, Unite, Truncate Tables, Query Result to Grid, Bash Script, End Failure, End Success andDelete tablesComponents.
  • Experienced components like If, Replicate, Filter, Calculator, Table Update, Rewrite Table and Table Output
  • Hands on Experience to configure the Salesforce component to extract the data from the Salesforce.
  • Experienced in creating the job variable, Environment variables and grid variables.
  • Having Experience to load the data into the LANDING, DATA_LAKE and ANALYTICS Concepts
  • Experienced with the Data Extraction, Transformation, and Loading (ETL) from disparate data sources such as Oracle, SQL Server and Snowflake and flat files to target (Data Warehouse)
  • Involved in developing mappings and mapplets using Informatica Source Analyser, Target designer, Mapping Designer & Transformation Developer
  • Used various Transformations such as Expressions, Filters, and aggregators, Lookups, Routers and Sequence Generator etc. to load consistent data intoOracle and SQL server data bases.
  • Implemented SCD type1 and SCD type2 methods to preserve the current and historical data.
  • Experience in Integration of various data sources like Oracle and Flat Files in various formats fixed width and delimiter like .csv, tsv, .dat, .txt and .IM.
  • Experienced in working with the XML files target. Performed XML validation using the xsdfile.
  • Experienced in creating the dynamic files and extracted the data from the dynamic files.
  • Experienced in working with the salesforce, used salesforce objects as a source and target and experienced with the SOQL queries to perform the data validations.
  • Experienced to migrate the code from DEV to UAT & TEST environment, created the folders and provided the permission to the users at folder level.
  • Hands on experience in sql DBM data modelling to create data model diagrams.
  • Extensively worked on agile methodologies process and having the good knowledge on JIRA tool.
  • Having good knowledge in databases like Oracle, Microsoft SQL server 2014, Snowflake Cloud DB
  • Performed UNIT testing and involved in writing the unit test cases and Supported UAT testing.

 

 

Technical Skills: 

ETL Tools

Informatica PowerCenter 10.x/9.X, Matillion ETL/ELT Cloud

Databases

Oracle 11g/10g, Microsoft SQL Server 2014, Snowflake Cloud DB

Data Modelling

sqlDBM cloud

Tools

SQL Developer, JIRA, Dataflux Management

Operating Systems

Linux, Unix, Windows Server 2008.

Languages

SQL, basic PL/SQL, Unix.

 

Education:

  1. Completed B. Tech in Electrical & Electronics Engineering from JNTU - Hyderabad during 2006 - 2010.

 

Professional Experience:

Project # 1

Project                        : DWBI Analytics

Client              : SOPHOS

Environment   : Matillion Cloud ETL/ELT, Snowflake (Cloud DB),

                        Salesforce, Agile Methodologies, JIRA, sqlDBMcloud data modelling, source tree,GitHub

            Sophos is a British based security software and hardware company. Sophos develops

products for communication endpoint, encryption, network security, email security, mobile

security, unified threat management. Sophos is primarily focused on providing security software

to theorganizations.

 

Responsibilities

 

  • Effectively used the Matillion ETL tool to extract the data from snowflake, aws s3 bucket.
  • Used SQL component to write the SQL queries with in the Matillion ETL tool.
 
Matching Resources
My Project History & Feedbacks
Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon
Loading…

stuff goes in here!