OnBenchMark Logo

AJINATH (RID : pp3lnm5bry4)

designation   AWS+REDSHIFT

location   Location : PUNE, India

experience   Experience : 7 Year

rate   Rate: $17 / Hourly

Availability   Availability : Immediate

Work From   Work From : Offsite

designation   Category : Information Technology & Services

Shortlisted : 0
Total Views : 65
Key Skills
AWS PENTAHO REDSHIFT SNOWFLAKE DBT TOOL MySQL
Discription

Ajinath Daund

 

 

Designation: Snowflake Developer/ETL Developer

 

Summary of Experience:

 

  • 7 years of experience in databases and ETL tool(Pentaho Data Integration) and DBT(Data Build Tool) tool,Talend,Snaplogic and good understanding of Data warehouse concepts. Ability to understand and transform complex business requirements into software, ensuring applications are delivered on time, as per specification, scalable, performance optimized and maintainable.
  • Having 3 years of professional experience in working with Snowflake Cloud Data Warehouse,AWS S3
  • Hands-on experience with Snowflake utilities : SnowSQL,Time Travel,Zero Copy Clonning,Staging,Snowpipe,Stream,Task,Data Masking,Views,Stored Procedure,Function,Email Notification etc.
  • Handling the large and complex data sets like JSON,Parquet,CSV files from various sources like AWS S3.
  • Worked on Query optimization

 

Summary of Projects Handled

 

Project 1 – Big Data Project

 

Employer_Name :Confidential

 

Language and Technologies involved – MySQL Workbench, Pentaho Data Integration Tool, AWS, Athena

 

Project Description: The goal of this project was to get the data from different sources like S3,Apache Hadoop,S3 browser and external files and applied the transformation logic on that as per the client and load that data into the data warehouse using PDI(Pentaho Data Integration)

 

Responsibilities:

    • Analyzed mapping document and transformation logic to create fact or dimension.
    • Created fact and dimension tables as per the client requirements.
    • Created jobs and transformation as per the client requirements
    • Written SQL statement in transformation to get the data from different source and perform transformation logic on that and load the data into target table.
    • Created stored procedure,triggers,cursor into data warehouse.
    • Validated source and target table data
    • Scheduled the Cronjob on AWS server
    • Trouble shoot issues and optimized the query performance using indexing

 

Project 2 – Big Data Project

 

Employer_Name : Confidential

 

Language and Technologies involved – Redshift, Snowflake, DBT tool, AWS S3

 

Project Description: The goal of this project was to migrate existing process from Redshift to Snowflake rewriting SQL in DBT

 

Responsibilities:

    • Unloaded the data from Redshift to Snowflake using manual script
    • Created Staging,File Format in Snowflake to read data from Amazon S3 location

 

    • Loaded data into Snowflake using SQL script.
    • Analyzed the existing SQL script written in redshift.
    • Converted redshift SQL into DBT to create DBT models.
    • Validated data from Redshift with Snowflake.
    • Run DBT models through Command Line
    • Created complex SQL dbt model as per client requirement’s

 

Project 3 – Big Data Project

 

Employer_Name : Confidential

 

Language and Technologies involved – SQL server, Snowflake, Alteryx, Snaplogic ,AWS S3

 

Project Description: The goal of this project to getting data from different sources like SQL server,AWS S3 etc and apply transformation logic using Snaplogic tool and load data Into Snowflake and after that Tableau using that data for reporting

 

Responsibilities:

    • Converted Alteryx job into Snaplogic
    • Created job in Snaplogic to get the data from SQL server and load into Snowflake
    • Created Staging,File Format in Snowflake to load the data from Amazon S3.
    • Created Standard view,Secured View, Materialized View in Snowflake
    • Created Stream,Task,Stored Procedure,Function,Data Masking policy
    • Worked on Cloning ,Time Travel,Data Sampling

Project 4 – Big Data Project

 

Employer_Name : Confidential

 

Language and Technologies involved – MYSQL,Oracle,Snowflake,Talend,Stitch,AWS S3,CICD,Gitlab,Github

 

Project Description: The goal of this project to getting data from different sources like Stitch,AWS S3,Talend server etc and apply transformation logic using Talend tool and load data Into Snowflake and after that Tableau using data for reporting

 

Responsibilities:

    • Understand the requirement and analyzing the data then implement the feasible solution
    • Created job in Talend to get the data from Salesforce,Oracle,MYSQL etc and load into Snowflake
    • Promoted the Talend job on Talend cloud to schedule that jobs to run daily,weekly or monthly
    • Created Storage Integration,Staging,File Format,Snowpipe in Snowflake to load the data from Amazon S3.
    • Created Standard view,Secured View, Materialized View in Snowflake
    • Created Stream,Task,Stored Procedure,Function,Data Masking policy
    • Worked on Cloning ,Time Travel,Data Sampling,Email Notification

 

 

Skills:

  • Databases: MySQL, Redshift, Snowflake, Data warehouse concepts and AWS Apache TOAD.
  • ETL Tools: PDI(Pentaho Data Integration), DBT tool(Data Build Tool),Talend,Snaplogic,Stitch Replication.
  • Programming Languages: Basics Knowledge of Java,Python
  • Data Formats: JSON, CSV,Parquet.
  • Version Control/Repositories: SourceTree, GIT, SVN,Gitlab,Github
  • Project Management tool: Jira
  • Cloud Services: Amazon Web Services(AWS), Athena,S3 browser.

 

 
Matching Resources
My Project History & Feedbacks
Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon
Loading…

stuff goes in here!