OnBenchMark Logo

Ajinath

Specialist Data Engineer
placeholder
Location Ahmednagar
view-icon
Total Views74
availability
Shortlist0
back in time
Member since30+ Days ago
 back in time
Contact Details
phone call {{contact.cdata.phone}}
phone call {{contact.cdata.email}}
Candidate Information
  • User Experience
    Experience 6.2 Year
  • Cost
    Hourly Rate$10
  • availability
    AvailabilityImmediate
  • work from
    Work FromOffsite
  • check list
    CategoryInformation Technology & Services
  • back in time
    Last Active OnJuly 16, 2024
Key Skills
Data Engineer SnaplogicMySQLData WarehousingSnowFlakeRedshift
Education
2014-2017

(Information technology )
Pune University

Summary

Project 1 – Big Data Project Employer_Name : Aloha Technology Language and Technologies involved – MySQL Workbench, Pentaho Data Integration Tool, AWS, Athena Project Description: The goal of this project was to get the data from different sources like S3,Apache Hadoop,S3 browser and external files and applied the transformation logic on that as per the client and load that data into the data warehouse using PDI(Pentaho Data Integration)

Responsibilities:

 Analyzed mapping document and transformation logic to create fact or dimension.

 Created fact and dimension tables as per the client requirements.

 Created jobs and transformation as per the client requirements

 Written SQL statement in transformation to get the data from different source and perform transformation logic on that and load the data into target table.

 Created stored procedure,triggers,cursor into data warehouse.

 Validated source and target table data  Scheduled the Cronjob on AWS server

 Trouble shoot issues and optimized the query performance using indexing


Project 2 – Big Data Project Employer_Name : Aloha Technology Language and Technologies involved – Redshift, Snowflake, DBT tool, AWS S3 Project Description: The goal of this project was to migrate existing process from Redshift to Snowflake rewriting SQL in DBT Responsibilities:

 Unloaded the data from Redshift to Snowflake using manual script

 Created Staging,File Format in Snowflake to read data from Amazon S3 location

 Loaded data into Snowflake using SQL script.

 Analyzed the existing SQL script written in redshift.

 Converted redshift SQL into DBT to create DBT models.

 Validated data from Redshift with Snowflake.

 Run DBT models through Command Line

 Created complex SQL dbt model as per client requirement’s


Project 3 – Big Data Project Employer_Name : Yash Technology Language and Technologies involved – SQL server, Snowflake, Alteryx, Snaplogic ,AWS S3 Project Description: The goal of this project to getting data from different sources like SQL server,AWS S3 etc and apply transformation logic using Snaplogic tool and load data Into Snowflake and after that Tableau using that data for reporting

Responsibilities:

 Converted Alteryx job into Snaplogic

 Created job in Snaplogic to get the data from SQL server and load into Snowflake

 Created Staging,File Format in Snowflake to load the data from Amazon S3.

 Created Standard view,Secured View, Materialized View in Snowflake

 Created Stream,Task,Stored Procedure,Function,Data Masking policy

 Worked on Cloning ,Time Travel,Data Sampling 

Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon