OnBenchMark Logo

CHAITANYA (RID : 210ulon0c2t0)

designation   DATA ENGINEER

location   Location : BANGALORE, India

experience   Experience : 4 Year

rate   Rate: $17 / Hourly

Availability   Availability : Immediate

Work From   Work From : Offsite

designation   Category : Information Technology & Services

Shortlisted : 0
Total Views : 51
Key Skills
Data Engineer ETL BIG DATA MATILLION PYTHON AZURE DATABRICKS SNOWFLAKE MONGODB
Discription

KOLAGATLA

 

CHAITANYA

 

 

DATA ENGINEER

 

 

 

 

SUMMARY

 

 

Results-driven data engineer with a proven track record of designing and implementing scalable data solutions, optimizing data pipelines, and enabling data-driven decision-making.

 

 

EXPERIENCE

 

 

 

  Data Engineer                05/2022 - present

 

Worked as a Backend Engineer for the ETL tool we created.

 

 

Wrote Dynamic code in python which generate DAG(Airflow).

 

 

Extensive experience in working with various data formats including JSON, Parquet, Excel and CSV, ensuring data compatibility and integrity throughout the processing lifecycle.

 

Proactive approach to monitoring, managing, and scaling Spark clusters, resulting in optimized resource utilization and enhanced overall system performance.(Databricks) Used Spark 3.0 to leverage AQE and other features.

 

Created few UDF'S.

 

 

Created notebooks for features like some cleansing and filter features profiling,replace, join,etc in scala Created Data PipeLine using Airflow and stored all the metadata in MongoDB

 

Worked with file formats like parquet and new table called 'Delta table' in databricks Scheduling the workflows created from the ETL tool by the user using Airflow.

 

Implemented and upheld the deployment of Airflow utilizing Docker as part of the development process. This entails ensuring the setup, management, and continuous operation of the Airflow platform within the Docker environment.

 

Created Azure Containers and used Blob storage for Storage.(Microsoft Storage Explorer)

 

 

Have done some POC on Azure Managed Airflow(ADF) and MWAA(Azure).

 

 

                        BIg Data Develpoer - Client - Telefonica                    06/2019 - 05/2022

 

Developed Spark Jobs.

 

 

Using parquet and other data formats to store in HDFS.

 

 

Developing Spark applications using scala.

 

 

Converting SQL queries to Spark transformations.

 

 

Implementing Partitions in Hive and Monitoring Apache Nifi workflows.

 

 

Snowflake Developer -Client Juniper Networks

 

Developing Complex Views, External Tables, Stored Procedures, File Formats, Tasks, Streams etc. Developing various SnowSQL code of least, medium and high complexity with utmost efficiency and performance.

 

To work on creating and using File Formats, Time-Travel and different types of tables and view Working on performance/process tuning for the existing SnowSQL jobs.

 

Developing Matillion Jobs for Triggering SnowSQL statements and scheduling the jobs.

 

 

IDQ Developer

 

Need to re-structure and recreate mapping for better data cleaning and fine tunning.

 

 

Used postman to read the data for the api and api testing.

 

 

Implemented DNB integration in mapping and transformation.

 

 

Worked on procedure to automate batch loading.(as we had limit for api calls

 

 

 

 

 

 

EDUCATION

 

 

2015 - 2019

GEETHANJALI COLLEGE OF ENGINEERING AND TECHNOLOGY

 

 

B.TECH

 

72%

CERTIFICATES

 

 

 

 

 

 

 

 

August 2021

AZURE DATA FUNDAMENTALS DP-900

 

 

TECHNICAL STACK AND TOOLS

 

 

 

 

 

 

 

Programming

 

Tools

 

 
Matching Resources
My Project History & Feedbacks
Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon
Loading…

stuff goes in here!