OnBenchMark Logo

Aditya (RID : 14f6gl5c9rm0x)

designation   AWS Data Engineer

location   Location : Hyderabad, India,

experience   Experience : 8 Year

rate   Rate: $25 / Hourly

Availability   Availability : Immediate

Work From   Work From : Offsite

designation   Category : Information Technology & Services

Shortlisted : 12
Total Views : 201
Key Skills
AWS Python BigData Hadoop ETL Spark Hive
Discription

●     Certified Data Engineer, having cleared AWS Data Analytics – Specialty certificate.

● Strong proficiency in different cutting-edge technologies including

-      AWS (Redshift, Lambda, EC2, S3, Sage maker, Glue, DynamoDB, API Gateway, DMS, RDS, SNS, SQS, IM Roles, etc.)

-      Python (JSON, Boto3, Moto, PY Test, Flask, Pandas, NumPy, OpenCV, Scipy, Async IO, etc.)

-      Big Data (Hadoop, Hive, Oozi, ETL, Spark, Sqoop, Impala, Pig, Flume, Kafka, Ranger, Kerberos, LDAP, Sentry, etc.)

●     Experience in working with giants of three major domains.

○ Morgan Stanley [Banking Domain]

○ Assurance [Insurance Domain]

○ Eli-Lilly [Healthcare Domain]

○ John deer [Manufacturing Domain]

○ Gracenote Nielsen [Media Domain]

 

●     Working as Module Lead

●     Working as Senior Associate Technology in Delta cubes Technology Pvt Ltd from November 2017.

●     Worked as System Engineer at TCS from July 2014 till Oct 2017.

●     Consistently top performer of the Morgan Stanley account in TCS with throughout A band, which is the highest possible performance award.

●     Experience in developing applications using cloud AWS, Python, S3, EMR, Lambda functions, Dynamo DB, Redshift, SNS, SQS, and Kinesis.

●     Experience in creating various Dashboards on TIBCO Spotfire, Kibana.

●     Experience in developing Data Lake applications using Hadoop, scoop hive, spark, spark streaming, Impala, yarn, and flume.

●     Experience in developing applications using UNIX, and shell scripting.

●     Experience in working with multiple schedulers like Active Batch, Cloud Watch, Cron, Autosys, TWS, and Oozie.

●     Implemented LDAP and Secure LDAP authentication on Hadoop, Hive, Presto, and Starburst Presto.

● Experience in working with different authentication mechanism as LDAP, BATCH & KERBROSE.

●     Good understanding of Software Development Life Cycle Phases such as Requirement gathering, analysis, design, development, and unit testing.

●     Strong ideation and conceptualization skills, been a sought-after person for many POCs.

●     Developed multiple utilities such as Auto cleanup, Workbench, ILM, loggings integration, Hadoop jobs automation, mainframe interaction, automation of procedure using Python, and Unix shell scripting which is used throughout the account.

●     Self-motivated & team-motivator, proficient communication skills aided with a positive and ready-to-learn attitude.

●     Goal-oriented, autonomous when required, good learning curve, and appreciation for technology.


Technology Stack

●     AWS: Lambda Function, Redshift, GLUE, Kinesis, EMR, DMS, S3, Glacier Storage,

DynamoDB, TTL, Lifecycle, SQS, SNS, Sage Maker, API Gateway, RDS, Elastic Search Kibana, Quick Sight, Athena, Cognito etc.

●     Python: Python3, Boto3, Pandas, Asyncio, OpenCV etc.

Big Data & Others: Hive, Active Batch, Unix, Informatica, Shell scripting, Sqoop, Hive,

Presto, Impala, Sentry, Ranger, Java, Teradata, Spark, Oozie, Pig, Flume, Autosys, TWS, DB2, MF, Green plum, My SQL, TIBCO Spot Fire, LDAP, Kerberos, CA certificates, SSL etc.

Professional Achievements:

●     AWS Certified Data Analytics - Specialty

●     Got Innovator Award for driving an innovative solution for Solution Accelerator.

●     Got Star Team Award for our contribution to the Eli-Lilly project.

●     Got 'On the spot' award Twice for outstanding performance and single-handed achievements in Asurion in Synechron Technologies.

●     Got 'A' Band consistently for Two years as part of the annual appraisal process for my performance in Morgan Stanley in TCS.

●     Got many client appreciations for on-time delivery in the project.

●     Winner of IIT Bombay Zonal Round and secured 5th Position in Final Round of Grid Master Robot.

●     Participated in IIT Kharagpur Line Follower Robot.

●     Secured 2nd Position in Line Follower in City level Technical-fest.

Education:

Bachelor of Engineering- EC

Professional Experience:

Senior Python Developer — Media Domain [Gracenote] — From Oct 2021 till Present

Worked in the media domain, integrated the media data from different sources and formats to bring everything on the same platform so that a good amount of analysis can be performed on top of it.

Accomplishments include:

●     Designed the architecture and flow of integration.

●     Worked on ETL tools like GLUE, Athena, PostgreSQL, XML, JSON, etc.

Senior Python Developer — John Deer — From March 2021 till Oct 2021

Migration project where we need to migrate tool from Informatica to AWS Lambda Python Pandas. We have created lambda (Python code) to convert the Informatica SQLs to python pandas code.

Accomplishments include:

●     Designed the approach of the problem statement. scenario.

●     Identified the tools to be used such as AWS Lambda, Pandas, SQL, JDBC, and attuned.


 


 
Matching Resources
My Project History & Feedbacks
Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon
Loading…

stuff goes in here!