OnBenchMark Logo

Anshuman (RID : g8qolgzb339t)

designation   Scala Developer

location   Location : Pune, India,

experience   Experience : 7 Year

rate   Rate: $14 / Hourly

Availability   Availability : Immediate

Work From   Work From : Offsite

designation   Category : Information Technology & Services

Shortlisted : 1
Total Views : 79
Key Skills
Redshift Bash Oracle Python
Discription

CURRICULUM VITAE

PRIMARY TECHNICAL SKILLS:

Spark, Scala, Core Java, Hive, AWS, Impala, Python, Apache Phoenix, HBase, Hadoop, LINUX

SECONDRY TECHNICAL SKILLS:

Bash, R, Servlet, MySQL, Jupyter, GIT, JIRA, Maven, SBT, Anaconda, Ambari, Graphite, Grafana

BIG DATA PLATFORMS WORKED ON:

AWS, Cloudera-HortonWorks (HDP), Azure, GCP

WORK EXPERIENCE: 6 Years 2 Month on date (September 22th , 2022)

PROJECTS:

Currently working as a Senior Product Developer

Client (Project)

Media EDH

Technologies

EMR, Oracle, Redshift, EC2, Autosys, S3, Bash, AWS CLI

Description

Process data and create final reports for client

Duration

On-Going (June 2022 – Ongoing)

Work Details

Maintain and update the report generation for client use. Data lands on S3. Oracle and Redshift holds data from different team and for different use cases are loaded into EMR and using a transient EMR process the data for report generation.

Client (Project)

Internal (S3-SQS-Lambda-Glue) based automated triggers

Technologies

EC2, S3, SQS, AWS Glue, Bash, Python, Spark-Scala on AWS Glue

Description

Automate the trigger mechanism based on the file upload on S3

Duration

On-Going (February 2021 – Ongoing)

Work Details

Created application - As soon as the file was placed on S3, based on the file patter, it would trigger the appropriate action required – (execution of scripts, load data file into Glue and other databases, etc) to minimize user intervention

 

Client (Project)

Internal (Oracle to AWS Migration)

Technologies

Athena, Python, S3, Oracle

Description

Migrate existing Analytics application from Oracle to AWS

Duration

On-Going (July 2021 – June 2022)

Work Details

To offload oracle and for better processing, we needed to move the Oracle procedures on AWS. My task was to design the framework and write code in python [EC2] to trigger jobs on Athena and generate final report.

ORGANIZATION: IBM INDIA PVT LTD:

Client (Project)

Barclays (DIT – Spark Upgrade)

Technologies

Shell Script, AWS EMR, Spark, Scala

Description

Due to Infrastructure upgrade, needed to upgrade Spark to 2.4

Duration

On-Going (February 2021 –July 2021)

Work Details

Needed to upgrade the Scala code from Spark 1.6 to Spark 2.4 for data load and ingestion for 496 tables. Also, update the framework from Spark 1.6 to Spark 2.4

ORGANIZATION: IBM INDIA PVT LTD:

Client (Project)

Barclays (Historical Remediation)

Technologies

Shell Script, Cloudera, Spark, Scala, Protegrity (In-house tokenization tool)

Description

Client target to remediate the tokenization in the historical data

Duration

On-Going (January 2021 – February 2021)

Work Details

Worked on creating tool to perform the data extract from table, tokenize the data and load the tokenized data into the table.

ORGANIZATION: IBM INDIA PVT LTD:

Client (Project)

Barclays (Informatica to Abinitio Migration)

Technologies

Shell Script, HDFS, AWS EMR, Spark, Scala

Description

Client target to Migrate the data source from Informatica to Abintio

Duration

On-Going (March 2020 – December 2020)

Work Details

Worked on creating comparison tool for Data files between Informatica and Abinitio extract to identify even the minute change such as space difference between the file. Also, create build and deploy the Spark artifacts to load the data from new sources (for 473 files)

ORGANIZATION: ZS ASSOCIATES:

Client (Project)

ZS Client Based Project (MDM + ETL + Warehousing + BI)

Technologies

Amazo

 
Matching Resources
My Project History & Feedbacks
Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon
Loading…

stuff goes in here!