OnBenchMark Logo

Aizaz (RID : sbosl1lxglvz)

designation   Big Data Engineer

location   Location : Bangalore, India

experience   Experience : 5 Year

rate   Rate: $15 / Hourly

Availability   Availability : Immediate

Work From   Work From : Offsite

designation   Category : Information Technology & Services

Shortlisted : 1
Total Views : 97
Key Skills
Python Hadoop Hive Spark SQL AWS
Discription

5+ years| Aizaz Ahamed |Big Data Engineer

Summary: Big Data Engineer with 5 years of Industrial experience, having good knowledge on Big data,Python ,spark-Framework. Certified in AWS solution Architect as well as AWS developer associate.Have worked for three Telecommunication leading clients and have expertise in leading a team.

Technical Skills:

Languages:

Python, Pyspark, JavaScript , JSON

Framework:

Selenium

IDE:

Eclipse , Pycharm, IntelliJ

Big Data Technologies

Hadoop(Hortonworks), Spark, Hive, Sqoop

Operating Systems:

Windows, Linux

Database

MYSQL

Cloud Platform

AWS

Project Experience

Project 1

Commonwealth Bank of Australia(CBA)

Description:

The Commonwealth Bank of Australia (CBA), is an Australian multinational bank with businesses across New Zealand, Asia, the United States and the United Kingdom. It provides a variety of financial services including retail, business and institutional banking, funds management, superannuation, insurance, investment and broking services.

Responsibilities:

Building data transformation solutions on spark and hive based on the client requirement. Maintaining and managing Hadoop production clusters.

Ingesting data from disparate data sources using a combination of SQL,S3 and lambda functions to create data views in AWS RDS and to ensure there is zero percent data loss during ingestion.

Tech Stack

ISpark, hive, Sqoop, Hadoop, python, SQL, AWS

Team Size

6

Project Duration

16 Months (Ongoing)

Project 2

Charter Communications

Description:

Charter Communications, is an American telecommunications and mass media company with services branded as Spectrum. With over 26 million customers in 41 states,it is the second-largest cable operator in the United States by subscribers.Over an advanced communications network, the company offers a full range of state-of-the-art residential and business services including Spectrum Internet, TV, Mobile and Voice. The company also distributes award-winning news coverage, sports and high-quality original programming to its customers through Spectrum Networks and Spectrum Originals.

Responsibilities:

Building data transformation solutions on spark and hive based on the client requirement. Maintaining and managing Hadoop production clusters.

Developed automation framework in python to handle Hadoop Method of Procedures (MOP) when alerts being triggered reducing the manual interventions to 20%.

Managed Deployment automation using Git-Lab, Terraform to automate system operations. Handled work from the initial stage of development to create branches automating the build and deploy process.

Tech Stack

Kafka, Nifi, ISpark, hive, Sqoop, Hadoop, python, SQL, AWS,

Team Size

10

Project Duration

15 months

Project 3

Swisscom

Description:

Swisscom is a major telecommunications provider in Switzerland. Swisscom offers the highest bandwidths throughout Switzerland with the best mobile network and the largest fiber-optic network.

Responsibilities:

Worked as a Tibco Administrator for the Test Infrastructure Team (TIF).Maintaining Tibco application and other infrastructural issues during release shifts. Also automated administrative tasks.

Worked on transformation of various applications to Dev-Ops based platforms for monitoring purposes like Splunk, Prometheus Grafana and other monitoring tools. Coded SQL queries and explored required data and built reporting deliverables.

Tech Stack

ISpark, hive, Sqoop, Hadoop, python, SQL

Team Size

7

Project Duration

19 Months

 
Matching Resources
My Project History & Feedbacks
Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon
Loading…

stuff goes in here!