OnBenchMark Logo

Ankit (RID : qzyclk2eda6p)

designation   Data Engineer

location   Location : Bangalore, India,

experience   Experience : 5 Year

rate   Rate: $11 / Hourly

Availability   Availability : 2 Week

Work From   Work From : Offsite

designation   Category : Information Technology & Services

Shortlisted : 1
Total Views : 88
Key Skills
Data Engineer Python Azure DevOps ETL SQL Spark
Discription

 

PROFILE SUMMARY

  • A competent professional with 5+ years of experience; currently associated with Ernst & Young, Bengaluru as Data Analyst In Technology
  • Excellence in utilizing data from diverse information systems & analysing large data sets & processing activities involving studying & summarizing data for extracting useful information which would assist in strategic decision-making & planning
  • Evaluated data findings to communicate findings in a clear, structured manner, developing cordial relations with clients / stakeholders
  • Skilled in identifying key metrics and uncover hidden patterns, tools, and trends in data sets to optimize performance
  • Mined & analysed data from multiple sources to drive optimization & improvements and delivered data-driven solutions to business challenges
  • Proficient in performing data cleansing, data matching & Classification, data visualization & data enrichment activities
  • Bridged gap between Client & Technical Team to ensure value-driven development of high-quality solutions
  • Working closely with clients to analyse system requirements, client’s technology needs, creating business requirements & functional specifications and eliciting the same to developing teams
  • Participating in sprint planning sessions, backlog grooming, and sprint retrospectives to ensure timely delivery of high-quality software solutions
  • Exhibited excellence in synthesizing insight from data and providing consultation to clients for better business decision-making with proficiency in executing projects for data driven solutions by providing comprehensive statistical & analytical models
  • Excellence in executing data analysis & processing activities involving analysing, studying, and summarizing data for extracting useful information which would assist in strategic decision-making and planning
  • Proficient in performing data cleansing, data matching & Classification, data visualization & data enrichment activities

CORE COMPETENCIES

Data Analytics

Data Mining & Cleansing

Data Visualization

Requirement Gathering & Analysis

Agile / Scrum Methodologies

Predictive Modelling & Visualization

Business Intelligence

Project Execution

Team Coordination

Quantitative / Statistical Analysis

Cross-functional Coordination

Reports & Dashboard Development

Data Modelling

Data Warehousing

Client / Business Stakeholder Engagement

CAREER TIMELINE CERTIFICATIONS

Ernst & Young, Bengaluru as Data Analyst In Technology

 

Since

Nov’21

Sep’19 –

Oct’21

Apr’18 –

Sep’19

College Dekho, Gurugram as Senior Executive in Finance

CBRE Group, Gurugram as Finance & Purchase Executive

    • Microsoft Certified Azure AI-900 Fundamental
    • Microsoft Certified Azure DP-900 Fundamental
    • Certification in Advance Excel from Ducat in 2018
    • NIIT Institute of Finance, Banking & Insurance in 2018

IT SKILLS

Programming : Python

ETL Tools : PowerCenter, SQL, Neo4J, SSMS Visualization : Power Bi

Cloud Computing : Azure DevOps, Azure Data Factory

WORK EXPERIENCE

Since Nov’21 with Ernst & Young, Bengaluru as Data Analyst In Technology Key Result Areas:

  • Working as a member of UA team and supervising collection / integration of data from various systems such as Axon, EDC, and others into Neo4j
  • Utilizing Python Pandas, and Cypher to design and construct the appropriate data model for Neo4j
  • Creating Cypher queries to retrieve data from Neo4j storage and present it in a dashboard
  • Formulating map and workflow to extract data from a SQL database or a flat file and load it into Informatica PowerCenter (defining the data source, creating a mapping to specify how the data is transformed, and developing a workflow to schedule and execute the ETL process)
  • Creating ETL pipelines to process large datasets and performing data cleaning, transformation, and validation
  • Importing client's file into Jupyter Notebook and perf
 
Matching Resources
My Project History & Feedbacks
Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon
Loading…

stuff goes in here!