OnBenchMark Logo

SURESH (RID : 15cyvln0krshf)

designation   Big Data Engineer

location   Location : Bengaluru, India,

experience   Experience : 11 Year

rate   Rate: $20 / Hourly

Availability   Availability : Immediate

Work From   Work From : Offsite

designation   Category : Information Technology & Services

Shortlisted : 6
Total Views : 74
Key Skills
Big Data Oracle Python Spark Data Analytics UNIX/LINU SQLServer MySQL Snowflake Tableau Informatica
Discription
  • A result-driven software professional with 11 years of progressive experience in Data Analysis, Design Implementation, Administration, and support of Business Intelligence, OLTP( Batch Processing, Online processing) OLAP, ETL, Data warehousing, Data mining, DBMS, and Data modeling primarily in Banking and Financial services and  Telecom domains.
  • Experience in developing Dashboard Reports, Parameterized Reports, linked reports, and Sub reports by Region, Year, Quarter, Month, and Week.
  • Extremely worked with the Clients for requirements gathering and drafting those requirements into design specs and preparing estimations for the project deliverables.
  • Expert level skills in Power BI, Snowflake, Java, and Python programming languages to handle immense datasets with Hadoop Ecosystem Map-Reduce/Spark.
  • Expert Level Skills in Oracle PL/SQL programming and Performance Tuning for BI activities.
  • Experienced in all phases of the Software Development Life Cycle (Analysis, Design, development, Testing, and maintenance) using waterfall and Agile methodologies
  • Facilitated project Inception by building product backlog, epics, and consumable user stories in JIRA tool and identified key stakeholders to ensure engagement.
  • Groomed and refined user stories by writing acceptance criteria and story points estimation after discussing with development teams for velocity tracking 
  • Worked with Architect and Development teams to ensure that the proposed design meets requirements and in line with SDLC best practices & the enterprise quality standards.
  • Participated in UAT and developed test cases for Unit Testing & Functional Testing based on the user requirement.
  • Proven experience in handling customers and development teams and proposing solutions to customers.
  • Experience in writing complex SQL Queries using Joins, stored procedures, functions, triggers, packages, and cursors in Oracle.
  • Expertise in Oracle performance tuning and Database designing and Troubleshooting in Oracle 11g platform.
  • Implemented middleware solutions through Tableau Business Works.
  • Developed Unix Shell/Perl/Python scripts and Power BI reports to automate various repetitive tasks and generate reports.
Project Details
Title : Project Profile#1
Duration : 19 (Month)
role and responsibileties :
  • Created databases and schema objects including tables, indexes, and applied constraints, connected various applications to the database and written functions, stored procedures, and triggers.
  • Generated periodic reports based on the statistical analysis of the data from various time frame and division using power pivot, power query, and power view.
  • Designed and developed Power BI graphical and visualization solutions with business requirement documents and plans for creating interactive dashboards.
  • Generated ad-hoc reports Excel Power Pivot and shared them with the decision makers for strategic planning using Power BI.
  • Designed/Developed table-driven ETL process with Perl and shell scripts called OWB mappings, informatica oracle code to load EDW.
Description :

eCommerce, Power BI, Snowflake, AWS, Phyton, Airflow, MicroStrategy SQL and PL/SQL, Data Analysis

 


Title : Project Profile#2
Duration : 24 (Month)
role and responsibileties :
  • Created databases and schema objects including tables, indexes, and applied constraints, connected various applications to the database and written functions, stored procedures, and triggers.
  • Generated periodic reports based on the statistical analysis of the data from various time frame and division using power pivot, power query, and power view.
  • Designed and developed Power BI graphical and visualization solutions with business requirement documents and plans for creating interactive dashboards.
  • Generated ad-hoc reports Excel Power Pivot and shared them with the decision makers for strategic planning using Power BI.
  • Designed/Developed table-driven ETL process with Perl and shell scripts called OWB mappings, informatica oracle code to load EDW.
  • Used Reporting tools like Tableau to connect with Hive for generating daily reports of data.
  • Collaborated with the infrastructure, network, database, application, and BI teams to ensure data quality and availability.
  • Analyzed the SQL scripts and designed the solution to implement using Pyspark
  • Responsible for developing a data pipeline with Amazon AWS to extract the data from weblogs and store in HDFS.
Description :

Reference Data - KYC, Account Off-Boarding, Client Data Remediation & Adoption

Data Analyst


Title : Project Profile#3: XYZ
Duration : 20 (Month)
role and responsibileties :
  • Involved in the Development of End - to - End Applications from Data Transfer to front application.
  • Requirement gathering, validating, and reviewing data model, designing (Technical Design) ETL interfaces, reviewing mapping design by team members as well as leading the team in coding, testing, 
  • Being responsible for the project and conducting meetings with all stakeholders to gather requirements and create BRD (Business Requirements Document).
  • Daily responsibilities include preparing Business Requirements And Functional Specifications Documents and passing them on to the developers for development.  
  • Used various transformations of Informatica to load data into the core table.
  • As a Big Data engineer, working with business and engineering teams to define information needs and develop solutions that support desired business and technical capabilities/requirements.
  • Extensively worked on creating logical and physical data models using SAP Power Designer.
  • Experienced in working with large data sets, and distributed computing. Extensively used Apache spark (Scala) and HIVE.
  • Experienced in extracting data from multiple structured and semi-structured feeds (XML) by building and maintaining scalable ETL pipelines on distributed software systems.
  • Participated in developing applications using Informatica Power Center, SQL, Python, UNIX Shell scripting and was involved in building AutoSys jobs to schedule the ETL Workflows.

 

Description :
  • Development of End - to - End Applications from Data Transfer to front application.
  • Requirement gathering, validating, and reviewing data model, designing (Technical Design) ETL interfaces, reviewing mapping design by team members as well as leading the team in coding, testing, 

Title : Project Profile#4: DMS Central
Duration : 24 (Month)
role and responsibileties :
  • Worked on Informatica tool Source Analyzer, Warehouse Designer, Mapping Designer, Workflow Manager, and Workflow Monitor.
  • Extensively used Transformations like Router, Joiner, Expression, Filters, Lookups, Update Strategy, Transaction Control, Stored Procedures, and Sequence Generator/Oracle sequence.
  • Developed Informatica mappings for TYPE 1 and TYPE 2 Slowly Changing Dimensions.
  • Extensively involved in Performance tuning of the mappings and then made performance improvements to the database by building partitioned tables and using bit map indexes.
  • Experienced in solving issues within defined timelines ensuring the service level agreement is met
  • Analyzed UNIX scripts, Session, and workflow logs to resolve issues in case of failed jobs.
  • Took necessary actions whenever required to make the data marts get refreshed so that the client would not face difficulties running the reports.
  • Experienced in solving different category of issues such as Incident management, Problem management, and change requests.

 

Description :

Informatica tool Source Analyzer, Warehouse Designer, Mapping Designer, Workflow Manager, and Workflow Monitor.


 
Matching Resources
My Project History & Feedbacks
Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon
Loading…

stuff goes in here!