OnBenchMark Logo

Saikiran

Tech lead
placeholder
Location Hyderabad
view-icon
Total Views49
availability
Shortlist0
back in time
Member since30+ Days ago
 back in time
Contact Details
phone call {{contact.cdata.phone}}
phone call {{contact.cdata.email}}
Candidate Information
  • User Experience
    Experience 10 Year
  • Cost
    Hourly Rate$15
  • availability
    AvailabilityImmediate
  • work from
    Work FromOffsite
  • check list
    CategoryInformation Technology & Services
  • back in time
    Last Active OnJuly 24, 2024
Key Skills
InformaticaPower BISQLMySQLPL/SQLMS SQL ServerSQL Server Integration ServicesSQL Server Reporting Services.
Summary

Professional Experience


Employer1: ItCratsRole:Technical LeadClient:Globe LifeDuration: February 2021 – NA


Key Responsibilities:


Involved in full life cycle development including Design, ETL strategy, troubleshooting Reporting, and Identifying facts and dimensions.

Reviewed the requirements with business, doing regular follow ups and obtaining sign offs.

Worked on IICS Data Integration and Application Integration and other IICS Components.

Data Integration: Created multiple task flows as part of Business requirements by using mapping tasks, business Services, Mass Ingestion tasks to push files to multiple downstream applications.

Calling Parameterized Python Script in order to handle Exception handling.

Application Integration: Extracting data out of relational tables and calling REST APIs in order to capture response and store it in to DB Tables.

Automated communications for all the data loads.

Build 6 different types of Interfaces End to End using Python/IICS (Data and Application Integration)

Capturing data from XML files and loading in to relational tables.

Calling REST API to capture response using Business Services in IICS & REST API APP Connections in application Integration.

Mass Ingestion tasks in order to deliver Bulk files to respective target locations.

Creating XML files and uploading it in to Amazon S3 Bucket.

Handled Complex Business requirement in order to convert in to ETL.


Environment: Informatica Cloud (IICS), MYSQL, AWS S3, Python, Unix shell scripting, Control-M Scheduler, Power BI Reporting


Employer2: Optum Global SolutionsRole:Sr. Data EngineerClient:OptumDuration: February 2016 – February 2021


Key Responsibilities:

Gathering requirements from business users, onsite coordinators and senior architects – Analyzing the requirements and doing impact analysis to understand the existing system functionality and proposing the optimal technical solutions by creating DB models and ETL models for the data storage from eHP and RALLY systems into ECODS tables for reporting use.

Actively working in eHP and ECODS decommission projects and identifying alternative solutions to get the data from other sources and making changes to the existing data models not to impact end-user reports deliverables.

Design, Development, Testing and Implementation of ETL processes using InformaticaCloud.

Convert specifications to programs and data mapping in an ETL Informatica Cloud environment

Built 70+ Informatica Power Center Workflows, IICS task flows to handle SCD Type1 and Type 2 logics in DWH dimension model for Facts and Dimension Loads – generate flat file extracts for data lakes and other downstream consumers using various transformations.

Created 10+ IDQ mappings to profile the data and to standardize the policy number variations, address standardization etc. by using Parser, Labeler and Address Validator transformations etc.

Designed ETL_CONTROL tables, mappings to handle Incremental loads and stats collection of Informatica workflows.

As part of Data migration from on premise data store to Amazon Aurora Databases, I have created scripts to run in AWS CLI to first copy files data into S3 buckets and then to Aurora DB tables. 

Scheduling meetings with senior architects and proposing suitable/optimized solutions to reduce the execution time of load process by demonstrating the plan in an understandable/convincing manner with the help of Data flow diagrams and Proof of Concepts (POC).

Creating python scripts to automate the processes like files conversion from xlsx to csv, connecting to Webserver (Ironbox) for files uploading and downloading etc.

Provided leadership to the team of 8 developers in supporting and enhancing the ETL application with fast growing changes in eHP/RALLY to ECODS. This DB is the second largest data warehouse at Optum at a size of approximately 75TB.

Created data lineage document for entire EHP/RALLY to ECODS data mapping flow which covered 180+ tables information. This information is helpful to downstream systems that use these tables to know from where exactly this information comes from.

Implemented end-to-end automated solutions for data load from RALLY transactional DB into Optum ECODS. It has 80% reduction of manual processing resulting in 30% more billable hours.

Converted 2700 lines of PL/SQL package of UPR process into an ETL process resulting in reducing 250hr of monthly execution process into 2hrs. This process also reduced the complexities in trouble-shooting the execution failures and data issues.

Parameterizing the Informatica ETLs to reduce the build time thus effectively managing ETLs in Production.

Completed Apache Airflow POC to schedule the UNIX cronjobs in the project; I have installed Airflow in Linux RHEL servers and setup the environment to schedule cronjobs using Python scripting.

Guide/mentor other developers in the team to resolve their technical issues and make them understand the requirements very well, peer reviews - keeping an eye on all project activities/deliverables - providing instant assistance if something goes wrong in production daily batch cycle or any other production loads.


Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon