OnBenchMark Logo

Siva

Sr Software Engineer
placeholder
Location Hyderabad
view-icon
Total Views69
availability
Shortlist0
back in time
Member since30+ Days ago
 back in time
Contact Details
phone call {{contact.cdata.phone}}
phone call {{contact.cdata.email}}
Candidate Information
  • User Experience
    Experience 13.9 Year
  • Cost
    Hourly Rate$12
  • availability
    AvailabilityImmediate
  • work from
    Work FromOffsite
  • check list
    CategoryInformation Technology & Services
  • back in time
    Last Active OnJuly 21, 2024
Key Skills
• PythonPysparkSnowflakeUnix
Summary

Project Details


Project 1(a): Infosario Registry Platform (IRP)                             

Client: American College of Surgeons (ACS), American Heart Association (AHA) and Muscular Dystrophy Association (MDA)

Role: Sr. Software Engineer

 

Vision:

Improve patient care through technology by capturing and enabling data.

Description:

IQVIA, formerly Quintiles and IMS Health, Inc., is an American multinational company serving the combined industries of health information technology and clinical research. IQVIA is the largest Healthcare Data Science Company and leader in Human Data Science Technology. The purpose of IRP is to define the solution architecture for a system which can be a product, application, group of related applications, or infrastructure solution. Patient registries have been defined as an organized system that uses observational study methods to collect uniform data to evaluate specified outcomes for a population defined by a disease, condition, or exposure, and that serves a predetermined scientific, clinical, or policy purpose. ACS manages different improvement systems, in general terms these systems collect patient level data on procedures and outcomes and provide feedback to the participants and those are SSR, NSQIP and TRAUMA for these systems IRP develop a product to ACS by integrating the data from SSR, NSQIP and TRAUMA to the ODS.

Responsibilities:

·        Participated in all phases including Requirement Analysis, Design, Coding, Testing and Documentation.

·        Involved to design technical specification document based on functional design document.

·        Involved in development of Jobs to extract data from various xml’s.

·        Created the Joblet to capture the execution details for all the Talend jobs for Audit purpose.

·        Created Talend ESB Jobs to enqueue and dequeue the data from JMS queues.

·        Involved in end to end of the ETL process from queue using Talend job to read the xmldata and loading to the ODS tables.

·        Deployed multiple instances of same jobs to get the maximum process rate.

·        Good knowledge on TAC on deploying ESB routes and DI jobs from nexus and scheduling them in TAC.

·        Preparation of ETL module specification documents.

·        Involved in Unit Testing of the developed components.

·        Involved in the Performance tuning of the Talend jobs.

·        Carrying out impact analysis for any new changes being pushed to production.

·        Involved in production deployment from initial releases and continued releases.

·        Handling the KT & Trainings on End-to-End Process.

·        Documented the Job process and methodology used to facilitate future development.

·        End-to-End Investigation/ Trouble Shooting and prepare application support documents.

·        Used HPQC as a defect tracking tool initially, later switched to Jira for both work allocation as well tracking the defects.

 

Technologies Used:

·         Talend Data Fabric 8.0.1, ActiveMQ, Oracle 12c and 19c, Linux, SVN, GitLab, S3, Athena etc.


Project 1(b): Infosario Registry Platform (IRP)                             

Client: American Heart Association (AHA)

Registry: The Society of Thoracic Surgeons (STS)

Role: Sr. Software Engineer


Responsibilities:

·        Worked on Python and spark (pyspark) frame work, processed large structured dataset to data mart.

·        Involved on Creating ETL pyspark batch jobs to process bulk data from source s3/database tables.

·        worked on micro batch real data streaming process using apache Kafka.

·        Created pipelines for data processing and analytics products and expertise on AWS cloud ,and various services like EMR,EC2,LAMBDA,S3,RDS,MSK(Kafka). Worked on Python webservices API ,AIRFLOW to manage jobs.

 

Technologies Used:

·        Python, Pyspark, snowflake, emr, ec2, Kafka, airflow, Unix, git etc..


Project 2: Price Master                     

Client: Market Share, USA

Role: ETL Developer


Description:

Market Share offers the industry leading cross-media analytics solutions for global marketers. Called out as a leader in the industry by Forrester Research and a "Cool Vendor" by Gartner, Market Share has enabled more than half of the Fortune 50 companies to dramatically improve their marketing effectiveness.Price Master is the Analytics tool for Dynamic Pricing Recommendations. Price master is the product to predict the pricing for so many events like sports, Arts & Theatres. Our ETL team process the data received from Ticketmaster (www.ticketmaster.com) FTP files and load into price master database.


Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon