OnBenchMark Logo

ugandar

Data Engineer
placeholder
Location Bengaluru
view-icon
Total Views70
availability
Shortlist0
back in time
Member since30+ Days ago
 back in time
Contact Details
phone call {{contact.cdata.phone}}
phone call {{contact.cdata.email}}
Candidate Information
  • User Experience
    Experience 6 Year
  • Cost
    Hourly Rate$8
  • availability
    AvailabilityImmediate
  • work from
    Work FromOffsite
  • check list
    CategoryInformation Technology & Services
  • back in time
    Last Active OnJuly 23, 2024
Key Skills
SQLAzfAdls
Summary

WORK EXPERIENCE

• Created ADF Pipelines to transfer data from various data sources like Flatfiles and oracle, transform and load into snowflake (ADW).

• Develop databricks notebooks using pyspark and spark SQL for data extraction, transformation and aggregation from complex data formats to generate simple csv files for further processing.

• Leverage db utilities to troubleshoot and test databricks notebooks.

• Developed ADF pipelines with proper audit tables and error handling/tracking tables to give more visibility to Analysts and BU.

• Configured emails notification for ADF pipeline success/failure using Azure Logic apps.

• Written cost efficient snowSQL code to work with csv files by using External tables and also created views in snowflake.

• Created Azure Integration runtime and migrated on premise SSIS packages to Azure MI SQL Instance.

• Make slight changes to SSIS package where ever there is source or destination from local system to work with Azure Blob storages. • Perform all pre and post migration checklists tasks

• Supported migrated Apps for specific period till we get production signoffs.

• Create and maintained documents for pipeline and other process. Leveraged Azure logic Apps in Azure Data Factory to trigger email notifications when any pipeline fails. Comscore NOV 2020 - PRESENT SENIOR DATA ENGINEER Brillio Technologies Bangalore DATA ENGINEER Bangalore ANALYTICS CONSULTANT LogicMatter India Private limited Hyderabad MAR 2016 - JUL 2018 SENIOR SYSTEMS ENGINEER Snowflake SSIS Dell (Brillio Technologies Payroll) FEB 2020 - OCT 2020 JUL 2018 - JAN 2020 Pune • Developed datawarehouseand DataMart’s for analyzing students’ performance.

• Implementing parallel loading using SSIS packages.

• Developed packages to extract thedata fromdifferent sources like relational tables, txt and excel files.

• Refactoring complex ETL to simpleand easy maintainable ETL’s

• Created Jobs, scheduling & running jobs and troubleshooting the failed jobs.

• Fixing issues during UAT and when Production support needs help.

• Done POC to replicateexisting Tableau reports to PowerBI.

• Extracted data from Adobe Analytics as CSV files, stored in HDFS and

• Developed oozieworkflowto orchestrate HIVE scripts execution reporting purpose. used unix commands to organize files as per campaigns, used HIVE scripts to clean, apply business logics and aggregatedata.

• After data QA, aggregated datais moved to SQL Server using SSIS for

• Comparedatabetween AA workspaceand aggregated HIVE data, Identify thecause for any datamismatching and provideasolution. • Weekly and Monthly meetings with Marketing and Sales teams to get updates on existing or newMarketing Campaigns to in corporatesame logic changes in HIVE scripts

Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon