OnBenchMark Logo

Prasanna (RID : qe71l13qxf9m)

designation   Azure Technical Engineer

location   Location : New Delhi, India

experience   Experience : 11 Year

rate   Rate: $24 / Hourly

Availability   Availability : Immediate

Work From   Work From : Offsite

designation   Category : Information Technology & Services

Shortlisted : 0
Total Views : 309
Key Skills
Unix and Windows BODS Admin BODS Designer Inforamtion steward Azure Data Factory Azure Data Bricks Azure Data lake Analytic SAP BODS Basic scala Python Teradata Oracle MS SQL Server AZURE DW


✔       11+ years of work experience in Development and Implementations of Data Warehousing solutions.

 ✔      Experienced in Azure Data Factory and preparing CI/CD scripts, Devops for the deployment.

 ✔      3 years of experience in Microsoft Azure cloud platform

 ✔      Customer front ending, requirements gathering, Functional spec, technical design, reviews with the customers, project delivery, testing plan, deployments, project stabilization Etc.

 ✔      Good exposure on Azure PAAS components like Azure Data factory, Azure SQL DW, Azure Data lake storage, and log analytics, Snowflake DB

 ✔      Experienced in Performance Tuning, Code Design Architecture Review, Automating Housekeeping and repetitive jobs.

 ✔      Played different roles like Development Lead, Project Lead, Tech Lead, Individual contributor etc.

 ✔      A Self-starter with a positive attitude, willingness to learn new concepts/technology and acceptance of challenges.

 ✔      Hands-on experience in Azure Analytics Services – Azure Data Lake Store (ADLS), Azure Data Lake Analytics (ADLA), Azure SQL DW, Azure Data Factory (ADF), Azure Data Bricks (ADB) etc.

 ✔      Microsoft Certified Professional in Perform

 ✔      Excellent knowledge of ADF building components – Integration Runtime, Linked Services, Data Sets, Pipelines, Activities.

 ✔      Extensively used ETL methodology for supporting Data Extraction, Transformation and processing of data from Source like SAP HANA and Non-SAP (Oracle, SQL Server, Teradata, SFTP, Amazon S3) using Azure Data Factory into Azure Data lake Storage and blob storage

 ✔      Designed and developed data ingestion pipelines from on-premise to different layers into the ADLS using Azure Data Factory(ADF V2).

 ✔      Good knowledge on polybase external tables in SQL DW.

 ✔      Designed and developed data transformations using U-SQL in Azure Data Lake Analytics

 ✔      Experience in building Azure Data Bricks (ADB) Spark-Scala, python, and pyspark Notebooks to perform data transformations.

 ✔      Good exposure to Data Frames, and Spark SQL

 ✔      Designed and developed audit/error logging framework in ADF.

 ✔      Orchestrated data integration pipelines in ADF using various Activities like Get Metadata, Lookup, For Each, Wait, Execute Pipeline, Set Variable, Filter, until, etc.

 ✔      Implemented dynamic pipeline to extract the multiple files into multiple targets with the help of single pipeline.

 ✔      Automated execution of ADF pipelines using Triggers

 ✔      Worked on Retrofit the framework with the help of new activities came in ADF.

 ✔      Have knowledge on Basic Admin activities related to ADF like providing access to ADLs using service principle, install IR, created services like ADLS, logic apps etc.

 ✔      POC using AZCOPY for copying files to Azure storage (blob, file share).

 ✔      POC using ADLCOPY to copy bob to ADLS and ADLS to ADLS.

 ✔      Experience in Production support

 ✔      Designed Azure Logic apps application to send pipeline success/failure alert emails, file unavailability notifications etc.

 ✔      6 years of experience in SAP BODS Development, Information steward data quality tool as well as Administration.

 ✔      Installation experience of Business Objects Data Services 4x.

 ✔      Involved in BODS Upgrade activities.

 ✔      Worked on ETL tool SAP Business Objects Data Services 4x/3x and on all transforms of Data Services (Data Integrator, Platform & Data Quality transforms).

 ✔      Managed BODS objects like Installation of BODS, Creation of new Repository, Schedule of Batch Jobs, Data stores, Addition of repositories to Designer, Setting Job server, Flat file formats and Excel/Xml file formats.

 ✔      Used in-built functions of BODS, Custom functions, Conditional workflows, BODS scripting language.

 ✔      Exposure on BODS Management Console.

 ✔      Extensively worked on BODS pushdown optimization.

 ✔      Involved in production support activities.

Matching Resources
My Project History & Feedbacks
Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon

stuff goes in here!