OnBenchMark Logo

Komal (RID : a49xle45gszn)

designation   Data Architect

location   Location : Noida

experience   Experience : 8 Year

rate   Rate: $19 / Hourly

Availability   Availability : Immediate

Work From   Work From : Offsite

designation   Category : Information Technology & Services

Shortlisted : 1
Total Views : 128
Key Skills
Azure Data Factory Azure Databricks Azure Storage REST API
Discription

Komal

Profile Summary

  • Hands on experience on Azure services such as Azure Data Factory, Azure Databricks, Azure Storage, Azure Data Lake, Data Warehousing and ETL(Informatica Power Center)techniques
  • Working knowledge of ETL tool like Informatica PowerCenter Tool, Teradata/Oracle Databases, job scheduler like Control-M/Autosys
  • Hands on experience in handling major data issue by backtracking nested views and identifying corrupted or duplicate data
  • Hands on experience in performance tuning and query optimization
  • Proficient in client requirement, engagement, and interaction
  • Proficient in handling risk level 1, 2 performance issues, coordinating with all teams involved following process, SLA to resolve issue ASAP
  • Regular system checks to maintain data sanity

Current Employer: ****

Current Designation: Sr. Consultant/Azure Data Engineer

Professional Experience:

  • Worked as Azure Developer on multipleAzure projectson developing, deploying, and maintaining pipelines end to end, Owned Oracle HR Module end to end and documentations.
  • Working knowledge on enhancing existing ADF pipeline and handling the ADF failures as well
  • On Azure Databricks, developed Type1, Type 2 logic for few sources as per business requirement, maintaining historical data using SQL and PySpark programming language.
  • Working knowledge on the development, deployment and CI/CD tools.
  • Hands on experience in development from scratch, preparing and maintaining documents (STMs, DDD, DG documents), understanding the client’s requirement and implementing the same.
  • Hands on experience on ingesting data from different sources like REST API,oracle using respective Azure Data Factory connectors.
  • Deployment and development of code from dev to QA using Visual Studio and GitHub repository.
  • Working knowledge of Unit testing of the code after deployment.
  • Working knowledge of Data refresh activity from PROD to QA/DEV and data validation

Current Employer: ***********

Current Designation: Sr. Associates - Project

Professional Experience:

  • Hands experience in handling PROD, TEST(QA) and DEV system along with supported Informatica IPC tool, IDR/TAS tools, Autosys, Snow Service Now, IDQ, Teradata database
  • Proactively working on daily activities including system health checks, DBC/disk space checks, business critical jobs, data sanity, RCA, SLA etc
  • Handling P1 incidents proficiently, Involving and Co-ordinating all the respective teams.
  • Proficient in handling high priority incidents, service tasks
  • Hands on experience in handling major data issue by backtracking nested views and identifying corrupted or duplicate data
  • Hands on experience in capturing monthly snapshot as per business need and correct the data if there is inconsistency in data
  • Experience in Identity and access Management, networking, storage and compute infrastructure
  • Involved in GCP migration Activities as a GCP Co-Ordinator
  • Efficient in handlingvarious type of Informatica job failures
  • Working knowledge of IDR/TAS configs
  • Hand on experience on AutoSys tool,monitoring jobs, space related issue, editing scripts, identifying and updating password in various folders, Update start time for a script in Jil file as per user requirement
  • Working knowledge of deployment in PROD, QA and DEV environment.
  • Worked on enhancement activities and change in code as per business requirement and implemented in all environments.
  • Proactively monitoring Informatica workflows and Teradata BTEQ jobs
  • Knowledge of restoring missing files
  • Actively involved in Outages, coordinated with team in pre and post outage activities.
  • Attending business requirement meetings
  • Working on and coordinating with teams for change order on Production, Development environment as per requirements
  • Generating reports as per requirements
  • Troubleshooting systems/performance issues under tight deadlines
  • Generating Capacity council reports and presenting the same to the customer
  • Actively involved in knowledge sharing sessions for new resources joining team
  • Assisting teammates in daily tasks

Previous Employer: ***********

Designation: Teradata DBA

Professional Experience:

  • Supported multiple International projects which included Retail,
 
Matching Resources
My Project History & Feedbacks
Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon
Loading…

stuff goes in here!