No of Positions: 2
Location: Gurgaon
Tentative Start Date: May 14, 2023
Work From : Offsite
Rate : $ 6 - 8 (Hourly)
Experience : 8 to 12 Year
• Design, Architect, Deploy and maintain solutions on the MS Azure using different Cloud & Big Data Technologies.
• Manage the full life-cycle of a Data Lake / Modern DWH solution from requirement gathering and analysis to platform selection, design of the architecture and deployment.
• Be responsible for implementing solutions which can scale on Cloud.
• Collaborate with a team of business domain experts, data scientists and application developers to develop BD solutions.
• Explore and learn new technologies for creative business problem solving and mentor a team of Data Engineers.
Required Experience, Skills & Competencies:
• Strong Hands-on experience in implementing Data Lake with technologies like – Data Factory (ADF), Azure Databricks, Azure Synapse Analytics / SQL DWH, Cosmos DB
• Experience of using big data technologies like Hadoop, Spark, Kafka, Hive, NoSQL – HBase / MongoDB / Cassandra, Impala, Sqoop etc.
• Strong programming experience either in Python is must.
• Experience of supporting BI and Data Science teams in consuming the data in a secure and governed manner.
• Very Good understanding and Experience of using CI/CD with Git, Jenkins / Azure DevOps.
• Experience of setting up cloud-computing infrastructure solutions.
• Hands on Experience/Exposure to NoSQL Database and Data Modelling
• 10+ years of technical experience with at-least 2 years on MS Azure and 2 year on Hadoop (CDH/HDP) ecosystem
• B.Tech/B.E from reputed institute preferred