Senior Data Engineer

  • No. of positions: 2
  • REMOTE
  • Last Active Date : 09 Mar, 2024

Budget

₹122,000 - ₹153,000 (Monthly)

Experience

5 to 8 Years

Work From

Offsite

Job Duration

6-12 Months

Job Applied

6

Required Skills

SnowflakeTerraformDBTSnaplogicKafkaDataEngineer

Job Description

Job Description: Senior Data Engineer

Location: Permanent WFH/ Remote

Experience: 5+ years Technology:

Mandatory Skills: Terraform, Snowflake, dbt, snap logic, Kafka, SQL

Primary Skills: CI/CD, Production deployment, Client management and Salesforce

Key Roles and Responsibilities:

• Design, develop, and maintain data pipelines for collecting, transforming, and loading data into various data stores.

• Build and maintain data warehousing and data lake solutions

• Develop and deploy data models that support various business requirements

• Write efficient and scalable code in languages such as Python, Scala, or Java

• Lead the design of data solutions with quality, automation, and performance in mind

• Own the data pipelines feeding into the Data Platform ensuring they are reliable and scalable

• Ensure data is available in a fit-for-purpose and timely manner for business and analytics consumption

• Work closely with the Data Product Manager to support alignment of requirements and sources of data from line of business systems and other endpoints

• Communicate complex solutions in a clear and understandable way to both experts and non-experts

• Interact with stakeholders and clients to understand their data requirements and provide solutions

Requirements:

• Extensive experience leading AWS and Snowflake • Proven track record of delivering large-scale data and analytical solutions in a cloud environment

• Hands-on experience with end-to-end data pipeline implementation on AWS, including data preparation, extraction, transformation & loading, normalization, aggregation, warehousing, data lakes, and data governance • Expertise in developing Data Warehouses

• In-depth understanding of modern data architecture such as Data Lake, Data Warehouse, Lakehouse, and Data Mesh

• Strong knowledge of data architecture and data modeling practices

• Cost-effective management of data pipelines

• Familiarity with CI/CD-driven data pipeline and infrastructure (e.g. Bit Bucket)

• Agile delivery approach using Scrum and Kanban methodologies

• Supporting QA and user acceptance testing processes

• Self-driven and constantly seeking opportunities to improve data processes

• Technology: Python, Snowflake, AWS, Terraform, and Bit Bucket

• Experience working in an Agile/Scrum development process.

• Experience with performance/or and security testing is a huge plus.

• Experience with load testing is a huge plus.

• Self-motivator with a desire to learn new skills and embrace new technologies in a constantly changing technology landscape

• Ability to thrive in a fast-moving environment

• Ability to show initiative, innovation and work independently when required

• Ability to work at pace and tackle project challenges in a collegiate, collaborative way

• Thoroughness and attention to detail

• Good communication skills (ability to present, inform and guide others

Submit Query icon