OnBenchMark Logo

Senior Data Engineer

No of Positions  No of Positions:   2

location Location: REMOTE

date Tentative Start Date:   March 19, 2024

Work From Work From : Offsite

rate Rate : $ 9  -  11 (Hourly)

experience Experience : 5 to 8 Year

Job Applicants : 5
Job Views : 122
You have successfully applied. Company will contact you soon.
Name : {{jobapplydata.name}}
Company Name : {{jobapplydata.cname}}
Email  {{jobapplydata.email}} |   Send Email   {{emaildata.total}}
Phone {{jobapplydata.phone}} | Call
You have successfully applied. Need to upgrade your plan to view contact details of client. Upgrade Plan
Job Category : Information Technology & Services
Duration : 6-12  Month
Key Skills Required Skills
Snowflake Terraform DBT Snaplogic Kafka DataEngineer
Description

Job Description: Senior Data Engineer

Location: Permanent WFH/ Remote

Experience: 5+ years Technology:

Mandatory Skills: Terraform, Snowflake, dbt, snap logic, Kafka, SQL

Primary Skills: CI/CD, Production deployment, Client management and Salesforce

Key Roles and Responsibilities:

• Design, develop, and maintain data pipelines for collecting, transforming, and loading data into various data stores.

• Build and maintain data warehousing and data lake solutions

• Develop and deploy data models that support various business requirements

• Write efficient and scalable code in languages such as Python, Scala, or Java

• Lead the design of data solutions with quality, automation, and performance in mind

• Own the data pipelines feeding into the Data Platform ensuring they are reliable and scalable

• Ensure data is available in a fit-for-purpose and timely manner for business and analytics consumption

• Work closely with the Data Product Manager to support alignment of requirements and sources of data from line of business systems and other endpoints

• Communicate complex solutions in a clear and understandable way to both experts and non-experts

• Interact with stakeholders and clients to understand their data requirements and provide solutions

Requirements:

• Extensive experience leading AWS and Snowflake • Proven track record of delivering large-scale data and analytical solutions in a cloud environment

• Hands-on experience with end-to-end data pipeline implementation on AWS, including data preparation, extraction, transformation & loading, normalization, aggregation, warehousing, data lakes, and data governance • Expertise in developing Data Warehouses

• In-depth understanding of modern data architecture such as Data Lake, Data Warehouse, Lakehouse, and Data Mesh

• Strong knowledge of data architecture and data modeling practices

• Cost-effective management of data pipelines

• Familiarity with CI/CD-driven data pipeline and infrastructure (e.g. Bit Bucket)

• Agile delivery approach using Scrum and Kanban methodologies

• Supporting QA and user acceptance testing processes

• Self-driven and constantly seeking opportunities to improve data processes

• Technology: Python, Snowflake, AWS, Terraform, and Bit Bucket

• Experience working in an Agile/Scrum development process.

• Experience with performance/or and security testing is a huge plus.

• Experience with load testing is a huge plus.

• Self-motivator with a desire to learn new skills and embrace new technologies in a constantly changing technology landscape

• Ability to thrive in a fast-moving environment

• Ability to show initiative, innovation and work independently when required

• Ability to work at pace and tackle project challenges in a collegiate, collaborative way

• Thoroughness and attention to detail

• Good communication skills (ability to present, inform and guide others

Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon
Loading…

Loading