No of Positions: 1
Location:
Remote
Tentative Start Date:
May 13, 2024
Work From :
Any Location
Rate : $ 6
-
15 (Hourly)
Experience :
4 to 5 Year
Data Engineer
Contract Remote Remote 5+ years
Mandatory Technical Skills:
ETL AWS Python SQL
Job Description
As a Contract Data Engineer, you will be responsible for designing, implementing, and optimizing data systems to support our data lake/data mesh infrastructure and data processing needs. We are looking for an individual who possesses strong technical skills, problem-solving abilities, and a passion for data engineering, with a specific emphasis on AWS, AWS Glue, and data lake expertise.
Responsibilities:
* Design, implement, and optimize data systems to support our data mesh infrastructure.
* Utilize AWS Glue for data extraction, transformation, and loading (ETL) processes.
* Refactor and improve existing data systems for enhanced efficiency and performance.
* Collaborate closely with product development and production support teams to meet their data engineering requirements.
* Participate actively in peer code reviews to maintain code quality and consistency.
* Diagnose and resolve production issues, offering corrective measures and workarounds when necessary.
* Continuously assess key performance indicators (KPIs) and identify opportunities for improvements and optimizations within our data mesh.
Job Requirements:
* Strong understanding of AWS services and AWS Glue, Athena and Lambda Functions
* Minimum of 5 years of relevant experience in data engineering.
* Strong understanding of complex data architecture, especially for data lake environments.
* Proficiency in ETL concepts within modern data applications.
* Excellent SQL skills, with a focus on PostgreSQL and query optimization.
* Proficiency in Python programming.
* Familiarity with MongoDB is a plus.
* Familiarity with agile development practices.
* Demonstrated ability in analysis and design of data systems.
Desired Qualifications:
* Experience in building data systems with real-time data streams (e.g., Kafka, Kinesis).
* Bachelor of Science degree in a technical major (Computer Science, Engineering, Mathematics) or equivalent industry experience.
* Familiarity with other ETL tools (Pentaho, Informatica, SSIS, etc.).
* Knowledge or experience with microservices and messaging systems.
* Experience with supply chain management is a plus
Privacy Policy