No of Positions: 2
Location: Thiruvananthapuram
Tentative Start Date: June 30, 2022
Work From : Any Location
Rate : $ 16 - 20 (Hourly)
Experience : 5 to 8 Year
Details Job Description
- Create and maintain optimal data pipeline architecture
- Assemble large, complex data sets that meet business requirements
- Optimize data delivery and re-design infrastructure for greater scalability
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using kafka and Azure technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
- Work with internal and external stakeholders to assist with data-related technical issues and support data infrastructure needs.
- Work closely with business intelligence team, understand their reuiqremts thorugh API and needs to check with the backend postgres and snowflake data warehouses and needs to update the facts and dimensions according to that. Strong understanding of datawarehouse concepts is needed.
Responsibilities
The candidate needs to work closely with Business intelligence team, understand their reuiqremts thorugh API and needs to check with the backend postgres and snowflake data warehouses and needs to update the facts and dimensions according to that. Strong understand of datawarehouse concepts is needed.
Primary Skills
Databases, Datawarehouses, Kafka, AWS Datalake, Postgres , snowflake, API, Data ingestion, python. Distributed systems, apache Airflow.
Secondary Skills(If Any)
Mongo DB , openAPI, fastAPI, business intelligence