No of Positions: 2
Location:
Bengaluru
Tentative Start Date:
February 15, 2022
Work From :
Offsite
Rate : $ 11
-
12 (Hourly)
Experience :
2 to 4 Year
Responsibilities:
● Implement monitoring and alerting solutions to provide insight into overall platform
health.
● As an integral part of the Data Platform team, onboard various data sources by
creating data pipelines.
● Provide resolutions and/ or Workaround to data pipeline-related queries/issues as
appropriate
● Ensure that Ingestion pipelines that empower the Data Lake and Data Warehouses are
up and running.
● Assist the End-users of the Data Platform with Query debugging and optimization.
● Collaborate with different teams in order to understand / resolve data availability and
consistency issues.
● Be involved in Knowledge Sharing (Knowledge Base Articles, Documentation, Forums,
Blogs, etc...)
● Exhibit continuous improvement on technical knowledge and problem resolution skills
and strive for excellence
Requirements:
● 2+ years of experience in Technical/ Application in building and supporting data pipelines
● Ability to read and write SQL - and understanding of one of the Relational Databases such as MySQL,
Oracle, Postgres, SQL Server.
● Good knowledge of Java & python programming knowledge
● Comfortable with Linux with the ability to write small scripts in Bash/Python. Ability to grapple with
log files and Unix processes
● Prior experience in working on cloud services, preferably AWS.
● Ability to learn complex new things quickly
● Occasional involvement in support over weekends/ early mornings/ late nights.
● Be a team player with an ability to work under pressure with good time management skills
● Excellent written and verbal communication skills
● Troubleshooting skills and doing root cause analysis RCA
● Prior experience in big data technologies like Spark, Hadoop, Kafka, Airflow, etc .. is preferred.