No of Positions: 15
Location: Bengaluru
Tentative Start Date: May 16, 2021
Work From : Any Location
Rate : $ 7 - 15 (Hourly)
Experience : 5 to 9 Year
Essential Responsibilities:
-Build data pipelines to extract, load and transform data from source systems to data lake both in batch and in near real time
-Build Hive tables from files/folders in data lake
-Assemble large, complex datasets
-Ensure that solutions developed are aligned to architectural roadmap
-Ensure data security
Essential / Technologies:
-Azure Data Lake Store
-Azure Databricks
-Azure ingestion and transformation tools such as Azure Data Factory, Azure Event Hub, Azure Stream Analytics
-Python
Good to Have Skills / Technologies:
-SQL
-PowerShell