OnBenchMark Logo

AWS Data Engineer

No of Positions  No of Positions:   3

location Location: Bengaluru

date Tentative Start Date:   April 23, 2024

Work From Work From : Offsite

rate Rate : $ 9  -  11 (Hourly)

experience Experience : 6 to 10 Year

Job Applicants : 7
Job Views : 166
You have successfully applied. Company will contact you soon.
Name : {{jobapplydata.name}}
Company Name : {{jobapplydata.cname}}
Email  {{jobapplydata.email}} |   Send Email   {{emaildata.total}}
Phone {{jobapplydata.phone}} | Call
You have successfully applied. Need to upgrade your plan to view contact details of client. Upgrade Plan
Job Category : Information Technology & Services
Duration : 6-12  Month
Key Skills Required Skills
AWS Spark python S3 Lambda glue athena RDS Redshift

Job Title: AWS Data Engineer

Work Type : Remote

Experience : 6+ yrs

Working Time zone : IST


Primary Skills : AWS, Spark, Python, S3, Lambda, Glue, Athena, RDS, Redshift.


Candidate Profile:


  • Seeking a Senior Business Intelligence Analyst with experience building solutions to support Data Reporting to meet those business requirements.
  • As a Sr Business Intelligence Analyst, you will work with key stakeholders and internal customers to support existing solutions, build new solutions while providing technical guidance, system development best practices and operational support.
  • This analyst will leverage Business Intelligence and Data Analytic best practices to deliver relevant, timely and insightful information to the business and champion the organization around BI.
  • This includes gathering, data modeling, report creation and automation, end user education and training.
  • Bachelor’s degree in information technology, Business Analytics, or related field
  • 3+ years hands-on Business Intelligence experience with an emphasis on analytical and/or reporting tools. Preferrable working with SAAS Oracle fusion ERP data sources.
    2+ years hands-on Experience working on AWS Glue, SPARK-SQL, Python and Redshift.
  • Strong proficiency in writing AWS Glue Jobs, SPARK-SQL, and Python.
  • Strong proficiency with AWS technologies like S3, EC2, CloudWatch, Lambda, Redshift, DMS and RDS. 
  • Advanced / Statistical Analytics experience
  • The ability to engage in active dialogue with enterprise stakeholders and data providers is central to success in this role.
  • Excellent communication skills at all levels. Proven ability to lead and communicate orally, in written documents, and in formal presentations is required.

Key Job Responsibilities:


  • Supports the Operational Reporting Team in the production of various internal reports.
  • Interprets & analyzes metrics to create meaningful dashboards & reporting both at a detailed and executive level.
  • Codes, implements, and optimizes data management processes to design reporting and analytics solutions.
  • Works cross-departmentally, to gather business requirements and develops schedules to ensure all deadlines and deliverables are met.
  • Monitors and manages business system data interfaces, files, integrity, and security.
  • Assists with design, testing and maintenance of data warehouse.
  • Provides training and end-user support for customized reports.
  • Codes, implements, and supports AWS data lake technology.
  • Codes Webhook/APIs to pull and push the data.
  • Codes complex SQL/MySQL queries as per business requirement
  • Responsible of Performance tuning of data warehouse running on-premises or AWS Cloud - Redshift.
  • Interacts well with multiple priorities in an extremely dynamic environment, initiating necessary tasks and ensuring complete follow-up.
  • Develops new functionality on our existing database and provides development support and assistance for escalations and issues.
  • Ensures the security of confidential and proprietary information and materials.
  • Responsible for managing activities in collaboration with data services department to accomplish planned growth.

Job Requirements:


      Good understanding and hands-on experience on AWS Cloud technologies like S3, Lambda, Glue, Athena, RDS, Redshift etc.

·    Experience with data bricks integrated with AWS cloud. 

·    Experience in working with AWS RDS SQL and MySQL-Aurora.

·    Experience in coding complex queries in SQL Server, MySQL.

·    Solid understanding and experience of AWS data lake architecture.

·    Solid understanding and experience of Database and BI programming

·    Experience writing complex queries, understanding execution plan of the queries.

·    Experience writing AWS Glue jobs, SPARK-SQL, Lambda and Python.

·    Experience with relational database concepts, SQL queries, NoSQL

·    Experience and understanding of DynamoDB AWS.

·    Experience with JIRA

·    Passionate about exploring new technologies.

·    Strong problem-solving skills and the ability to make sound technical decisions.

Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon