OnBenchMark Logo

Data Architect

No of Positions  No of Positions:   1

location Location: Bengaluru

date Tentative Start Date:   April 03, 2022

Work From Work From : Offsite

rate Rate : $ 16  -  18 (Hourly)

experience Experience : 8 to 13 Year

Job Applicants : 2
Job Views : 194
You have successfully applied. Company will contact you soon.
Name : {{jobapplydata.name}}
Company Name : {{jobapplydata.cname}}
Email  {{jobapplydata.email}} |   Send Email   {{emaildata.total}}
Phone {{jobapplydata.phone}} | Call
You have successfully applied. Need to upgrade your plan to view contact details of client. Upgrade Plan
Job Category : Information Technology & Services
Duration : 6-12  Month
Key Skills Required Skills
Spark Scala Big data Hive AWS Kafka
Description

Job Objective:

Data Architect should be able to transform the data into sets that can be easily worked with by the entire organization. As data Architect, they have to develop a wide range of methods to improve the quality and efficiency of data. They will also help implement these methods across the organization. Data Architect have to work closely with software and application development teams to recommend database structures based on the data storage and retrieval needs. Data Architect will be responsible for constantly monitoring database and immediately addressing database issues and problems.


Primary Responsibility

  • Provide insight into the changing database storage and utilization requirements offer suggestions for solutions
  • Analyze database implementation methods to make sure they are in line with company data policies and any external regulations that may apply.
  • Develop database design and architecture documentation for big data system.
  • Help maintain the integrity and security of the big data systems.
  • Oversee the migration of data from legacy systems to new big data solutions.


Desired qualifications

  • Bachelor's Degree in Computer Engineering or related field required (Master's degree preferred)
  • 5+ years' experience in a data analysis
  • Strong knowledge of database structure systems and data mining.
  • Knowledge of C and PHP languages
  • Proven ability to work in distributed systems (Spark, Hadoop)
  • Must be able to develop creative solutions to problems
  • They should also have experience using the following software/tools:
  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
  • Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
  • Experience with AWS/Azure cloud services
  • Experience with stream-processing systems: Storm, Spark- Streaming, etc.
  • Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon
Loading…

Loading