OnBenchMark Logo


Software Developer
Location Chennai
Total Views104
back in time
Member since30+ Days ago
 back in time
Contact Details
phone call {{contact.cdata.phone}}
phone call {{contact.cdata.email}}
Candidate Information
  • User Experience
    Experience 12 Year
  • Cost
    Hourly Rate$15
  • availability
  • work from
    Work FromOffsite
  • check list
    CategoryInformation Technology & Services
  • back in time
    Last Active OnJuly 14, 2024
Key Skills
Hadoop AdminLinuxMapperPythonDocker


 Seeking a position as aHadoop/cloud/Devopsin an esteemed organization, where I can utilize my knowledge, versatile experience and skills in perfect balance that will make a meaningful contribution to the organization and my career as well.

Experience Summary:

 ☆   Carrying with me, 12 years of IT experience and 5 years of experience relevant to Hadoop, Docker, Jenkin, ansible.
☆   Exposure to Remote Infrastructure management & support operations.
☆   Performing assigned tasks in accordance with established standards and guidelines.
☆   Provides coverage for BAU Linux Server Management teams.

 Coordinate with vendors and support teams as necessary

Experience Profile:

 ☆  Currently working with Paypal Holdings, Inc as Software Engineer 3 from October 2021


☆  Previously worked with Ford Motor Company as Lead Hadoop Engineer from June 2020 to September 2021


☆  Previously workedwith Tech Mahindra Limited as Tech Lead from September 2017 to June 2020.


☆  Previously Worked with Wipro Technologies Limited as Senior Administrator from October 2015 to August 2017.


☆  Previously Worked with IBM India Private Limited as Senior Operations Professional from May 2011 to September 2015


☆  Previously Worked with IBM India Private Limited as Server system operation (Pay role of Maintec Technologies) from August 2009 – April 2011.

Project Profile                 

 Duration       :  June 2020 – September 2021

Company      :  Ford Motor Company.

OS                  :  RHEL, Suse.

Application   :  Hortonworks/Cloudera Hadoop


Daily Activities:


·        Received KT from Ford-North America senior SME’s, Resident Architects and Resident Engineer from Cloudera

·        Technically trained and developed team to support Hadoop operation team

·        Implemented Sun-model support. Provided 24/7 support

·        Done the capacity management of yarn queue for sharing and forecasting

·        Managing and configuring the github branches as per project specification

·        Identify improvements to enhance CI/CD, Planning and prioritizing of CI automation scope and backlog

·        Installing and configuring Docker container for Hadoop data nodes.

·        Implemented various phases of project including System Integration, Big data technologies and Hadoop ecosystem: HDFS, Map Reduce, YARN, Pig, Hive, Oozie, Hbase, and Sqoop& Zookeeper.

·        Experience in administering, installation, configuration, supporting and maintaining Hadoop cluster using Hortonworks.

·        Hands-on experience on major components in Hadoop Ecosystem including HDFS, Yarn, Hive, Zookeeper, Oozie and other ecosystem products, Optimized the configurations of Map Reduce, spark and Hive jobs for better performance.

·        Experience in setting, configuring & monitoring of ambari and Hadoop cluster on Hortonworks.

·        Perform Ambari and HDP upgrade after the release of new patch.currently using HDP 2.6.4

·        Experience in design and maintenance and support of Big Data Analytics using Hadoop Ecosystem components like HDFS, Hive, Hbase, Sqoop, MapReduce, Oozie...

·        Extensive knowledge of Mapper/Reduce/HDFS Framework.

·        Commissioning and Decommissioning Hadoop cluster Nodes including balancing HDFS block data.

·        Expert level skills in Managing, Scheduling andtroubleshooting Jobs through oozie.

·        Hands on experience in analyzing Log files for Hadoop and eco system services and finding root cause.

·        Experience in Implementing High Availability of Name Node and Hadoop Cluster capacity planning to add and remove the nodes.

·        Involved in Installing and configuring Kerberos for the authentication of users and hadoop daemons.

·        Experience in installing, configuring Hive, its services and Metastore. Exposure to Hive Querying Language, knowledge about tables like importing data, altering and dropping tables.

·        Hadoop Cluster capacity planning, performance tuning, cluster Monitoring, Troubleshooting.

·        Helping application teams to import and export their Dbs using sqoop and trouble shoot their issues.

·        Creating collection for solr

·        Troubleshoot with Hadoop developers, designers in MapReduce job through oozie and tez.Spark jobs through pyspark.

·        Manage and review data backups and log files, Do maintenance cleanup of old log files.

·        Create, configure and manage access for Kafka topics.

·        Worked on administrating components accessibility via ranger. Create and manage policies.

·        Expertise in Installing, Configuration and Managing Red hat Linux .

·        Expertise in handling the hardware issues, patchmanagement, firmware upgrades in cluster.

·        Expertise in shell scripting to automate daily activities.

·        Learning python vigoursily to gain further knowledge on automation.

·        Monitor the Hadoop cluster using Grafana and Ambari metrics.


Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon