RAKESH

Lead Big Data Administrator
placeholder
Location Gurgaon
view-icon
Total Views242
availability
Shortlist0
back in time
Member since30+ Days ago
 back in time
Contact Details
phone call {{contact.cdata.phone}}
phone call {{contact.cdata.email}}
Candidate Information
  • User Experience
    Experience 10 Year
  • Cost
    Hourly Rate$20
  • availability
    AvailabilityImmediate
  • work from
    Work FromAny
  • check list
    CategoryInformation Technology & Services
  • back in time
    Last Active OnOctober 04, 2022
Key Skills
Microsoft Azure Adminshell ScriptingJenkinsHadoopTFSEclipse
Summary

Hi,


I feel that my skills and experience are a great fit for this position. 

Please feel free to contact me to arrange an interview. I look forward to learning more about this opportunity.

Please find the attached resume.


Experience Summary: - I have 10 years of working experience in mix of profile like Hadoop Administrator and Software Testing.

I have 4 years of work experience as a Hadoop Administrator.


Big Data Exposure:-

*Installation and Verify in Hadoop logic installation, configuration, supporting and managing Hadoop Clusters using Apache Hadoop.

*Verify backup configuration and recovery from a Namenode failure.

*Verify Installation of various Hadoop Ecosystems and Hadoop Daemons.

*Checking system health by Heartbeat mechanism. 

*Good experience on Design, configure and manage the backup and disaster recovery for Hadoop data.

*Hands on experience in analyzing log files for Hadoop and ecosystem services and finding  root cause. 

*As an administrator verify, cluster maintenance, troubleshooting, monitoring and followed proper backup & Recovery strategies.

*Experience in HDFS data storage and support for running map-reduce jobs. 

*Verify installing and configuring Hadoop eco- system like Sqoop, Pig, Hive, Hbase, Flume, Oozie, and Kafka.

*Configured various property files like core-site.xml, hdfs-site.xml, mapred-site.xml, yarn-site.xml based upon the job requirement.

*Importing and exporting data into HDFS using Sqoop.

*Importing and exporting data into hdfs to local, local to local.

*Good working Knowledge in Hadoop security like Kerberos and sentry.

*Experienced in Cloudera installation, configuration and deployment on Linux distribution.

*Commissioning and decommissioning of nodes as require.

*Managing and monitoring Hadoop services like Namenode, Datanode & Yarn

*Performance tuning, and solving Hadoop issues using CLI or by WebUI

*Troubleshooting Hadoop cluster runtime errors and ensuring that they do not occur again.

*Accountable for storage and volume management of Hadoop clusters.

*Ensuring that the Hadoop cluster is up and running all the time (High availability, big data cluster etc.)

*Evaluation of Hadoop infrastructure requirements and design/deploy solutions.

*Backup and recovery task by creating snapshots policies, backup schedules and recovery from node failure.

*Responsible for Configuring Alerts for different types of services which is running in  Hadoop Ecosystem.

*Moving data from one cluster to another.

 

Warms Regards

Rakesh Dubey

8744855286









Copyright : 2022 – OnBenchMark All Right Reserved.