Devops
Location : Jaipur
Experience : 4 Year
Rate: $20 / Hourly
Availability : 1 Week
Work From : Any
Category : Information Technology & Services
1 | P a g e Shivm -DevOps Engineer (Exp: 4+ years) Currently working on the PayRoll of Nine Hertz India Pvt. Ltd. PROFILE: AWS Certified Developer associate served as AWS developer/Architect for migration and Applicatoin Hosting to AWS Cloud. EXPERIENCE: I served as DevOps Engineer for EdTech Startup where I was responsible for deploying all theapplications and APIs over to any cloud Infrastructure providers. This is how it's going: Migrated/Deployed everything to AWS/Azure/GCP Cloud depending on many factors. Automated Build/Deployment to industry standard CICD Setup over Jenkins Setup all the internal apps like GitLab, Jenkins, Databases, HR Portal, IT portal etc. Took care of Site Reliability Engineer Responsibilities. Setup Firewalls and Jump Servers/Proxy servers where required Setup of Private Cloud for Internal Applications Seamless autoscaling setup. For Instance,When an exam begins at 9AM, We are running at2.small instance to host the login website, in a single minute thoushand of studentsstart their exam and infrastructure scales to 100s of small t2.small instances to cope upwith the load for 2-3 hours of exams. When students logs off, production scales downback to 1 or 2 instances Creating Scripts for automating application deployment using EC2, RDS, Route53 andmany more AWS Resources. Deployment Automation using PowerShell scripting, Windows and Unix Shell scripting. ETL Practice for Data Migration. Hosted Databases on RDS/DynamoDB and EC2 Instances using Terraform. Serverless setup using AWS Lambda and API Gateway Providing Cost Effective Solutions for Deployments and using AWS Resources. As a part of DevOps team, We provided 24x7 Support to keep Applications available at alltimes. Worked on numerous Jira tasks on daily basis to deploy new applications to our poolof services. Setup CICD architecture using Bitbucket and Bamboo. Currently migrating allexisting environments to AWS ECS, ECR, EKS Services. Migrating Docker Swarm toKubernetis. Working in 3 rotational shifts of 8 hours each. Created Cron Jobs to monitorSystem Health and take necessary actions to fix most common issues. TOOLS / SERVICES: Jenkins, Terraform, Packer, Puppet. Linux shell/Powershell AWS CLI Python 2 | P a g e Windows Active Directory management Git, Jira, Confluence Jira, Assyst, SourceTree Apache Hadoop, Hive, Pig, Sqoop, Spark, R etc. SKILLS & ABILITIES: Terraform Scripting Windows/Unix Shell Scripting and Powershell Jenkins - Code Repo integration for CICD Setup Packer/Puppet/Ansible scripting for Automated server configuration and management Serverless Setup using AWS Lambda, S3, API Gateway Database Migration JSON and YAML Scripting Working Knowledge of Apache Hadoop, Pig, Spark, Sqoop and various data analyticsPython libraries like Pandas, numpy, scipy, matplotlib etc INTERNSHIP: DATA ANALYSYS During college internship on Big Data analysis using Apache Hadoop, Pig, Hive, Sqoop,Spark, R and various Python libraries like matplotlib, Pandas, Tkinter, plotly, etc. Iconfigured Hadoop’s master slave architecture using different machines and AWS EC2Instances. Learned to store and clean textual data containing billions of entries. Hive useSQL queries to get results. I worked on US Govt provided logs and records which areavailable at data.gov to analyze frequency of a particular reading and predict future scope. PROFESSIONAL QUALIFICATION: Bachelor Of Technology – Punjab Technical University – 2014-2018 ***********THANKS**********