OnBenchMark Logo

Venkat (RID : 1gkbsl9y681nd)

designation   DevOps Developer

location   Location : Noida

experience   Experience : 8 Year

rate   Rate: $20 / Hourly

Availability   Availability : Immediate

Work From   Work From : Offsite

designation   Category : Information Technology & Services

Shortlisted : 0
Total Views : 156
Key Skills
DevOps Oracle Big Query Apache Jira ITSM Data warehousing
Discription

Venkat. R

Production Analyst

 

PROFESSIONAL SUMMARY:

  • Having 8 years of IT experience as an Informatica, GCP, SQL, Unix shell scripting, development and Good knowledge in Business Analysis, Requirements, Development, Production support using industry accepted methodologies and procedures.
  • In-depth understanding of all Cloud services in market.
  • Exposure in Snowflake, Data warehousing
  • Experience with Google Cloud Platform, Informatica, Oracle, UNIX, Apache Airflow DAG, Big Query
  • Good understanding of Cloud based services like Google cloud platform, Azure and AWS
  • Have good skills on understanding and development of business rules for its Standardization, Cleanse and Validation of data in various formats.
  • Experience in ITSM and worked with Incident management tools Like (Jira, Remedy and Service now).
  • Experience in developing/maintaining applications in various fields such as Banking Domain, Insurance and sales Domain

.

  • Experience in CRONTAB, Control-M, and UC4 and AIRFLOW Scheduling automation tool to monitor jobs and handle it.
  • Experience in using ETL data integration tools like Informatica with the help Of Hadoop and GCP.
  • Certified in Google Data Engineer.

EDUCATIONAL PROFILE:

  • Bachelor of Science (Computers) from Andhra University, India.

PROFESSIONAL EXPERIENCE:

Working as GCP engineer at Technology from 2022 to tii

  • Worked as senior consultant in India, Hyderabad from March 2021 to 2021 DEC.
  • Worked as senior software engineer in software Pvt ltd. Bangalore from Sep 2019 to Feb 2021.
  • Worked as senior software engineer in CGI, Hyderabad from Nov 2016 to Aug 2019.
  • Worked as software engineer in , Hyderabad from Dec 2015 to July 2016.
  • Worked as software engineer with SAN information from April 2013 to Oct 2015.

TECHNICAL SKILLS:

  • Development Tools (ETL) : GCP, Big Query, Informatica 9.1, 10.2.0.
  • Case Tools : Toad
  • Databases : Oracle 9i,11g,SQL Server
  • Programming Languages : SQL, PLSQL, Unix, shell scripting, Scala
  • Operating Systems : Windows 2000/XP/7, UNIX. MAC OS
  • Scheduling Tools : Crontab,UC4,Control –M, Airflow
  • Reporting Tools : Tableau.
  • Good Knowledge in GCP and Snowflake

FORTES:

  • Strong Interpersonal and Excellent Communication skills.
  • Strong team player with ability to work in a team and an individual contributor.
  • Ability to work independently with minimum guidance.
  • Fast learner, flexible and able to work on multiple tasks.
  • Looking to learn and work on new technologies like (Hadoop, Informatica BDM & reporting tools).

 

PROJECTS

PROJECT PROFILE: #1

Project Title

Data Migration from On-Prem to Cloud

Location

Bangalore

Role

Senior Software Engineer

Client

Stub hub

Operating Systems

Windows, MAC OS

Programming Languages/Tools

GCP, Airflow, Informatica, Oracle, Unix, Hadoop, Hive

Start Date

25-Sep-2019

End Date

2021

Project Description:

Stub Hub is an American ticket exchange and resale company. It provides services for buyers and sellers of tickets for sports, concerts, theater and other live entertainment events. It has grown from the largest secondary-market ticket marketplace in the United States into the world's largest ticket marketplace. While the company does not currently disclose its financials, in 2015 it had over 16 million unique visitors and nearly 10 million live events per month.

  • This project is to Migrate historical data from On-Prem database (Informatica/Oracle) to Google Cloud Platform.
  • Flat file is being generated from Informatica workflow & bq command used to load into Big Query in GCP.
  • Python script is written for incremental load which is from ecomm source to Big Query.
  • The daily load is scheduled in Airflow.

 

 
Matching Resources
My Project History & Feedbacks
Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon
Loading…

stuff goes in here!