OnBenchMark Logo

RAMAKANTH (RID : pp3lo7ajstt)

designation   GCP Developer

location   Location : DELHI, India

experience   Experience : 16 Year

rate   Rate: $20 / Hourly

Availability   Availability : Immediate

Work From   Work From : Offsite

designation   Category : Information Technology & Services

Shortlisted : 0
Total Views : 55
Key Skills
GCP ETL BIG QUERY AWS AZURE CI/CD MY SQL POSTGRE SQL
Discription

Ramakant Singh

Seeking Position of:          CloudData Architect

                                   

Primary Skills:

  • GCP:Have Extensive Experience in IT data analytics projects.Design Data Migration planning an on-premises to cloud. Using Compute Engine to design initial infrastructure.  Hands on experience in migrating on premise ETLs to Google Cloud Platform (GCP) using cloud native tools such as bigquery, Cloud SQL, Cloud Data Fusion, Cloud Data Proc, Google Cloud Storage, Composer. GCS bucket, G - cloud function, cloud dataflow (Apache Beam Java/Python), Pub/Sub cloud shell, Vertex AI, Document AI, GSUTIL, BQ command line utilities, Stack drive. Compute, network, IAM, VPC, firewall rules, Load Balancer, Cloud Deployment Manager, Logging and Monitoring, Cloud Studio, Data Security and Governance with Bigquery and ETL workflow.  Streaming workloads migrations. Document AI implementation.
  • Power BI/Tableau: Design Power BI Architecture, Experienced in Developing Report in Power BI on different sources like SSAS Tabular cubes, Azure and SQL Server. Transforming data through query editor, Parameterized Reports. Design Reports and Dashboards in Tableau, Forecasting, Formatting, Charts, Actions, Filters, and Calculations, Created Tableau scorecards, dashboards using stack bars, bard graphs, scattered plots, and geographical maps. Using Measures and Dimensions data from various databases by using data blending concept.
  • Data Modeling: Conceptual, Logical, Physical, Dimensional Modeling, Entity Relationship, Key Based Model, Attributed Model, Transformation Model, Referential Integrity, Denormalization, Normalization, Domain Dictionary, Forward Engineering, Reverse Engineering, Testing and Maintenance Data Model.
  • AWS:AWS Glue service. S3, Design DW in AWS Redshift, Lambda, Athena, RDS and DynamoDB, EC2, IAM, Airflow.
  • Azure: Using Azure based services for like Data Factory, Data Lake and Azure Synapse.

Secondary Skills:  

  • Matillion Cloud ETL,My SQL, PostgreSQL, Snowflake,Neo4j, CI / CD, Agile methodology

Experience:               Total Yrs. Exp.In IT :16 yrs.

 

Cloud Data Architect

Confidential, HYD                                                                                               June 27th, 2022 – Onwards  

Role & Responsibilities:Participate in Cloud Data Integration, business intelligence (BI) and enterprise information management programs by rationalizing the data architecture to support reuse, create reports, works with Data Engineers, DBA’s as part of Database Design and Development, working knowledge of software development methodologies, Agile, Extreme Programming, SDLC, and Waterfall Develop and maintain Enterprise Data Models for Enterprise Class Data Initiatives, developing and enforcing data modeling standards and best practices. Design Cloud Data Landing zone strategies. Prepare RFP for different clients.

Project 1:                  Fraud Detection Pipeline for payment transaction

                                    Role: Architect and Data Engineer

Environment:Pub Sub, Bigtable, Data Flow, Big Query, Vertex AI, Python

Role & Responsibilities: The objective of this project is when someone initiates a credit card or debit cart purchase, the transaction is sent for processing before the purchase can be completed. The processing includes validating the card, checking for fraud, and adding the transaction to the user's transaction history.Once thos

 
Matching Resources
My Project History & Feedbacks
Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon
Loading…

stuff goes in here!