Exclusive FY 2023-24 Year End Offer - upto 40% off | Hurry up - Offer valid till 31st March 2024
OnBenchMark Logo

Veeraj (RID : 6bz5ld02kawq)

designation   Data Engineer

location   Location : Ahmedabad, India, ,

experience   Experience : 4 Year

rate   Rate: $14 / Hourly

Availability   Availability : Immediate

Work From   Work From : Offsite

designation   Category : Information Technology & Services

Shortlisted : 18
Total Views : 88
Key Skills
Data Engineer Big Query SQL GCS
Discription

Projects
Project 02
Client: Eshopbox
Project : Sales data migration
Role: GCP Data Engineer
Technologies: SQL, Big Query, Airflow/Composer, GCP
 Description :
Eshopbox is an all-in-one ecommerce logistics platform using modern software to provide fast and affordable fulfilment. The company helps brands manage their ecommerce lifecycle from inventory managementand order fulfillment to delivery and returns, with its digital platform, For quick delivery at affordable rates, Eshopbox stores inventory on behalf ofits customers in fulfilment centers across India and works with multiple courier partners.
 Roles and Responsibilities :
➢ Developed BQ DDL and DML scripts to Load data into BigQuery from GCS bucket
➢ Analyze and organize the raw data as per the business requirements
➢ Evaluation of business requirements and objectives
➢ Creation of BigQuery Data sets, tables and pipelines to store and analyze the data
➢ Use SDK shell to interact with GCP components to configure the services and Storages
➢ Experience in handling CSV, Avro, Parquet file formats.
➢ Developed Audit tables for reconciliation and metadata tracking purposes.
➢ Create authorized views in BigQuery to provide required data to downstream applications.
➢ Using Source code in GitHub repository.
Project: 01
Client: AppsFlyer
Project : Sales Data Analytics
Role: Big Data Developer
Technologies: BigQuery, SQL,GCS
 Description :
AppsFlyer is a leading mobile attribution and marketing analytics platform, used by top B2C brands worldwide to measure the effectiveness of their marketing campaigns, as well as the adoption and usage of their mobile apps. The platform serves more than 100,000 users and handles traffic of more than 80 billion events a day.
 Roles & Responsibilities :
➢Responsible for creating the GCP buckets, Datasets, Big Query tables
in different layers of BQ projects.
➢Loading the data from Hadoop to Big Query using Dataproc and then processing the data into Big Query and creating authorized views on top ofthat as per the Entitlement Logic.
➢Processing the data from Landing zone to BigQuery stage-audit-layer and historyaudit-layer.
➢ Create high level and low level specification documents.
➢ Responsible for SIT walk through to Business to get Sign-off.
➢ Used GitHub to manage and deploy the code version.
➢ Actively participated in all sprint phases/activities.
➢ Worked with software development and testing team members to design and develop robust solutions to meet client requirements for functionality, scalability and performance

 
Matching Resources
My Project History & Feedbacks
Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon
Loading…

stuff goes in here!