OnBenchMark Logo

Brijesh

Software Developer
placeholder
Location Palwal
view-icon
Total Views68
availability
Shortlist0
back in time
Member since30+ Days ago
 back in time
Contact Details
phone call {{contact.cdata.phone}}
phone call {{contact.cdata.email}}
Candidate Information
  • User Experience
    Experience 5.6 Year
  • Cost
    Hourly Rate$8
  • availability
    AvailabilityImmediate
  • work from
    Work FromOffsite
  • check list
    CategoryInformation Technology & Services
  • back in time
    Last Active OnJuly 25, 2024
Key Skills
GCPPythonBigData
Summary

PROJECTS

Title : Teradata migration to bigquery Client : bed bath and beyond Location : Gurugram ,India Designation : senior data engineer Tools : airflow,teradata,sas,Netezza,bigquery About the Project: Netezza sunset migration project : In this project we worked with dynamic pricing team . In existing pipeline we are using SAS and Netezza for comparing pricing with other competitor of any product . after comparing price with other competitors will manage our product price. As per new architecture we are using bigquery for storing the data and airflow is used as scheduling tool.

• Prepared the design document as per the current architecture

• Created new tables in bigquery as per our new architecture

• Converted sas script into bigquery scripts

• Develop the dag for executing the bigquery scripts

• Airflow is used as orchestration tool PDM migration to bigquery : In this project we worked with PDM team for migration of existing Teradata pipeline to bigquery. In this project we get daily information related to product . Daily information is updated related To product . some new sku added or deleted . we manage product related information in this pipeline . as per new architecture we are using nexla for real time data processing from google pubsub and then we are storing file in gcs bucket

• Prepared the design document as per the current architecture

• Created new ingress layer ,operational layer and analytical table

• Converted the existing Teradata stored procedure into bigquery stored procedure

• Build the ingress layer dag for loading data into raw layer from gcs bucket

• Build the Operational layer dag for loading data from raw layer to operational layer

• Build the Analytical layer dag for loading data from operational layer to analytical layer

• We serve this analytical layer to our BI team for reporting MAO json feed : This project involves loading JSON data from pubsub Topic to NEXLA and then Ingress to BIqQuery tables using cloud composer to orchestrate jobs.Currently EOM is being used as an OMS system which will provide the CO-export data to EDW through TiBCO queue and ABI. Now as our IT infra is migrating on cloud so BBBY decides to change the EOM with MAO (Manhattan Active Omni) a cloud based Order Management system

• Created new tables for ingress and operational layer as per new architecture

• Build the data pipeline for exporting the data from bigquery table to gcs bucket

• Worked on Implementation of CCPA on existing ecom table which have PII data Verizon media api : We used yahoo Verizon media api for marketing our product . we used this api for getting information about user . as per our existing architecture we using Teradata for storing the data . now we are using bigquery for storing data as per our new architecture .

• Created tables as per our new architecture in bigquery

• Create the custom airflow operator for getting the information about customer using Verizon media api

• Process the information and strored into csv file

• Load that csv file into bigquery table

Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon