OnBenchMark Logo

Bindu (RID : yb8pllexhjqf)

designation   Snowflake Developer

location   Location : Jaipur, India

experience   Experience : 5 Year

rate   Rate: $18 / Hourly

Availability   Availability : Immediate

Work From   Work From : Any

designation   Category : Information Technology & Services

Shortlisted : 0
Total Views : 94
Key Skills
SNOWFLAKE API’s Data Governance Snowpipe Data Security JSON CSV XML SnowSQL Python
Discription

Bindu – 5 Years – Snowflake Developer


TOTAL WORK
EXPERIENCE
Total IT work experience is 5 years 1 months , relavent experience on Snowflake is 5 years.
SKILLS

Skills: SQL, PLSQL, SnowSQL, SNOWFLAKE, Phyton, Data Warehouse, Virtuval Warehouse, Snowpipe, Data
Masking, Data Security, Data Sharing, Data Governance, API’s, Data Cleaning, Statistical Analysis, Data Modeling,
pipelines, Data ingestion
Cloud technologies : AWS,
AZURE UTILITIES WORKED ON
SnowSQL, Snowpipe, Tasks, Streams, Time Travel, Data sharing, Stored procedures, SCD’s(slow changing
dimensions), Functions,Stored Procedures, RDBMS, Virtuval warehouse monitoring, Partitioning, Clustering,
Cloning, View’s, Materialized views, Star schema, OLTP,OLAP, Airflows, ELT, Snowflake architecture and
processing, Aws s3data storage(lamada) and implimenting
SQL all DDL, DML commands, Joins & Analytical functions. knowledge
on snowflake databases, schemas & table structures. TOOLS
ETL : Pentaho, Apache airflows, Informatia , Fivetran
DBT
Experience with JSON, CSV, XML, ORC, Paraquet


WORK EXPERIENCE
Confidential |Snowflake Developer | Jan. 2023 – Present

Customer Name: ETO Motors Pvt Ltd.
Analyzed technical specifications of all ETO Motor products and compared them with their fossil fuel counterparts,
resulting in valuable insights for the business
Loading there own data into snowflake as per their requirement and creating snowpipe to load live data loading from
Aws s3 bucket into snowflake using snowpipe and setting e-mail alerts
Loading data from different sources like structure and unstructured data(csv,json,xml,parquet)

Confidential |Snowflake Developer | june 2018 – jan 2023

Customer Name : COMPUTER LIABILITY INSURANCE COVERAGE
Environment : SQL, SNOWFLAKE,AWS
Role : SQL Developer, SNOWFLAKE Developer

Duration : From JUN 2018 to April 2020
Responsibilities:
• Involved in Requirement Analysis & preparation of Functional Design documents.
• Table DDLs creation in snowflake development database.
• Have used COPY statements to ingest data from stage to Tables.
• Implemented snow pipe for real-time data ingestion.
• Cloned Production data for code modifications and testing.
• Work with multiple data sources.
• Created data sharing out of snowflake with testing team.
• Worked with streams for change data capture and implemented SCD.
• Implemented solutions using snowflake’s data sharing, cloning and time travel.
• Involved in unit testing and Integration Testing.
• Establish and ensure adoption of best practices and development standards.
• Communicate with peers and supervisors routinely, document work, meetings, decisions.
• Bulk loading from external stage (AWS S3) to internal stage (snowflake) using COPY
• Loading data into Snowflake tables from internal stage and on local machine.
• Used COPY, LIST, PUT and GET commands for validating internal and external stage file
• Used import and export from internal stage (Snowflake) VS external stage (S3 Bucket).
• Writing complex Snowflake sql scripts in Snowflake cloud data warehouse to Business
• Responsible for task distribution among the team.
• Perform troubleshooting analysis and resolution of critical issues.
• Created database according to business requirements.
• Used COPY to bulk load the data.
• Created Snow pipe for continuous data load.
• Worked on Data validation between Oracle and Snowflake databases.
• Created internal, external stage and transformed data during load.
• Shared sample data using grant access to customer for User Acceptance Testing (UAT).
• Involved on Data cleansing, is the process of detecting and correcting or removing .


Responsibilities:

• Understanding the business functionality of the system.
• Involved in Migrating Objects from SQL to Snowflake.
• Created Snowpipe for continuous data load and bulk data load purpose used COPY.
• Created internal and external stage and transformed data during load.
• Cloned Production data for code modifications and testing.
• Shared sample data using grant access to customer for UAT.
• Historical data retrieved by using Time travel and also used for missed data to recover.
• Heavily involved in testing Snowflake to understand best possible.
• Analyzing the areas of service improvements and implement fix for the same
• Developed SQL queries based on customer requirement using various joins.
• Performing all SDLC phases to complete ETL development work

• Created views, views, indexes, hints.
• Executing the automated scripts and updating the logs. Involved in execution of the scrip
• Issues reporting/clarifications, raising bugs if product issues.
• Attending daily stand-up meeting and give updates.
• Involved in daily run for Regression.
• Team Management in knowledge transfer as well as mentoring activities
• Providing weekly & monthly report on project activities and performance.
• Preparing solution approach documents, Design documents and Requirement Traceability Matrix for new
• Working on effort estimation of change requests.
• Performing data analysis and resolution of issues as part of production support
• Developed parameterized reports, drill down, drill through reports, sub reports as per the report
• Implemented complex business logic through T-SQL stored procedures, Functions, Views

EDUCATION
PG : MBA (HR) June 2018 , Percentage : 72%
Bachelors in Technology majoring in ECE , Percentage: 76%
Intermediate in Mathematics, Physics and Chemistry , Percentage : 85.5%
SSC Percentage : 83.2%


******THANK YOU******

 

 

 

 

 

 

 
Matching Resources
My Project History & Feedbacks
Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon
Loading…

stuff goes in here!