OnBenchMark Logo

Sachin (RID : yb8pllexy3fe)

designation   Snowflake Developer

location   Location : Jaipur, India

experience   Experience : 15 Year

rate   Rate: $18 / Hourly

Availability   Availability : Immediate

Work From   Work From : Any

designation   Category : Information Technology & Services

Shortlisted : 1
Total Views : 93
Key Skills
Snowflake Sqoop Hive MapReduce Storm Spark Azure SQL AWS S3 SnowSQL Kafka Azure DevOps MapReduce Pig SnowSQL AWS S3
Discription

Sachin – 15 Years – Snowflake Developer

Summary
 Azure Solutions Architect and Snowflake Certified professional with more than 15 years of experience working in Data related projects (Data Migration, Data warehouse, Data analytics, Data Visualization, Big Data, Cloud Platforms, Data Governance etc.) in various capacities as Architect, Design, Development and Operations.
 Proven track record of successfully managing multiple projects as Solution Designer & Data Architect in a multi-vendor global environment for clients across multiple domains.
 Experience in designing and development of cloud solution by using Snowflake, Azure Data Lake, Azure Data Factory, Databricks, Delta Lake, Azure SQL, Azure Synapse, Power BI, Cosmos DB, Stream Analytics, Event Hub, Azure DevOps (for CI/CD), AWS S3, Glue, Athena, Redshift etc.
 Knowledge of Hadoop ecosystem and different frameworks inside it–HDFS, MapReduce, Pig, Hive, Sqoop, Zookeeper, Oozie, Spark, Storm, Kafka with Real-time processing Framework.
 Expertise in logical and physical data modeling (ER and Dimensional) including star schemas, snowflake schemas, and highly normalized data models using tools like Erwin.
 Strong hands-on experience in designing and building data warehouses and marts using Snowflake components like Snow pipes, Stages, Tasks, Stored Procedure, SnowSQL, Streams, Data Share etc..
 Develop best practices, standards, and methodologies to assist in the implementation and execution of Data Engineering projects.
 Strong ability to analyze and integrate data from multiple sources and multiple formats to produce meaningful data insights with precision and efficiency. 
 Extensive experience in Traditional, Agile and hybrid software development methodologies, both onshore and offshore. Certified SAFe (Scaled Agile Framework) Agilest


Skills

Cloud
Azure Data Engineer, Azure Data factory,
Databricks, Data Lake, Delta Lake, Lakehouse,
Azure Functions, Logic Apps, Azure Event Hub,
Azure Synapse Analytics, Azure Analysis
Services, Azure Stream Analytics, AWS S3, AWS
Glue,
Athena, Kinesis, Snowflake
Data visualization and BI tools Power BI, Tibco Spotfire, OBIEE, IBM Cognos, SSRS, SSAS
Databases Teradata, Redshift, Oracle, SQL Server, Sybase, Cosmos
DB, HBase
Integration and ETL Tools SSIS, IBM Datastage, Informatica, Oracle Warehouse
Builder, Talend
Languages Python, PL/SQL, SQL, Scala, Kusto Query Language, U-SQL
ML & Data Science Python data science packages, Azure ML
Big Data
Hadoop & other components of Hadoop ecosystem (Hive,
Pig, Sqoop, Impala, Oozie, Spark, Storm etc.)
Domain Knowledge ETRM/CTRM, Health and Environment, Telecom

Experience
Mar 2010 to Dec 2022

Project: Live Data Factory – Snowflake
Role: Data Architect – Snowflake, AWS Lamba, AWS S3, AWS Glue, Power BI
Description:
The client identified several reasons for the lack of proper data management - undefined data management strategy and governance within the organization, inconsistent data management operations, below-average data quality and the inability to see data lineage and ownership. This led to many issues such as poor data exchange processes and stakeholders experience in their interactions with the client’s data products, difficulty to onboard new businesses with automatic scaling etc.
We helped client to realize its vision of a consolidated data platform by delivering a solution with
Snowflake based PAAS leveraging common Snowflake components such as Snow pipes, warehouse,
tasks, streams, stored procedures, and functions. ETL jobs from the existing on-premises solution was
ported to Snowflake and re-factored to improve efficiency and reduce unnecessary complexity. Existing
reports in Power BI are re-wired in the cloud.
Responsibilities:
 Architect overall PaaS Data solutions using Snowflake, AWS S3, lambda, Glue etc.
 Design Data migration strategy and framework using Snowflake for migrating historical data from on premise databases to S3 and Snowflake.
 Copy Data from Source applications in different format (csv, xlsx, json etc.) into an AWS S3 landing zone. Data files are converted into snowflake compatible formats using AWS lambda.
 Integrate data from multiple sources and multiple formats (both full and incremental load) using Snowpipe and AWS Lambda to produce meaningful data insights u

 
Matching Resources
My Project History & Feedbacks
Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon
Loading…

stuff goes in here!