OnBenchMark Logo

Tulasi (RID : g8qolgkntn9z)

designation   PowerBI Developer

location   Location : Noida, India,

experience   Experience : 6 Year

rate   Rate: $13 / Hourly

Availability   Availability : 1 Week

Work From   Work From : Offsite

designation   Category : Information Technology & Services

Shortlisted : 3
Total Views : 125
Key Skills
Power BI Snowflake Oracle SQL Snowpipe Dax ETL Window Server UNIX
Discription

Tulasi Reddy

(Snowflake Developer, SQL Server, PowerBI)

CAREER OBJECTIVE
Having 6 years of experience in SQL, Snowflake, Business Intelligence Tools PowerBI, PowerBI Service, and DAX. Looking forward to working with an esteemed organization that gives me an opportunity to utilize my existing analytical skills and help to make better business decisions.


Summary of Experience
✔ Experience in connecting with multiple data sources like SQL Server, and Excel from Power BI to build complex Data sets.
✔ Experience in performing the ETL Activities in Power Query.
✔ Good Experience in filtering Unwanted Data in Power Query.
✔ Creating visualizations like Pie Charts, TreeMap, Tabular, and Matrix by using PowerBI View
✔ Designing different types of reports like drill down and drill through reports in PowerBI.
✔ Created Dashboards style of reports using PowerBi Desktop Visualizations like Bar Charts, Line Charts, Area Charts, Slicer, etc.
✔ Implemented ROW LEVEL SECURITY (RLS) as part of security in PowerBI.
✔ Experience in sharing the Reports and Dashboards with End Users.
✔ Software Engineer with years of total experience in SNOWFLAKEDB development/support, and performance tuning.
✔ Exposure to writing complex SQL Queries and being involved in creating Tables, inserting rows, adding Relations, defining Primary and Foreign keys, and setting Default Values to columns.
✔ Strong skills include SQL, SNOWFLAKE.
✔ Detailed knowledge of Snowflake with experience in constructing Stored Procedures, functions, Exception handling, and Views.
✔ Hands-on experience with Snowflake utilities such as SnowPipe, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures.
✔ Experience in Data Migration from RDBMS to Snowflake cloud data warehouse
✔ Working with Semi-Structured Data, Views, JOIN
✔ Working with Snowflake Data Sharing
✔ Bulk loading from the external stage (AWSS3), Internal stage to the snowflake cloud using the COPY command.
✔ Loading data into snowflake tables from the internal stage using now SQL.
✔ Used COPY, LIST, PUT and GET commands for validating the internal stage files.
✔ Used import and Export from the internal stage(snowflake) from the external stage(AWS S3).
✔ Writing complex snow SQL scripts in Snowflake cloud Dataware house for business analysis and reporting.
✔ Used FLATTEN table function to produce a lateral view of the VARIANT, OBJECT, and ARRAY columns.
✔ Used SNOWPIPE for continuous data ingestion from the S3 bucket.
✔ Developed snowflake procedures for executing branching and looping
✔ Created clone objects to maintain zero-copy cloning.
✔ Data validations have been done through information_schema.


EDUCATION
B.S.C(Computer) from CALORX TEACHERS UNIVERSITY, AHMEDABAD


SKILLSET
Databases: Snowflake, Oracle SQL,
Programming Languages: SQL, SNOWPIPE, DAX
Operating systems: Windows XP/7/8, UNIX
DevelopmentTools: SQL WORKBENCH, DBEAVER, POWER BI DESKTOP

CAREER CONTOUR


Project#1:
ProjectTitle: Merck
Role: SnowflakeDeveloper
Environment: Snowflake cloud
.
Roles&Responsibilities:
● CreatingTables, Functions, Procedures, etc.
● Hands-on experience with Snowflake utilities such as Snowpipe, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures.
● Experience in Data Migration from RDBMS to Snowflake cloud Dataware house
● Working with Semi-Structured Data, Views, JOIN
● Working with Snowflake Data Sharing
● Bulk loading from the external stage (AWSS3), and internal stage to the snowflake cloud using the COPY command.
● Loading data into snowflake tables from the internal stage using snow SQL.
● Used COPY, LIST, PUT and GET commands for validating the internal stage files.
● Used important Export from the internal stage (snowflake) from the external stage (AWS S3).
● Writing complex snow SQL scripts in Snowflake cloud Dataware house for business analysis and reporting.
● Used FLATTEN table function to produce a lateral view of the VARIANT, OBJECT, and ARRAY columns.
●Used SNOWPIPE for continuous data ingestion from the S3 bucket.
●Developed snowflake procedures for executing branching and looping
●Created clone objects to maintain zero-copy cloning.

Project#2:
ProjectTitle: Roche
Role: SnowflakeDeveloper
Environment: snowflake cloud.
Roles&Responsibilities:
● writing complex SQL Queries and involved in creating Tables, inserting rows, adding Relations, defining Primary and Foreign keys, and setting Default Values to columns.
● Strong skills include SQL, SNOWFLAKE. 
● Detailed knowledge of snowflakes with experience in constructing Stored Procedures, functions, exception handling, and Views.
● Extensive experience involving Data Definition, Data Manipulation, Data Query and Transaction Control Language, use of complex queries, Aggregate Functions, Set operations, Joins, etc.

 
Matching Resources
My Project History & Feedbacks
Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon
Loading…

stuff goes in here!