OnBenchMark Logo

SANTHI (RID : 210ulon0fphq)

designation   DATA ENGINEER

location   Location : DELHI, India, ,

experience   Experience : 6.3 Year

rate   Rate: $19 / Hourly

Availability   Availability : Immediate

Work From   Work From : Offsite

designation   Category : Information Technology & Services

Shortlisted : 0
Total Views : 51
Key Skills
Data Engineer AWS ETL DATAWAREHOUSE MATILLION SNOWFLAKE DATA ANALYSIS PYTHON
Discription

SanthipriyaPenumaka

SUMMARY

  • Around6yearsofexperienceinDesign,DevelopandImplementationofETLsolutionsand Snowflake development.
  • Experienceingathering, analyzinganddocumentingbusinessrequirements,functionalrequirements, designingand developingthemappingbasedon therequirements.
  • KnowledgeonSnowflake coreconceptsZeroCopyCloning,TimeTravel, Snow pipe,Streams, Tasks, Stages, Secure Shares, Caches, SnowSQL etc.
  • ExperiencedinMatillionETLtool.
  • ExperiencedinimplementingtheRESTAPIcallswithpythonandMatillionAPIcomponent.
  • Experiencedinimplementingthebusiness rulesbycreatingMatillion jobsusing s3Load, s3Unload, detectChanges, Sqlcomponent, Datatransfer,Bash,Pythoncomponent, SNSnotification etc.
  • StrongknowledgeonDatawarehousingconcept,Datamart,StarSchemaandSnowflakeSchema modeling, Facttablesand Dimensionaltables.Implemented SlowlyChanging Dimension for accessingthefullhistoryof accountsandtransaction information.
  • ExperienceinusingInformaticapower enterclient. Used various transformationslikeExpression, Aggregate,UnconnectedandConnected Lookup,Router,UpdateStrategy,Filter,Joiner,Union.
  • ExperienceinusingonAWSserviceslikeS3,SNS,SQS.
  • BasicKnowledgeandexperienceonBODS,BHOOMI,HVR,Perspectiumtools

 

SKILLS

 

 

  • ETL/DWHTools:Matillion,InformaticaPowerCentre,Talend,BODS,Bhoomi.
  • Databases:Oracle,HANA
  • Cloud:AWS,Snowflake
  • ProgrammingLanguages:SQL,UNIXShellScripting,Python
  • OperatingSystems:Windows,Unix
  • SchedulingTools:HWA,TMC
  • Data Analysis: DataDesign/Analysis,Business Analysis, DataCleansing, Data Transformations, Data Relationships, SourceSystemsAnalysis.

 

PROFESSIONALSUMMARY

 

 

  • WorkingwithConfidentialasSeniorDeveloperfromAug’2022totilldate
  • WorkedwithConfidentialasSeniorAssociatefrom Nov’2021toAug’2022
  • WorkedwithConfidentialUSIasConsultantfromDec’2020-Nov'2021
  • WorkedwithConfidentialIndiaasDataEngineerfromSep’20l8toNov’2020
  • WorkedwithConfidentialasApplicationDevelopmentAssociate fromSep’2017toSep’20l8.

 

EDUCATION

 



 

 

 

  • UniversityCollegeofEngineeringKakinada,B.TechinComputerScience,2017
  • DiplomainComputerscienceandengineeringfromM.B.T.SPolytechnicCollege,Guntur

 

PROJECTS

 

 

ProjectTitle:RTB

Role:SeniorDataEngineer

Description

RTBproject involvesinDevelopingthenewintegrations andEnhancementstotheexisting integrations andaswelloptimizing theintegrations inmatilion andsnowflake

Responsibilities:

 

  • Designedand developed variousMatillionjobs like flatfileload, database loads, APIbased integrations,Big QuerydataloadtoSnowflake.
  • DevelopedtheSnowSQLscriptstofindouttheduplicatesinthetablelevel.
  • ImplementedAuditmechanisms forMatillion jobsandcreated thecommon framework jobs for Flat file loads and database loads
  • Workedonvariouscreditoptimizationstasksonsnowflakedatawarehouse.
  • EnhancedthefewoftheBODSjobsandmigratedthemfromBODStoMatillion.
  • Replicatedthesomeofthetablestosnowflakeusingperspectium.

Environment:Matillion,Snowflake,Python,SQL,BODS,Perspectium

 

 

 

ProjectTitle:HCM(WasteManagement) Role: Snowflake/ Matillion Developer Description

HCMinvolvesEmployeepayrollrelateddatafordifferentregions.Themainfocusofthe project ismigrating thedatafrom people soft data toHCM cloud

Responsibilities:

 

 

  • Designedand developedMatillionjobstoloaddatainto variousSnowflaketables.
  • DevelopedtheSnowSQLscriptstoValidate,Cleanse,andStandardizeandsummarizethedata.
  • ImplementedErrorhandlingforMatillionjobsandworkedondifferentmethodsoflogging mechanism.
  • CreatedasharedjobsforreusabilityoferrorframeworkandHCMloads.
  • ImplementedthepythonscriptfortriggeringtheHDLjobstooraclecloud.
  • Workedwithdifferentfileformatslikefixedwidth,Flatfiles,semistructureddate.
  • Usedthevariouscomponentsliketruncate,S3Load,S3Unload,SNS,DataTransfer,DetectChanges

,filter,SQL,ExcelQuery,Python,BashScript,APIExtractetc.

 

Environment:Matillion,Snowflake,AWS.

 

ProjectTitle:AMIG&A(NISSAN) Role: Talend Developer Description:

G&AFinanceDatawarehouse involvesconsolidation ofgeneral andadmincostfrom differententitiesinAMIregion.ThisisapartofAMIDatawarehouseprogramthatdealswiththecreation ofanAMIdashboard forFinanceandAdminthat will leveragethevisualandanalytics ofTableau.

FoundationofITINFRAoncloudthatwillbecometheplatformforallDigitalinitiativesofAMI.

 

Responsibilities:

 

  • InvolvedinthedesignphaseoftheTalendjobs.
  • Created ETL/Talend jobs both design and code to process data to target databases   CreatedTalend jobstocopythefilesfromoneserver toanother andutilized Talend FTP components
  • Created Implicit,localand globalContext variablesinthejob
  • ExtractedtheDatafromthedifferentsourceslikeSAP,flatfiles,loadintotheAmazonS3bucket andthesnowflake internalstagetolanding,andthentostagingtoAMI datawarehouse(DWH).
  • UsedthevariouscomponentsliketMap,TFileInput,tLogRow,tDBRow,tJava,tContextLoad, tFileList, tFileExist, tFileRowCount,tJoin, tdbInput etc.
  • ImplementedcustomErrorhandlinginTalendjobsandworkedondifferentmethodsoflogging.

Environment:Snowflake,Talend,AWSS3,Flatfiles,SAP,JAVA

 
Matching Resources
My Project History & Feedbacks
Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon
Loading…

stuff goes in here!