OnBenchMark Logo

Garvit

Sr. SOFTWARE ENGINEER
placeholder
Location Jaipur
view-icon
Total Views35
availability
Shortlist0
back in time
Member since30+ Days ago
 back in time
Contact Details
phone call {{contact.cdata.phone}}
phone call {{contact.cdata.email}}
Candidate Information
  • User Experience
    Experience 4 Year
  • Cost
    Hourly Rate$8
  • availability
    AvailabilityMore 1 Month
  • work from
    Work FromOffsite
  • check list
    CategoryInformation Technology & Services
  • back in time
    Last Active OnApril 17, 2024
Key Skills
Software Development Life Cycle(SDLC)APIXML
Summary

P R O J E C T S SOFTWARE ENGINEER APPCINO TECHNOLOGIES | JUNE 2021 - PRESENT NEOSALPHA TECHNOLOGIES | JULY 2019 - JUNE 2021 W O R K E X P E R I E N C E 3.6 years of experience in Data Integration. Self-directed & motivated IT professional with experience in Design, Development, and SDLC processes, with an aim to deliver effective business solutions with the best available technology. P E R S O N A L P R O F I L E This integration picks the Customer's data from Salesforce and sends it to Kong API which is later used for backend integration from API to SAP. This is simple integration with Complex Json mapping, consisting of 28 objects and 45 child objects in one go. Error handling used in this process is, if the record is not synced then we need to update back the status in Salesforce. Dell Boomi Salesforce to Kong API | July 2021 - Present Sr. SOFTWARE ENGINEER https://www.linkedin.com/in/garvitmaheshwari +91 - 7297889996 2-G-19, 20 Mahaveer Nagar Extension Kota, Rajasthan Garvit.maheshwari00@gmail.com C O N T A C T M E A T C E R T I F I C A T I O N S Dell Boomi, MuleSoft MYSQL DB HTML, CSS, Javascript, Groovy script Java, Serviet, and JSP Salesforce, Netsuite, Jira, Workday, ADO, APIs, JMS Jitterbit (on-premise) T E C H N I C A L S K I L L S This integration fetches data from Database and posts that data to the API endpoint. We have used JMS queues and topics to ingest and fetch the data from the Incremental to Delivery Process. Complex integration with the challenge of Historical data where Lakhs of records needs to process with the API logging for each chunk differs for every integration as per load that API can handle. This process consists of 4 process where 1st process is delta load which decides if this I delta run or historical load and send the data to the topic as per the need. A delivery process that inserts the data/chunks in the temp table and makes a query to the Database again based on chunkid and sends it over to API. If any chunk's record fails we have added an error handling where it keeps retrying 5 times and sends the same record to another process which makes it run independently. If it fails in there, then chunk/record goes to DL queue and runs next day as per schedule and discard from the queue. 

Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon