OnBenchMark Logo

Paneer (RID : 14f6gl5autwl5)

designation   BI Developer

location   Location : Noida, India

experience   Experience : 11 Year

rate   Rate: $24 / Hourly

Availability   Availability : Immediate

Work From   Work From : Offsite

designation   Category : Information Technology & Services

Shortlisted : 0
Total Views : 93
Key Skills
Power BI Azure Software Development Life Cycle(SDLC) MS SQL Server

Overall 11+ years of IT experience, including 7+ years of experience in building data-intensive applications, and tackling challenging architectural problems in the Retail Domain. Currently Working in Plan Reporting - Business Intelligence - a team of GAP Inc as a Business Intelligence Engineer. Worked in CSC India Pvt Ltd for 11 Years as a Software Developer in Client Projects like GAP Inc, eBay, UNUM, and Progressive Insurance

● Valid U.S. B-1 visa holder till 2027

● Onsite business visit in 2019 to client office in California, the USA for business discussion/proposals 


● 11 + years of IT experience in complete SDLC, including requirements gathering, designing systems, modeling data, implementing, configuring, testing, and delivering the software applications using DevOps and Agile Methodologies.

● Relational Database (MS SQL Server 2014) SSIS package creation, and deep understanding of Data modeling concepts.

● Experience in Azure - Azure Data Factory, Logic Apps, Azure Blob Storage

● High Familiarity with supporting tools like JIRA, Post Man, Confluence, GitHub, Pager Duty, Service now, Nagios, and Fiddler.

● Strong knowledge of MS SQL Server – Store Procedures, Functions,

● Deep understanding of Object-Oriented Programming.

● Good Performer in Root Cause Analysis and providing technical leadership to improve the software quality.

● Ensuring quality assurance of all aspects of the solutions including establishing metrics, applying industry best practices, and developing new automation tools using python.

● Experience in automating web testing and deployment through Jenkins CI/CD pipeline.

● Deep understanding of Retail Domain.

● Quick Learner and open to learn any technology


Bachelor of Engineering Kongu Engineering College Bharathiar University Coimbatore


.NET Framework 4.5/4.0/3.5/3.0,

Visual Studio 2017/2015/2013/2012,

C#, ASP.NET, XML, Web Services and MS SQL,

Azure Data Factory, Cube Reporting.


GAP Inc. PROJECT: PETE BI Integrator

The PETE BI Integrator application is one of the integral applications in the supply chain management of The Gap, Inc.'s retail space. The business objective of this project is to provide information about business volume, growth, and revenue for different regions, and countries, for brands like Gap, Old Navy, Banana Republic, etc., and for different markets and seasons. This application is built in C# and JAVA and runs as microservices. Integrated with Azure SQL DB as backend. We have SSIS Packages and ADF pipelines used for the ETL. Jenkins CI/CD Pipeline is used for Automating the deployment of the applications. Code is maintained in GitHub. Nagios, New Relic, PagerDuty, and Azure monitoring systems are used for monitoring and Alerting. Have used REST API consumption and Rabbit MQ as event queue messaging systems. Microservices listen to the events generated at RabbitMQ and process by calling multiple rest API endpoints and transforming the data and writing into the Azure SQL database.

ROLE: Business Intelligence Engineer


● Maintain the applications in the .NET platform.

● Building the ADF Pipelines and SSIS packages for Reporting and ETL

● Successful operational DevOps on a rotational basis for the team.

● Involving in the Deployment activities during the respective windows.

● Creating test cases and test plans for Validation of the applications and documenting in Confluence pages.

● Update the status of bugs and changes in the Confluence page and Jira.

● Involved in fine Tuning and Query optimization of the Database as required by the application.

● Check the tickets in Service Now for this application and ensure resolution within the defined SLA.

● Conduct root cause analysis for defects and issues assigned.

● Automate monotonous and repetitive tasks.

● Providing post-implementation, application maintenance, and enhancement support to the client.


The eBay Automotive component solution is meant for Catalog acquisition by preparing fitment catalog data for automotive aftermarket parts and analyzing automotive catalog data

ROLE: Software engineer


● Manage end-to-end data quality reporting process and quality auditing including defining reports & metrics. Continuous Data Quality metrics assessment and refining them as needed. Refresh and Monitor Data Quality dashboard

● Conduct Data Cleansing actions on Customer, Vendor, and Product master data.

● Review catalog for content quality based on pre-defined guidelines and by preparing SOPs.

● Hands-on experience with data processing/analysis technologies. Maintaining reports and dashboards to support CRM and BI team.

● Ability to understand and apply low-cost business solutions for complex problems.

● Used ADO.NET objects such as Data Reader, Dataset, and Data Adapter, for consistent access to SQL data source

● Jira for bug tracking

Matching Resources
My Project History & Feedbacks
Copyright© Cosette Network Private Limited All Rights Reserved
Submit Query
WhatsApp Icon

stuff goes in here!