Sai is a Cloud Engineer with 10+ years of experience specializing in Azure Data solutions, including ETL and data warehousing. He has a proven track record in designing and implementing robust data pipelines and cloud-native solutions.
Designed and deployed Azure Data Factory pipelines for varied data sources, ensuring reliable data ingestion.
Engineered Azure Data Bricks notebooks for advanced data transformations, optimizing analytics.
Implemented CI/CD pipelines using Azure DevOps, facilitating automated deployments and migrations.
Migrated on-premises SSIS packages to Azure Data Factory, modernizing ETL workflows.
Provided technical guidance to teams, ensuring successful project delivery and bug resolution.
Successfully migrated SSIS Packages to Azure Data Factory, modernizing ETL workflows.
Developed complex T-SQL Stored Procedures, Views, and Functions for comprehensive database management.
Implemented CI/CD pipelines using Azure DevOps, enhancing deployment efficiency.
Overview: This project involved working for a multinational pharmaceutical company, focusing on data related to a safety bundle. Responsibilities: Involved in the design, development, and bug fixing phases of the project. Created Azure Data Factory pipelines to copy data from API sources to ADLS Gen2 in CSV format. Developed multiple Azure Data Bricks notebooks using PySpark and Pandas to extract and transform CSV data into delta tables. Loaded transformed data into a data warehouse using a medelian architecture and created views for reporting.
Key outcomes:
Successfully implemented API to ADLS data ingestion pipelines.
Transformed CSV data to delta tables for enhanced data analytics.
Deployed Azure services to higher environments via automated DevOps pipelines.
Overview: This project supported a multinational pharmaceutical company, focusing on data-related operations and system enhancements. Responsibilities: Involved in the development and bug fixing phases. Created Data Domain Provisioning for Azure Data Factory, Azure Data Bricks, and SQL Server in Azure DevOps via PowerShell. Applied ACLs to containers and folders in ADLS Gen2 to manage data access.
Key outcomes:
Implemented data domain provisioning for key Azure services.
Enhanced data security by applying ACLs in ADLS Gen2.
Overview: This project involved a technology refresh for a multinational pharmaceutical company, focusing on updating and enhancing data systems. Responsibilities: Involved in the design, development, and bug fixing phases. Created Azure Data Factory pipelines to copy data from On-Prem SQL Server to ADLS Gen2 in CSV format.
Key outcomes:
Successfully migrated on-premises SQL Server data to Azure Data Lake Store Gen2.
Enhanced reporting capabilities through delta table views.
Overview: This project focused on business intelligence for an insurance company offering various specialty and niche-market insurance products. Responsibilities: Interacted with onsite personnel for requirements clarification and analyzed functional specifications with managers and leads.
Key outcomes:
Designed and implemented a robust BI solution on SQL Server 2014.
Automated data export/import and warehousing processes using SSIS packages.
Metagenics — nutrigenomic data company + Azure Data Factory pipelines from On-prem to data warehousing.
Key outcomes:
Established a multi-stage data ingestion pipeline from on-premises to DWH using Azure Data Factory.
Ensured data quality and integrity through comprehensive unit testing of ADF pipelines.
SAI NAG
Azure Lead