I'm the Azure Data Engineer behind robust data pipelines — the one who turns complex datasets into actionable strategies. Over the past 4.9 years at Deloitte and Celebal Technologies, I've designed and delivered end-to-end pipelines across auto-finance, real estate, manufacturing, and automotive sectors. My toolkit spans Azure Databricks, Data Factory, Synapse Analytics, Delta Lake & PySpark — migrating 2,500+ tables to modern Lakehouse architectures, cutting runtimes by 65%, and shipping with zero data loss.

From enterprise migrations at Deloitte to multi-client delivery at Celebal Technologies.
Deep expertise across the Microsoft Azure data ecosystem.
Real-world data pipelines I've designed, built, and shipped to production.
Led the migration of 2,000+ production tables from a legacy Informatica ETL system to Azure Databricks, modernizing the entire data platform. Designed end-to-end workflows and job orchestration using notebooks, tasks, and job clusters to automate scheduled data loads. Built a dynamic, reusable loading framework handling SCD Type-1/2, Truncate Load, and Append Load — reducing per-table build time by ~70%. Analyzed complex business specifications and converted them into efficient SQL scripts. Applied Delta Lake optimizations (Z-ordering, OPTIMIZE, partition pruning, caching) to meet aggressive SLA windows.
Configured Synapse Link to establish real-time data ingestion from Dynamics 365 Finance & Operations into ADLS in Delta format — replacing overnight batch processing entirely. Led the migration of 500+ tables ensuring 100% data integrity throughout. Designed and implemented Delta Live Tables pipelines in Azure Databricks, building scalable Bronze, Silver, and Gold layers. Implemented complex transformations and quality enforcement in the Silver layer for data consistency. Engineered 200+ curated Gold views for Power BI reporting, applying advanced query optimization that reduced dashboard load times by ~40%.
Developed an end-to-end data engineering solution using Delta Live Tables on Databricks to process and optimize data pipelines for manufacturing analytics. Ingested high-frequency sensor data from ADLS into a Delta Bronze table using Auto Loader with schema evolution support — handling unpredictable formats without pipeline interruptions. Designed Silver-layer transformations for unit normalization, outlier handling, and business rule enforcement. Collaborated with the client to define Gold table requirements, delivering actionable insights that enabled faster decision-making across production lines.
Created fully automated ADF pipelines with parameterized designs supporting multiple load types — full load, delta load, truncate and reload. Implemented scheduled triggers to refresh updated data into tables at defined intervals. Integrated Azure Logic Apps for automated email notifications on pipeline failures, cutting mean time to resolution by ~50%. Built SQL Server views and stored procedures per business requirements, enabling BI teams to access production data directly without pipeline dependency — unlocking faster reporting and analysis cycles.
Validated expertise across Microsoft Azure and Databricks ecosystems.