parasgautam33@gmail.com

Hello, my name is

Paras Gautam.

I build data architectures that scale.

Senior Data Engineer with 7+ years of experience designing, building, and optimizing data pipelines across healthcare and banking sectors. Currently architecting modern data platforms with Databricks, Azure, and Delta Lake.

0+ Years Experience
0 Companies
0% Performance Gains
0x Report Efficiency
pipeline_status.sh
$ spark-submit --master yarn etl_pipeline.py
[INFO] Connecting to Azure Data Lake...
[OK] Bronze layer ingestion complete
[OK] Silver layer transformation done
[OK] Gold layer aggregation finished
[INFO] Data quality checks: ALL PASSED
[INFO] Pipeline completed in 2.4 min
$ |

01. About Me

I'm a data engineer who thrives on transforming raw, chaotic data into reliable, high-performance systems. My journey started in 2018 with banking data migrations, where I learned the critical importance of data integrity during high-stakes system transitions.

Over the years, I've evolved from migration projects to building end-to-end data platforms. At Abacus Insights, I led the migration of healthcare data to the Databricks Medallion architecture for BCBSMA, managing complex Master Data Management initiatives that enhanced data governance across the organization.

Currently at Insight Workshop, I architect modern data platforms using Azure Data Lake, Databricks, and Delta Lake -- building resilient ETL/ELT pipelines with CDC-based ingestion, schema evolution, and comprehensive data quality frameworks.

Beyond engineering, I'm passionate about mentoring. I've trained and guided junior engineers, helping them grow into confident data professionals. I believe the best data systems are built by strong, well-supported teams.

Paras Gautam
Kirtipur, Kathmandu
+977 9843078022
M.S. Data Science

02. Experience

Senior Data Engineer @ Insight Workshop

2025 - Present

  • Architect and maintain a modern data platform using Azure Data Lake, Azure Data Factory, Databricks, and Delta Lake.
  • Build and orchestrate ETL/ELT pipelines using Databricks Lakeflow (DLT) for both batch and streaming data.
  • Implement Unity Catalog and fine-grained access control to ensure secure and governed data access.
  • Collaborate with data analysts, scientists, and business stakeholders to design reusable data models and high-performance tables.
  • Enable CDC-based ingestion, schema evolution, and data quality checks for resilient data workflows.
  • Monitor and optimize compute resources for cost-effective and high-throughput processing.
  • Drive best practices around data versioning, lineage, and CI/CD for data pipelines using Git-integrated workflows.

Senior Data Engineer @ Abacus Insights Nepal

2022 - 2025

  • Led development and migration of BCBSMA client's data to the Databricks Medallion architecture, ensuring seamless transition and high data integrity.
  • Directed multiple Master Data Management projects focusing on data accuracy and governance that enhanced overall efficiency across systems.
  • Spearheaded initiatives to optimize SQL queries, resulting in a 30% increase in application performance and reduced processing times.
  • Improved team productivity by 25% through automation of deployment processes, significantly reducing manual errors.

Senior Data Engineer @ eXtenso Data

2020 - 2022

  • Developed a robust data pipeline from client sources to eXtenso's data warehouse, tripling the reports delivered to premium clients like Fonepay and eSewa.
  • Created dimensional tables for reporting, simplifying processes and enhancing report generation efficiency by five times.
  • Trained and mentored five junior staff members, fostering skills development and enhancing team capabilities.

Migration Lead @ CAS Trading House

2018 - 2020

  • Led the migration of data during implementation of the Finacle system for Citizen International Bank, ensuring timely and successful project delivery.
  • Collaborated with teams from Global IME Bank Ltd. and Janata Bank Ltd. to prepare ETL processes for data merging, streamlining bank operations.
  • Acted as migration lead for the merger project of Kumari Bank Ltd. and Deva Bikas Bank Ltd., ensuring smooth transition for all stakeholders.
  • Customized Finacle using scripting and Jasper reporting to meet client specifications.
  • Developed analytical BI reports incorporating big data solutions that improved decision-making processes across departments.

03. Technical Skills

Data Engineering

SQL Python PySpark ETL/ELT Delta Live Tables

Cloud & Big Data

Azure Data Lake Azure Data Factory Databricks Unity Catalog AWS Snowflake

Data Visualization

Power BI Databricks Dashboards BigQuery

Architecture & Tools

Apache Kafka Apache Airflow Git CI/CD Pipelines

Databases

Delta Lake MySQL PostgreSQL ClickHouse

Methodologies

Agile Data Governance Master Data Management Data Migration CDC Schema Evolution Query Optimization

04. Featured Projects

05. Education

Master's in Data Science & Computational Intelligence

Softwerica College of Ecommerce (Coventry University, UK)

2025

Bachelor's in Computer Science & Information Technology

New Summit College

2018

06. Blog

Thoughts on data engineering, architecture patterns, and lessons from the field.

Architecture Dec 15, 2025

Understanding the Medallion Architecture: Bronze, Silver, Gold

A deep dive into the Medallion architecture pattern used in modern data lakehouses, with practical implementation examples using Databricks and Delta Lake.

Read Article →
Data Pipelines Nov 28, 2025

Building Resilient CDC Pipelines with Delta Live Tables

How to implement Change Data Capture pipelines using Databricks DLT, with schema evolution and data quality checks for production-grade workflows.

Read Article →
Migration Oct 10, 2025

Lessons from Large-Scale Banking Data Migrations

Key lessons and strategies from leading data migration projects across multiple banks, covering ETL design, data reconciliation, and stakeholder management.

Read Article →
Performance Sep 5, 2025

SQL Query Optimization: Achieving 30% Performance Gains

Practical techniques for optimizing SQL queries in data-heavy environments, including indexing strategies, query planning, and partition pruning in Databricks.

Read Article →
Governance Aug 18, 2025

Data Governance with Unity Catalog: A Practical Guide

How to implement fine-grained access control, data lineage tracking, and governance policies using Databricks Unity Catalog in enterprise environments.

Read Article →
MDM Jul 3, 2025

Master Data Management in Healthcare: Challenges and Solutions

Navigating the complexities of MDM in healthcare data -- from patient record matching and deduplication to building a single source of truth across systems.

Read Article →

07. Get In Touch

I'm always open to discussing data engineering challenges, architecture design, or potential collaboration opportunities. Whether you have a project in mind or just want to connect, feel free to reach out.