Available for new opportunities

Senior Data Engineer & Platform Architect

Building scalable data platforms and engineering solutions with 10+ years of experience. Specializing in cloud architecture, ETL pipelines, and Medallion data architectures.

Arnob Kumar Dey - Senior Data Engineer
// About Me

Crafting Data Solutions for Enterprise Scale

With over 10 years of experience in data engineering and platform architecture, I've helped Fortune 500 companies transform their data infrastructure. My expertise spans from designing robust ETL pipelines to implementing enterprise-wide data governance frameworks.

I hold an M.Tech from BITS Pilani and have developed a deep expertise in cloud-native data architectures, particularly the Medallion Architecture pattern that enables reliable, scalable, and maintainable data lakes.

10+ Years Experience
50+ Projects Delivered
5+ Fortune 500 Clients
// Technical Skills

My Tech Stack

Expertise in modern data engineering tools and cloud platforms

AWS
Python
SQL
Terraform
Data Engineering
ETL
Data Pipeline
PySpark
Medallion Architecture
Databricks
// Case Studies

Featured Projects

Enterprise-scale data solutions that drive business value

Data Modernization Finance

Recall Modernization & Data Mart Development

S&P Global

Legacy system hosted on VMware needed modernization for better scalability. Migrated the end-to-end system to a managed AWS environment and architected a centralized "Recall Data Mart".

Migrated legacy VMware system to managed AWS
Architected centralized Recall Data Mart
Optimized query performance with AWS Glue & Apache Iceberg
AWS Glue Apache Iceberg Data Mart
Cloud-Native Aviation

OE Scheduling Optimizer

Delta Airlines

Manual pilot scheduling needed automation to ensure strict FAA regulation compliance. Led the design of a cloud-native scheduling optimizer using AWS CDK, Lambda for orchestration, and SageMaker for model deployment.

Automated pilot scheduling with FAA compliance
Cloud-native architecture with AWS CDK
ML model deployment via SageMaker
AWS CDK Lambda SageMaker
Data Quality Healthcare

Data Validation Framework

Internal Project

Designed a custom framework using Great Expectations to enforce data quality standards across pipelines handling FHIR healthcare data, ensuring compliance and reliability.

Custom Great Expectations framework
FHIR healthcare data compliance
Automated data quality enforcement
Great Expectations FHIR Data Quality
// Credentials

Professional Certifications

Verified expertise in cloud and data engineering technologies

AWS Certified Developer Associate

AWS Certified

Developer Associate

Expertise in AWS services, deployment, and security best practices

Databricks Data Engineer Associate

Databricks Certified

Data Engineer Associate

Proficiency in Databricks, Apache Spark, and lakehouse architecture

// Get In Touch

Let's Connect

Have a project in mind or want to discuss opportunities? I'd love to hear from you.