As a Data Engineer, I specialize in building data infrastructure that drives analytics and business decision-making. With hands-on experience in Python, SQL, Airflow, and Databricks, I create robust ETL pipelines and ML workflows that scale with organizational needs.
To engineer data ecosystems that power intelligent insights and business transformation through automation and modern cloud tools.
To deliver efficient, secure, and scalable data solutions that enable organizations to extract maximum value from their data.
Building scalable ETL workflows using Apache Airflow, dbt, Spark, and Databricks for fast, reliable data processing.
Expertise in SQL, Redshift, BigQuery, Snowflake, and PostgreSQL for designing schema models and optimizing queries.
Automating ML deployment with MLflow, CI/CD tools, and Terraform; enabling efficient model delivery in cloud environments.
Designing insightful dashboards in Power BI and Tableau using DAX and row-level security to drive business decisions.
Using Docker, Jenkins, GitHub Actions, and Terraform to automate infrastructure provisioning and monitor data pipelines.
Working with AWS, Azure, and GCP to build, scale, and secure data workflows using Glue, Synapse, Lambda, and BigQuery.
Built regression models using NumPy and Pandas to predict housing prices based on features like lot size, cooling, and bedrooms. Cleaned 10K+ rows, compared RMSE scores of tree-based vs. linear models, and visualized trends using Matplotlib.
Developed a React Native mobile application with features like image upload, swipe compare, login control, and real-time form updates using Redux. Delivered store-ready builds and resolved 20+ permission and routing issues across 10+ screens.