I am building my career in Data Engineering with a strong background in finance & operations.
I focus on creating well-structured, reliable data workflows using Python, SQL and cloud services.
Data Engineering
- ETL / ELT workflows
- Data cleaning & transformation
- Working with APIs
- Data modelling (star schema, dimensional approach)
Programming & Tools
- Python (pandas, PySpark)
- SQL (PostgreSQL)
- FastAPI basics
- Git & GitHub Actions
Cloud & Infrastructure
- AWS: S3, Lambda, IAM
- Serverless workflows
- Infrastructure as Code (Terraform)
- Building cloud-based ETL pipelines
- Strengthening Spark/PySpark skills
- Deploying small serverless data workflows
- Improving data modelling and architecture fundamentals
- Developing practical skills in machine learning (classification pipelines, embeddings)
- Exploring modern AI tools and LLM-based workflows to enhance data engineering solutions
I previously worked in finance & operations within the e-commerce sector, where I developed strong analytical skills, process understanding, and experience with data-driven decision making.
This background now supports my transition into Data Engineering.
I am currently studying at Northcoders (Data Engineering Bootcamp), developing cloud-first and data-focused engineering skills.
“Decode the system. Understand the pattern. Engineer the solution.”
— yanade
