// BUILDING RESILIENT DATA ECOSYSTEMS

anwesha@macbook — zsh — 80×24

About

My Name is Anwesha Chakraborty, and I am a Lead Data Engineer / Data Architect

Results-driven Data Architect and Lead Engineer dedicated to building data platforms that meet the highest standards of operational resilience. With over a decade of experience in the Australian financial sector, I lead the delivery of end-to-end data integration frameworks designed for auditability, lineage, and scale. I am a hands-on mentor who excels at navigating fast-paced, regulated environments—leveraging Delta Lake, CI/CD, and DevOps to ensure data assets are secure, high-quality, and fully compliant with APRA standards.

Platform & Pipeline Expertise

Ingestion & Orchestration

Apache Airflow Apache Spark Apache Kafka Azure Data Factory Cloud Composer Dataflow Oozie

Transformation

dbt Python / PySpark SQL Databricks Dataproc Delta Lake

Governance & Quality

dbt Expectations Great Expectations Azure Purview GitHub Actions Codefresh

Skills

Languages

Python SQL R Java C# C Shell

Cloud Platforms

Azure
Synapse ADF Databricks
GCP
BigQuery Cloud Composer Dataflow Dataproc
AWS
Redshift S3 Lambda
On-Prem
Cloudera Redhat Openshift

Methodologies

Data Modelling Solution Design Ingestion Profiling Transformation

Tools & Databases

dbt Airflow SQL Server Teradata

Data Management

Data Quality
dbt Expectations Great Expectations
Data Lineage
dbt
CI/CD
Codefresh GitHub Actions

Experience

Senior Data Engineer – ANZ | KYC Uplift

Nov 2025 – Present

Working as a Lead Data Engineer within KYC uplift, building end-to-end data solutions that deliver trusted insights for financial crime, AML, and regulatory obligations. Responsible for stakeholder management across business and technology, embedding data governance, lineage, and quality controls, optimising data pipelines for scale and reliability, and coaching data engineers to support sustainable delivery.

Senior Data Engineer – ANZ | Customer Care

May 2024 – Nov 2025

Worked as a Senior Data Engineer building a proactive, customer-centric data approach across high impact processes (HIPs) such as bereavement, hardship, and remediation. Partnered closely with business, risk, and operations stakeholders — owning data governance and quality frameworks, improving pipeline performance and resilience, and mentoring data engineers to uplift team capability and delivery maturity.

Senior Data Engineer – ANZ | Retail & Commercial

Apr 2023 – May 2024

Architected enterprise Delta Lake transaction assets in Spark. Led 6 engineers through Teradata-to-OpenShift migration, standardising Airflow + dbt for data observability at scale.

Data Engineer – ANZ | Commercial & Responsible Banking

2020 – 2023

Built geospatial and payment pipelines for regulatory-grade analytics. Delivered R Shiny tools and automated CI/CD for legacy Python platform modernisation.

Data Engineer – AGL Energy

2019 – 2020

Designed Azure data platforms (ADF, Synapse, Databricks) for energy analytics and LRET/SRET compliance. Reduced ETL latency ~35% and implemented Microsoft Purview for data governance.

Earlier Roles – HCL, Infosys, Wipro

Software Engineer → Senior Software Engineer

Built Hadoop, Spark, Kafka & SSIS pipelines for BFSI and FMCG clients. Delivered AUSTRAC compliance systems and cross-border payment reporting at enterprise scale.

Work

dbt

TBD-dbt_learn_ac

Analytics engineering learning project using dbt for data transformation, testing, and documentation — applying real-world patterns from enterprise data pipelines.

View on GitHub

Blog

November 8, 2025

Data Foundations: Because Regulators Don't Accept 'It Worked on My Machine'!

Why mastering data governance, lineage, and quality matters more than chasing the latest tech — especially in regulated banking environments.

Read Post →

More posts on data engineering, regulatory compliance, and life in a regulated data platform team.

All Posts →

Connect

Name Email Message