Dian Octaviani

Senior Data Engineer

With 10 years of experience in the tech industry, across both consulting and product-focused organisations, I combine analytical thinking with technical foundation to solve business-critical problems and uncover opportunities hidden in data.

I enjoy building things with data, turning them into interesting insights but I’m also passionate about working cohesively in a team-oriented environment. This is how I find myself most productive!

Experience

Senior Data Engineer - Xero
Feb 2022 - Present

I am currently working remotely as a Senior Data Engineer at Xero

Role: Delivering Insights on Product Stability, Operational Health, Release Controls and Engineering Analytics.

Key Responsibilities:

  • Design data pipelines from developer’s source platforms like GitHub, AWS, Cortex, etc.
  • Code reviews and mobbing sessions with the team
  • Building data models using Data Vault, dbt, SQL, Snowflake
  • Orchestrate pipeline and deployments using Prefect and GitHub Actions
  • Visualize key metrics in Tableau and MicroStrategy
Data Engineer - JobAdder
Aug 2021 - Feb 2022

Role: Contributed to data analytics and pipelining for JobAdder, a recruitment SaaS company.

Key Responsibilities:

  • Introduced Python environment and package management using Conda.
  • Introduced collaborative coding practices (Git Branching, PRs, Issues, Releases)
  • Automated the build, deploy and test of pre-releases to Test environments via CI/CD pipelines using GitHub Actions, Bash, S3 and internal DevOps framework.
  • Developed ETL module for data sourced from Monday.com’s GraphQL API in Python
  • Bug fixes and enhancements (SQL, Python)
  • AWS Infrastructure Management (AWS RDS, Redshift, EC2, SSM, IAM)
Data Engineering Consultant - Contino
Jan 2020 - Aug 2021

Role: Data engineer in a consulting firm, delivering projects across various clients within the Data Practice. Longest project was Victoria’s Department of Health.

Key Projects:

  • Everlight Radiology: Developed dimensional modelling for radiologist’s timesheet entries data using Amazon RDS (mySQL), Deputy API & Kimball Data Modelling.
  • Woolworths: API testing for SuccessFactors using Postman, Firebase & Python
  • Victoria’s Department of Health:
    • Data migration for Melbourne’s COVID tracing system from legacy PHESS system to Salesforce using Azure Data Factory, Azure Functions, SQL and Salesforce API.
    • Involved in data integration using ArcGIS for information around LGAs, Addresses, etc.
    • Performed fixes of data issues and bugs to ensure accurate reporting of COVID data.
    • Performed deployment of releases and hotfixes to Production during low user-traffic time windows.
  • NAB: Optimized data deliveries and transitioned data warehouses from Redshift to Snowflake.
Data Engineering Consultant - Analytics8
Oct 2018 - Dec 2019

Role: Data engineer in a consulting firm, delivering projects to a long term client, KFC (Yum! Brands).

Key Responsibilities:

  • Built data pipelines using Azure, AWS S3, Snowflake, and Python for ELT/ETL of online customer orders (Deliveroo and Menulog), payment data (PayPal, Braintree and Adyen), Wi-Fi analytics (Purple), and KFC mobile app interactions.
  • Integrated data between Braze, Google Analytics, and Facebook for audience segmentation in ad campaigns in liaison with Mediacom & O’Gilvy.
  • Assisted in migrating data from on-prem SQL DB, Azure SQL DW, and Amazon RDS to Snowflake.
  • Supported daily operations by maintaining parts of the Data Warehouse and SSIS scripts for data across 650+ KFC stores - such as performing bug fixes, health checks, and performance tuning during migration.
  • Conducted data analysis for operational reporting issues in MicroStrategy and Power BI.
BI Developer - QBT Consulting
Nov 2017 - Oct 2018

Role: Improved reporting and streamlined operational data for clients of QBT’s managed IT services.

Key Responsibilities:

  • Automated billing reconciliation and sales monitoring using Power BI, SQL, and SSIS.
  • Developed reports based on SLA data and trained staff on new implementations.
  • Semi-automated the restructuring of SLAs in ConnectWise to enable automated renewals, pricing updates and invoice generation across 248 active small clients.
Technical Business Analyst - Optika
Jul 2015 - Nov 2017

Role: Contributed to data analytics, automation, migration and systems design for clients. Longest client was Woodside Energy.

Key Projects:

  • Catholic Education: Data migration from on-prem Oracle DBs to Azure Data Warehouse across 200 private schools.
  • Woodside Energy: Developed ETL processes and BI reports for LNG markets using R and TIBCO Spotfire.
  • Internal Projects: Improved DevOps processes and optimized product release cycles using GitLab CI/CD and Chef.

Education

2012 - 2015
Bachelor of Science in Information Systems
Murdoch University
2012 - 2015
Bachelor of IT Management
Murdoch University
2010 - 2011
Certificate IV in IT (Networking & Web Development)
South Metropolitan TAFE

Get in Touch

My inbox is always open. Whether you have a question or just want to say hi, I’ll try my best to get back to you!