Senior Data Engineer
-
XeroFeb 2022 - Present
I am currently working remotely as a Senior Data Engineer at Xero
Role: Delivering Insights on Product Stability, Operational Health, Release Controls and Engineering Analytics.
Key Responsibilities:
- Design data pipelines from developer’s source platforms like GitHub, AWS, Cortex, etc.
- Code reviews and mobbing sessions with the team
- Building data models using Data Vault, dbt, SQL, Snowflake
- Orchestrate pipeline and deployments using Prefect and GitHub Actions
- Visualize key metrics in Tableau and MicroStrategy
Data Engineer
-
JobAdderAug 2021 - Feb 2022
Role: Contributed to data analytics and pipelining for JobAdder, a recruitment SaaS company.
Key Responsibilities:
- Introduced Python environment and package management using Conda.
- Introduced collaborative coding practices (Git Branching, PRs, Issues, Releases)
- Automated the build, deploy and test of pre-releases to Test environments via CI/CD pipelines using GitHub Actions, Bash, S3 and internal DevOps framework.
- Developed ETL module for data sourced from Monday.com’s GraphQL API in Python
- Bug fixes and enhancements (SQL, Python)
- AWS Infrastructure Management (AWS RDS, Redshift, EC2, SSM, IAM)
Data Engineering Consultant
-
ContinoJan 2020 - Aug 2021
Role: Data engineer in a consulting firm, delivering projects across various clients within the Data Practice. Longest project was Victoria’s Department of Health.
Key Projects:
- Everlight Radiology: Developed dimensional modelling for radiologist’s timesheet entries data using Amazon RDS (mySQL), Deputy API & Kimball Data Modelling.
- Woolworths: API testing for SuccessFactors using Postman, Firebase & Python
- Victoria’s Department of Health:
- Data migration for Melbourne’s COVID tracing system from legacy PHESS system to Salesforce using Azure Data Factory, Azure Functions, SQL and Salesforce API.
- Involved in data integration using ArcGIS for information around LGAs, Addresses, etc.
- Performed fixes of data issues and bugs to ensure accurate reporting of COVID data.
- Performed deployment of releases and hotfixes to Production during low user-traffic time windows.
- NAB: Optimized data deliveries and transitioned data warehouses from Redshift to Snowflake.
Data Engineering Consultant
-
Analytics8Oct 2018 - Dec 2019
Role: Data engineer in a consulting firm, delivering projects to a long term client, KFC (Yum! Brands).
Key Responsibilities:
- Built data pipelines using Azure, AWS S3, Snowflake, and Python for ELT/ETL of online customer orders (Deliveroo and Menulog), payment data (PayPal, Braintree and Adyen), Wi-Fi analytics (Purple), and KFC mobile app interactions.
- Integrated data between Braze, Google Analytics, and Facebook for audience segmentation in ad campaigns in liaison with Mediacom & O’Gilvy.
- Assisted in migrating data from on-prem SQL DB, Azure SQL DW, and Amazon RDS to Snowflake.
- Supported daily operations by maintaining parts of the Data Warehouse and SSIS scripts for data across 650+ KFC stores - such as performing bug fixes, health checks, and performance tuning during migration.
- Conducted data analysis for operational reporting issues in MicroStrategy and Power BI.
Role: Improved reporting and streamlined operational data for clients of QBT’s managed IT services.
Key Responsibilities:
- Automated billing reconciliation and sales monitoring using Power BI, SQL, and SSIS.
- Developed reports based on SLA data and trained staff on new implementations.
- Semi-automated the restructuring of SLAs in ConnectWise to enable automated renewals, pricing updates and invoice generation across 248 active small clients.
Technical Business Analyst
-
OptikaJul 2015 - Nov 2017
Role: Contributed to data analytics, automation, migration and systems design for clients. Longest client was Woodside Energy.
Key Projects:
- Catholic Education: Data migration from on-prem Oracle DBs to Azure Data Warehouse across 200 private schools.
- Woodside Energy: Developed ETL processes and BI reports for LNG markets using R and TIBCO Spotfire.
- Internal Projects: Improved DevOps processes and optimized product release cycles using GitLab CI/CD and Chef.