Senior Recruitment Consultant (Tech | Data & Analytics) at Cox Purtell Staffing Services
Are you an experienced Data Engineer looking for an opportunity to work on high-impact government projects? This role offers the chance to develop and optimise data pipelines in Azure, leveraging Python, SQL, and Azure Data Factory and Databricks to drive efficient and scalable data solutions.
This is a 4-month contract role with the possibility of a 12-month extension, based in Canberra with a hybrid work arrangement (minimum 3 days in-office per week). The position requires candidates to hold or be eligible for a Baseline security clearance and allows for a maximum of 40 hours per week.
Role Responsibilities:
* Develop, optimise, and maintain data pipelines using Python & SQL in Azure Databricks.
* Design and implement ETL/ELT workflows in Azure Data Factory for efficient data transformation and loading.
* Apply Kimball dimensional modelling and Medallion architecture best practices for structured, scalable data solutions.
* Collaborate with business stakeholders to translate requirements into technical data solutions.
* Implement and manage CI/CD pipelines using Azure DevOps & Git to ensure automated deployments and version control.
* Monitor, troubleshoot, and enhance Databricks jobs and queries for performance optimisation.
* Work closely with data analysts and BI teams to deliver well-structured, high-quality datasets.
* Ensure compliance with data governance, security, and privacy best practices.
* Contribute to code quality improvement through peer reviews, best practices, and knowledge sharing.
Requirements
Technical Skills & Experience:
* Proven experience in developing ETL/ELT processes for large-scale data movement and transformation in a cloud environment.
* Expertise in Python, SQL, and Azure Databricks for data pipeline development.
* Experience optimising query performance in distributed computing engines (Spark), Azure SQL, Python, or R.
* Hands-on experience with Azure Data Factory, Azure DevOps, and Git for CI/CD pipelines.
* Strong understanding of Kimball dimensional modelling (fact/dimension tables, star/snowflake schemas).
* Knowledge of Medallion architecture for structuring data lakes (bronze, silver, and gold layers).
Collaboration & Problem-Solving:
* Experience working within agile development teams and applying DevOps best practices.
* Strong problem-solving skills, with the ability to analyse and resolve complex data integration challenges.
* Excellent communication and stakeholder engagement skills to translate business requirements into technical solutions.
Other
* Security Clearance: Candidates must hold a valid Baseline security clearance or be willing and eligible to obtain one.
* Location Requirement: This role is Canberra-based, and candidates must be available to work on-site at least 3 days per week (hybrid work arrangement).
* Work Authorisation: Candidates must have Australian citizenship to be eligible for government security clearance.
Ready to take your data engineering career to the next level? Apply now and be part of a high-impact government transformation project!
#J-18808-Ljbffr