Bluefin Resources is proud to partner with a prestigious government enterprise to recruit a Senior Data Engineer for a long-term contract. This exciting opportunity calls for a highly skilled professional with strong technical expertise across the Azure Data Platform, contributing to impactful data-driven initiatives.Position PurposeThe Senior Data Engineer will play a key role in delivering data-driven insights by providing expert service and support to the business in managing its data assets. This includes designing and maintaining efficient ETL processes to ensure seamless data flow between source systems and the data warehouse. With a strong commitment to safety and technical excellence, the Senior Data Engineer will ensure the reliability and integrity of the organisation’s data infrastructure.Key AccountabilitiesSafety: ensure all activities are undertaken with the safety of our people as the number one priority and always role model safe behaviour.Values: behave and make decisions in accordance with the agency's Values at all times.Designing and building scalable, reliable, and efficient data pipelines and architectures on the Azure platformDeveloping and maintaining data models (conceptual, logical, and physical), metadata management, and data cataloguing processes.Conduct diagnostic procedures to identify system improvements;Implement data optimisation methods and scaling of data storage.Plan and deployment of technology infrastructure and CI/CD data pipelines to store and process large amounts of data to serve analytics models.Develop and maintain databases by acquiring data from primary and secondary sources and build scripts that will make our data evaluation process more flexible or scalable across datasets.Maintaining relevant procedures, documentation, monitoring and control of all relevant systems and data to support information governance, integrity, privacy, security, and regulatory requirements.Enhance and maintain actuarial data marts within the data warehouse.Monitoring daily data quality and processes and remediating any operational issues.Designing, coding, testing, and deploying data pipelines for core datasets.Designing and maintaining data models, enabling the creation and sharing of key data assets.Provide level 3rd support to required incidents within agreed SLS and quality standards. Research emerging data technologies for their application in agency environments. Recommend innovation trials as relevant to the agency.Key ChallengesStandardize Data Quality tooling across the enterprise data assets and maintaining business process alignment.Cost Optimisation while managing big data complexity and performance.Mandatory Candidate RequirementsQualifications:Relevant tertiary qualification in Computer Science, e.g., Bachelor’s degree in a Management Information Systems, Information Technology/Computer Sciences field.Relevant intermediate/advanced Azure Data certifications.Current NSW Drivers Licence.Knowledge:Proven experience with data management.Strong experience with Azure Data Warehouse, Azure Data Explorer.Experience in Azure Data Factory (ADF) for setting up data warehouses on Azure.Agile delivery methodologies.Experience:Proven experience with data management.Experience with technologies such as Power Platform, Synapse, Databricks, Azure Data Factory, Azure Data Explorer, Power BI.Strong data warehousing experience.Experience with modern data pipelines, data streaming, and using real-time analytics tools.Expertise in data modelling and ELT/ETL strategies.Ability to build and design data platforms for integration, reporting, and data science workloads.Strong SQL skills. Proficiency in SSIS, SQL and KQL.A self-starter with minimal supervision who can work across a diverse range of problems, contexts and at times changing priorities.Analyses, designs, plans, executes, and evaluates work to time, cost and quality targets.Excellent communication skills and ability to work collaboratively with cross-functional teams.Solid understanding of data security, data classification, and access control measures.Good understanding of Data Governance.Knowledge of industry leading data quality and data protection management practices.Favourable Candidate RequirementsFamiliarity with Azure services such as Azure Synapse Analytics, Azure Data Lake Storage, Azure Stream Analytics, Azure Data Catalogue, ML Studio, AI/ML, Azure Functions, Azure DevOps, CI/CD, Azure Event Hubs, Logic Apps, Apache Spark, Docker Containers, Azure HDInsight, and Azure Databricks.Experience with big data technologies.Good programming skills in Python and/or Scala.Experience with MDM, metadata management, and data cataloguing.Familiarity with Microsoft Power BI.Excellent problem-solving skills.Consulting experience on Azure is a plus!Knowledge of data governance practices, business and technology issues related to management of enterprise information assets and approaches related to data protection.Knowledge of data related government regulatory requirements and emerging trends and issues.Experience working in DevOps and Agile environments, as well as continuous.What's Available:This opportunity is an initial 8 month contract with possible chances of further extensions. The role is hybrid with 2/3 days on-site (Sydney, Parramatta) and 2/3 WFH.You'll collaborate with a team of dedicated individuals and contribute to a meaningful and fulfilling project where your impact will truly matter.If you wish to be considered for this position, please submit your application online along with your most up-to-date resume. Alternatively, you can reach out to Lee Bartlett at 0410 744 438 for a more in-depth discussion.
#J-18808-Ljbffr