Bluefin Resources is proud to partner with a prestigious government enterprise to recruit a Data Pipeline Engineer for a long-term contract with a strong likelihood of extension. This is an exciting opportunity to join a multi-year greenfield project, driving cutting-edge data modernisation initiatives. 2 x Data Pipeline Engineer The Data Pipeline Engineer - A data pipeline engineer designs, builds, tests, and maintains data pipelines that move data between systems. They work with data scientists and analysts to ensure data quality and security. Typical activities include : Design: Creating scalable data pipelines that meet project requirements Maintain: Optimising pipelines for performance and scalability Monitor: Troubleshooting issues and documenting processes Collaborate: Working with analysts and data scientists to understand data needs Automate: Writing scripts to automate repetitive tasks The Data Pipeline Engineer will work with a project team and other SMEs to ensure that the data pipelines support the ‘Crack Down on Fraud’ program of work and meets the business requirements of the program Key duties and responsibilities The Data Pipeline Engineer will provide data pipeline design and advice across one or more of the above-mentioned capabilities. The candidate will: need to have technical expertise in scripting languages (e.g. AWS Glue, HashiCorp Terraform) have a solid understanding of cloud-based architecture relating to data storage and pipelines understand data processing of streams (e.g. AWS Kinesis) understand and be able to implement security and access controls around data pipelines have high level of communication skills, demonstrating an ability to communicate at the technical and business levels. have experience in Agile development methodologies have experience in data pipeline ETL tools (e.g. AWS Glue, Informatica, DBT, Talend etc) Very good SQL skills tertiary qualifications in an ICT related field or applicable industry certifications. Essential criteria Minimum of five years’ experience working as a Data Pipeline Engineer in a Cloud Computing environment Demonstrated experience configuring (design, build, test, deploy) varying types of data pipelines solutions e.g. MFT, API, File extracts etc. Good command of Amazon Web Services as it relates to Data Storage, Extraction, Transformation, Data processing frameworks A proficient understanding of security concepts around the above areas Experience with Agile development methodologies (Scrum, Kanban, Lean, Xtreme Programming) Good problem-solving and communication skills Desirable criteria Experience with GoAnywhere MFT 2 Experience with Ctrl-M orchestration Experience with CI/CD automation / GITHub / Bitbucket What's Available: This is an initial 12 month contract with a strong possibility of extension. The role offers flexible working options and is based in either the Sydney, Canberra or Melbourne offices. You'll be working alongside a team of dedicated professionals, contributing to a meaningful and high-impact project where your expertise will make a real difference. To apply, please submit your updated resume online. For a more in-depth discussion, feel free to reach out to Lee Bartlett at 0410 744 438. Consultant internalBusinessintelligencebluefinresources.com.au Reference number: BH-59565 Profession:Data & Analytics Data Engineering Company: Bluefin Resources Date posted: 7th Apr, 2025