About {company} and the Role:
We're a big company with a start-up attitude, based on hiring outstanding people and accepting nothing less than being the best at what we do. As a junior data modeller in the Business Intelligence & Analytics team, you will be responsible for leveraging large and complex data sets to assist in building robust, reusable and optimised data models for use across the organisation.
This is a hybrid role (2 days in the office per week) that can be done from Sydney, Melbourne or Brisbane. You will have broad technical skills of developing strategic models to support a wide range of use cases for analytics, business intelligence, self-service, machine learning models and other data driven projects.
Key Responsibilities:
* Assist in creating standardized logical and physical data models and tables, using best practices, to ensure high data quality and reduced redundancy.
* Contribute to the design, development and maintenance of scalable data pipelines for ingesting, transforming and storing data.
* Collaborate with data scientists and analysts to understand data requirements and implement solutions.
* Work closely with the business and BI analysts to surface enriched enterprise-level views enabling data driven decisions.
* Advise on processes that support the application of appropriate modelling, including designing and documenting mappings, ensuring all new development conforms to the agreed data model design.
* Enable self-service by assisting in the implementation of data governance standards at the implementation level.
* Support data quality and data validation through analysis of data and understanding the meaning of data.
* Demonstrate focus on continual self-improvement, across both the technical and business aspects of delivering high quality data enrichment solutions.
Requirements:
* A minimum Bachelor's degree qualification in engineering, technology or quantitative discipline.
* Experience with cloud-based technologies and databases, particularly GCP and Snowflake.
* Knowledge of data warehousing, ETL processes and data integration techniques in cloud environments.
* Familiarity with Python and shell scripting.
* Experience in designing and implementing dbt pipelines, setting up monitoring and troubleshooting issues.
* A strong data and technical background demonstrating innovation, initiative and a passion for data.
* Confidence in collecting, collating and interpreting business requirements to produce quality outcomes.
* Some experience working with various stakeholders such as product managers or platform owners to gather data requirements and provide solutions.