Exciting Opportunities for Senior GCP Data Engineers!Are you a highly skilled GCP Data Engineer looking for a new challenge? Join our team at INGRITY, where we have multiple opportunities for talented professionals like you. Your expertise in data engineering will be instrumental in delivering innovative solutions for our clients.About INGRITY:The Company INGRITY is a Microsoft data and AI solution partner, working closely with Microsoft to create value for our customers. We collaborate with some of the best ASX-listed companies and many medium-sized businesses, delivering transformative data and AI-driven solutions. Our success is built on innovation, customer advocacy, and our strategic partnership with Microsoft.About the Role:As a Senior Data Engineer, you will play a key role in designing, developing, and optimising data solutions on Google Cloud Platform (GCP). You will work closely with top-tier clients to implement scalable and reliable data pipelines, ensuring efficient data processing and analytics.Key Responsibilities:Design, develop, and maintain scalable data pipelines using GCP services such as BigQuery, DataFlow (Apache Beam), and Cloud Storage.Implement ETL/ELT processes for data ingestion, transformation, and loading.Optimise data workflows for performance, scalability, and reliability.Ensure data governance, security, and compliance best practices.Collaborate with data scientists and analysts to enable seamless access to structured and unstructured data.Maintain code repositories and CI/CD pipelines to ensure efficient deployment processes.Troubleshoot and resolve data infrastructure issues, minimising downtime and performance bottlenecks.Stay up to date with industry trends and emerging technologies, continuously improving data engineering practices.Key Skills & Experience:5+ years of experience in data engineering, with a strong background in Google Cloud Platform (GCP).Expert knowledge of SQL for data querying and manipulation within BigQuery.Experience in DataForm, Pub/Sub, BigQuery, Cloud Storage and other GCP services.Strong proficiency in Python for data processing tasks.Deep understanding and experience in data warehousing, schema design, and data modelling.Experience with ETL tools and frameworks for data ingestion and transformation.Knowledge of containerisation technologies (Docker, Kubernetes) is a plus.Strong problem-solving skills and ability to thrive in a fast-paced environment.Excellent communication and collaboration skills.Bachelor’s degree in Computer Science, Engineering, or a related field (Master’s preferred).Note:This position is based in Sydney, Australia.The candidate must have full working rights in Australia to be eligible for this role.
#J-18808-Ljbffr