We are seeking a highly skilled Data Operations Engineer to join our team at {company}, a leading financial services company. In this role, you will be responsible for building and managing secure, scalable data solutions using AWS or Google Cloud Platform (GCP).
The ideal candidate will have a strong background in cloud and data engineering, with experience in designing, automating, and optimizing data pipelines, datasets, and processes. They will also be proficient in Terraform for Infrastructure as Code (IaC) and have strong automation skills in SQL, Python, and Bash.
Key responsibilities include:
* Designing, automating, and optimizing data pipelines, datasets, and processes on either AWS or GCP
* Using Terraform to manage and streamline cloud data infrastructure
* Collaborating with Data Engineering and DevOps teams to support data ingestion, transformation, and analytics workflows
* Implementing data governance, security policies, and IAM controls to maintain secure data access and compliance
* Monitoring data platform performance, data quality, and capacity to ensure alignment with service levels and customer satisfaction
* Participating in on-call rotations for incident response and occasional off-hours change implementations
Requirements include:
* Bachelor's degree in Computer Science, IT, or related field
* 3+ years of experience in cloud and data engineering on either AWS or GCP
* Certification in AWS or GCP is highly preferred
* Proficiency in Terraform for IaC, along with strong automation skills in SQL, Python, and Bash
* Experience with data pipelines using tools like AWS Glue or DBT
* Strong problem-solving and communication skills, with the ability to multitask in high-pressure environments
The estimated salary for this position is $120,000 - $180,000 per year, depending on experience and qualifications.