Kindred is executing on its biggest product transformation to date. Our goal is to deliver a best-in-class sports betting product. As such, we are building new sportsbook capabilities from the ground-up including the Data and Reporting Capabilities. In this role, you will be working to ensure the quality of the data products we are building to cater to various Product Owners and sportsbook operations teams, driving high standards for our sportsbook reporting data quality for both batch and real-time use cases with a key mindset of automation and tracking defects and monitoring them to completion.
What you will do
Data Quality Assurance & Testing
1. Test and identify defects in batch and real-time data processes, ensuring high data integrity.
2. Develop and execute test cases for ETL pipelines and analytical datasets to validate data transformations and aggregations.
3. Design and execute data validation procedures to verify accuracy, completeness, and consistency across data sources.
4. Build an automated regression test suite to ensure every data release is automatically validated.
Automation & Monitoring
1. Automate daily data quality checks for proactive monitoring of data pipelines.
2. Identify statistical anomalies and unexpected data patterns to detect quality issues before they impact business decisions.
3. Implement data profiling, define data quality metrics, and track improvements over time.
4. Advocate and embed a quality-driven mindset across the team, ensuring data quality is a priority from the start.
5. Collaborate with business users, product owners, data engineers, and engineering teams to define and validate data requirements.
6. Work closely with IT teams to implement and optimize data quality solutions.
7. Prepare data quality reports, dashboards, and KPIs to communicate insights on data health and issue resolution progress.
Your experience
* Experience with SQL and PostgresSQL is a must.
* Must have experience scripting with Python on large data formats.
* Ability to use and have an understanding of logging and monitoring tools such as Grafana, Splunk, Prometheus, etc.
* Test Automation experience in big data warehouse environments with a focus on quality.
* Have exposure to or used Kafka for data streaming.
#J-18808-Ljbffr