Created on
December 5, 2025

Data Quality Engineer

Data Quality Engineer Job at Campaignswell

About Us

At Campaignswell, we're transforming marketing analytics with our AI-powered predictive platform used by apps, games, and e-commerce brands worldwide. We deliver highly accurate lifetime value (LTV) predictions within hours and unify product, marketing, and monetization analytics in a single platform.
Founded in 2023 and headquartered in San Francisco, our team is building the future of data-driven growth. As we scale, data quality and reliability are becoming foundational pillars of our product and that’s where you come in.

Responsibilities

  • Own and improve data quality across multiple pipelines, datasets, and integrations.
  • Design and maintain automated data validation frameworks (freshness, completeness, schema checks, anomaly detection).
  • Build monitoring and alerting systems to ensure reliability of ELT/ETL pipelines.
  • Investigate data inconsistencies and work cross-functionally to resolve root causes.
  • Develop and maintain data SLAs, incident response playbooks, and documentation.
  • Partner with Data Engineering, Analytics, and Customer Success teams to ensure data used by clients is always accurate, timely, and trustworthy.
  • Improve internal tools and workflows related to data ingestion, observability, lineage, and testing.
  • Contribute to continuous improvements of our data platform and operational excellence.

Requirements

  • 3+ years of experience in Data Quality, Data Ops, Analytics Engineering, or Data Engineering.
  • Solid SQL and Python skills.
  • Experience implementing data testing frameworks (e.g., dbt tests, Great Expectations, Soda, or custom tooling).
  • Strong understanding of ETL/ELT pipelines and data warehousing concepts.
  • Hands-on experience with orchestration tools (Airflow or equivalents).
  • Experience with AWS cloud services (S3, Lambda, ECS, etc.).
  • Understanding of schema design, data modeling, and data lineage.
  • Strong analytical mindset and exceptional attention to detail.
  • Excellent written and verbal communication skills.

Nice to Have

  • Experience with Snowflake and/or ClickHouse.
  • Knowledge of monitoring/observability tools (e.g., Prometheus, Grafana, OpenTelemetry).
  • Familiarity with event-based architectures and webhook ingestion.
  • Experience supporting ML pipelines from a data reliability standpoint.
Max file size 10MB.
Uploading...
fileuploaded.jpg
Upload failed. Max size for files is 10 MB.
By submitting this form, I agree that the Terms of Service and Privacy Notice will govern the use of services I receive and personal data I provide respectively.
Thank you, we will contact you shortly!
Oops! Something went wrong while submitting the form.