Overview
WOOP is looking for a Data Operations & Analytics Manager to make our data accurate, unified, secure, and truly useful across teams and markets. You'll own data quality and enrichment, data modeling and performance, integrations reliability, governance and security, cost-efficient infrastructure, and smooth day-to-day operations. This is a hands-on manager role: you'll lead a small team, collaborate with Tech/Marketing, and build repeatable processes that scale.
What You'll Do
1. Accurate, Unified, and Trusted Data
Data Quality & Standardization
- Maintain clean, de-duplicated, and correct consumer data at scale
- Standardize formats (country, device, channels, dates) across databases/tools
- Ensure numbers align across key dashboards (e.g., Metabase) and reports
- Perform regular data audits and quality assessments to spot trends/anomalies early
Data Enrichment & Updates
- Enrich profiles with device type (iPhone/Android), channel, segment, and tags
- Operationalize scheduled enrichment runs (weekly/monthly) with minimal backlog
Data Modeling, Structure & Performance
- Design logical/physical models, schemas, and views for speed, stability, and cost
- Continuously optimize query performance and reduce report failures/timeouts
- Maintain data lineage maps to trace issues end-to-end
- Assess impact of schema/pipeline changes on downstream reports/users
Common Naming, Tags & Metrics
- Own the data dictionary and business metric definitions
- Enforce naming/tagging standards across teams, tools, and markets
2. Maximize Value from Every Lead & Interaction (Zero Data Loss)
Integration Data Reliability (Clients & APIs)
- Partner with Tech to stabilize data flows with client systems; monitor API errors, mappings, formats; close gaps fast
Internal Data Flow Reliability (Pointerpro, QW, etc.)
- Map internal tool flows (e.g., Pointerpro ↔ QW), maintain diagrams/docs, ensure complete records
Tagging & Segmentation Systems
- Design universal tagging for device, campaign, creative, audience; ensure tags flow end-to-end into the warehouse and Metabase
Ad-Level Data & Granular Targeting
- Restore/preserve ad-level signals (e.g., from Facebook) and reflect device/ad tags accurately for targeting and reporting
3. Cost-Efficient and Scalable Data Infrastructure
Cloud & Storage Optimization
- Right-size compute/storage, archive low-value data, and track costs vs. performance
Integration & Tool Cost Management
- Reduce redundant flows/tools; consolidate via smart orchestration (e.g., Quickwork or similar)
- Evaluate and recommend data tools/tech (e.g., new ETL/orchestration vs. Quickwork)
- Manage vendor relationships for data platforms/APIs; negotiate contracts for cost/reliability
In-house Data Capabilities
- Build reusable components and bring critical processes in-house to reduce vendor dependence
4. Strong Governance, Privacy, and Security
Compliance & Governance
- Align practices with ISO standards, GDPR, and local laws; maintain retention, access, and sharing policies; support audits
Data Security & Access Control
- Implement role-based access; mask/encrypt PII; maintain access/export logs
Secure Data Sharing & File Transfers
- Enforce encryption and approved channels; maintain traceability for all exports
Monitoring & Incident Readiness
- Monitor for unusual access and pipeline issues; maintain incident response runbooks; partner with Tech on vulnerability scans
5. Stakeholder Reporting & Strategic Insights
Executive Reporting
- Create and present data ops reports/dashboards to leadership (e.g., monthly quality/cost reviews)
- Extract insights from ops data to recommend process improvements (e.g., "API errors spike on weekends")
6. Smooth Operations, Clear Requests, and a Skilled, Healthy Data Team
Request & Ticket Management
- Use a single ticketing system; triage, prioritize, and communicate status/SLAs
- Analyze ticket trends to proactively fix root causes (e.g., recurring API failures)
Ongoing Process Improvement
- Reduce manual steps via automation; improve pipeline stability and documentation
- Build/maintain monitoring dashboards for pipeline health, data freshness, and SLAs
Training & Skill Development
- Upskill the team on tools, security, and modern data stack; promote knowledge sharing
Team Leadership & Well-being
- Set clear priorities and realistic workloads; coach and grow a small team; foster collaboration and psychological safety
- Own data ops budget (cloud, tools, team); forecast needs and track ROI
Success Metrics (How Your Impact is Measured)
- Reduced data errors (duplicates, wrong formats, missing fields); higher % of complete profiles
- Faster average query/report run-times; fewer failed queries/broken dashboards
- Standard naming/metrics adopted across key dashboards; reduced rework due to misalignment
- Data loss from API/mapping issues trending to near-zero; robust ad-level and device tagging visible in Metabase
- Lower cloud/integration costs with stable or improved performance; fewer redundant tools/flows
- Passing audits with minimal findings; secure, auditable file sharing and access logs
- Lower ticket resolution time; fewer repeat issues; improved team engagement and skill levels
- Audit findings resolved within defined SLAs; 95%+ data freshness maintained
- Stakeholder satisfaction with ops reporting; insights leading to 10%+ efficiency gains
Minimum Qualifications
Education:
Bachelor's in Computer Science, Information Systems, Statistics, or related field (preferred, not required)Experience:
3+ years in data operations/data management, including hands-on work with data quality, enrichment, data modeling, and integrationsLeadership:
Prior people leadership experience (e.g., 1–5 direct reports or leading a pod/squad); ability to set priorities, coach, and review workTechnical Skills:
- Strong SQL and at least one scripting language for data management (Python preferred; R or Scala acceptable)
- Experience with BI/analytics tools (e.g., Metabase) and maintaining consistent metrics and definitions
- Practical experience designing and optimizing schemas, queries, and pipelines for performance and cost
- ETL/pipeline optimization experience; exposure to Agile/Scrum methodologies
Integration Experience:
Comfortable with APIs, webhooks, and data exchange formats (JSON/CSV); mapping/tagging across ad platforms and formsGovernance & Security:
Familiar with governance and security practices (role-based access, masking/encryption, audit logs) and regulations (GDPR; ISO-aligned processes)Infrastructure:
Experience with a major cloud/data warehouse (e.g., BigQuery, Snowflake, Redshift) and storage/compute cost controlOperations:
Proficiency with a ticketing system (e.g., Jira, Asana, Zendesk) and documentation standardsSoft Skills:
Urgency in resolving client/stakeholder issues; collaborative communication style
Nice-to-Haves
- Experience with Pointerpro and Quickwork (QW) or similar workflow/integration platforms
- Exposure to dbt, Airflow, or similar orchestration/transformation tools
- Data lineage/observability tools (e.g., Monte Carlo, DataHub) and job schedulers
- Incident response/runbooks for data pipelines
- Ad platform data (e.g., Facebook) and granular audience/device-level reporting
- Certifications in data privacy/security or cloud platforms
- Experience with distributed team management if remote
The Person We're Looking For
You're pragmatic and detail-oriented
You lead by example
You're a systems thinker.
You're collaborative, not siloed.
Finally, you care about the craft.