We are seeking a highly skilled Data Engineer with deep expertise in designing, maintaining, and optimizing multi-tenant PostgreSQL architectures. This role owns the platform’s data backbone and ensures integrity, scalability, security, and high performance. You will also handle external data integrations, data pipelines, and PSA domain–specific models.
Key Responsibilities
- Database Architecture & Data Modeling
- Design, implement, and maintain the multi-tenant PostgreSQL architecture, including schema-per-tenant and RLS.
- Define logical and physical data models that scale with product growth.
- Establish standards for normalization, denormalization, partitioning, indexing, and materialized views.
- Optimize application-level data interactions by improving schema design and query paths.
- Data Integrity, Security & Performance
- Ensure data integrity, consistency, and reliability across all modules.
- Continuously monitor and identify performance bottlenecks across queries, storage, indexing, and schema design.
- Recommend and implement database design improvements, execution plan optimizations, and schema refactoring.
- Manage connection pooling using tools such as PgBouncer or PgPool, ensuring efficient connection management, pooling strategies, and stability under load.
- Tune vacuuming, autovacuum settings, and memory configurations for sustained performance.
- Implement enterprise-grade database security, encryption, RBAC, and auditing.
- Backup, Restore & Disaster Recovery
- Manage and maintain multiple schemas with complete tenant isolation.
- Design, automate, and manage full, incremental, and differential backup strategies.
- Enable point-in-time recovery, schema-level restoration, and tenant-specific rollback workflows.
- Execute backup integrity checks, periodic restoration drills, and disaster recovery validations.
- Maintain WAL archiving, replication, and failover setups.
- External Integrations & Data Pipelines
- Build and maintain ETL/ELT pipelines from PSA/ERP/CRM and other external systems.
- Implement ingestion workflows that ensure deduplication, normalization, mapping, and validation.
- Manage integration challenges such as authentication, throttling, retries, scheduling, and reconciliation.
- Ensure synchronized and reliable data exchange aligned with internal models.
- Collaboration & Product Support
- Work closely with backend engineers to refine ORM patterns, reduce query load, and optimize data access.
- Support analytics/reporting needs by enabling efficient query models and derived datasets.
- Translate product requirements into scalable database structures.
- Document schemas, data flows, backups, performance guidelines, and integration logic.
- DevOps & Operational Responsibilities
- Manage database migrations, environment alignment, and safe deployment rollouts.
- Implement monitoring and observability for slow queries, replication lag, connection saturation, and storage growth.
- Contribute to CI/CD pipelines for data and database lifecycle processes.
- Handle capacity planning, HA setups, and infrastructure tuning.
Required Technical Competencies
Core Skills
Schema design, multi-tenancy, advanced SQL, indexing, partitioning, query optimization, execution planning, RLS.
- Performance Optimization:
Ability to identify bottlenecks, analyze slow queries, recommend schema/query improvements, and tune workloads.
Experience with PgBouncer/PgPool, pool configuration, load handling, transaction pooling, session management.
PITR, WAL archiving, incremental backups, tenant-level restore, HA and DR strategies.
ETL/ELT pipelines, ingestion frameworks, transformation logic, CDC pipelines.
Experience working with PSA/ERP/CRM platforms and external APIs.
Strong Python skills for automation, ETL, tooling, and data workflows.
Managed PostgreSQL services, CI/CD for database changes, containerized workflows.
Domain Expertise
- Practical experience in the Professional Services Automation (PSA) domain.
- Understanding of projects, timesheets, billing, utilization, and financial data models.
Good-to-Have Qualifications
- Knowledge of FastAPI or backend microservices patterns.
- Familiarity with workflow/BPMN engines.
- Experience with Azure PostgreSQL Flexible Server, Data Factory, or AKS.
- Exposure to event-driven data architectures.
Why Join Us?
This is an opportunity to be a foundational member of a fast-paced, high-growth product team. You will be empowered to make significant technical decisions and see your contributions directly impact the business. We offer a challenging yet rewarding environment where technical innovation and ownership are highly valued. If you are a seasoned technologist ready to take on a Senior role and build a world-class product, we encourage you to apply.Apply Now and let us redefine the future of Services Management.