We are seeking a highly skilled Data Engineer with deep expertise in designing, maintaining, and optimizing multi-tenant PostgreSQL architectures. This role owns the platform s data backbone and ensures integrity, scalability, security, and high performance. You will also handle external data integrations, data pipelines, and PSA domain-specific models.
Key Responsibilities
1. Database Architecture Data Modeling
Design, implement, and maintain the
multi-tenant PostgreSQL architecture
, including schema-per-tenant and RLS. Define logical and physical
data models
that scale with product growth. Establish standards for
normalization, denormalization, partitioning, indexing
, and materialized views. Optimize application-level data interactions by improving schema design and query paths.
2. Data Integrity, Security Performance
Ensure
data integrity, consistency, and reliability
across all modules. Continuously monitor and
identify performance bottlenecks
across queries, storage, indexing, and schema design. Recommend and implement
database design improvements
, execution plan optimizations, and schema refactoring. Manage
connection pooling
using tools such as PgBouncer or PgPool, ensuring efficient connection management, pooling strategies, and stability under load. Tune vacuuming, autovacuum settings, and memory configurations for sustained performance.
Implement enterprise-grade
database security
, encryption, RBAC, and auditing. 3. Backup, Restore Disaster Recovery
Manage and maintain
multiple schemas
with complete tenant isolation. Design, automate, and manage
full, incremental, and differential backup strategies
. Enable
point-in-time recovery
, schema-level restoration, and tenant-specific rollback workflows. Execute backup integrity checks, periodic restoration drills, and disaster recovery validations.
Maintain WAL archiving, replication, and failover setups.
4. External Integrations Data Pipelines
Build and maintain
ETL/ELT pipelines
from PSA/ERP/CRM and other external systems. Implement ingestion workflows that ensure
deduplication, normalization
, mapping, and validation. Manage integration challenges such as authentication, throttling, retries, scheduling, and reconciliation.
Ensure synchronized and reliable data exchange aligned with internal models.
5. Collaboration Product Support
Work closely with backend engineers to refine ORM patterns, reduce query load, and optimize data access.
Support analytics/reporting needs by enabling efficient query models and derived datasets.
Translate product requirements into
scalable database structures
. Document schemas, data flows, backups, performance guidelines, and integration logic.
6. DevOps Operational Responsibilities
Manage
database migrations
, environment alignment, and safe deployment rollouts. Implement monitoring and observability for
slow queries, replication lag, connection saturation
, and storage growth. Contribute to CI/CD pipelines for data and database lifecycle processes.
Handle capacity planning, HA setups, and infrastructure tuning.
Required Technical Competencies
Core Skills
PostgreSQL (Expert):
Schema design, multi-tenancy, advanced SQL, indexing, partitioning, query optimization, execution planning, RLS.
Performance Optimization:
Ability to identify bottlenecks, analyze slow queries, recommend schema/query improvements, and tune workloads.
Connection Pooling:
Experience with PgBouncer/PgPool, pool configuration, load handling, transaction pooling, session management.
Backup Recovery:
PITR, WAL archiving, incremental backups, tenant-level restore, HA and DR strategies.
Data Engineering:
ETL/ELT pipelines, ingestion frameworks, transformation logic, CDC pipelines.
Integrations:
Experience working with PSA/ERP/CRM platforms and external APIs.
Scripting:
Strong Python skills for automation, ETL, tooling, and data workflows.
Cloud DevOps:
Managed PostgreSQL services, CI/CD for database changes, containerized workflows.
Domain Expertise
Practical experience in the
Professional Services Automation (PSA)
domain. Understanding of projects, timesheets, billing, utilization, and financial data models.
Good-to-Have Qualifications
Knowledge of FastAPI or backend microservices patterns.
Familiarity with workflow/BPMN engines.
Experience with Azure PostgreSQL Flexible Server, Data Factory, or AKS.
Exposure to event-driven data architectures.
Why Join Us
This is an opportunity to be a foundational member of a fast-paced, high-growth product team. You will be empowered to make significant technical decisions and see your contributions directly impact the business. We offer a challenging yet rewarding environment where technical innovation and ownership are highly valued. If you are a seasoned technologist ready to take on a Senior role and build a world-class product, we encourage you to apply.
Apply Now and let us redefine the future of Services Management.