About Acelab
Our mission at Acelab is to transform how the building industry makes material decisions. We've created a comprehensive platform that connects architects with the right materials for their projects because we believe materials are fundamental to transforming inspired designs into exceptional buildings.At the heart of our mission is the understanding that material choices shape aesthetics, performance, sustainability, and ultimately, the human experience of built spaces. We recognize that architects face overwhelming challenges in navigating hundreds of thousands of products while trying to capture and maintain their firm's collective material expertise.Through our Material Hub platform, we aim to:
- Empower architects to make material choices that truly matter
- Provide easier access to innovative products and collective knowledge
- Elevate not just individual buildings but the entire built environment
- Create a shared language for materials that enables seamless communication between architects, manufacturers, contractors, and clients
- Preserve institutional knowledge within architecture firms
- Streamline the material selection and specification process
Our mission statement "Because Materials Matter" encapsulates our commitment to elevating material selection from a fragmented, time-consuming process to a strategic aspect of architectural excellence. By connecting the industry's deepest technical database with material decision-making workflow tools, we're working to ensure that every project benefits from better informed material decisions.
About The Role
We are seeking a Senior Data Engineer to join our AI-focused data team and lead the development of scalable data processing and enrichment pipelines. You'll work at the cutting edge of AI-powered data engineering, building production-grade systems that power our Material Hub platform and enable architects to access comprehensive, up-to-date material information.
The Work You'll Do
- Production Engineering: Transform experimental AI workflows into robust, automated production systems with comprehensive monitoring and quality assurance
- System Architecture: Design scalable data processing pipelines, create reusable modular components, and establish engineering standards for team collaboration
- Data Quality & Automation: Build evaluation frameworks to monitor pipeline quality, implement automated error handling, and reduce manual intervention requirements
Technical Leadership: Mentor team members, conduct thorough code reviews, and set best practices for AI system design
Requirements
Required Qualifications
Technical Skills
- Advanced Python programming with experience building production data processing systems
- Proven experience productionalizing and scaling AI/ML systems using LangChain or similar LLM orchestration frameworks
- Advanced SQL skills with PostgreSQL experience and familiarity with vector databases
- Experience with Google Cloud Platform or another major cloud platform
- Experience with containerization using Docker
- Proficiency with workflow orchestration tools such as Airflow
- Strong system design skills and experience with CI/CD pipelines
Professional Competencies
- Strategic problem-solving with ability to choose appropriate AI versus deterministic approaches
- Experience mentoring team members and setting technical standards
- Experience conducting thorough code reviews with focus on quality, security, and performance
- Self-motivated with proven ability to take ownership of complex technical initiatives
- Excellent communication skills for working with other engineers, business subject matter experts, and product teams
- Embrace learning of new technologies and sharing knowledge with colleagues
Preferred Qualifications
- Data science experience, including evaluation and monitoring experience
- MCP (Model Context Protocol) experience
- Graph database (e.g. Neo4j) experience
- Kubernetes experience
- Experience with event-driven architectures (Kafka, GCP Pub/Sub, AWS SQS, or Azure Event Hubs)
- Experience with web scraping
- Interest in or experience with the AEC (Architecture, Engineering, Construction) industry
Responsibilities
- Work within a team to design and implement scalable AI-powered data processing pipelines
- Automate manual processes and build intelligent resource management systems
- Transform proof-of-concept solutions from other team members into production-grade deployed systems
- Establish monitoring, logging, and CI/CD infrastructure
- Provide technical mentorship and conduct code reviews
- Architect reusable components and quality assurance frameworks
Benefits
Impact & Growth Opportunity
- Drive Business Growth: Enable significant expansion of our data capabilities and platform reach
- Shape Technical Direction: Influence system architecture decisions and establish patterns for platform evolution
- Own Critical Infrastructure: Take ownership of core data workflows after onboarding
- Work with Cutting-Edge Technology: Leverage the latest AI/LLM technology and supporting technologies in production
Join us in building the future of how the architecture industry discovers, evaluates, and specifies building materials.