Principal DevOps Engineer Database & Data Operations will drive the strategic implementation of automation, pipelines, and infrastructure solutions supporting large-scale data platforms. This role involves designing and optimizing CI/CD workflows for data services, implementing Infrastructure as Code with Terraform and Ansible, and managing containerized environments. You will partner closely with database and data engineering teams to ensure scalability, reliability, and performance across critical systems. As a senior contributor, youll solve complex technical challenges, guide cross-functional initiatives, and align DevOps practices with organizational goals.
Here is how, through this exciting role, YOU will contribute to BMC's and your own success:
- Partner with DBA and Data Engineering teams to optimize automation for PostgreSQL, Oracle, SQL Server, and cloud-native databases.
- Improve database reliability, availability, and cost efficiency through DevOps practices.
- Lead enterprise adoption of Terraform, Ansible, Jenkins and similar frameworks to manage infrastructure, configuration, and deployments.
- Establish scalable, secure patterns for database infrastructure and supporting services.
- Design and optimize CI/CD workflows for data pipelines and database migrations, integrating schema changes, versioning, and provisioning.
- Implement automation to ensure fast, safe delivery of database and analytics solutions.
- Oversee strategic use of cloud technologies (AWS, Azure, GCP, OCI) for data workloads and automation.
- Provide technical guidance on hybrid and multi-cloud deployments.
- Manage data-centric containerized deployments using Docker, Kubernetes, and Helm.
- Optimize orchestration for stateful workloads and large-scale data clusters.
- Develop and maintain robust observability frameworks to monitor database health, data pipelines, and infrastructure.
- Anticipate issues through metrics, logging, and alerting.
- Implement automation for database security policies, patching, and compliance audits.
- Drive initiatives that reduce manual overhead, cut costs, and improve uptime and performance.
- Ensure DevOps strategies support business objectives, SLAs, and cost-saving initiatives for data operations.
To ensure youre set up for success, you will bring the following skillset & experience:
- 12+ years of professional experience with a bachelors degree OR
- 9+ years of experience with an advanced degree.
- Strong background in database DevOps, data operations, or cloud database automation.
- IaC& Automation:HashiCorpTerraform Professional, Ansible Automation certifications.
- Containerization: Docker Certified Associate, Certified Kubernetes Administrator (CKA).
- DevOps: DevOps Institute certifications (DevOps Foundation,DevSecOpsEngineering).
- Technical Proficiency: Advanced knowledge of Terraform, Ansible, CI/CD tooling, Kubernetes, and database automation.
- Database Expertise: Experience supporting PostgreSQL, Oracle, SQL Server, and/or modern data platforms.
- Scripting: Skilled with writing and reading various scripting and coding languages including Bash and Python.
- Pipeline Management: Skilled in building, optimizing, and automating data pipelines for large-scale operations.
- Business Acumen: Aligns DevOps automation with business vision and cost savings.
- Collaboration & Leadership: Works closely with DBAs, Data Engineers, and cross-functional stakeholders.
- Communication: Clearly explains complex technical solutions to both technical and non-technical audiences.
- Problem Solving: Anticipates and mitigates risks to maintain database reliability and performance.
- Adaptability: Stays ahead of evolving data technologies and prepares the team for adoption.
- Mentoring: Provides technical mentorship in DevOps and database automation best practices.