Solution Design & Development
- Architect, design, and implement large-scale, high-volume, high-performance, and highly available data solutions and pipelines on the Databricks Lakehouse platform within AWS Cloud.
- Develop and maintain data lakes, lakehouses, and data warehouses across on-premises and cloud environments.
- Build robust ETL processes and analytics solutions to support business needs.
Data Engineering & Governance
- Manage the full data lifecycle, including ingestion, ETL, pruning, modeling, and governance, ensuring compliance with regulatory requirements.
- Partner with data owners and platform teams to establish a single source of truth leveraging modern data architecture principles.
Technical Expertise
- Deep understanding of Databricks architecture and AWS native services.
- Strong programming skills in Python and related frameworks, with a focus on scalability, reliability, and performance optimization.
- Implement CI/CD pipelines, automated testing, and code review best practices.
- Apply security best practices for secure application development.
Collaboration & Leadership
- Drive technical decision-making, balancing business priorities, technical constraints, and user experience.
- Identify process inefficiencies and lead continuous improvement initiatives.
- Work effectively with global development and engineering teams.
Preferred Skills
- Experience with financial systems (Oracle Financials, Oracle Fusion Cloud, Hyperion) and optimizing their integration into broader data ecosystems.
- Familiarity with DevOps practices, cloud security models, encryption strategies, and hybrid network architectures.
- Knowledge of cloud data governance, compute optimization, and cost control.
- Hands-on experience with tools such as Bitbucket, Rally, and Jenkins.
- Exposure to large-scale cloud data migration projects.
- Understanding of SDLC processes, standards, and governance frameworks.
- Proficiency in at least one database technology (Oracle, Exadata, Netezza, SQL Server).
What We Look For
- Strong problem-solving skills and ability to work in an agile environment.
- Excellent communication and collaboration skills to work across global teams.
- Passion for innovation and continuous learning in modern data engineering practices.