What you will do In this vital role you will be part of the Data Platform Engineering team to support and maintain cloud and big data solutions used by functional teams like Manufacturing, Commercial, Research and Development
- Work closely with the Enterprise Data Lake/Fabric delivery and platform teams to ensure that the applications are aligned with the overall architectural and development guidelines
- Research and evaluate technical solutions including Databricks, AWS Services, Kubernetes/EKS, NoSQL databases, Data Science packages, platforms and tools with a focus on enterprise deployment capabilities like security, scalability, reliability, maintainability, cost management etc.
- Management of Enterprise Data Lake/Fabric platform incidents related to Databricks, AWS Services, Kubernetes/EKS, NoSQL databases, platforms and tools
- Assist in building and managing relationships with internal and external business stakeholders
- Develop basic understanding of core business problems and identify opportunities to bring automation to the manual resolutions
- Collaborate with Enterprise Data Lake/Fabric ecosystem leads to support the integration and operational readiness of new data management and processing tools identified for the platform
- Work to ensure effective cost observability and control mechanisms are in place for all aspects of data platform management.
- Experience working in Agile environments and participating in Agile ceremonies
- Keen on adopting new responsibilities, facing challenges, and mastering new technologies
- Work during US business hours to support cross-functional teams including Manufacturing, Commercial, and Research & Development in their use of Enterprise Data Lake/Fabric
Basic Qualifications:
- Bachelors degree and 2 to 6 years of relevant experience.
- Experience with Databricks capabilities including but not limited to cluster setup, execution, and tuning
- Experience with AWS services including but not limited to MSK, IAM, EC2, EKS and S3.
- Knowledge with data lake, data fabric and data mesh concepts
- Experience with platform performance optimization
- Knowledge on relational databases and relational database concepts
- Knowledge on ETL or ELT pipelines; Hands-on experience with SQL/NoSQL
- Program skills in one or more computer languages SQL, Python, Java
- Experienced with software engineering best-practices, including but not limited to version control (Git, GitLab.), CI/CD (GitLab or similar), automated unit testing, and Dev Ops
- Skills needed to assist business users
Preferred Qualifications:
- Experience in Cloud technologies AWS preferred.
- Cloud Certifications -AWS, Databricks, Microsoft
- Familiarity with the use of AI for development productivity, such as GitHub Copilot, Databricks Assistant, Amazon Q Developer or equivalent.
- Knowledge of Agile and DevOps practices.
- Knowledge of disaster recovery planning.
- Familiarity with load testing tools (JMeter, Gatling).
- Basic understanding of AI/ML for monitoring.
- Knowledge of distributed systems and microservices.
- Data visualization skills (Tableau, Power BI).
- Understanding of compliance and auditing requirements.
- Knowledge of Low code/No Code platform like Prophecy
- Familiarity with Shell Scripting
- Working experience with ServiceNow.
Soft Skills: - Excellent analytical and solve skills
- Excellent written and verbal communications skills (English) in translating technology content into business-language at various levels
- Ability to work effectively with global, virtual teams
- High degree of initiative and self-motivation
- Ability to manage multiple priorities successfully
- Team-oriented, with a focus on achieving team goals
- Strong problem-solving and analytical skills.
- Strong time and task leadership skills to estimate and successfully meet project timeline with ability to bring consistency and quality assurance across various projects.