Job Location: Singapore Job Title: Data Engineer + AIML (Hands-on experience), 6-8 years experience Qualifications: You hold a relevant academic degree such as M.Sc. or M Tech/BTech. in Computer Science , Data Engineering or similar Knowledge and experience with Data Warehousing on Azure Data Lake Experience with Spark on Databricks and Delta Lake tables Strong experience with programming, specifically Python or Pyspark Experience with setting up and working with Devops pipelines. Experience with Infrastructure as a code – ARM / Bicep or Terraform is a plus Experience with data management and BI Architecture is preferable and so is creating dashboards You have excellent communication skills and fluency in spoken and written English Your responsibilities: Combine Data Engineering skills—along with sound business sense—to facilitate data-driven decision making across the organization. Act as the subject matter expert within data engineering, with a primary focus on delivering digital services and products. Design and develop data pipelines on batch and streaming data using Azure Paas services. Ensure that infrastructure, tools, and procedures are available to facility efficient execution of the digitalization roadmap and empowerment of end-users Job Type: Full-time Pay: ₹1,000,000.00 - ₹1,200,000.00 per year Application Deadline: 15/09/2025
We’re looking for professionals with 5–8 years of experience in Kubernetes, Apache Kafka, and API Gateway application installation/deployment . Candidates from Hyderabad, Singapore, or Remote locations are preferred. Experience in DevOps or platform setup will be an added advantage. Job Type: Full-time Pay: ₹390,223.06 - ₹1,567,505.77 per year Work Location: Hybrid remote in Hyderabad, Telangana
We’re looking for professionals with 5–8 years of experience in Kubernetes, Apache Kafka, and API Gateway application installation/deployment . Candidates from Hyderabad, Singapore, or Remote locations are preferred. Experience in DevOps or platform setup will be an added advantage. Job Type: Full-time Pay: ₹390,223.06 - ₹1,567,505.77 per year Work Location: Hybrid remote in Hyderabad, Telangana
As a professional with 5-8 years of experience in Kubernetes, Apache Kafka, and API Gateway application installation/deployment, you will be responsible for: - Installing and deploying Kubernetes, Apache Kafka, and API Gateway applications - Collaborating with team members to ensure successful deployment - Providing technical support and troubleshooting as needed Preferred candidates will have experience in DevOps or platform setup. Candidates from Hyderabad, Singapore, or remote locations are preferred for this full-time position. The work location will be a hybrid remote setup in Hyderabad, Telangana.,
We are looking for a Junior Data Engineer with hands-on experience in Azure Data Factory (ADF) , Talend , and Snowflake to support our data integration, workflow automation, and cloud data warehousing initiatives. The ideal candidate is someone who has strong fundamentals in ETL/ELT pipelines and is eager to grow in the data engineering space. Key Responsibilities Build, schedule, and monitor ETL/ELT pipelines using Azure Data Factory . Develop data integration workflows using Talend (Open Studio / Enterprise). Load, transform, and validate data within the Snowflake cloud data warehouse. Work with SQL queries, stored procedures, and data modeling concepts. Assist in migration of on-premise data solutions to cloud platforms. Maintain data quality, perform debugging, and support production issues. Collaborate with senior engineers, analysts, and stakeholders to understand data requirements. Document data flows, pipeline processes, and technical configurations. Job Type: Full-time Pay: ₹300,000.00 - ₹450,000.00 per year Work Location: In person
We are looking for a skilled Vault Engineer with 2–3 years of hands-on experience . The ideal candidate should have strong knowledge of Vault administration, secrets management, security best practices, and automation. Key Responsibilities Install, configure, and maintain Vault in production and non-production environments. Manage authentication methods, dynamic secrets, policies, and access controls. Monitor Vault performance, conduct regular maintenance, upgrades, and backups. Implement best practices for secret management, credential rotation, and encryption workflows . Integrate Vault with applications, CI/CD tools, Kubernetes, cloud platforms, and databases. Develop automation using Terraform, Ansible, or scripting languages (Shell/Python). Troubleshoot Vault issues, perform root cause analysis, and ensure high availability. Work closely with DevOps, Security, and Development teams to support secure delivery pipelines. Job Type: Full-time Pay: ₹341,317.79 - ₹863,369.02 per year Work Location: In person