Role: Python Developer Location : Remote - India Experience : Min 6-7 experiences (at least 2 year experience in Insurance domain) Job Description: 1. What specific Python skills or frameworks are most important? Python developers with knowledge of 1) Core Python 2) Understanding of creating libraries – preferably in the Hx Platform 3) General Understanding of Azure APIM 4) Experience with Writing/Consuming API’s using Open API Specs. 2. How critical is Hyperexponential experience—required or just nice to have? Highly desired, but we concede that Hx experience is a niche skill that is difficult to source 3. Do you have a preferred offshore location or just need time-zone overlap? Preference for work hours? Can they work remotely, do they need to be in an office? Working day can end around 12pm ET so there is overlap with the UK and with N. America. 4. Is this a single developer role, or could it expand to a small team? This would add to the small team we currently have. 5. What tools/processes do you use (Agile, Jira, GitHub, etc.)? Agile (still a work in progress), Azure DevOps, GIT
Title: Azure Databricks Engineer with Injection experience Location: Remote (India) Duration: 6 Month, (if the candidate is good with work, it can be extended to 1+ year ) Time: 10.00 AM- 7.00 PM IST Need good comm skills – they will be coordinating with Americans on daily bases for clarity : We are seeking an Ingestion Engineer with expertise in CI/CD and Kubernetes to join our team. This role requires a strong understanding of data ingestion processes, focusing on integrating various data sources into Databricks. The ideal candidate will have hands-on experience with data pipelines, automation, and infrastructure management to support scalable and efficient ingestion workflows. Key Responsibilities: Design, develop, and optimize data ingestion pipelines for integrating multiple sources into Databricks. Implement and maintain CI/CD pipelines for data workflows. Deploy and manage containerized applications using Kubernetes. Ensure data integrity, security, and compliance throughout the ingestion process. Collaborate with Data Engineers and other stakeholders to streamline data ingestion strategies. Troubleshoot and optimize ingestion pipelines for performance and scalability. Required Skills & Qualifications: Proven experience in data ingestion and pipeline development. Hands-on experience with CI/CD tools (GitHub Actions, Jenkins, Azure DevOps, etc.). Strong knowledge of Kubernetes and container orchestration. Experience with Databricks, Spark, and data lake architectures. Proficiency in Python, Scala, or SQL for data processing. Familiarity with cloud platforms such as AWS, Azure, or GCP. Strong problem-solving and analytical skills. Preferred Qualifications: Experience with Infrastructure as Code (Terraform, Helm, etc.). Background in streaming data ingestion (Kafka, Kinesis, etc.). Knowledge of data governance and security best practices.