Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5.0 - 8.0 years
22 - 30 Lacs
Noida, Hyderabad, Bengaluru
Hybrid
Role: Data Engineer Exp: 5 to 8 Years Location: Bangalore, Noida, and Hyderabad (Hybrid, weekly 2 Days office must) NP: Immediate to 15 Days (Try to find only immediate joiners) Note: Candidate must have experience in Python, Kafka Streams, Pyspark, and Azure Databricks. Not looking for candidates who have only Exp in Pyspark and not in Python. Job Title: SSE Kafka, Python, and Azure Databricks (Healthcare Data Project) Experience: 5 to 8 years Role Overview: We are looking for a highly skilled with expertise in Kafka, Python, and Azure Databricks (preferred) to drive our healthcare data engineering projects. The ideal candidate will have deep experience in real-time data streaming, cloud-based data platforms, and large-scale data processing . This role requires strong technical leadership, problem-solving abilities, and the ability to collaborate with cross-functional teams. Key Responsibilities: Lead the design, development, and implementation of real-time data pipelines using Kafka, Python, and Azure Databricks . Architect scalable data streaming and processing solutions to support healthcare data workflows. Develop, optimize, and maintain ETL/ELT pipelines for structured and unstructured healthcare data. Ensure data integrity, security, and compliance with healthcare regulations (HIPAA, HITRUST, etc.). Collaborate with data engineers, analysts, and business stakeholders to understand requirements and translate them into technical solutions. Troubleshoot and optimize Kafka streaming applications, Python scripts, and Databricks workflows . Mentor junior engineers, conduct code reviews, and ensure best practices in data engineering . Stay updated with the latest cloud technologies, big data frameworks, and industry trends . Required Skills & Qualifications: 4+ years of experience in data engineering, with strong proficiency in Kafka and Python . Expertise in Kafka Streams, Kafka Connect, and Schema Registry for real-time data processing. Experience with Azure Databricks (or willingness to learn and adopt it quickly). Hands-on experience with cloud platforms (Azure preferred, AWS or GCP is a plus) . Proficiency in SQL, NoSQL databases, and data modeling for big data processing. Knowledge of containerization (Docker, Kubernetes) and CI/CD pipelines for data applications. Experience working with healthcare data (EHR, claims, HL7, FHIR, etc.) is a plus. Strong analytical skills, problem-solving mindset, and ability to lead complex data projects. Excellent communication and stakeholder management skills. Email: Sam@hiresquad.in
Posted 6 days ago
5.0 - 8.0 years
22 - 30 Lacs
Noida, Hyderabad, Bengaluru
Hybrid
Role: Data Engineer Exp: 5 to 8 Years Location: Bangalore, Noida, and Hyderabad (Hybrid, weekly 2 Days office must) NP: Immediate to 15 Days (Try to find only immediate joiners) Note: Candidate must have experience in Python, Kafka Streams, Pyspark, and Azure Databricks. Not looking for candidates who have only Exp in Pyspark and not in Python. Job Title: SSE Kafka, Python, and Azure Databricks (Healthcare Data Project) Experience: 5 to 8 years Role Overview: We are looking for a highly skilled with expertise in Kafka, Python, and Azure Databricks (preferred) to drive our healthcare data engineering projects. The ideal candidate will have deep experience in real-time data streaming, cloud-based data platforms, and large-scale data processing . This role requires strong technical leadership, problem-solving abilities, and the ability to collaborate with cross-functional teams. Key Responsibilities: Lead the design, development, and implementation of real-time data pipelines using Kafka, Python, and Azure Databricks . Architect scalable data streaming and processing solutions to support healthcare data workflows. Develop, optimize, and maintain ETL/ELT pipelines for structured and unstructured healthcare data. Ensure data integrity, security, and compliance with healthcare regulations (HIPAA, HITRUST, etc.). Collaborate with data engineers, analysts, and business stakeholders to understand requirements and translate them into technical solutions. Troubleshoot and optimize Kafka streaming applications, Python scripts, and Databricks workflows . Mentor junior engineers, conduct code reviews, and ensure best practices in data engineering . Stay updated with the latest cloud technologies, big data frameworks, and industry trends . Required Skills & Qualifications: 4+ years of experience in data engineering, with strong proficiency in Kafka and Python . Expertise in Kafka Streams, Kafka Connect, and Schema Registry for real-time data processing. Experience with Azure Databricks (or willingness to learn and adopt it quickly). Hands-on experience with cloud platforms (Azure preferred, AWS or GCP is a plus) . Proficiency in SQL, NoSQL databases, and data modeling for big data processing. Knowledge of containerization (Docker, Kubernetes) and CI/CD pipelines for data applications. Experience working with healthcare data (EHR, claims, HL7, FHIR, etc.) is a plus. Strong analytical skills, problem-solving mindset, and ability to lead complex data projects. Excellent communication and stakeholder management skills. Email: Sam@hiresquad.in
Posted 1 week ago
5.0 - 10.0 years
5 - 10 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Role & responsibilities Looking exp in 5+ Yrs exp in Confluent Kafka Administrator-Technology Lead Kafka Administrator Required Skills & Experience: Hands-on experience in Kafka Cluster Management Proficiency with Kafka Connect Knowledge of Cluster Linking and MirrorMaker Experience setting up Kafka clusters from scratch Experience on Terraform/Ansible script Ability to install and configure the Confluent Platform Understanding of rebalancing, Schema Registry, and REST Proxies Familiarity with RBAC (Role-Based Access Control) and ACLs (Access Control Lists) Interested candidate share me your updated resume in recruiter.wtr26@walkingtree.in
Posted 2 weeks ago
5.0 - 10.0 years
6 - 11 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Role & responsibilities Kafka Administrator Required Skills & Experience: Looking exp around 5+ Years. Hands-on experience in Kafka Cluster Management Proficiency with Kafka Connect Knowledge of Cluster Linking and MirrorMaker Experience setting up Kafka clusters from scratch Experience on Terraform/Ansible script Ability to install and configure the Confluent Platform Understanding of rebalancing, Schema Registry, and REST Proxies Familiarity with RBAC (Role-Based Access Control) and ACLs (Access Control Lists) Share me your updated resume recruiter.wtr26@walkingtree.in
Posted 2 weeks ago
8 - 10 years
27 - 32 Lacs
Hyderabad
Work from Office
Your Primary Responsibilities: Lead needed technical processes and designs considering reliability, data integrity, maintainability, reuse, extensibility, usability and scalability. Collaborate with Infrastructure partners to identify and deploy optimal hosting environments. Define scalability and performance criteria for assigned applications. Ensure application meets the performance, privacy, and security requirements. Tune application performance to eliminate and reduce issues. Verify test plans to ensure compliance with performance and security requirements. Support business and technical presentations in relation to technology platforms and business solutions. Mitigate risk by following established procedures and monitoring controls. Help develop solutions that balance cost and delivery while meeting business requirements. implement technology-specific best practices that are consistent with corporate standards. Partner with multi-functional teams to ensure the success of product strategy and project deliverables. Manage the software development process. Drive new technical and business process improvements. Estimate total costs of modules/projects covering both hours and expense. Research and evaluate specific technologies, and applications, and contributes to the solution design. Construct application Architecture encompassing end-to-end designs. Mitigates risk by following established procedures and monitoring controls, spotting key errors and demonstrating strong ethical behavior. Qualifications: Minimum 8-10 years of related experience Bachelor's degree preferred or equivalent experience. Talents Needed for Success: Hands on experience in software development using Design Patterns, Java, typescript, Java EE, Spring Boot, Angular 8+, JMS, REST API, PL/SQL, Python. Experience with Micro Services & Layered (SOA/MVC) Architecture onPrem and Oncloud (AWS preferred). Familiar developing and running applications in Windows and Linux environments. Expertise in deploying scalable solutions in Kubernetes/docker containers that are highly resilient and perform well in an environment that talks to legacy systems and future centric micro architecture. Demonstrable experience in software development using CI/CD tools especially GIT, Bitbucket, Maven, Jenkins, Jira Experience using the following development tools: Visual Studio, IntelliJ, or Eclipse. Demonstrated capability working with middleware like IBM MQ, Solace, tomcat, liberty server, WebSphere, WebLogic or JBoss application servers. Familiarity working with relational databases including DB2 or Oracle. Experience with microservices and event driven architecture. Experience with Apache Kafka (or Confluent Kafka), Kafka APIs and tooling (e.g., Kafka Connect, KStreams, KSQL) Proficiency in different phases of the system development life cycle including project planning, analysis, design, development, and testing. Solid focus on software testing with Junit, Mockito, Jasmine, Karma. Familiarity with different software development methodologies (Waterfall, Agile, Scrum, Kanban) Additional Qualifications: Strong ability to gather and analyze requirements and translate them into technical specification. Writing and implementing the unit test scenarios / cases to ensure the code quality and reliability. Deep understanding of all lifecycle components (code, test, deploy) Ability to present designs to peers and Product Owner for approval. Fixing and debugging code to resolve technical issues. Optimizing application performance to ensure efficient use of software resources. Good verbal and written communication and interpersonal skills
Posted 2 months ago
10 - 12 years
32 - 37 Lacs
Bengaluru
Work from Office
Are you passionate about working with innovative technologies? Do you have experience in DevOps Engineering and a strong focus on Azure DevOps and Terraform? We are looking for a talented DevOps Engineer to join our SAP Development & Integration CoE team at Novo Nordisk. If you are ready for the next step in your career and want to be part of a global healthcare company that is making a difference in the lives of millions, read on and apply today for a life-changing career. Apply Now! The position As a Senior IT Developer I at Novo Nordisk, you shall develop and maintain integrations as well as APIs using Confluent Kafka as a message broker. You will have the opportunity to: Design, build and maintain CI/CD pipelines using Azure DevOps. Develop and manage infrastructure as code (IaC) using Terraform. Implement software development best practices to ensure high-quality, scalable, and secure solutions. Design, develop and maintain Kafka-based solutions. Work with cross-functional teams to understand business requirements and translate them into technical requirements. Develop and maintain documentation for Kafka-based solutions. Troubleshoot and debug issues in Kafka-based solutions. Optimize Kafka-based solutions for performance, scalability, and availability. Collaborate with global teams to define and deliver projects. Qualifications To be successful in this role, you should have: Masters or bachelors degree in computer science, IT, or a related field, with a total of 10+ years of experience. 6+ years of relevant experience as a DevOps Engineer, with a strong focus on Azure DevOps and Terraform. Solid understanding of software development best practices. Basic knowledge of Confluent Kafka, AWS, and integrations in general. Knowledge about Kafka architecture, Kafka connectors, and Kafka APIs. Experience in implementing Kafka-based solutions in a cloud environment (AWS, Azure, Google Cloud, etc.). Ability to troubleshoot and debug complex issues in Kafka-based solutions. Experience in platform technologies such as Confluent Cloud Kafka, AWS EKS, Kafka Connect, KSQL, Schema registry and in security. Experience in CI/CD tools such as Azure DevOps, Azure pipelines, Terraform and Helm Charts. Experience using Agile, Scrum and iterative development practices. Good communication skills and ability to work with global teams to define and deliver on projects. Self-driven and fast learner with a high sense of ownership.
Posted 2 months ago
5 - 9 years
25 - 35 Lacs
Pune, Bengaluru
Work from Office
Job Description We are seeking a seasoned Full Stack Software Engineer with a strong background in backend engineering and proficiency in frontend development. The ideal candidate will have extensive experience in Java and Kotlin programming, a deep understanding of functional programming principles, and expertise in real-time data streaming with Apache Kafka. Additionally, proficiency in UI programming using either React or Angular is essential. Key Responsibilities Kafka Expertise: Develop and maintain data streaming solutions using Apache Kafka. Ensure the seamless integration of Kafka with other systems. Backend Development: Design, develop, and maintain robust and scalable backend systems using Kotlin and Java Develop and maintain user interfaces using React or Angular, collaborating with UI/UX designers to implement responsive and intuitive designs, ensuring their technical feasibility, and optimizing applications for speed and scalability. Java Development: Write clean, maintainable, and efficient Java code. Lead the development of key components and services. Collaboration: Work closely with product managers, software engineers, and other stakeholders to deliver high-quality software solutions. Performance Tuning: Identify and address performance bottlenecks in the system. Implement solutions to enhance system performance and scalability. Monitoring and Troubleshooting: Implement monitoring and logging solutions to ensure the health and performance of applications. Troubleshoot and resolve issues as they arise. Continuous Improvement: Stay up-to-date with the latest industry trends and technologies. Continuously seek opportunities to improve existing processes and solutions. We are looking for candidates with a proven performance track record with the following: Required Qualifications: Education: Bachelors or Master’s degree in Computer Science, Engineering, or a related field. Experience: Minimum of 6+ years of experience in application architecture and software development. Technical Skills: Proficiency in Java : Strong understanding of Java SE and EE, including multithreading, concurrency, and design patterns. Frameworks : Experience with Spring, Spring Boot, Hibernate, and JPA. JavaScript/TypeScript : Proficient in modern JavaScript (ES6+) and TypeScript. UI/UX Principles : Knowledge of responsive design, cross-browser compatibility, and web accessibility standards. Event-Driven Architecture and Kafka: In-depth knowledge of Apache Kafka, including setup, configuration, partitioning, replication, producers, consumers, and Kafka Connect. Experience with Kafka topic design, retention policies, and offsets management. Ability to design and implement stream processing applications using Kafka Streams DSL (Domain Specific Language) and Processor API. Solid understanding of microservices architecture and RESTful API design. Experience with CI/CD pipelines and tools (e.g., Jenkins, GitLab CI). Familiarity with cloud platforms (e.g., AWS, GCP, Azure) is a plus. Soft Skills: Excellent problem-solving and analytical skills. Strong communication and interpersonal skills. Ability to work effectively in a team-oriented environment. Demonstrated ability to lead and mentor junior engineers.
Posted 3 months ago
6 - 8 years
8 - 12 Lacs
Chennai
Work from Office
Around 5+ years of hands-on experience in Java based application development with integration into Kafka messaging systems Mandatory Development experience implementing Spring, Spring Boot, Microservices at least for one year. Preferred candidates with hands on experience with Apache Kafka (producers, consumers and stream processors) Familiarity with Kafka internals such as brokers, zookeepers, topics and partitions Familiarity with tools like Kafka connect, Kafka streams and schema Registry Very strong hands-on experience in Java-8 features like Generics, exception handling, collection API, Functional Interfaces, Multithreading, Lambda Expression, Stream API etc Mandatory knowledge in deploying microservices in ECS environment Kubernetes, docker, Light speed etc. Knowledge and experience in Junit are must. Experience in writing Oracle PL / SQL queries. Good to have: Angular, CSS, Banking domain, capital markets
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2