Home
Jobs

56 Apache Kafka Jobs - Page 2

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10 - 15 years

32 - 37 Lacs

Bengaluru

Work from Office

Naukri logo

The Role Are you an experienced Software Architect looking for an exciting opportunity to shape the future of software development? Look no further than Kyndryl, where you'll have the chance to make a meaningful impact as you lead the design and development of cutting-edge software applications. As a Software Engineering Architect at Kyndryl, you will be responsible for leading the charge in creating technical blueprints, defining product/solution architecture, and developing the technical specifications for our projects. You'll be a key player in integrating the features of our software applications into a cohesive and functioning system. Responsibilities The chosen candidate will play the following roles and take on the below responsibilities . As Lead Integration Architect for project delivery - Engage with clients to design applications based on business requirementscovering Applications, Data, Cloud Technology & Application Authentication & Authorization. Performing quality review of project deliverables. Lead work sessions and client discussion. Collaborate with enterprise architecture, information security, application & infrastructure teams to produce an optimal, high level, conceptual, and cost-effective application designs by using microservices patterns and best practices Who You Are Youre good at what you do and possess the required experience to prove it. However, equally as important you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused someone who prioritizes customer success in their work. And finally, youre open and borderless naturally inclusive in how you work with others. Required Technical and Professional Experience Required Technical and Professional Experience Overall 10+ years of relevant IT experience, with minimum 8 years of experienceinarea of design, deployment and maintenance of middleware solutions in application development and modernization programs. MQ, WebSphere, MuleSoft, IBM Integration Bus Experience in deployment of applications to various integration components Must have exp in DevOps framework. Web Application Server - Tomcat, Jboss , Open Liberty, Apache Kafka, Apache Tomcat, Nginx, EAP, Jetty, Oracle WebLogic, NodeJS, RabbitMQ, Apache Kafka, JIRA, Terraform, Ansible, Jenkins, GIT, NEXUS, Preferred Technical and Professional Experience Bachelor's degree in Computer Science, related technical field, or equivalent practical experience Experience with one or more automation tools like Ansible, Puppet, Chef, Terraform Experience with containerization packaging, tooling, and DevOps methodologiesExperience in management and orchestration for distributed federated edge K8S clusters like Helm Understanding of computer networking and protocols

Posted 2 months ago

Apply

7 - 10 years

20 - 22 Lacs

Chennai, Pune, Noida

Work from Office

Naukri logo

Experience in Java, Apache Kafka, Streams, Clusters Application Development, Topic Management, Data Pipeline Development, Producer & Consumer Implementation, Integration & Connectivity, Cluster Administration, Security & Compliance, Apache Zookeeper Required Candidate profile 7 -10 year exp in Kafka Expertis, Programming Skill, Big Data & Streaming Technologie, Database Knowledge, Cloud & DevOp, Event-Driven Architecture, Security & Scalability, Problem Solving & Teamwork

Posted 2 months ago

Apply

8 - 13 years

15 - 30 Lacs

Chennai, Bengaluru

Hybrid

Naukri logo

Extensive testing experience especially with a data environment with strong SQL skills and understanding of databases Test automation frameworks such as BDD Cucumber and Java programming (basic) and scripting experience Agile experience with good Jira/ confluence skills Experience with cloud platforms such as GCP and associated technology such as BigQuery and Spanner Experience in data integration tools such as Apache Kafka, IBM MQ Experience with data visualization tools such as Looker / Tableau Performance Test experience using tools such as LoadRunner, Jmeter Test Environment management

Posted 2 months ago

Apply

10 - 20 years

45 - 65 Lacs

Bengaluru

Work from Office

Naukri logo

Lead Messaging & Scheduling systems, manage real-time data processing & predictive analytics, work with time-series databases (InfluxDB, Kafka, OSIsoft), design event-driven architectures (Flink, Spark, Azure Event Hubs), and oversee IT teams

Posted 2 months ago

Apply

7 - 10 years

27 - 32 Lacs

Pune

Work from Office

Naukri logo

Job Title: Big Data Developer Job Location: Pune Experience : 7+ Years Job Type: Hybrid. Strong skills in - Messaging Technologies like Apache Kafka or equivalent, Programming skill Scala, Spark with optimization techniques, Python Should be able to write the query through Jupyter Notebook Orchestration tool like NiFi, Airflow Design and implement intuitive, responsive UIs that allow issuers to better understand data and analytics Experience with SQL & Distributed Systems. Strong understanding of Cloud architecture. Ensure a high-quality code base by writing and reviewing performance, well-tested code Demonstrated experience building complex products. Knowledge of Splunk or other alerting and monitoring solutions. Fluent in the use of Git, Jenkins. Broad understanding of Software Engineering Concepts and Methodologies is required.

Posted 2 months ago

Apply

6 - 9 years

6 - 16 Lacs

Chennai, Pune, Bengaluru

Work from Office

Naukri logo

Dear candidate, We are currently looking for skilled professionals for an exciting opportunity at Deloitte India in the domain of Java backend-Java Full stack . Based on your background and expertise, we believe you could be a great fit for this position. (Immediate preferred) If you're interested in this opportunity, please share your updated resume to sanvenkatesan@deloitte.com and any other relevant details at your earliest convenience. - Name: - Email Id: - Phone No: - Company: - Skill : - Total Experience: - Relevant Experience: - Current Location: - Preferred Locations: Pune / Chennai / Bangalore - Current CTC: - Expected CTC: - Notice Period(LWD-Date & Month): Job Description : The ideal candidate will have strong expertise in backend development using Java and Spring Boot along with frontend development using React.js, Angular etc . Experience: 6 to 9 years of experience Strong system design experience with Data Structures/Algorithms Strong working experience in Java programming including Java 8 and multithreading features. Java 15 knowledge is a plus. Strong experience & knowledge in Spring / Spring Boot (creating endpoints, integrations, CRUD operations, etc.) Strong experience and understanding of OOPS concepts. Good exposure to TDD and BDD Good working experience in SQL and NoSQL Databases (use cases, querying, joins, triggers, etc.) Low level design, API Design and Database Table design experience is required. General awareness of architecture & design patterns. Experience in Docker, Kubernetes, Cloud platforms and DevOps are an added advantage. Good experience in software lifecycle (waterfall/agile/others) and processes. Strong analytical and problem-solving skills Good communication skills Key Responsibilities: Design, develop, and maintain RESTful APIs using Java, Spring Boot . Build dynamic and responsive UI components using React.js etc . Collaborate with cross-functional teams to define, design, and ship new features. Optimize applications for performance, scalability, and security . Write clean, maintainable, and efficient code following best practices. Work with databases (SQL/NoSQL) and ensure efficient data management. Implement authentication and authorization mechanisms (OAuth, JWT, etc.). Debug and troubleshoot issues across the full stack. Participate in code reviews, testing, and deployments . Required Skills: Backend: Java, Spring Boot, Microservices, Hibernate, REST APIs. Frontend: React.js, JavaScript, TypeScript, Redux, HTML, CSS. Databases: MySQL, PostgreSQL, MongoDB (any one or more). DevOps & Tools: Docker, Kubernetes, CI/CD pipelines (Jenkins, GitHub Actions). Cloud: AWS, Azure, or GCP (basic understanding preferred). Version Control: Git, GitHub/GitLab. Frameworks: Apache Kafka, Apache Camel, Spring integration Testing Frameworks: JUnit, Jest, Cypress (optional). Looking forward to hearing from you. Best regards, Santha Kumar V

Posted 2 months ago

Apply

6 - 7 years

11 - 14 Lacs

Delhi NCR, Mumbai, Bengaluru

Work from Office

Naukri logo

Location: Remote / Pan India,Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Notice Period: Immediate iSource Services is hiring for one of their client for the position of Java kafka developer. We are seeking a highly skilled and motivated Confluent Certified Developer for Apache Kafka to join our growing team. The ideal candidate will possess a deep understanding of Kafka architecture, development best practices, and the Confluent platform. You will be responsible for designing, developing, and maintaining scalable and reliable Kafka-based data pipelines and applications. Your expertise will be crucial in ensuring the efficient and robust flow of data across our organization. Develop Kafka producers, consumers, and stream processing applications. Implement Kafka Connect connectors and configure Kafka clusters. Optimize Kafka performance and troubleshoot related issues. Utilize Confluent tools like Schema Registry, Control Center, and ksqlDB. Collaborate with cross-functional teams and ensure compliance with data policies. Qualifications: Bachelors degree in Computer Science or related field. Confluent Certified Developer for Apache Kafka certification. Strong programming skills in Java/Python. In-depth Kafka architecture and Confluent platform experience. Experience with cloud platforms and containerization (Docker, Kubernetes) is a plus. Experience with data warehousing and data lake technologies. Experience with CI/CD pipelines and DevOps practices. Experience with Infrastructure as Code tools such as Terraform, or CloudFormation.

Posted 2 months ago

Apply

7 - 11 years

8 - 10 Lacs

Bhubaneshwar, Kolkata

Work from Office

Naukri logo

Data Science professional with a proven track record in training Engineering, IT, Diploma, Polytechnic and Technical candidates. With over a 7 yrs of experience in Artificial Intelligence, Machine Learning, Big Data, and Cloud Computing, Specialise in delivering industry-oriented, hands-on training that equips candidates with the technical proficiency required in today's data-driven world.

Posted 2 months ago

Apply

7 - 11 years

8 - 10 Lacs

Bhubaneshwar, Kolkata

Work from Office

Naukri logo

Data Science professional with a proven track record in training Engineering, IT, Diploma, Polytechnic and Technical candidates. With over a 7 yrs of experience in Artificial Intelligence, Machine Learning, Big Data, and Cloud Computing, Specialise in delivering industry-oriented, hands-on training that equips candidates with the technical proficiency required in today's data-driven world. Experienced Data Science professionals with a proven track record in training BSc. M.Sc. in Mathematics, Statistics, or Computer Science Keywords : - Statistical Modelling, Machine Learning, Deep Learning,Hadoop, Spark, Apache Kafka, Python, Scala Mandatory Key Skills : - Data Science Techniques, Big Data Technologies, Python, R, Scala, Data Engineering, Cloud Platforms & DevOps Integration Work Experience Required : - 7 Years

Posted 2 months ago

Apply

8 - 10 years

15 - 25 Lacs

Chennai, Bengaluru, Hyderabad

Work from Office

Naukri logo

Incident Management for Confluent Platform Change Management for Confluent Platform Connectivity between these producers and consumers Confluent deployment services for overall cluster Manage and execute Fail Forward scenarios Confluent Platform Monitoring Connectivity and Networking between SaaS and PaaS Capacity Management Confluent availability management Confluent disaster Recovery & Execution Confluent Cluster Management RBAC (Role Based Access Controls) ACL ( Access Control List) Oauth - SIMAAS Integration Org Cloud API Key creation (Create client and secret) Org Cloud API Key management Azure key vault Environment Management Cluster Creation Cluster management Confluent managed Connectors Pool ID Cluster level API Key L1 Incident Management Incident Management for Confluent App Topics Change Management for Confluent Topics Confluent deployment services for app related Application Monitoring Cluster Link KSQL Self-managed Connectors Schema Registry

Posted 2 months ago

Apply

5 - 10 years

7 - 12 Lacs

Bengaluru, Hyderabad, Gurgaon

Work from Office

Naukri logo

Minimum of 5+ years in experience in Java Backend Strong understanding of data structures and algorithms Sound understanding of object-oriented programming and excellent software design skills. Good experience of SOA/Microservices/Restful services and development of N-tier J2EE / Java Springboot applications (APIs) . Strong understanding of database design and SQL (mySql/mariaDB) development Should have experience in Apache Kafka , RabbitMQ, or other Queueing systems. Drive discussions to create/improve the product, process, and technology Provide end to end solution and design details Lead development of formalized solution methodologies Passion to work in a startup-like environment

Posted 3 months ago

Apply

5 - 7 years

11 - 20 Lacs

Chennai, Bengaluru, Hyderabad

Work from Office

Naukri logo

Job Title : Kafka Developer Location : Chennai /Hyderabad/Bangalore Job Type : Full-Time Experience:5-9 Yrs Introduction : We are seeking an experienced Kafka Developer with JAVA to join our dynamic team. In this role, you will be responsible for designing, implementing, and maintaining real-time data streaming systems using Apache Kafka. You will work closely with cross-functional teams to build scalable, high-performance data pipelines and enable the efficient flow of data across various applications. Responsibilities : Design, implement, and maintain scalable, high-performance data streaming systems using Apache Kafka. Build and deploy Kafka topics, producers, and consumers for real-time data processing. Collaborate with backend engineers, data engineers, and other team members to integrate Kafka into various systems and platforms. Optimize Kafka clusters for performance, scalability, and high availability. Develop Kafka Streams applications for real-time data processing and transformation. Troubleshoot and resolve Kafka-related issues, including cluster performance, message processing, and data consistency problems. Implement security best practices within the Kafka ecosystem, including access control, encryption, and authentication. Monitor Kafka clusters and pipelines to ensure uptime and performance metrics are met. Ensure proper data governance and compliance measures are implemented across the Kafka pipeline. Develop and maintain documentation, including setup guides, technical specifications, and architecture diagrams. Stay up to date with the latest Kafka features, improvements, and industry best practices. Requirements : Proven experience as a Kafka Developer, Data Engineer, or similar role with hands-on expertise in Apache Kafka. Strong knowledge of Kafkas core concepts: topics, partitions, producers, consumers, brokers, and Kafka Streams. Experience with Kafka ecosystem tools like Kafka Connect, Kafka Streams, and KSQL. Expertise in Java, developing Kafka-based solutions. Experience in deploying and managing Kafka clusters in cloud environments (AWS, Azure, GCP). Strong understanding of distributed systems, message brokers, and data streaming architectures. Familiarity with stream processing and real-time data analytics. Experience in building, optimizing, and monitoring Kafka-based systems. Knowledge of containerization technologies (e.g., Docker, Kubernetes) for managing Kafka deployments. Excellent problem-solving skills and the ability to troubleshoot complex Kafka-related issues. Strong communication and collaboration skills for working in a team environment. Preferred Qualifications : Experience with other messaging systems like Apache Pulsar or RabbitMQ. Familiarity with data storage technologies like HDFS, NoSQL, or relational databases. Experience in DevOps practices and CI/CD pipelines. Knowledge of cloud-native architectures and microservices. Education : Bachelors degree in Computer Science, Information Technology, or a related field (or equivalent work experience). Why Join Us : [Photon Interactive systems offers a dynamic and inclusive work environment with opportunities for personal and professional growth. Competitive salary and benefits package. Work with the latest technologies in the field of data streaming and big data analytics.

Posted 3 months ago

Apply

10 - 18 years

30 - 45 Lacs

Chennai

Work from Office

Naukri logo

Details on tech stack GCP Services : BigQuery, Cloud Dataflow, Pub/Sub, Dataproc, Cloud Storage. Data Processing : Apache Beam (batch/stream), Apache Kafka, Cloud Dataprep. Programming : Python, Java/Scala, SQL. Orchestration : Apache Airflow (Cloud Composer), Terraform. Security : IAM, Cloud Identity, Cloud Security Command Center. Containerization : Docker, Kubernetes (GKE). Machine Learning : Google AI Platform, TensorFlow, AutoML. Certifications : Google Cloud Data Engineer, Cloud Architect (preferred). Proven ability to design scalable and robust AI/ML systems in production, with a focus on high-performance and cost-effective solutions. Strong experience with cloud platforms (Google Cloud, AWS, Azure) and cloud-native AI/ML services (e.g., Vertex AI, SageMaker). Expertise in implementing MLOps practices, including model deployment, monitoring, retraining, and version control. Strong leadership skills with the ability to guide teams, mentor engineers, and collaborate with cross-functional teams to meet business objectives. Deep understanding of frameworks like TensorFlow, PyTorch, and Scikit-learn for designing, training, and deploying models. Experience with data engineering principles, scalable pipelines, and distributed systems (e.g., Apache Kafka, Spark, Kubernetes). Nice to have requirements to the candidate Strong leadership and mentorship capabilities, guiding teams toward best practices and high-quality deliverables. Excellent problem-solving skills, with a focus on designing efficient, high-performance systems. Effective project management abilities to handle multiple initiatives and ensure timely delivery. Strong emphasis on collaboration and teamwork , fostering a positive and productive work environment.

Posted 3 months ago

Apply

3 - 8 years

25 - 27 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Top-notch programming skills, with an interest in functional programming languages. - Solid Coding skills in Java/J2EE technologies with Spring Boot experience - Solid understanding of developing and supporting large scale, cloud based distributed systems. - Experience with REST based API Development - Relational DB knowledge like Postgres DB - SOA Architecture experience is needed to develop scalable APIs - Exposure to event based and asynchronous processing. - Strong operational excellence and testing skills. - Someone who values automation. We dont like solving the same problem manually over and over. - Experience working with distributed databases (Cosmos/Cassandra/etc.) - Fluent in a few programming languages, functional, dynamic and static. - Solid understanding of how to use data structures to solve problems optimally. - Good understanding of event-driven systems. - Experience with Apache Kafka or similar streaming platform. - Understands DevOps: how to support large scale distributed systems, how to prioritize and escalate issues. - Is to modernize the existing Java Application to modern tech stack and BOM upgrade. While doing that, they are planning to rewrite for better optimization and performance - Supporting Content Management Systems for the e-commerce site - Informing the blueprint and template of e-commerce site - Helping in providing appropriate content for the display

Posted 3 months ago

Apply

4 - 8 years

10 - 20 Lacs

Hyderabad

Hybrid

Naukri logo

Preferred candidate profile Mandatory 3+ hands on working experience on Apache NiFi + Apache Kafka Good to have Google Cloud Big Query Scripting Skills or Snowflake AWS or Cloud SQL Knowledge Having working experience on Scala/Java is added advantage Mandatory SQL PL/SQL Scripting Experience Mandatory anyone of Python/Linux/Unix Skills

Posted 3 months ago

Apply

5 - 10 years

25 - 35 Lacs

Bengaluru

Hybrid

Naukri logo

Were always looking for talented and creative engineers to join our team. Event & Streaming Group offers a relaxed but fast environment where creative and collaborative talented people are rewarded. We are very active and passionate about catching up and introducing cutting-edge technology from OSS (Open-Source Software). Our Solution for Data Engineering and Event Management are being used for various services in Rakuten, Inc and continue to grow, following up needs of system for data-driven strategy. Userss requirements and needs are changing continuously, and Our Solution are also evolving fast to catch up their needs and support. Role: We are in search of a talented Engineer, which would work with members in India and Japan. In Event & Streaming Group where are collecting and engineering tremendous data using data engineering solutions, you will get to play a core role in administrating, monitoring and problem resolution on current data engineering platform, and the cutting-edge data engineering technology R&D. Responsibilities: Administration and Maintenance for Data Pipeline System that transfer and wrangle terabyte of data from various service using ELK, Apache Kafka, Apache NiFi. Collaboration with SRE Tm in Japan and India. Implement Automated Operation System. L1/L2 Incident Response. Requirements: Excellent Hands-on experience with Linux . (At least more than 3-years) Must have experience in administrating and maintaining one of the following: Apache Kafka, ELK, NIFI Cluster in production . (At least more than 1-years) Apache Pulsar or Confluent Kafka (At least more than 1-years) Hands-on experience with one of Apache Pulsar or Confluent Kafka on K8S . (At least more than 1-years) Hands-on Experience with one of deployment system like Chef, Ansible , etc Hands-on Experience with one of metrics collection system like Prometheus, Graphite , etc Experience on one of programing languages in J ava (or Scala), Python, or ShellScript (At least more than 1-years) Must have experience in administrating and maintaining client-server backend system in production. (At least more than 1-years) Must be self-organized and gritty on continuous improvements of the platform Must be a self-starter and good collaborator with good communication skills. Preferred Knowledge, Skills and Abilities: Hands-on experience with HDP (HDFS, Hive/HiveLLAP, MapReduce, Spark on Yarn) or CDP Hands-on experience or great knowledge with Docker, Kubernetes . Hands-on experience or great knowledge with GCP, AWS, Azure Fluent or Business level of Japanese. Looking for immediate joiners / can join with in 30-days Rakuten is committed to cultivating and preserving a culture of inclusion and connectedness. We are able to grow and learn better together with a diverse team and inclusive workforce. The collective sum of the individual differences, life experiences, knowledge, innovation, self-expression, and talent that our employees invest in their work represents not only part of our culture, but our reputation and Rakutens achievement as well. In recruiting for our team, we welcome the unique contributions that you can bring in terms of their education, opinions, culture, ethnicity, race, sex, gender identity and expression, nation of origin, age, languages spoken, veterans status, color, religion, disability, sexual orientation, and beliefs.”

Posted 3 months ago

Apply

4 - 7 years

6 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Job Responsibilities Assist in the design and implementation of Snowflake-based analytics solution(data lake and data warehouse) on AWS Requirements definition, source data analysis and profiling, the logical and physical design of the data lake and data warehouse as well as the design of data integration and publication pipelines Develop Snowflake deployment and usage best practices Help educate the rest of the team members on the capabilities and limitations of Snowflake Build and maintain data pipelines adhering to suggested enterprise architecture principles and guidelines Design, build, test, and maintain data management systems Work in sync with internal and external team members like data architects, data scientists, data analysts to handle all sorts of technical issue Act as technical leader within the team Working in Agile/Lean model Deliver quality deliverables on time Translating complex functional requirements into technical solutions. EXPERTISE AND QUALIFICATIONS Essential Skills, Education and Experience Should have a B.E. B.Tech. MCA or equivalent degree along with 4-7 years of experience in Data Engineering Strong experience in DBT concepts like Model building and configurations, incremental load strategies, macro, DBT tests. Strong experience in SQL Strong Experience in AWS Creation and maintenance of optimum data pipeline architecture for ingestion, processing of data Creation of necessary infrastructure for ETL jobs from a wide range of data sources using Talend, DBT, S3, Snowflake. Experience in Data storage technologies like Amazon S3, SQL, NoSQL Data modeling technical awareness Experience in working with stakeholders working in different time zones Good to have AWS data services development experience. Working knowledge on using Bigdata technologies. Experience in collaborating data quality and data governance team. Exposure to reporting tools like Tableau Apache Airflow, Apache Kafka (nice to have) Payments domain knowledge CRM, Accounting, etc. in depth understanding Regulatory reporting exposure Other skills Good Communication skills Team Player Problem solver Willing to learn new technologies, share your ideas and assist other team members as needed Strong analytical and problem-solving skills; ability to define problems, collect data, establish facts, and draw conclusions.

Posted 3 months ago

Apply

10 - 15 years

35 - 40 Lacs

Hyderabad

Work from Office

Naukri logo

Responsibilities 1. Integration Strategy & Architecture Define the enterprise integration strategy , aligning with business goals and IT roadmaps. Design scalable, resilient, and secure integration architectures using industry best practices. Develop API-first and event-driven integration strategies. Establish governance frameworks, integration patterns, and best practices. 2. Technology Selection & Implementation Evaluate and recommend the right integration technologies , such as: Middleware & ESB: TIBCO, MuleSoft, WSO2, IBM Integration Bus Event Streaming & Messaging: Apache Kafka, RabbitMQ, IBM MQ API Management: Apigee, Kong, AWS API Gateway, MuleSoft ETL & Data Integration: Informatica, Talend, Apache NiFi iPaaS (Cloud Integration): Dell Boomi, Azure Logic Apps, Workato Lead the implementation and configuration of these platforms. 3. API & Microservices Architecture Design and oversee API-led integration strategies. Implement RESTful APIs, GraphQL, and gRPC for real-time and batch integrations. Define API security standards ( OAuth, JWT, OpenID Connect, API Gateway ). Establish API versioning, governance, and lifecycle management. 4. Enterprise Messaging & Event-Driven Architecture (EDA) Design real-time, event-driven architectures using: Apache Kafka for streaming and pub/sub messaging RabbitMQ, IBM MQ, TIBCO EMS for message queuing Event-driven microservices using Kafka Streams, Flink, or Spark Streaming Ensure event sourcing, CQRS, and eventual consistency in distributed systems. 5. Cloud & Hybrid Integration Develop hybrid integration strategies across on-premises, cloud, and SaaS applications . Utilize cloud-native integration tools like AWS Step Functions, Azure Event Grid, Google Cloud Pub/Sub. Integrate enterprise applications (ERP, CRM, HRMS) across SAP, Oracle, Salesforce, Workday . 6. Security & Compliance Ensure secure integration practices , including encryption, authentication, and authorization. Implement zero-trust security models for APIs and data flows. Maintain compliance with industry regulations ( GDPR, HIPAA, SOC 2 ). 7. Governance, Monitoring & Optimization Establish enterprise integration governance frameworks. Use observability tools for real-time monitoring (Datadog, Splunk, New Relic). Optimize integration performance and troubleshoot bottlenecks. 8. Leadership & Collaboration Collaborate with business and IT stakeholders to understand integration requirements. Work with DevOps and cloud teams to ensure CI/CD pipelines for integration. Provide technical guidance to developers, architects, and integration engineers. Qualifications Technical Skills Candidate should have 10+ years of experience Expertise in Integration Platforms: Informatica, TIBCO, MuleSoft, WSO2, Dell Boomi Strong understanding of API Management & Microservices Experience with Enterprise Messaging & Streaming (Kafka, RabbitMQ, IBM MQ, Azure Event Hub) Knowledge of ETL & Data Pipelines (Informatica, Talend, Apache NiFi, AWS Glue) Experience in Cloud & Hybrid Integration (AWS, Azure, GCP, OCI) Hands-on with Security & Compliance (OAuth2, JWT, SAML, API Security, Zero Trust) Soft Skills Strategic Thinking & Architecture Design Problem-solving & Troubleshooting Collaboration & Stakeholder Management Agility in Digital Transformation & Cloud Migration

Posted 3 months ago

Apply

7 - 12 years

10 - 20 Lacs

Bengaluru

Hybrid

Naukri logo

Developing Modern Data Warehouse solutions using Databricks and AWS/ Azu, GCP Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills and Qualifications: Bachelor's and/or masters degree in computer science or equivalent experience. Must have total 6+ yrs. of IT exr Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail.

Posted 3 months ago

Apply

10 - 12 years

15 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Collaborate with stakeholders to align info reqs, develop architecture artifacts,& drive innovative IA solutions. Ensure compliance with tech & audit standards, manage prod changes,& maintain LMS expertise (e.g., integration, calc, interfaces). Required Candidate profile Exp. with Java 9+ Ver. 10-15 yrs in software dev, 8+ yrs in mission-critical system, 5+ yrs in tech leadership & SOA. Expert in architect, agile, EA, DBs, web tech, & EAI.

Posted 3 months ago

Apply

4 - 8 years

10 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Ensure quality software delivery & enhancements. Manage full software lifecycle. Provide Tier 3+ support & implement safe changes. Specialize in an LMS functional area (e.g., Integrations, Calculations, Equipment Interfaces). Required Candidate profile CS degree & 4–8 yrs exp. in agile, coding, testing & ops. Exp. in Java 9+, JS, Ruby, REST APIs,HTML, JSON, SOLID. Skilled in TDD/BDD, Agile, Docker/K8s, SQL/NoSQL, Kafka, RabbitMQ,

Posted 3 months ago

Apply

8 - 10 years

20 - 30 Lacs

Chennai

Work from Office

Naukri logo

P2 C3 STS Primary skills Java Spring boot SQL Camunda Splunk Design 6+yrs of Hands-on experience in designing and implementing web applications using Java Platform.(Java 8) Working experience in Spring Boot, Spring Data JPA, Spring MVC, Spring Security, Spring Cloud Config Experience designing and building service oriented or microservice architectures; Understanding and implementing of RESTful API principles; Experience in Messaging Queues with Apache Kafka. Good Experience in Oracle Database. Testing Framework experience in JUnit/Mockito Solid experience with continuous integration and continuous delivery tools like Git, Gitlab, Docker, Appengine Agile methodologies (SCRUM) and Software development life cycle; Tools: Swagger, Postman, Insomnia, Splunk, ServiceNow, Jira .

Posted 3 months ago

Apply

6 - 10 years

8 - 18 Lacs

Kolkata

Hybrid

Naukri logo

Lead team of Blockchain Developers, write & review high-quality codes. Design & build Blockchain framework, accelerators & assets. Design & deploy smart contracts on Ethereum & Layer 2 sidechains. Collaborate on decentralized finance projects & TDD. Required Candidate profile B.Tech/ MCA 6+ Yrs exp in Solidity smart contract prog. Hands-on blockchain APIs, Ethereum standards (ERC-20, ERC-721) & De-Fi projects, Docker, Kubernetes, Node.js, Open-source tools, React/ Angular.

Posted 3 months ago

Apply

7 - 9 years

11 - 12 Lacs

Delhi NCR, Mumbai, Bengaluru

Work from Office

Naukri logo

Location: Remote / Pan India, hyderabad,ahmedabad,pune,chennai,kolkata. Notice Period: Immediate iSource Services is hiring for one of their client for the position of RoR Engineer About the Role - An RoR Engineer is responsible for maintaining all the applications i.e. the primary back-end application API, the order admin tool, the eCommerce application based on Solidus, and various supporting services which are used by our fulfilment partners, web and mobile customer facing applications. Roles & Responsibilities: Primary technology: Ruby on Rails Monitoring #escalated-support and #consumer-eng slack channels and addressing any issues that require technical assistance. Monitoring logs (via rollbar / datadog) and resolving any errors. Monitoring Sidekiqs job morgue and addressing any dead jobs. Maintaining libraries in all applications with security updates. Security requirements and scope understanding. Must have knowledge of database like MySQL, PostgreSQL, SQLite. Good knowledge of deployment of application on server. 7 years in ROR and 3 Years in Angular JS .

Posted 3 months ago

Apply

5 - 10 years

30 - 32 Lacs

Chennai

Work from Office

Naukri logo

Roles and Responsibilities Implementing the design and architecture of complex web applications using Angular framework, ensuring adherence to best practices and architectural principles. Collaborate closely with product managers, UX/UI designers, and development teams to translate business requirements into technical specifications and architectural designs. Define and implement scalable and maintainable front-end architecture, including component-based architecture, state management, and data flow patterns. Provide technical guidance and mentorship to development teams, promoting code quality, performance optimization, and maintainability. Conduct code reviews and architectural reviews to ensure compliance with established standards and design guidelines. Evaluate and recommend tools, libraries, and frameworks to enhance productivity and efficiency in Angular development. Stay current with industry trends and emerging technologies related to front-end development, and incorporate them into our architectural roadmap. Drive continuous improvement initiatives to streamline development processes, increase development velocity, and elevate overall product quality. Preferred Skills Knowledge of continuous integration Excellent teamwork and communication abilities Excellent organizational and time management abilities Effective scrum master experience Good to have knowledge of API designing using Swagger Hub Good to have knowledge of SignalR API for web functionality implementation and data broadcasting. Requirements Skill Requirements Bachelor/Master of Engineering or equivalent in Computers/Electronics and Communication with 8+ years experience. Proven Experience as Software Architect or Solution Architect or Senior Full Stack Developer or in web application development. Hands-on experience in C#, ASP.NET development. Expert-level proficiency in Angular framework and its ecosystem (Angular CLI, RxJS, Angular Material and related technologies). Expert-level proficiency in designing and implementing microservices-based applications, with a strong understanding of micro services design principles, patterns, and best practices. Architect level Cloud Certification is recommended. Deep knowledge of front-end development technologies such as HTML5, CSS3, JavaScript/ TypeScript, and RESTful APIs. Experience with state management libraries (e.g., NgRx, Redux) and reactive programming concepts. Strong understanding of software design principles, design patterns, and architectural styles, with a focus on building scalable and maintainable front-end architectures. Excellent communication and collaboration skills, with the ability to effectively convey technical concepts to non-technical stakeholders. Experience working in Agile/Scrum development environments and familiarity with DevOps practices is a plus. Experience to work in multiple cloud environments - Azure, AWS web services and GCP. Experience in developing and consuming web services GRPC Strong knowledge of RESTful APIs, HTTP protocols, JSON, XML and micro services using serverless cloud technologies. Design, Implementation and Integration of data storage solutions like databases, key-value stores, blob stores User authentication and authorization between multiple systems, servers, and environments Management of hosting environment, deployment of update packages Excellent analytical and problem-solving abilities Strong understanding of object-oriented programming Strong unit test and debugging skills Proficient understanding of code versioning tools such as Git, SVN Hands-on experience with PostgreSQL Database. Knowledge on Azure IOT, MQTT, Apache Kafka, Kubernetes, Docker, is a plus. Experience with version control systems such as Git & SVN. Good understanding of Agile based software development & Software delivery process. Experience in Requirements Managements tools like Polarion [preferable] or any other requirement management system Excellent communication and collaboration abilities, with the capacity to work effectively in cross- functional teams.

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies