Jobs
Interviews

78 Confluent Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

punjab

On-site

As a Senior Lead Developer in Sydney, you will be responsible for: - Demonstrating extensive experience with development and the administration of Kafka (Confluent) in production. - Showing a strong understanding of how to deploy and manage Kafka efficiently. - Understanding how to provision a secure, highly available cluster architecture. - Operating experience with service mesh technology such as Istio/Consul. - Being proficient in modern Cloud platforms, especially AWS. - Engaging in large development of digital applications. - Delivering cloud solutions end-to-end and being hands-on throughout the process. - Implementing CI/CD pipelines efficiently. - Utilizing config as code effectively. - Having experience working in an Agile environment. - Demonstrating outstanding communication skills. Kindly note that no additional details of the company were provided in the job description.,

Posted 6 days ago

Apply

4.0 - 7.0 years

35 - 40 Lacs

bengaluru

Work from Office

Overall 8+ years with 3+ years of experience in RabbitMQ and administering Kafka (Apache, Confluent, MSK) in production environments. Proven track record in monitoring, optimization, and incident resolution. Deep understanding of RabbitMQ and Kafka ecosystems: brokers, connectors, zookeeper/KRaft, schema registry. Proficiency with monitoring tools and middleware performance metrics. Strong collaboration, communication, and documentation abilities. Experience supporting cross-functional teams and mentoring juniors. Strong problem-solving skills and attention to detail.

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

As a Java Backend with Kafka, you will be responsible for demonstrating a strong proficiency in Core Java, Spring Boot, and Microservices architecture. Your role will involve hands-on experience with Apache Kafka, including Kafka Streams, Connect, Schema Registry, and Confluent. You will also work with REST APIs, JSON, and event-driven systems. In this position, you should have knowledge of SQL databases such as MySQL and PostgreSQL, as well as NoSQL databases like MongoDB, Cassandra, and Redis. Familiarity with Docker, Kubernetes, and CI/CD pipelines is essential for success in this role. Experience in multi-threading, concurrency, and distributed systems will be beneficial. An understanding of cloud platforms such as AWS, Azure, or GCP is desired. You should possess strong problem-solving skills and excellent debugging abilities to address complex technical challenges effectively. Join our team in Bangalore (WFO) and contribute your expertise to our dynamic projects.,

Posted 1 week ago

Apply

7.0 - 14.0 years

0 Lacs

karnataka

On-site

The ideal candidate for the position should possess 10-14 years of experience in software development/engineering with a focus on Debezium/Confluent and Kafka, ETL tools, database architectures, and data replication. As a Software Development Engineer, you will be based in Bangalore and will be responsible for architecting, developing, designing, and maintaining Debezium/Kafka/Snowflake-based replication solutions for a large ERP system. In addition, you will develop custom connectors to integrate with multiple platforms and demonstrate expertise in Snowflake internals, integrating Snowflake with other technologies for data processing and reporting. Your key responsibilities will include designing and implementing Kafka-based messaging systems to stream data in real-time to Snowflake, troubleshooting and resolving data-related issues such as performance bottlenecks and data quality problems, as well as responding promptly to production incidents and service interruptions. You will be expected to implement monitoring tools and metrics to track the performance and health of data replications, analyze system logs and metrics to proactively identify issues, optimize resource utilization, and improve overall system efficiency. Furthermore, you will take ownership of complex technical challenges and drive them to successful resolution. To be successful in this role, you should have a minimum of 7 years of experience in data engineering, replication, and ETL tools, with at least 4 years of experience with Debezium/Confluent and Kafka-based solutions preferred. You should possess a strong understanding of database architectures, database administration (Oracle and SQL Server), and database activities impacting replication and data replication. Demonstrated experience in designing and implementing data replication strategies using Debezium/Kafka/Snowflake or a similar technology stack is essential for this role. At CGI, we pride ourselves on ownership, teamwork, respect, and belonging. As a CGI Partner, you will have the opportunity to be an owner from day one, shaping the company's strategy and direction. Your work will create value by developing innovative solutions, building relationships with teammates and clients, and accessing global capabilities to scale your ideas. You will have the chance to shape your career in a company built to grow and last, supported by leaders who care about your health and well-being and provide opportunities for skill development and growth. Join our team at CGI, one of the largest IT and business consulting services firms globally, and embark on a journey where you can turn meaningful insights into action.,

Posted 1 week ago

Apply

6.0 - 10.0 years

15 - 25 Lacs

thane, navi mumbai, mumbai (all areas)

Work from Office

Location: Thane (Work from Office) Mandate Experience: 3 - 6 Years of experience in KAFKA Role & Responsibilities Minimum 6+ years of Operations Experience in 24x7 high-availability Linux Production environment. Good Understanding of Operating System: Administer and tune Linux systems (Red Hat, SUSE, Ubuntu) in a 24x7 production environment. Perform kernel tuning, system hardening, and firewall configuration. Manage file systems (EXT3, XFS, NFS), LVM, and patching Demonstrated ability to implement and maintain software load balancers like HAProxy, Keepalived for traffic distribution and high availability Troubleshoot system issues using tools like iOS tat, vmstat, netstat, Sar, and strace Expertise in Kafka & Streaming Systems: Design, deploy, and manage Kafka clusters for real-time data ingestion and analytics. Integrate Kafka with enterprise systems and data pipelines. Monitor and optimize Kafka performance for low latency and high throughput. Implement Kafka Connect, Kafka Streams, and Schema Registry. Implemented Kafka for data streaming and real-time analytics. Conducted performance testing and optimization to ensure high throughput and low latency. Developed monitoring and alerting solutions to ensure the health of the Kafka cluster Expertise in Elasticsearch Administration: Deploy, configure, and maintain Elasticsearch clusters in production and development environments. Monitor cluster health, performance, and availability using tools like Kibana, Grafana, and Prometheus. Optimize indexing, querying, and storage strategies for performance and scalability. Manage Elasticsearch upgrades, backups, and disaster recovery plans. Implement and manage index lifecycle policies and shard allocation strategies Good understanding of CI/CD & Automation: Build and maintain CI/CD pipelines using Jenkins, GitLab CI, or similar tools. Automate infrastructure provisioning and configuration using Ansible, Terraform, or Puppet, similar tools Implement GitOps workflows and Infrastructure as Code (IaC) practices Integrate Elasticsearch into CI/CD pipelines for log aggregation and monitoring Hands on in Implementing Observability stack: Set up monitoring and alerting using Prometheus, Grafana, ELK Stack, or Zabbix. Develop custom scripts for log parsing, system health checks, and alert automation Familiarity with cloud platforms (AWS, Azure, GCP) and services like EC2, S3, IAM, and VPC. Familiarity with containerization (Docker) and orchestration (Kubernetes, Helm). Awareness on Security & Compliance: Role-based access control (RBAC), TLS encryption, and audit logging Security best practices including SE Linux, firewall, and audit Integration of Elasticsearch with authentication systems (LDAP, SSO, etc.) Compliance with data protection and operational standards Must possess significant work experience in a production, mission-critical environments Red-Hat and Certified Kubernetes Administrator (CKA) Certification will be an added advantage Competencies: Team working Strong analytical and troubleshooting skills Excellent communication and collaboration abilities Self-driven, detail-oriented, and results-focused

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

pune, maharashtra, india

On-site

Job description Some careers shine brighter than others. If you're looking for a career that will help you stand out, join HSBC, and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organizations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realize their ambitions. We are currently seeking an experienced professional to join our team in the role of senior consultant specialist In this role, you will Be an approachable and supportive team member with a collaborative attitude within a demanding, maturing Agile environment Influence and champion new ideas and methods Great communication - convey your thoughts, ideas and opinions clearly and concisely face-to-face or virtually to all levels up and down stream Understanding of AEM concepts, AEM 6.5 and above Understanding of DevOps and Infrastructure as a Code concept and Terraform And equally important - you listen and reflect back what others communicate to you Regularly demonstrate these qualities - drive, motivation, determination, dedication, resiliency, honesty and enthusiasm skills Excellent organizational and presentation skills Ability to communicate with non-technical people Requirements To be successful in this role you should meet the following requirements: Working knowledge of one of Java, JavaScript/Node.js or Python scripting language Advance knowledge of AWS Cloud and AWS Services like VPC, CloudFront, CloudWatch, Lambda, S3 etc. or any other cloud environment Experience with agile development tools like GIT, Visual Code, IntelliJ, Confluent and JIRA Experienced in full automation and configuration management desirable Knowledge on incident and problem management Understanding and working experience with CI/CD and available tools i.e. usage of Jenkins, Sonar, maven etc. Working experience in an agile environment Strong ability to quickly learn new skills and tools Ability to troubleshoot production issues in timely manner Be a clear communicator, document your work, share your ideas Review and be reviewed by your peers Experience deploying to production You'll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working, and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by - HSBC Software Development India

Posted 2 weeks ago

Apply

5.0 - 10.0 years

30 - 35 Lacs

chennai, bengaluru

Work from Office

Data Engineer: Experienced Kstream + Ksql dev with in-depth knowledge of specific client systems TAHI Contract and Application, ISP Contract and Application modules. Performs data analysis and writes code to implement functional requirements per LLD and client processes. Minimum skills levels in this specific area Current roles are 5 + years plus Insurnace domain experience These are technical roles, and the prime requirement is for Kstream/ Java/ KSLQDB/ Kafka

Posted 2 weeks ago

Apply

7.0 - 12.0 years

14 - 24 Lacs

pune, bengaluru

Hybrid

Role & responsibilities Kafka Engineer Work mode - Hybrid Location - Bangalore/ Pune 7+ years experience with Apache Kafka - designing and owning solutions, troubleshooting. Please share your updated profile to puneet@mounttalent.com

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

pune, maharashtra, india

On-site

Job description Some careers shine brighter than others. If you're looking for a career that will help you stand out, join HSBC, and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organizations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realize their ambitions. We are currently seeking an experienced professional to join our team in the role of senior software engineer In this role, you will Be an approachable and supportive team member with a collaborative attitude within a demanding, maturing Agile environment Influence and champion new ideas and methods Great communication - convey your thoughts, ideas and opinions clearly and concisely face-to-face or virtually to all levels up and down stream Understanding of AEM concepts, AEM 6.5 and above Understanding of DevOps and Infrastructure as a Code concept and Terraform And equally important - you listen and reflect back what others communicate to you Regularly demonstrate these qualities - drive, motivation, determination, dedication, resiliency, honesty and enthusiasm skills Excellent organizational and presentation skills Ability to communicate with non-technical people Requirements To be successful in this role you should meet the following requirements: Working knowledge of one of Java, JavaScript/Node.js or Python scripting language Advance knowledge of AWS Cloud and AWS Services like VPC, CloudFront, CloudWatch, Lambda, S3 etc. or any other cloud environment Experience with agile development tools like GIT, Visual Code, IntelliJ, Confluent and JIRA Experienced in full automation and configuration management desirable Knowledge on incident and problem management Understanding and working experience with CI/CD and available tools i.e. usage of Jenkins, Sonar, maven etc. Working experience in an agile environment Strong ability to quickly learn new skills and tools Ability to troubleshoot production issues in timely manner Be a clear communicator, document your work, share your ideas Review and be reviewed by your peers Experience deploying to production You'll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working, and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by - HSBC Software Development India

Posted 3 weeks ago

Apply

4.0 - 8.0 years

6 - 15 Lacs

bengaluru

Work from Office

Over all 5+ years of experience in IT industry, 3+.years of experience with Snowflake 3+ years of experience in Oracle Sql,Plsql ,DBT ,Apachi Kafka working as a Data engineer and highly motivated individual with proven ability to learn fast and work well under pressure. Apache Kafka: A distributed streaming platform used for building real-time data pipelines. Confluent Platform/Cloud: A platform for building and managing Kafka-based streaming data pipelines. Snowflake Cloud Data Warehouse: A cloud-based data warehouse used for analytics and business intelligence. Kafka Connect: A framework for connecting Kafka to external systems, including databases and data warehouses. Confluent Cloud Connectors: Fully managed connectors for various data sources and sinks, including Snowflake. SQL: Used for querying and manipulating data in Snowflake. ksqlDB: A stream processing platform built on top of Kafka. Roles & Responsibilities Designing and Implementing Data Pipelines: Creating and managing the flow of data between Kafka (often used for streaming data) and Snowflake (used for data warehousing and analytics). Kafka Connect and Confluent Cloud Connectors: Utilizing Kafka Connect to move data efficiently and effectively, potentially leveraging Confluent's fully-managed connectors for Snowflake. Snowflake Expertise: Deep understanding of Snowflake's architecture, data loading, performance optimization, and security best practices. ETL Processes: Developing and optimizing Extract, Transform, Load (ETL) processes to move data from various sources into Snowflake via Kafka. Data Modeling and Architecture: Designing scalable and efficient data models within Snowflake to support analytical workloads. Monitoring and Troubleshooting: Ensuring the reliability and performance of data pipelines, identifying and resolving issues related to data ingestion and processing. Reverse ETL: Potentially involved in Reverse ETL, moving data from Snowflake back into operational systems via Kafka. Real-time Analytics: Working with streaming data and near real-time analytics use cases, leveraging the combined power of Kafka and Snowflake.

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

punjab

On-site

Americana Restaurants International PLC is a pioneering force in the MENA region and Kazakhstan's Out-of-Home Dining industry, and ranks among the world's leading operators of Quick Service Restaurants (QSR) and casual dining establishments. With an extensive portfolio of iconic global brands and a dominant regional presence, we have consistently driven growth and innovation for over 60 years. Our expansive network of 2,600+ restaurants spans 12 countries throughout the Middle East, North Africa, and Kazakhstan from Kazakhstan in the east to Morocco in the west powered by a team of over 40,000+ talented individuals committed to delivering exceptional food, superior service, and memorable experiences. In line with our vision for innovation and operational excellence, we have established our Center of Excellence in Mohali, India. This facility plays a pivotal role in product development, IT, Digital, AI, and Analytics, as well as in implementing global IT best practices. Serving as a strategic hub, it is integral to strengthening our technology backbone and driving digital transformation across our worldwide operations. Your Impact In this role, you will lead and inspire a high-performing technical support team, ensuring every customer interaction is handled efficiently, professionally, and with care. Your Role Will Include: We are looking for a skilled Data Engineer to join our growing team. As a Data Engineer, you will be responsible for designing, developing, and maintaining scalable data pipelines and infrastructure to support the extraction, transformation, and loading of data into our data warehouse and other data repositories. You will collaborate closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions that enable data-driven decision-making. Responsibilities: Data Pipeline Development: - Design, build, and maintain scalable and robust data pipelines for ingesting, transforming, and storing large volumes of data from various sources. - Implement ETL (Extract, Transform, Load) processes to ensure data quality and reliability. - Optimize data pipelines for performance, reliability, and cost-effectiveness. Data Modeling and Architecture: - Design and implement data models and schemas to support business requirements. - Work closely with data architects to ensure alignment with overall data architecture and standards. - Implement and maintain data warehouse solutions and data lakes. Data Integration and API Development: - Integrate data from multiple sources and third-party APIs. - Develop and maintain RESTful APIs for data access and integration. Data Quality and Governance: - Implement data quality checks and monitoring processes to ensure data accuracy, completeness, and consistency. - Define and enforce data governance policies and best practices. Performance Tuning and Optimization: - Monitor and optimize data storage and query performance. - Identify and resolve performance bottlenecks in data pipelines and databases. Collaboration and Documentation: - Collaborate with cross-functional teams including data scientists, analysts, and software engineers to understand data requirements and deliver solutions. - Document data pipelines, data flows, and data models. - Provide technical guidance and support to junior team members. What You Bring - Bachelors degree in computer science, Engineering, or a related field, or equivalent work experience. - Proven experience as a Data Engineer or similar role, with a strong understanding of data management and integration techniques. - Hands-on experience with big data technologies and frameworks such as Hadoop, Spark, Confluent, Kafka, Data Lake, PosgreSQL, Data Factory etc. - Proficiency in programming languages such as Python, Scala, or Java for data manipulation and transformation. - Experience with cloud platforms and services (e.g., Confluent, Azure, Google Cloud). - Solid understanding of relational databases, SQL, and NoSQL databases. - Familiarity with data warehousing concepts and technologies (e.g., Redshift, Snowflake, Big Query). - Strong analytical and problem-solving skills. - Excellent communication and collaboration skills. - Ability to work effectively in a fast-paced environment and manage multiple priorities. Preferred Qualifications: - Masters degree in data science, Computer Science, or a related field. - 5+ years of experience in software development, with a focus on full stack web development using Java technologies. - Experience with containerization and orchestration tools such as Docker and Kubernetes. - Knowledge of machine learning and data analytics techniques. - Experience with data streaming technologies (e.g., Apache Kafka, Kinesis). - Familiarity with DevOps practices and tools.,

Posted 3 weeks ago

Apply

15.0 - 19.0 years

0 Lacs

pune, maharashtra

On-site

Join the Team That's Redefining Wireless Technology At Tarana, you will have the opportunity to contribute to building a cutting-edge cloud product, a management system for wireless networks that scales to millions of devices. You will be involved in utilizing modern cloud-native architecture and open-source technologies. Your responsibilities will include designing and implementing distributed software in a microservices architecture. This will involve tasks such as requirements gathering, high-level design, implementation, integrations, operations, troubleshooting, performance tuning, and scaling. As a key member of the team, you will provide technical and engineering leadership to an R&D team that manages multiple microservices end-to-end. You can expect to work on Proof of Concepts, customer pilots, and production releases in an agile engineering environment. The role will present daily challenges that will allow you to enhance and expand your skills. Meeting high standards of quality and performance will be a core focus, and the necessary mentoring will be provided to support your success. The position is based in Pune and will require your in-person presence in the office for collaboration with team members. Responsibilities: - Minimum of 15 years of software development experience, with at least 5 years in large-scale distributed software - Experience in product architecture and design, including providing technical leadership to engineering teams - Familiarity with building SaaS product offerings or IoT applications - Bonus points for experience in not only developing but also operating and managing such systems Required Skills & Experience: - Bachelor's degree (or higher) in Computer Science or a related field from a reputable university; a Master's or Ph.D. is preferred - Proficiency in software design and development in Java and its associated ecosystem (e.g., Spring Boot, Hibernate, etc.) - Expertise in microservices and RESTful APIs, covering design, implementation, and consumption - Strong understanding of distributed systems, including concepts like clustering, asynchronous messaging, scalability & performance, data consistency, and high availability - Experience with distributed messaging systems like Kafka/Confluent, Kinesis, or Google Pub/Sub - Proficiency in databases (relational, NoSQL, search engines), caching, and distributed persistence technologies. Knowledge of Elastic Search or any time series databases is a plus - Familiarity with cloud-native platforms like Kubernetes and service-mesh technologies such as Istio - Knowledge of network protocols (TCP/IP, HTTP), standard network architectures, and RPC mechanisms (e.g., gRPC) - Understanding of secure coding practices, network security, and application security Join Tarana and be part of shaping the future of wireless connectivity. About Us: Tarana's mission is to accelerate the deployment of fast, affordable internet access globally. With over a decade of R&D and significant investment, the Tarana team has developed a unique next-generation fixed wireless access (ngFWA) technology, embodied in its initial commercial platform, Gigabit 1 (G1). G1 represents a significant advancement in broadband economics for both mainstream and underserved markets, utilizing licensed or unlicensed spectrum. Since its production launch in mid-2021, G1 has been adopted by over 250 service providers in 19 countries and 41 US states. Headquartered in Milpitas, California, Tarana also conducts research and development in Pune, India. Visit our website to learn more about G1.,

Posted 3 weeks ago

Apply

2.0 - 7.0 years

6 - 10 Lacs

bengaluru

Work from Office

We are looking for Data Engineer ( AWS, Confluent & Snaplogic ) Data Integration : Integrate data from various Siemens organizations into our data factory, ensuring seamless data flow and real-time data fetching. Data Processing : Implement and manage large-scale data processing solutions using AWS Glue, ensuring efficient and reliable data transformation and loading. Data Storage : Store and manage data in a large-scale data lake, utilizing Iceberg tables in Snowflake for optimized data storage and retrieval. Data Transformation : Apply various data transformations to prepare data for analysis and reporting, ensuring data quality and consistency. Data Products : Create and maintain data products that meet the needs of various stakeholders, providing actionable insights and supporting data-driven decision-making. Workflow Management : Use Apache Airflow to orchestrate and automate data workflows, ensuring timely and accurate data processing. Real-time Data Streaming : Utilize Confluent Kafka for real-time data streaming, ensuring low-latency data integration and processing. ETL Processes : Design and implement ETL processes using SnapLogic , ensuring efficient data extraction, transformation, and loading. Monitoring and Logging : Use Splunk for monitoring and logging data processes, ensuring system reliability and performance. Youd describe yourself as: Experience : 3+ relevant years of experience in data engineering, with a focus on AWS Glue, Iceberg tables, Confluent Kafka, SnapLogic, and Airflow. Technical Skills : Proficiency in AWS services, particularly AWS Glue. Experience with Iceberg tables and Snowflake. Knowledge of Confluent Kafka for real-time data streaming. Familiarity with SnapLogic for ETL processes. Experience with Apache Airflow for workflow management. Understanding of Splunk for monitoring and logging. Programming Skills : Proficiency in Python, SQL, and other relevant programming languages. Data Modeling : Experience with data modeling and database design. Problem-Solving : Strong analytical and problem-solving skills, with the ability to troubleshoot and resolve data-related issues. Preferred Qualities: Attention to Detail : Meticulous attention to detail, ensuring data accuracy and quality. Communication Skills : Excellent communication skills, with the ability to collaborate effectively with cross-functional teams. Adaptability : Ability to adapt to changing technologies and work in a fast-paced environment. Team Player : Strong team player with a collaborative mindset. Continuous Learning : Eagerness to learn and stay updated with the latest trends and technologies in data engineering. This role, based in Bangalore, is an individual contributor position. You may be required to visit other locations within India and internationally. In return, you'll have the opportunity to work with teams shaping the future.

Posted 3 weeks ago

Apply

4.0 - 8.0 years

10 - 20 Lacs

pune

Hybrid

Job Title: Data Engineer - Ingestion, Storage & Streaming (Confluent Kafka) Job Summary: As a Data Engineer specializing in Ingestion, Storage, and Streaming, you will design, implement, and maintain robust, scalable, and high-performance data pipelines for the efficient flow of data through our systems. You will work with Confluent Kafka to build real-time data streaming platforms, ensuring high availability and fault tolerance. You will also ensure that data is ingested, stored, and processed efficiently and in real-time to provide immediate insights. Key Responsibilities: Kafka-Based Streaming Solutions: Design, implement, and manage scalable and fault-tolerant data streaming platforms using Confluent Kafka. Develop real-time data streaming applications to support business-critical processes. Implement Kafka producers and consumers for ingesting data from various sources. Handle message brokering, processing, and event streaming within the platform. Ingestion & Data Integration: Build efficient data ingestion pipelines to bring real-time and batch data from various data sources into Kafka. Ensure smooth data integration across Kafka topics and handle multi-source data feeds. Develop and optimize connectors for data ingestion from diverse systems (e.g., databases, external APIs, cloud storage). Data Storage and Management: Manage and optimize data storage solutions in conjunction with Kafka, including topics, partitions, retention policies, and data compression. Work with distributed storage technologies to store large volumes of structured and unstructured data, ensuring accessibility and compliance. Implement strategies for schema management, data versioning, and data governance. Data Streaming & Processing: Leverage Kafka Streams and other stream processing frameworks (e.g., Apache Flink, ksqlDB) to process real-time data and provide immediate analytics. Build and optimize data processing pipelines to transform, filter, aggregate, and enrich streaming data. Monitoring, Optimization, and Security: Set up and manage monitoring tools to track the performance of Kafka clusters, ingestion, and streaming pipelines. Troubleshoot and resolve issues related to data flows, latency, and failures. Ensure data security and compliance by enforcing appropriate data access policies and encryption techniques. Collaboration and Documentation: Collaborate with data scientists, analysts, and other engineers to align data systems with business objectives. Document streaming architecture, pipeline workflows, and data governance processes to ensure system reliability and scalability. Provide regular updates on streaming and data ingestion pipeline performance and improvements to stakeholders. Required Skills & Qualifications: Experience: 3+ years of experience in data engineering, with a strong focus on Kafka, data streaming, ingestion, and storage solutions. Hands-on experience with Confluent Kafka, Kafka Streams, and related Kafka ecosystem tools. Experience with stream processing and real-time analytics frameworks (e.g., ksqlDB, Apache Flink). Technical Skills: Expertise in Kafka Connect, Kafka Streams, and Kafka producer/consumer APIs. Proficient in data ingestion and integration techniques from diverse sources (databases, APIs, etc.). Strong knowledge of cloud data storage and distributed systems. Experience with programming languages like Java, Scala, or Python for Kafka integration and stream processing. Familiarity with tools such as Apache Spark, Flink, Hadoop, or other data processing frameworks. Experience with containerization and orchestration tools such as Docker, Kubernetes.

Posted 3 weeks ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

jaipur

Work from Office

Project Role : Application Developer Project Role Description : Design

Posted 4 weeks ago

Apply

5.0 - 9.0 years

13 - 18 Lacs

mumbai, pune, chennai

Work from Office

Project description Our client is one of the leading multinational food and drink processing corporation. Being one of the largest food company in the world, company has strategic programs of regional platform management and optimization. Responsibilities Took part in design discussions Develop components and integrations using Mulesoft Conduct code reviews Communicate with clients regarding the architecture design and other technical topics Skills Must have 5 years of experience in support and development applications using technologies/tool using MuleSoft Hands on experience in designing RAML (Rest API modeling language). Great hands-on experience in Anypoint studio, Mule expression language (MEL), Mule ESB (Enterprise Service Bus). Good experience in mule connectors (Endpoint operations), Filters, Components, Transformers, Routers (Flow Controls), Scopes, Exception Handling Mechanism. Experience working with SAP APIM and NWG, ASAPIO Experience upgrading Mulesoft to the newer Java version (Java 17) Hands on experience with Anypoint Platform. Experience in security strategies like Encryption/Decryption techniques. Hands on experience in Security Policies application in mule and conversion of HTTP APIs to HTTPS. Experience in creating Mule flows, sub flows and common exception handling for my projects. Hands on experience on Validating and Debugging techniques. advanced knowledge in mulesoft such as RTF, CloudHub 2.0, and jdk migration strategies Nice to have Java, Azure, MS SQL, experience working with Kafka/Confluent Location - pune,mumbai,chennai,banagalore

Posted 4 weeks ago

Apply

10.0 - 12.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About McDonalds: One of the worlds largest employers with locations in more than 100 countries, McDonalds Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald&aposs global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Summary: Senior Manager, Integrated Test Lead Data Product Engineering & Delivery (Sr Manager, Technology Testing) Lead comprehensive testing strategy and execution for complex data engineering pipelines and product delivery initiatives. Drive quality assurance across integrated systems, data workflows, and customer-facing applications while coordinating cross-functional testing efforts. Who we are looking for: Primary Responsibilities: Test Strategy & Leadership: Design and implement end-to-end testing frameworks for data pipelines, ETL / ELT processes, and analytics platforms Ensure test coverage across ETL / ELT, data transformation, lineage and consumption layers Develop integrated testing strategies spanning multiple systems, APIs, and data sources Establish testing standards, methodologies, and best practices across the organization Data Engineering Testing: Create comprehensive test suites for data ingestion, transformation, and output validation Design data quality checks, schema validation, and performance testing for large-scale datasets Implement automated testing for streaming and batch data processing workflows Validate data integrity across multiple environments and systems and against business rules Cross-Functional Coordination: Collaborate with data engineers, software developers, product managers, and DevOps teams Coordinate testing activities across multiple product streams and release cycles Manage testing dependencies and critical path items in complex delivery timelines Quality Assurance & Process Improvement: Establish metrics and KPIs for testing effectiveness and product quality to drive continuous improvement in testing processes and tooling Lead root cause analysis for production issues and testing gaps Technical Leadership: Mentor junior QA engineers and promote testing best practices Evaluate and implement new testing tools and technologies Design scalable testing infrastructure and CI/CD integration Skill: 10+ years in software testing with 3+ years in leadership roles 8+ year experience testing data engineering systems, ETL pipelines, or analytics platforms Proven track record with complex, multi-system integration testing Experience in agile/scrum environments with rapid delivery cycles Strong SQL experience with major databases (Redshift, Bigquery, etc.) Experience with cloud platforms (AWS, GCP) and their data services Knowledge of data pipeline tools (Apache Airflow, Kafka, Confluent, Spark, dbt, etc.) Proficiency in data warehousing, data architecture, reporting and analytics applications Scripting languages (Python, Java, bash) for test automation API testing tools and methodologies CI/CD/CT tools and practices Strong project management and organizational skills Excellent verbal and written communication abilities Experience managing multiple priorities and competing deadlines Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid. Show more Show less

Posted 1 month ago

Apply

6.0 - 8.0 years

10 - 15 Lacs

Hyderabad, Gurugram, Bengaluru

Work from Office

Application Integration Engineer Experience Level (6-8 years) Skill Python, AWS S3, AWS MWAA Airflow, Confluent Kafka, API Development ? Experienced Python developer with very good experience with Confluent Kafka and Airflow. Have API development experience using Python. Have good experience with AWS cloud services. Very good experience with DevOps process CI/CD tools like Git, Jenkins, AWS ECR/ECS, AWS EKS etc. ? Requirements analysis of FR / NFR and prepares technical design based on requirements ? Builds code based on the technical design ? Can independently resolve technical issues and also help other team members with technical issue resolution. ? Helps with testing and efficiently fixes bugs. ?Follows the DevOps CI/CD processes and change management processes for any code deployment

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

In the modern banking age, financial institutions are required to bring Classical Data Drivers and Evolving Business Drivers together on a single platform. However, traditional data platforms face limitations in communicating with evolving business drivers due to technological constraints. A Modern Data Platform is essential to bridge this gap and elevate businesses to the next level through data-driven approaches, enabled by recent technology transformations. As a Technology leader with an academic background in Computer Science / Information Technology / Data Technologies [BE/BTech/MCA], you will have the opportunity to lead the Modern Data Platform Practice. This role involves providing solutions to customers on Traditional Datawarehouses across On-Prem and Cloud platforms. You will be responsible for architecting Data Platforms, defining Data engineering designs, selecting appropriate technologies and tools, and enhancing the organization's Modern Data Platform capabilities. Additionally, you will lead pre-sales discussions, provide technology architecture in RFP responses, and spearhead technology POC/MVP initiatives. To excel in this role, you are expected to possess the following qualifications and experiences: - 12-16 years of Data Engineering and analytics experience, including hands-on experience in Big Data systems across On-Prem and Cloud environments - Leadership in Data Platform architecture & design projects for mid to large size firms - Implementation experience with Batch Data and Streaming / Online data integrations using 3rd party tools and custom programs - Proficiency in SQL and one of the programming languages: Core Java / Scala / Python - Hands-on experience in Kafka for enabling Event-driven data pipes / processing - Knowledge of leading Data Services offered by AWS, Azure, Snowflake, Confluent - Strong understanding of distributed computing and related data structures - Implementation of Data Governance and Quality capabilities for Data Platforms - Analytical and presentation skills, along with the ability to build and lead teams - Exposure to leading RDBMS technologies and Data Visualization platforms - Demonstrated AI/ML models for Data Processing and generating Insights - Team player with the ability to work independently with minimal direction Your responsibilities at Oracle will be at Career Level - IC4, and the company values Diversity and Inclusion to foster innovation and excellence. Oracle offers a competitive suite of Employee Benefits emphasizing parity, consistency, and affordability, including Medical, Life Insurance, and Retirement Planning. The company encourages employees to contribute to the communities where they live and work. Oracle believes that innovation stems from diversity and inclusion, and is committed to creating a workforce where all individuals can thrive and contribute their best work. The company supports individuals with disabilities by providing reasonable accommodations throughout the job application, interview process, and in potential roles to ensure successful participation in crucial job functions. As a global leader in cloud solutions, Oracle is dedicated to leveraging tomorrow's technology to address today's challenges. The company values inclusivity and empowers its workforce to drive innovation and growth. Oracle careers offer opportunities for global engagement, work-life balance, and competitive benefits. The company is committed to promoting an inclusive workforce that supports opportunities for all individuals. If you require accessibility assistance or accommodation for a disability at any point during the employment process at Oracle, kindly reach out by emailing accommodation-request_mb@oracle.com or calling +1 888 404 2494 in the United States.,

Posted 1 month ago

Apply

8.0 - 13.0 years

10 - 20 Lacs

Bengaluru

Remote

Description: POSITION OVERVIEW : BA - AWS data Engineer Lead Work Address Remote Experience ( Relevant) 10+ Top 3 skills which is mandatory Kafka, AWS, Confluent Shift Timings Uk shift Start date ASAP Data Engineer Lead Robust hands-on experience with industry standard tooling and techniques, including SQL, Git and CI/CD pipelines mandiroty Management, administration, and maintenance with data streaming tools such as Kafka/Confluent Kafka, Flink Experienced with software support for applications written in Python & SQL Administration, configuration and maintenance of Snowflake & DBT Experience with data product environments that use tools such as Kafka Connect, Synk, Confluent Schema Registry, Atlan, IBM MQ, Sonarcube, Apache Airflow, Apache Iceberg, Dynamo DB, Terraform and GitHub Debugging issues, root cause analysis, and applying fixes Management and maintenance of ETL processes (bug fixing and batch job monitoring) Training & Certification • Apache Kafka Administration Snowflake Fundamentals/Advanced Training Experience 10 years of experience in a technical role working with AWS At least 2 years in a leadership or management role

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

coimbatore, tamil nadu

On-site

You are a seasoned Confluent & Oracle EBS Cloud Engineer with over 10 years of experience, responsible for leading the design and implementation of scalable, cloud-native data solutions. Your role involves modernizing enterprise data infrastructure, driving real-time data streaming initiatives, and migrating legacy ERP systems to AWS-based platforms. Your key responsibilities include architecting and implementing cloud-based data platforms using AWS services such as Redshift, Glue, DMS, and Data Lake solutions. You will lead the migration of Oracle E-Business Suite or similar ERP systems to AWS while ensuring data integrity and performance. Additionally, you will design and drive the implementation of Confluent Kafka for real-time data streaming across enterprise systems. It is essential for you to define and enforce data architecture standards, governance policies, and best practices. Collaborating with engineering, data, and business teams to align architecture with strategic goals is also a crucial aspect of your role. Furthermore, you will optimize data pipelines and storage for scalability, reliability, and cost-efficiency. To excel in this role, you must possess 10+ years of experience in data architecture, cloud engineering, or enterprise systems design. Deep expertise in AWS services, including Redshift, Glue, DMS, and Data Lake architectures, is required. Proven experience with Confluent Kafka for real-time data streaming and event-driven architectures is essential. Hands-on experience in migrating large-scale ERP systems (e.g., Oracle EBS) to cloud platforms is a must. Strong understanding of data governance, security, and compliance in cloud environments, as well as proficiency in designing scalable, fault-tolerant data systems, are also necessary. Preferred qualifications include experience with data modeling, metadata management, and lineage tracking, familiarity with infrastructure-as-code and CI/CD practices, and strong communication and leadership skills to guide cross-functional teams.,

Posted 1 month ago

Apply

13.0 - 17.0 years

40 - 55 Lacs

Hyderabad, Pune

Work from Office

Job Title: Engineering Manager (Java, Spring Boot, Microservices, Kafka) Location: Pune (On-site) Experience: 13+ years total, including 2+ years in people management We are hiring an Engineering Manager / Technical Manager for a leading product-based company in Pune & Hyd. This is a full-time, on-site role and an exciting opportunity for a seasoned engineering leader who thrives in a hands-on technical environment. Key Responsibilities: Lead and mentor a team of engineers, overseeing performance management, appraisals, and talent development. Drive the design and development of scalable backend systems using Java, Spring Boot, Microservices, and Confluent Kafka. Collaborate with cross-functional teams to ensure timely delivery and high-quality outcomes. Maintain a hands-on approach to coding, technical problem-solving, and code reviews. Requirements: 13+ years of total industry experience with strong backend engineering skills. At least 2 years of proven experience in engineering leadership or people management roles. Proficiency in Java, Spring Boot, Microservices architecture, and Kafka (preferably Confluent). Strong interpersonal and communication skills, with a passion for mentoring and growing engineering talent. Must be willing to work from the Pune office full-time.

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Chennai, Bengaluru

Hybrid

Role : Api Developer Node.Js What awaits you/ Job Profile Design, develop, and maintain RESTful APIs using Node.js. Develop serverless applications using AWS Lambda, API Gateway, Redis and other AWS services. Design and maintain infrastructure using AWS CloudFormation for Infrastructure-as-Code (IaC) or similar services from Azure/Google cloud. Ensure the application is scalable, performant, secure and highly available. Good analytical/problem solving skills, algorithms, logical thinking. API Testing, error resolutions, and overall software development process. Maintains and upgrades existing applications. Collaborate with the team members. What should you bring along Expert in programming using Node.Js with TypeScript. Exposure in AWS Cloud/Azure Cloud/Google Cloud Knowledge in Api Tools like Insomnia / Postman Must have technical skill: Strong proficiency with JavaScript and Node.js. AWS Lambda, Api Gateway, AWS Cloud watch or corresponding services from other cloud providers like Azure, Google cloud Implement streaming solutions using Confluent Kafka Docker for containerization and deployment. Understand Agile methodologies and hands-on experience using Jira, Confluence or similar agile project management tools. Excellent communication and collaboration skills. Domain background in Financial Services/banking. Good to have skills: Certifications in cloud computing platforms (e.g., Microsoft Certified: Azure Developer Associate, AWS Certified Developer - Associate). Kubernetes and Docker GitHub and Jenkins

Posted 2 months ago

Apply

4.0 - 8.0 years

5 - 15 Lacs

Pune

Hybrid

About the team and your role We are currently looking for integration consultants that are passionate about integration and understand what it takes to deliver TOP quality integration solutions to our clients and partners. You have an eagle-eye for identifying the integration challenges and the ability to translate those same business challenges into the best integration solutions. You can listen and stay calm under pressure and can be the linking pin between business and IT. You have seen integrations in different shapes, sizes, and colours, you can integrate any to any, either on-premise or cloud. You advise our clients about the best integration strategies based on best practices, analysts' recommendations, and architecture patterns. Last but not least, to be successful in this position we expect you to apply your strong consultancy & integration skills. Part of your responsibilities is to support the different developing teams during the entire lifecycle of an interface, from requirements gathering, analysis, design, development, testing, and handing over to operational support. We are looking for experienced Enterprise Integration Consultants to join our team. The ideal candidate has: Strong knowledge of integration principles and consultancy skills to be able to translate business to IT requirements. Hands-on experience in the integration of SAP and non-SAP systems in A2A and B2B scenarios. Deep expertise and hands-on experience in Dell Boomi as the primary integration platform. Working knowledge of API Management platforms (e.g., SAP API Management, Apigee, or others). Familiarity with event-driven architecture and distributed streaming platforms like Solace or Confluent Kafka . In-depth technical, functional, and architectural expertise in integrating applications using different technologies such as but not limited to REST, SOAP, ALE-IDocs, EDI, RFC, XI, HTTP, IDOC, JDBC, File/FTP, Mail, JMS. Solid middleware knowledge and web service skills. Good understanding of REST API and Web Services. Extensive experience with integrating 3rd party applications using REST-based services. Experience using tools like Postman & SOAPUI for service testing. Proven experience with full life cycle Integration implementation or rollout projects. Demonstrated experience with deliverables planning, client-facing roles, and high pace environments. What is Rojo all about? Founded in 2011, Rojo Integrations has transformed from a consulting firm into a comprehensive SAP integration leader, partnering with top software vendors like SAP, Coupa, SnapLogic, and Solace. As the leading SAP integration partner and ultimate expert, we provide seamless enterprise integration and data analytics solutions, enabling real-time insights and empowering digital transformation. Trusted by global Bluechip companies such as Heineken and Siemens, we deliver tailored services to meet unique business needs. Rojo is headquartered in the Netherlands and operates globally from its offices in the Netherlands, Spain, and India. We specialize in SAP integration modernization and business processes, improving data integration and business strategies. Our 360-degree portfolio includes consultancy, software development, and managed services to streamline integration, enhance observability, and drive growth. Requirements to succeed in this role Experience using Dell Boomi , SAP PO, SAP Cloud Integration, SnapLogic, and/or API Management. Quick Learner and adapt to the new tools and technologies and evaluate their test applicability. Team Player with good technical, analytical, communication skills and client-driven mindset. A bright mind and ability to understand a complex platform. Ability to understand technical/engineering concepts and to learn integration product functionality and applications. Demonstrated user-focused technical writing ability. Must be able to communicate complex technical concepts clearly and effectively. Strong analytical and problem-solving skills. Ability to work independently in a dynamic environment. Ability to work on multiple complex projects simultaneously. Strong interpersonal communication skills. Effectively communicates in one-to-one and group situations. At least three years of previous experience in a similar role. Additional desired skills: You have at least a Bachelors degree in computer engineering or a related field. Experience with any API Management Platform. Experience with Distributed Streaming Platforms and Event-based Integration Architecture such as Kafka or Solace. Extensive experience in integration of SAP and non-SAP systems in A2A and B2B scenarios using SAP Integration Suite or Cloud Integration (CPI). Experience in integration with main SAP backend systems (SAP ERP, SAP S/4HANA, SAP S/4HANA Cloud). SAP PO experience in programming UDFs, Modules, Look-Ups (RFC, SOAP, JDBC), BPM, Inbound and Outbound ABAP Proxies. Extensive knowledge of Java, JavaScript and/or GroovyScript. Good understanding CI/CD concepts. Speak and write English fluently. Affinity and experience with integration platforms/software like Dell Boomi, SAP Cloud Integration, or SnapLogic is desirable. What do we offer? The chance to gain work experience in a dynamic and inspiring environment and launch your career. Plenty growth opportunities while working in a high energy and fun environment. The opportunity to work on innovative projects with colleagues who are genuinely proud of their contribution. Training and mentoring to support your professional development with a yearly education budget. International atmosphere with Multicultural environments (+- 20 nationalities). A global, inclusive and diverse working climate within a world conscious organization. Plus, other exciting benefits specific to each region. Rojo is committed in achieving diversity & inclusion in terms of gender, caste, race, religion, nationality, ethnic origin, sexual orientation, disability, age, pregnancy, or other status. All qualified candidates are encouraged to apply. No one fits a job description perfectly, and there is no such thing as the perfect candidate. If you don't meet all the criteria, we'd still love to hear from you. Does that spark your interest? Apply now.

Posted 2 months ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Jaipur

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Kafka Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : Minimum 15 years of full time educationKey Responsibilities :A- Strong Experience as Administrator/Platform Engineering for Kafka B- Expertise in Confluent Kafka Administration C- Experience in implementing Kafka on confluent cloudD- Hands-on experience with Kafka clusters hosted on Cloud and on-prem platformsE- Design, build, assemble, and configure application or technical architecture components using business requirementF-Plus to have AWS expertise and familiarity with CI/CD DevOps, in addition to skills in Spring Boot, Microservices, and Angular Technical Experience :A-Token based auth, OAUTH, Basic Auth, Keypair concept, Openssl libraryB-Manage Kafka Cluster in OnPrem and Cloud environmentC-Confluent Cloud backup and restore for data D-Kafka load balancing and auto scale on the basis of loadE-Confluent cloud Centre and KSQL knowledge must have Professional Attributes :A -Interpersonal skills along with the ability to work in a teamB Good presentation skills Qualification Minimum 15 years of full time education

Posted 2 months ago

Apply
Page 1 of 4
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies