Home
Jobs

56 Apache Kafka Jobs - Page 3

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3 - 5 years

5 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Responsibilities Position Purpose To Design, Develop Data Pipelines integrating with Messaging System like Redpanda, Active MQ using Apache Camel, Kaka Connect. Primary duties will include: Contribute to design, understanding the architectural trade-offs including Scalability, Resiliency, High Availability, Storage and Security Perform capacity planning and solution review related to the Messaging and Streaming environment. Adopt and promote modern automated Self Service paradigms Perform proof-of-concept work for new tools and components Turn proof-of-concept work into Production grade systems Provide end-user support to Dev teams and implement change requests Prioritize across various tasks, manage changes in daily workload Concisely document instructions, user guides and policies in Confluence or in source control Guide fellow engineers on stack standards and efficient utilization This role includes out of office hours support rota Providing support for our US based workloads. This position demands ability to adapt to US time zone work schedule. To land this role you will need: Proven history of implementation, configuration, and support of a modern Messaging platform and their ecosystems (RH AMQ, ActiveMQ Artemis, IBM MQ, Solace) Proven history of implementation, configuration, and support of a modern Streaming platform and their ecosystems (Apache Kafka, RedPanda, Apache Pulsar) Ability to use programming languages (e.g. Java, Python, PowerShell) to create production-grade APIs and integrations Working knowledge of Linux environment Experience with GIT, CI and CD Willingness to learn and adopt to innovative technologies Effective communication and interpersonal skills. Forward thinking, self-starter, able to work independently Qualifications What makes you stand out: Configuration Management / Deployment tools such as Ansible, Octopus Experience with Cloud Providers (preferably Azure) GitOps, CI / CD mindset Networking knowledge Kubernetes and Helm Knowledge Ansible / Configuration as a Code Willingness to learn and adopt to innovative technologies University / College bachelor level (or equivalent) degree

Posted 3 months ago

Apply

8 - 13 years

25 - 40 Lacs

Bengaluru

Work from Office

Naukri logo

Specialized Open Positions in Azure - Cloud Engineer - DevOps Engineer - Solutions Architect - Azure Data Engineer - Security Engineer - API Developer - Data Hosting - IAM - OS - Integration C&B - Best in The Industry Required Candidate profile Mid Level - 5+ Years , Sr Level - 10+ years of Relevant Experience in Cloud Engineering Must have hands on Working Experience Send CV to itjobs@consultasia.in mention Specialization in Azure Cloud

Posted 3 months ago

Apply

8 - 10 years

32 - 35 Lacs

Delhi NCR, Mumbai, Bengaluru

Work from Office

Naukri logo

Profile: (8-10 years) JAVA BACKEND + AWS Web Application design and architecture Core Java (JDK 8/11), Spring Framework (Spring boot, Spring Core), Hibernate / Spring ,JPA, Spring - Restful Webservices, Microservice Architecture * Front end: Angular JS/ ReSenior Java Developeract JS/ Node JS, HTML5, jQuery, AJAX, CSS3 technologies * Apache Kafka / Apache Spark * Application Server, Web Server, Cloud AWS cloud-based services like DynamoDB, Lambda, API Gateway, AWS S3 Splunk Dashboard, Datadog/Grafana Junit/Mockito frameworks for unit testing, API Testing (Rest assured/ Newman), CI/CD, Jenkins, Sonar Oracle/MySQL/NoSQL DB SVN/GIT, Docker, Kubernetes, Jenkins Location- Delhi NCR, Bangalore, Chennai, Pune, Kolkata, Ahmedabad, Mumbai, Hyderabad

Posted 3 months ago

Apply

8 - 12 years

40 - 45 Lacs

Delhi NCR, Mumbai, Bengaluru

Work from Office

Naukri logo

Roles & Responsibilities: Data Engineering Leadership & Strategy: Lead and mentor a team of data engineers, fostering a culture of technical excellence and collaboration. Define and implement data engineering best practices, standards, and processes. Data Pipeline Architecture & Development: Design, build, and maintain scalable, robust, and efficient data pipelines for ingestion, transformation, and loading of data from various sources. Optimize data pipelines for performance, reliability, and cost-effectiveness. Implement data quality checks and monitoring systems to ensure data integrity. Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions. Cloud-Based Data Infrastructure: Design, implement, and manage cloud-based data infrastructure using platforms like AWS, Azure, or GCP. Leverage cloud services (e.g., data lakes, data warehouses, serverless computing) to build scalable and cost-effective data solutions. Leverage opensource airbyte , mage ai and similar Ensure data security, governance, and compliance within the cloud environment. Data Modeling & Warehousing: Design and implement data models to support business intelligence, reporting, and analytics. Optimize data warehouse performance for efficient querying and reporting. Collaboration & Communication: Collaborate effectively with cross-functional teams including product managers, software engineers, and business stakeholders. Requirements: Bachelor's or master's degree in computer science, Engineering, or a related field. 8+ years of proven experience in data engineering, with at least 3+ years in a lead role. Expertise in building and maintaining data pipelines using tools such as Apache Spark, Apache Kafka, Apache Beam, or similar. Proficiency in SQL and one or more programming languages like Python, Java, or Scala. Hands-on experience with cloud-based data platforms (AWS, Azure, GCP) and services. Work Timings: 2.30 pm - 11.30 pm IST Location-Remote,Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad

Posted 3 months ago

Apply

10 - 15 years

30 - 35 Lacs

Hyderabad

Work from Office

Naukri logo

As a Principal Engineer with in-depth knowledge of data-intensive application development. Youll be part of the Network & Technology IT Organization focusing on end-to-end implementation of data movement/data pipeline solutions. Working on the overall IT architecture and design for complex Applications, System Architecture and functionality development. Providing technical guidance to cross-functional teams in crafting solutions and describing the impact of new solutions to systems and processes. Supporting mission & business critical applications in Network Systems covering Data movement platforms Playing a critical role in alignment with the Solution Architecture team to prepare HLD and LLD and taking responsibility for Development and Delivery on Time. Participating in planning, definition, and high-level design of the solution and explore solution alternatives Developing complex application migrations to the North Star technology platforms aligning with the SOE/SRE standards. Creating, leading, shaping and executing programs with initiatives to process optimization and modern technology adoption to create higher value for the business. Defining reusable components/frameworks, common schemas, standards to be used & tools to be used and helping bootstrap the Data Movement / Data Pipeline platforms. Preparing and Reviewing design to make sure design is aligned with the North Star Architecture and Data Engineering Industry Standards. Driving Technology and with best software engineering practices to impact on the entire architecture eco-system of the product. Participating in code reviews, design reviews, architecture discussions. Working with Solution Architects customers, stakeholders, and suppliers to establish high-level Solution Intent, and the solution intent information models and documentation requirements. Where you'll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Youll need to have: Masters or Bachelors degree with more than ten years of work experience. Ten or more years of work experience in Big Data, Data Engineering concepts like Aggregations, Transformations, Rollups etc., Hands on experience in Data Movement/Data Pipeline platforms Primary Must Have Skills Experience in Distributed systems which are data-intensive applications Experience in Data Pipeline Design/ETL frameworks Experience in Data Ingestion/Processing Frameworks like: Apache Flink, Apache Spark, Apache Kafka Experience in Containers Orchestration tools like Kubernetes, Managed Kubernetes Experience in Big Data formats like Apache Parquet, Apache Orc and Apache Avro Experience in at least any one of OLAP Databases Good to Have Orchestration Tools: Apache Airflow, Dagster, Prefect, Oozie Data Modeling: Logical/Conceptual data models, Snowflake vs Star schema, SCD types etc Experience in Data Quality/Data Lineage/Data Governance Hands on Experience in coding(Python,Java,SpringBoot), drive teams through massive refactoring exercises and improve coding standards across large code bases. Experience on providing solutions / Architecture for Modernization Experience in Compute & Storage Architectures Experience in Data Visualization tools like Apache Superset, Grafana etc., Experience in Data storage formats - IceBerg/delta and Data Retrieval, DataBricks/Snowflake and Orchestration- Airflow Knowledge on cloud technologies - AWS/On-Prem cloud or docker containers -Kubernetes (K8S) Even better if you have one or more of the following: A Master degree. Experience with object oriented programming, SOA, modular code development, design patterns, multi-threading. Contribution to industry forums (e.g., ETSI, 3GPP, TMForum, MEF). Knowledge of wireline and Wireless networks and services

Posted 3 months ago

Apply

6 - 10 years

10 - 19 Lacs

Chennai, Pune

Hybrid

Naukri logo

Location : Pune/Chennai Immediate joiners within 2-3 week Strong system design experience with Data Structures/Algorithms Strong working experience in Java programming including Java 8 and multithreading features. Java 15 knowledge is a plus. Strong experience & knowledge in Spring / Spring Boot (creating endpoints, integrations, CRUD operations, etc.) Strong experience and understanding of OOPS concepts Strong experience in event driven architecture and messaging products like Apache Kafka, ActiveMQ, RabbitMQ etc. Good working experience in SQL and NoSQL Databases (use cases, querying, joins, triggers, etc.) Low level design, API Design and Database Table design experience is required. General awareness of architecture & design patterns. Experience in Docker, Kubernetes, Cloud platforms and DevOps are an added advantage. Good experience in software lifecycle (waterfall/agile/others) and processes. Strong analytical and problem-solving skills Good communication skills

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies