Home
Jobs

7 Ksql Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 10.0 years

1 Lacs

Remote, , India

On-site

Foundit logo

Job Responsibilities: Develop and maintain data pipelines for large-scale data processing. Work with streaming data technologies, including Kafka, kSQL, and Mirror Maker. Design and implement near real-time data streaming solutions. Optimize ETL processes for performance, scalability, and reliability. Collaborate with cross-functional teams to ensure seamless data integration. Ensure data quality, security, and compliance with best practices. Mandatory Skills & Qualifications: At least 7 years of experience in data-focused development projects. Expertise in Kafka framework, including kSQL and Mirror Maker. Proficiency in at least one programming language: Groovy or Java. Strong knowledge of data structures, ETL design, and storage optimization. Hands-on experience in real-time/streaming data pipeline development using Apache Spark, StreamSets, Apache NiFi, or similar frameworks.

Posted 1 week ago

Apply

5.0 - 10.0 years

2 - 6 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Job Description Design, develop, and manage Kafka-based data pipelines. Ability to architect and create reference architecture for Kafka Implementations Responsible for maintaining the availability, performance and security of Kafka infrastructure. Troubleshoot Kafka related issues Strong understating on secure deployment of Kafka solutions Providing Backup & Recovery and problem determination strategies for the projects. Provide expertise in Kafka brokers, zookeepers, Kafka connect, schema registry, KSQL, Rest proxy and Kafka Control center Provide administration and operations support of the Kafka platform like provisioning, access lists Kerberos and SSL configurations Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices. Automate routine tasks using scripts or automation tools to lessen manual work, decrease the chance of human errors, and boost system reliability Automate routine tasks using scripts or automation tools to lessen manual work, decrease the chance of human errors, and boost system reliability. Use automation tools like provisioning using BladeLogic, Ansible, Chef, Jenkins and GitLab Good-to-Have Capable of working independently, handling multiple projects at a time, able to work independently with good analytical, communication and organization skills.

Posted 1 week ago

Apply

6 - 11 years

8 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Skills : Apache Kafka or Confluent Kafka, Optimization and tuning, in Schema Regitry, KSQL, Connectors, replication across organizations and Cluster LinkingNotice Period: 0- 30 days

Posted 2 months ago

Apply

8 - 10 years

27 - 32 Lacs

Hyderabad

Work from Office

Naukri logo

Your Primary Responsibilities: Lead needed technical processes and designs considering reliability, data integrity, maintainability, reuse, extensibility, usability and scalability. Collaborate with Infrastructure partners to identify and deploy optimal hosting environments. Define scalability and performance criteria for assigned applications. Ensure application meets the performance, privacy, and security requirements. Tune application performance to eliminate and reduce issues. Verify test plans to ensure compliance with performance and security requirements. Support business and technical presentations in relation to technology platforms and business solutions. Mitigate risk by following established procedures and monitoring controls. Help develop solutions that balance cost and delivery while meeting business requirements. implement technology-specific best practices that are consistent with corporate standards. Partner with multi-functional teams to ensure the success of product strategy and project deliverables. Manage the software development process. Drive new technical and business process improvements. Estimate total costs of modules/projects covering both hours and expense. Research and evaluate specific technologies, and applications, and contributes to the solution design. Construct application Architecture encompassing end-to-end designs. Mitigates risk by following established procedures and monitoring controls, spotting key errors and demonstrating strong ethical behavior. Qualifications: Minimum 8-10 years of related experience Bachelor's degree preferred or equivalent experience. Talents Needed for Success: Hands on experience in software development using Design Patterns, Java, typescript, Java EE, Spring Boot, Angular 8+, JMS, REST API, PL/SQL, Python. Experience with Micro Services & Layered (SOA/MVC) Architecture onPrem and Oncloud (AWS preferred). Familiar developing and running applications in Windows and Linux environments. Expertise in deploying scalable solutions in Kubernetes/docker containers that are highly resilient and perform well in an environment that talks to legacy systems and future centric micro architecture. Demonstrable experience in software development using CI/CD tools especially GIT, Bitbucket, Maven, Jenkins, Jira Experience using the following development tools: Visual Studio, IntelliJ, or Eclipse. Demonstrated capability working with middleware like IBM MQ, Solace, tomcat, liberty server, WebSphere, WebLogic or JBoss application servers. Familiarity working with relational databases including DB2 or Oracle. Experience with microservices and event driven architecture. Experience with Apache Kafka (or Confluent Kafka), Kafka APIs and tooling (e.g., Kafka Connect, KStreams, KSQL) Proficiency in different phases of the system development life cycle including project planning, analysis, design, development, and testing. Solid focus on software testing with Junit, Mockito, Jasmine, Karma. Familiarity with different software development methodologies (Waterfall, Agile, Scrum, Kanban) Additional Qualifications: Strong ability to gather and analyze requirements and translate them into technical specification. Writing and implementing the unit test scenarios / cases to ensure the code quality and reliability. Deep understanding of all lifecycle components (code, test, deploy) Ability to present designs to peers and Product Owner for approval. Fixing and debugging code to resolve technical issues. Optimizing application performance to ensure efficient use of software resources. Good verbal and written communication and interpersonal skills

Posted 2 months ago

Apply

10 - 12 years

32 - 37 Lacs

Bengaluru

Work from Office

Naukri logo

Are you passionate about working with innovative technologies? Do you have experience in DevOps Engineering and a strong focus on Azure DevOps and Terraform? We are looking for a talented DevOps Engineer to join our SAP Development & Integration CoE team at Novo Nordisk. If you are ready for the next step in your career and want to be part of a global healthcare company that is making a difference in the lives of millions, read on and apply today for a life-changing career. Apply Now! The position As a Senior IT Developer I at Novo Nordisk, you shall develop and maintain integrations as well as APIs using Confluent Kafka as a message broker. You will have the opportunity to: Design, build and maintain CI/CD pipelines using Azure DevOps. Develop and manage infrastructure as code (IaC) using Terraform. Implement software development best practices to ensure high-quality, scalable, and secure solutions. Design, develop and maintain Kafka-based solutions. Work with cross-functional teams to understand business requirements and translate them into technical requirements. Develop and maintain documentation for Kafka-based solutions. Troubleshoot and debug issues in Kafka-based solutions. Optimize Kafka-based solutions for performance, scalability, and availability. Collaborate with global teams to define and deliver projects. Qualifications To be successful in this role, you should have: Masters or bachelors degree in computer science, IT, or a related field, with a total of 10+ years of experience. 6+ years of relevant experience as a DevOps Engineer, with a strong focus on Azure DevOps and Terraform. Solid understanding of software development best practices. Basic knowledge of Confluent Kafka, AWS, and integrations in general. Knowledge about Kafka architecture, Kafka connectors, and Kafka APIs. Experience in implementing Kafka-based solutions in a cloud environment (AWS, Azure, Google Cloud, etc.). Ability to troubleshoot and debug complex issues in Kafka-based solutions. Experience in platform technologies such as Confluent Cloud Kafka, AWS EKS, Kafka Connect, KSQL, Schema registry and in security. Experience in CI/CD tools such as Azure DevOps, Azure pipelines, Terraform and Helm Charts. Experience using Agile, Scrum and iterative development practices. Good communication skills and ability to work with global teams to define and deliver on projects. Self-driven and fast learner with a high sense of ownership.

Posted 2 months ago

Apply

5 - 7 years

7 - 9 Lacs

Mumbai

Hybrid

Naukri logo

5- 7 years of Confluent Kafka platform experience Administration of Confluent Kafka Platform on Prem and in the Cloud Knowledge of Confluent Kafka operations Administration of Topics, Partitions, Consumer groups and kSQL queries to maintain optimal performance. Knowledge of Kafka ecosystem including Kafka Brokers, Zookeeper/KRaft, kSQL, Connectors, Schema Registry, Control Center, and platform interoperability. Knowledge of Kafka Cluster Linking and replication Experience with administering Multi Regional Confluent Clusters System Performance: Knowledge of performance tuning of messaging systems and clients to meet application requirements. Operating Systems: RedHat Linux Employee Type: Permanent

Posted 3 months ago

Apply

4 - 6 years

6 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Must have experience of creating Kafka Producers/Consumers Have experience in Java, Spring boot, Microservices. He/she must have knowledge of how Kafka works, its basic architecture. Hands-on experience in creating Kafka Topics/brokers. Required Candidate profile He/she must be aware of various configuration parameters exposed by Kafka for tuning Producers/Consumers He/she must have knowledge of handling exceptions and failure scenarios.

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies