Jobs
Interviews

10 Ksql Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

15.0 - 17.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Who we are We&aposre a leading, global security authority that&aposs disrupting our own category. Our encryption is trusted by the major ecommerce brands, the world&aposs largest companies, the major cloud providers, entire country financial systems, entire internets of things and even down to the little things like surgically embedded pacemakers. We help companies put trust - an abstract idea - to work. That&aposs digital trust for the real world. Job summary In the Product engineering group, you will have the opportunity to be part of a team of engineering and product management experts working to deliver new, innovative world-class solutions for our large scale enterprise customers. We are innovating and creating technologies around security to solve complex problems for our large customers. The group has high-talent and high-energy, pushing new technologies that must achieve the best performance, and best user experience in the marketplace. What you will do Lead a team of development engineers towards project/program goals. Involved in all aspects of the product design and completely own the technical road map for the product to make it best-of-the-breed. Develops functional processes and operational policies within the area managed. Contributes to the development and achievement of organizational goals and objectives. Provide solutions for wide range of complex problems. Independently determines and develops approach to solutions, under only limited direction. General understanding of business environment. Fosters teamwork and collaboration within and across work groups. Ability to learn and apply new tools and applications. Full use and application of standard engineering principles, theories, and concepts. Interfaces with senior management to report on project and program milestones and to present project needs. Develops and provides challenging yet appropriate assignments, evaluates work, communicates progress toward career development and goals. Assists in developing and communicating organization vision and strategic direction, serves as functional consultant and technical leader. Accountable for HR processes and actions. What you will have Strong Academic credentials (masters in computer science will be an added plus). 15+ years of strong product development experience. 5+ years in Dev management. Preferably, proven people management skills. Knowledge and hands-on experience in applying strong development practices. Superlative analytical and problem solving skills. Very strong development background with hands-on knowledge on large scale, distributed and complex system design and implementation. Hands-on experience with J2EE-based, Micro Services based architecture, large scale cloud application deployments, large scale data processing, modern front-end web design patterns and frameworks; design, development, and deployment and monitoring of the enterprise software. Ability to select and collect the correct metrics, identify the trends, make forecasts. Understanding and hands-on experience with distributed Agile/Scrum process environments. Security domain/security testing knowledge a big plus Strong Communication skills. Strong Presentation skills. Self driven, innovative and proactive. Candidate must have a comfortable working knowledge of Java/Golang/web UI technologies. Candidate must demonstrate comfortable system level knowledge of Windows and Linux. Nice to have Knowledge of Kafka, KSQL, Elastic Search, Cassandra are desirable. Benefits Generous time off policies Top shelf benefits Education, wellness and lifestyle support Show more Show less

Posted 2 days ago

Apply

10.0 - 14.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have at least 10+ years of experience in the field, with a strong background in Kafka Streams / KSQL architecture and associated clustering model. Your expertise should include solid programming skills with Java, along with best practices in development, automation testing, and streaming APIs. Practical experience in scaling Kafka, KStreams, and Connector infrastructures is required, as well as the ability to optimize the Kafka ecosystem based on specific use-cases and workloads. As a developer, you should have hands-on experience in building producer and consumer applications using the Kafka API, and proficiency in implementing KStreams components. Additionally, you should have developed KStreams pipelines and deployed KStreams clusters. Experience in developing KSQL queries and understanding the best practices of using KSQL vs KStreams is essential. Strong knowledge of the Kafka Connect framework is necessary, including experience with various connector types such as HTTP REST proxy, JMS, File, SFTP, JDBC, Splunk, Salesforce, and the ability to support wire-format translations. Familiarity with connectors available from Confluent and the community, as well as hands-on experience in designing, writing, and operationalizing new Kafka Connectors using the framework is a plus. Knowledge of Schema Registry is also beneficial. Nice-to-have qualities include providing thought leadership for the team, excellent verbal and written communication skills, being a good team player, and willingness to go the extra mile to support the team. In terms of educational qualifications, a four-year college degree in Science, Engineering, Technology, Business, or Humanities is required. Candidates with a Master's degree and/or certifications in the relevant technologies are preferred. The working mode for this position is hybrid, full-time (3 days working from the office), and the notice period is a maximum of 30 days.,

Posted 3 days ago

Apply

12.0 - 14.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Company Description Global Technology Partners is a premier partner for digital transformation, with a diverse team of software engineering experts in the US and India. They combine strategic thinking, innovative design, and robust engineering to deliver exceptional results for their clients. Job Summary We are seeking a highly experienced and visionary Principal/Lead Java Architect to play a pivotal role in designing and evolving our next-generation, high-performance, and scalable event-driven platforms. This role demands deep expertise in Java, extensive experience with Kafka as a core component of event streaming architectures, and a proven track record of leading architectural design and implementation across complex enterprise systems. You will be instrumental in defining technical strategy, establishing best practices, and mentoring engineering teams to deliver robust and resilient solutions. Key Responsibilities: Architectural Leadership: Lead the design, development, and evolution of highly scalable, resilient, and performant event-driven architectures using Java and Kafka. Define architectural patterns, principles, and standards for event sourcing, CQRS, stream processing, and microservices integration with Kafka. Drive technical vision and strategy for our core platforms, ensuring alignment with business objectives and long-term technology roadmap. Conduct architectural reviews, identify technical debt, and propose solutions for continuous improvement. Stay abreast of emerging technologies and industry trends, evaluating their applicability and recommending adoption where appropriate. Design & Development: Design and implement robust, high-throughput Kafka topics, consumers, producers, and streams (Kafka Streams/KSQL). Architect and design Java-based microservices that effectively integrate with Kafka for event communication and data synchronization. Lead the selection and integration of appropriate technologies and frameworks for event processing, data serialization, and API development. Develop proof-of-concepts (POCs) and prototypes to validate architectural choices and demonstrate technical feasibility. Contribute hands-on to critical path development when necessary, demonstrating coding excellence and leading by example. Kafka Ecosystem Expertise: Deep understanding of Kafka internals, distributed systems concepts, and high-availability configurations. Experience with Kafka Connect for data integration, Schema Registry for data governance, and KSQL/Kafka Streams for real-time stream processing. Proficiency in monitoring, optimizing, and troubleshooting Kafka clusters and related applications. Knowledge of Kafka security best practices (authentication, authorization, encryption). Technical Governance & Mentorship: Establish and enforce architectural governance, ensuring adherence to design principles and coding standards. Mentor and guide engineering teams on best practices for event-driven architecture, Kafka usage, and Java development. Foster a culture of technical excellence, collaboration, and continuous learning within the engineering organization. Communicate complex technical concepts effectively to both technical and non-technical stakeholders. Performance, Scalability & Reliability: Design for high availability, fault tolerance, and disaster recovery. Define and implement strategies for performance optimization, monitoring, and alerting across the event-driven ecosystem. Ensure solutions are scalable to handle significant data volumes and transaction rates. Required Skills & Experience: 12+ years of progressive experience in software development, with at least 5+ years in an Architect role designing and implementing large-scale enterprise solutions. Expert-level proficiency in Java (Java 8+, Spring Boot, Spring Framework). Deep and extensive experience with Apache Kafka: Designing and implementing Kafka topics, producers, and consumers. Hands-on experience with Kafka Streams API or KSQL for real-time stream processing. Familiarity with Kafka Connect, Schema Registry, and Avro/Protobuf. Understanding of Kafka cluster operations, tuning, and monitoring. Strong understanding and practical experience with Event-Driven Architecture (EDA) principles and patterns: Event Sourcing, CQRS, Saga, Choreography vs. Orchestration. Extensive experience with Microservices architecture principles and patterns. Proficiency in designing RESTful APIs and asynchronous communication mechanisms. Experience with relational and NoSQL databases (e.g., PostgreSQL, MongoDB, Cassandra). Solid understanding of cloud platforms (AWS, Azure, GCP) and containerization technologies (Docker, Kubernetes). Experience with CI/CD pipelines (e.g., Jenkins, GitLab CI, Azure DevOps). Strong problem-solving skills, analytical thinking, and attention to detail. Excellent communication, presentation, and interpersonal skills. Show more Show less

Posted 3 days ago

Apply

3.0 - 8.0 years

8 - 10 Lacs

Mumbai, Mumbai Suburban, Mumbai (All Areas)

Work from Office

3 to 7 Years of experience in Core and Advanced Java, along with experience in developing robust applications using Spring Boot, working with Kafka, and applying design patterns. Hands-on skills in multithreading, databases, and basic DevOps tools. Required Candidate profile Develop & maintain scalable Java-based applications using Spring Boot. Design & implement solutions using Core Java, Advanced Java, multithreading techniques. Work with MySQL,PostgreSQL,MongoDB db. Perks and benefits To be disclosed post interview

Posted 2 weeks ago

Apply

13.0 - 20.0 years

30 - 45 Lacs

Pune

Hybrid

Hi, Wishes from GSN!!! Pleasure connecting with you!!! We been into Corporate Search Services for Identifying & Bringing in Stellar Talented Professionals for our reputed IT / Non-IT clients in India. We have been successfully providing results to various potential needs of our clients for the last 20 years. At present, GSN is hiring DATA ENGINEERING - Solution Architect for one of our leading MNC client. PFB the details for your better understanding : 1. WORK LOCATION : PUNE 2. Job Role: DATA ENGINEERING - Solution Architect 3. EXPERIENCE : 13+ yrs 4. CTC Range: Rs. 35 LPA to Rs. 50 LPA 5. Work Type : WFO Hybrid ****** Looking for SHORT JOINERS ****** Job Description : Who are we looking for : Architectural Vision & Strategy: Define and articulate the technical vision, strategy and roadmap for Big Data, data streaming, and NoSQL solutions , aligning with overall enterprise architecture and business goals. Required Skills : 13+ years of progressive EXP in software development, data engineering and solution architecture roles, with a strong focus on large-scale distributed systems. Expertise in Big Data Technologies: Apache Spark: Deep expertise in Spark architecture, Spark SQL, Spark Streaming, performance tuning, and optimization techniques. Experience with data processing paradigms (batch and real-time). Hadoop Ecosystem: Strong understanding of HDFS, YARN, Hive and other related Hadoop components . Real-time Data Streaming: Apache Kafka: Expert-level knowledge of Kafka architecture, topics, partitions, producers, consumers, Kafka Streams, KSQL, and best practices for high-throughput, low-latency data pipelines. NoSQL Databases: Couchbase: In-depth experience with Couchbase OR MongoDB OR Cassandra), including data modeling, indexing, querying (N1QL), replication, scaling, and operational best practices. API Design & Development: Extensive experience in designing and implementing robust, scalable and secure APIs (RESTful, GraphQL) for data access and integration. Programming & Code Review: Hands-on coding proficiency in at least one relevant language ( Python, Scala, Java ) with a preference for Python and/or Scala for data engineering tasks. Proven experience in leading and performing code reviews, ensuring code quality, performance, and adherence to architectural guidelines. Cloud Platforms: Extensive EXP in designing and implementing solutions on at least one major cloud platform ( AWS, Azure, GCP ), leveraging their Big Data, streaming, and compute services . Database Fundamentals: Solid understanding of relational database concepts, SQL, and data warehousing principles. System Design & Architecture Patterns: Deep knowledge of various architectural patterns (e.g., Microservices, Event-Driven Architecture, Lambda/Kappa Architecture, Data Mesh ) and their application in data solutions. DevOps & CI/CD: Familiarity with DevOps principles, CI/CD pipelines, infrastructure as code (IaC) and automated deployment strategies for data platforms . ****** Looking for SHORT JOINERS ****** Interested, don't hesitate to call NAK @ 9840035825 / 9244912300 for IMMEDIATE response. Best, ANANTH | GSN | Google review : https://g.co/kgs/UAsF9W

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Bengaluru

Work from Office

Key Skills: Confluent Kafka, Kafka Connect, Schema Registry, Kafka Brokers, KSQL, KStreams, Java/J2EE, Troubleshooting, RCA, Production Support. Roles & Responsibilities: Design and develop Kafka Pipelines. Perform unit testing of the code and prepare test plans as required. Analyze, design, and develop programs in a development environment. Support applications and jobs in the production environment for issues or failures. Develop operational documents for applications, including DFD, ICD, HLD, etc. Troubleshoot production issues and provide solutions within defined SLA. Prepare RCA (Root Cause Analysis) document for production issues. Provide permanent fixes to production issues. Experience Requirement: 5-10 yeras of experience working with Confluent Kafka. Hands-on experience with Kafka Connect using Schema Registry. Strong knowledge of Kafka brokers and KSQL. Familiarity with Kafka Control Center, Zookeepers, and KStreams is good to have. Experience with Java/J2EE is a plus. Education: B.E., B.Tech.

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

pune, maharashtra

On-site

You are a results-driven Data Project Manager (PM) responsible for leading data initiatives within a regulated banking environment, focusing on leveraging Databricks and Confluent Kafka. Your role involves overseeing the successful end-to-end delivery of complex data transformation projects aligned with business and regulatory requirements. In this position, you will be required to lead the planning, execution, and delivery of enterprise data projects using Databricks and Confluent. This includes developing detailed project plans, delivery roadmaps, and work breakdown structures, as well as ensuring resource allocation, budgeting, and adherence to timelines and quality standards. Collaboration with data engineers, architects, business analysts, and platform teams is essential to align on project goals. You will act as the primary liaison between business units, technology teams, and vendors, facilitating regular updates, steering committee meetings, and issue/risk escalations. Your technical oversight responsibilities include managing solution delivery on Databricks for data processing, ML pipelines, and analytics, as well as overseeing real-time data streaming pipelines via Confluent Kafka. Ensuring alignment with data governance, security, and regulatory frameworks such as GDPR, CBUAE, and BCBS 239 is crucial. Risk and compliance management are key aspects of your role, involving ensuring regulatory reporting data flows comply with local and international financial standards and managing controls and audit requirements in collaboration with Compliance and Risk teams. The required skills and experience for this role include 7+ years of Project Management experience within the banking or financial services sector, proven experience in leading data platform projects, a strong understanding of data architecture, pipelines, and streaming technologies, experience in managing cross-functional teams, and proficiency in Agile/Scrum and Waterfall methodologies. Technical exposure to Databricks (Delta Lake, MLflow, Spark), Confluent Kafka (Kafka Connect, kSQL, Schema Registry), Azure or AWS Cloud Platforms, integration tools, CI/CD pipelines, and Oracle ERP Implementation is expected. Preferred qualifications include PMP/Prince2/Scrum Master certification, familiarity with regulatory frameworks, and a strong understanding of data governance principles. The ideal candidate will hold a Bachelors or Masters degree in Computer Science, Information Systems, Engineering, or a related field. Key performance indicators for this role include on-time, on-budget delivery of data initiatives, uptime and SLAs of data pipelines, user satisfaction, and compliance with regulatory milestones.,

Posted 2 weeks ago

Apply

5.0 - 10.0 years

30 - 35 Lacs

Chennai, Bengaluru

Work from Office

Data Engineer: Experienced Kstream + Ksql dev with in-depth knowledge of specific client systems TAHI Contract and Application, ISP Contract and Application modules. Performs data analysis and writes code to implement functional requirements per LLD and client processes. Minimum skills levels in this specific area Current roles are 5 + years plus Insurnace domain experience These are technical roles, and the prime requirement is for Kstream/ Java/ KSLQDB/ Kafka

Posted 1 month ago

Apply

7.0 - 10.0 years

1 Lacs

Remote, , India

On-site

Job Responsibilities: Develop and maintain data pipelines for large-scale data processing. Work with streaming data technologies, including Kafka, kSQL, and Mirror Maker. Design and implement near real-time data streaming solutions. Optimize ETL processes for performance, scalability, and reliability. Collaborate with cross-functional teams to ensure seamless data integration. Ensure data quality, security, and compliance with best practices. Mandatory Skills & Qualifications: At least 7 years of experience in data-focused development projects. Expertise in Kafka framework, including kSQL and Mirror Maker. Proficiency in at least one programming language: Groovy or Java. Strong knowledge of data structures, ETL design, and storage optimization. Hands-on experience in real-time/streaming data pipeline development using Apache Spark, StreamSets, Apache NiFi, or similar frameworks.

Posted 1 month ago

Apply

5.0 - 10.0 years

2 - 6 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Job Description Design, develop, and manage Kafka-based data pipelines. Ability to architect and create reference architecture for Kafka Implementations Responsible for maintaining the availability, performance and security of Kafka infrastructure. Troubleshoot Kafka related issues Strong understating on secure deployment of Kafka solutions Providing Backup & Recovery and problem determination strategies for the projects. Provide expertise in Kafka brokers, zookeepers, Kafka connect, schema registry, KSQL, Rest proxy and Kafka Control center Provide administration and operations support of the Kafka platform like provisioning, access lists Kerberos and SSL configurations Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices. Automate routine tasks using scripts or automation tools to lessen manual work, decrease the chance of human errors, and boost system reliability Automate routine tasks using scripts or automation tools to lessen manual work, decrease the chance of human errors, and boost system reliability. Use automation tools like provisioning using BladeLogic, Ansible, Chef, Jenkins and GitLab Good-to-Have Capable of working independently, handling multiple projects at a time, able to work independently with good analytical, communication and organization skills.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies