Home
Jobs
Companies
Resume

14 Streams Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

6 - 15 Lacs

Gurugram

Work from Office

Naukri logo

Requirements Elicitation, Understanding, Analysis, & Management Understand the project's Vision and requirements, and contribute to the creation of the supplemental requirements, building the low-level technical specifications for a particular platform and/or service solution. Project Planning, Tracking, & Reporting Estimate the tasks and resources required to design, create (build), and test the code for assigned module(s). Provide inputs in creating the detailed schedule for the project. Support the team in project planning activities, in evaluating risks, and shuffle priorities based on unresolved issues. During development and testing, ensure that assigned parts of the project/modules are on track with respect to schedules and quality. Note scope changes within the assigned modules and work with the team to shuffle priorities accordingly. Communicate regularly with the team about development changes, scheduling, and status. Participate in project review meetings. Tracking and reporting progress for assigned modules Design: Create a detailed (LLD) design for the assigned piece(s) with possible alternate solutions. Ensure that LLD design meets business requirements. Submit the LLD design for review. Fix the detailed (LLD) design for the assigned piece(s) for the comments received from team. Development & Support Build the code of high-priority and complex systems according to the functional specifications, detailed design, maintainability, and coding and efficiency standards. Use code management processes and tools to avoid versioning problems. Ensure that the code does not affect the functioning of any external or internal systems. Perform peer reviews of code to ensure it meets coding and efficiency standards. Act as the primary reviewer to review the application code created by software engineers to ensure compliance to defined standards. Recommend changes to the code as required. Testing & Debugging Attend the Test Design walkthroughs to help verify that the plans and conditions will test all functions and features effectively. Perform impact analysis for issues assigned to self and software engineers. Actively assist with project- and code-level problem solving, such as suggesting paths to explore when testing engineers or software engineers encounter a debugging problem, and escalate urgent issues. Documentation Review technical documentation for the code for accuracy, completeness, and usability. Document and maintain the reviews conducted and the unit test results. Process Management Adhere to the project and support processes. Adhere to best practices and comply with approved policies, procedures, and methodologies, such as the SDLC cycle for different project sizes. Shows responsibility for corporate funds, materials and resources. Ensure adherence to SDLC and audits requirements. Adhere to best practices and comply with approved policies, procedures, and methodologies. Position Summary As a Lead Collaboration Engineer at Guardian Life Insurance, you will be responsible for designing, building, testing, deploying, and supporting Microsoft 365 collaboration capabilities for 16,000 users globally. You are Excellent problem solver Strong collaborator with team members and other teams Strong communicator, documenter, and presenter Strong project ownership and execution skills, ensuring timely and quality delivery. Continuous self-learner, subject matter expert for Microsoft 365 You have Bachelor’s degree in computer science, Information Technology, or significant relevant experience 5+ years of experience, preferably in a large financial services enterprise Expert-level experience with Microsoft 365: Administration, Outlook/Exchange Online/Exchange Server, Teams, SharePoint Online/OneDrive, Power Automate, Viva Engage (Yammer), Stream, PowerShell scripting, advanced troubleshooting diagnostics, Copilot, Word, Excel, PowerPoint, OneNote, Visio, Project, Whiteboard, To Do, Planner, Lists, Viva Insights, Power Apps, Loop, Azure. Intermediate-level experience with Proofpoint E-mail Protection or a similar e-mail security service – Administration, Routing, Allow/Block List, Encryption, DLP, Send Securely, Secure Portal, SPF/DKIM/DMARC, delivery troubleshooting, incident response. Knowledge of other complimentary collaboration applications are desired: Zoom, BitTitan MigrationWiz, or ShareGate. Strong knowledge of IT Service Management and ITIL, preferably using Service Now – Incidents, Tasks, Problems, Knowledge, CMDB, Reporting, Dashboards. Proven ability to manage support and request tickets within SLAs, and drive Microsoft support cases to closure. Knowledge of Project Management using waterfall and agile frameworks. Proven ability to complete projects reliably and with quality. Knowledge of Networking and Security - DNS, Active Directory, Entra ID (Azure AD) including conditional access policies, certificates, firewalls, proxies, cloud access security brokers (CASB), single sign on (SSO), multi-factor authentication (MFA), data loss prevention (DLP) and identity and access management (IAM). Knowledge of Endpoints, Servers, and Cloud – Devices, operating systems, browsers, Intune, System Center, Nexthink, Amazon AWS, Azure. Microsoft certifications are desired, preferably MS-900, MS-700, MS-721, MS-102 You will Deliver excellent support for Collaboration capabilities to achieve service level agreements. Participation in the team on-call support rotation is required. Design, build, test, and deploy new Collaboration capabilities to achieve strategic goals and key deliverables reliably and with quality. Current goals are focused on Copilot, and Service Improvements. Reporting Relationships As our Collaboration Engineer, you will administratively report to our Delivery Manager/ Head of IT who reports to our Head of Infrastructure IT; and functionally to the Head of Collaboration Technology. Location: This position can be based in any of the following locations: Gurgaon For internal use only: R000106866

Posted 22 hours ago

Apply

6.0 - 10.0 years

15 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Are you ready to dive headfirst into the captivating world of data engineering at Kyndryl? As a Data Engineer, you'll be the visionary behind our data platforms, crafting them into powerful tools for decision-makers. Your role? Ensuring a treasure trove of pristine, harmonized data is at everyone's fingertips. AWS Data/API Gateway Pipeline Engineer responsible for designing, building, and maintaining real-time, serverless data pipelines and API services. This role requires extensive hands-on experience with Java, Python, Redis, DynamoDB Streams, and PostgreSQL, along with working knowledge of AWS Lambda and AWS Glue for data processing and orchestration. This position involves collaboration with architects, backend developers, and DevOps engineers to deliver scalable, event-driven data solutions and secure API services across cloud-native systems. Key Responsibilities API & Backend Engineering Build and deploy RESTful APIs using AWS API Gateway, Lambda, and Java and Python. Integrate backend APIs with Redis for low-latency caching and pub/sub messaging. Use PostgreSQL for structured data storage and transactional processing. Secure APIs using IAM, OAuth2, and JWT, and implement throttling and versioning strategies. Data Pipeline & Streaming Design and develop event-driven data pipelines using DynamoDB Streams to trigger downstream processing. Use AWS Glue to orchestrate ETL jobs for batch and semi-structured data workflows. Build and maintain Lambda functions to process real-time events and orchestrate data flows. Ensure data consistency and resilience across services, queues, and databases. Cloud Infrastructure & DevOps Deploy and manage cloud infrastructure using CloudFormation, Terraform, or AWS CDK. Monitor system health and service metrics using CloudWatch, SNS and structured logging. Contribute to CI/CD pipeline development for testing and deploying Lambda/API services. So, if you're a technical enthusiast with a passion for data, we invite you to join us in the exhilarating world of data engineering at Kyndryl. Let's transform data into a compelling story of innovation and growth. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Skills and Experience Bachelor's degree in computer science, Engineering, or a related field. Over 6 years of experience in developing backend or data pipeline services using Java and Python . Strong hands-on experience with: AWS API Gateway , Lambda , DynamoDB Streams Redis (caching, messaging) PostgreSQL (schema design, tuning, SQL) AWS Glue for ETL jobs and data transformation Solid understanding of REST API design principles, serverless computing, and real-time architecture. Preferred Skills and Experience Familiarity with Kafka, Kinesis, or other message streaming systems Swagger/OpenAPI for API documentation Docker and Kubernetes (EKS) Git, CI/CD tools (e.g., GitHub Actions) Experience with asynchronous event processing, retries, and dead-letter queues (DLQs) Exposure to data lake architectures (S3, Glue Data Catalog, Athena) Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.

Posted 1 week ago

Apply

5.0 - 7.0 years

35 - 55 Lacs

Bengaluru

Work from Office

Naukri logo

Serko is a cutting-edge tech platform in global business travel & expense technology. When you join Serko, you become part of a team of passionate travellers and technologists bringing people together, using the world’s leading business travel marketplace. We are proud to be an equal opportunity employer, we embrace the richness of diversity, showing up authentically to create a positive impact. There's an exciting road ahead of us, where travel needs real, impactful change. With offices in New Zealand, Australia, North America, and China, we are thrilled to be expanding our global footprint, landing our new hub in Bengaluru, India. With rapid a growth plan in place for India, we’re hiring people from different backgrounds, experiences, abilities, and perspectives to help us build a world-class team and product. As a Senior Principal Engineer, you’ll play a key role in shaping our technical vision and driving engineering excellence across our product streams. Your leadership will foster a high-performance culture that empowers teams to build innovative solutions with real-world impact. Requirements Working closely with stream leadership—including the Head of Engineering, Senior Engineering Managers, Architects, and domain specialists—you’ll provide hands-on technical guidance and help solve complex engineering challenges. As a Senior Principal Engineer, you'll also lead targeted projects and prototypes, shaping new technical approaches and ensuring our practices stay ahead of the curve. What you'll do Champion best practices across engineering teams, embedding them deeply within the stream Proactively resolve coordination challenges within and across streams to keep teams aligned and unblocked Partner with Product Managers to ensure customer value is delivered in the most pragmatic and impactful way Lead or contribute to focused technical projects that solve high-priority problems Collaborate with cross-functional teams to define clear requirements, objectives, and timelines for key initiatives Explore innovative solutions through research and analysis, bringing fresh thinking to technical challenges Mentor engineers and share technical expertise to uplift team capability and growth Continuously evaluate and enhance system performance, reliability, and scalability Stay ahead of the curve by tracking industry trends, emerging technologies, and evolving best practices Drive continuous improvement across products and processes to boost quality, efficiency, and customer satisfaction Maintain strong communication with stakeholders to gather insights, provide updates, and incorporate feedback What you'll bring to the team Strong proficiency in stream-specific technologies, tool and programming languages Demonstrated expertise in specific areas of specialization related to the stream Excellent problem-solving skills and attention to detail Ability to lead teams through complex changes to engineering related areas, and maintain alignment across Product and Technology teams Effective communication and interpersonal skills Proven ability to work independently and collaboratively in a fast-paced environment Tertiary level qualification in a relevant Engineering discipline or equivalent. Benefits At Serko we aim to create a place where people can come and do their best work. This means you’ll be operating in an environment with great tools and support to enable you to perform at the highest level of your abilities, producing high-quality, and delivering innovative and efficient results. Our people are fully engaged, continuously improving, and encouraged to make an impact. Some of the benefits of working at Serko are: A competitive base pay Medical Benefits Discretionary incentive plan based on individual and company performance Focus on development: Access to a learning & development platform and opportunity for you to own your career pathways Flexible work policy. Apply Hit the ‘apply’ button now, or explore more about what it’s like to work at Serko and all our global opportunities at www.Serko.com .

Posted 2 weeks ago

Apply

6.0 - 10.0 years

6 - 15 Lacs

Gurugram

Work from Office

Naukri logo

Requirements Elicitation, Understanding, Analysis, & Management Understand the project's Vision and requirements, and contribute to the creation of the supplemental requirements, building the low-level technical specifications for a particular platform and/or service solution. Project Planning, Tracking, & Reporting Estimate the tasks and resources required to design, create (build), and test the code for assigned module(s). Provide inputs in creating the detailed schedule for the project. Support the team in project planning activities, in evaluating risks, and shuffle priorities based on unresolved issues. During development and testing, ensure that assigned parts of the project/modules are on track with respect to schedules and quality. Note scope changes within the assigned modules and work with the team to shuffle priorities accordingly. Communicate regularly with the team about development changes, scheduling, and status. Participate in project review meetings. Tracking and reporting progress for assigned modules Design: Create a detailed (LLD) design for the assigned piece(s) with possible alternate solutions. Ensure that LLD design meets business requirements. Submit the LLD design for review. Fix the detailed (LLD) design for the assigned piece(s) for the comments received from team. Development & Support Build the code of high-priority and complex systems according to the functional specifications, detailed design, maintainability, and coding and efficiency standards. Use code management processes and tools to avoid versioning problems. Ensure that the code does not affect the functioning of any external or internal systems. Perform peer reviews of code to ensure it meets coding and efficiency standards. Act as the primary reviewer to review the application code created by software engineers to ensure compliance to defined standards. Recommend changes to the code as required. Testing & Debugging Attend the Test Design walkthroughs to help verify that the plans and conditions will test all functions and features effectively. Perform impact analysis for issues assigned to self and software engineers. Actively assist with project- and code-level problem solving, such as suggesting paths to explore when testing engineers or software engineers encounter a debugging problem, and escalate urgent issues. Documentation Review technical documentation for the code for accuracy, completeness, and usability. Document and maintain the reviews conducted and the unit test results. Process Management Adhere to the project and support processes. Adhere to best practices and comply with approved policies, procedures, and methodologies, such as the SDLC cycle for different project sizes. Shows responsibility for corporate funds, materials and resources. Ensure adherence to SDLC and audits requirements. Adhere to best practices and comply with approved policies, procedures, and methodologies. Position Summary As a Lead Collaboration Engineer at Guardian Life Insurance, you will be responsible for designing, building, testing, deploying, and supporting Microsoft 365 collaboration capabilities for 16,000 users globally. You are Excellent problem solver Strong collaborator with team members and other teams Strong communicator, documenter, and presenter Strong project ownership and execution skills, ensuring timely and quality delivery. Continuous self-learner, subject matter expert for Microsoft 365 You have Bachelor’s degree in computer science, Information Technology, or significant relevant experience 5+ years of experience, preferably in a large financial services enterprise Expert-level experience with Microsoft 365: Administration, Outlook/Exchange Online/Exchange Server, Teams, SharePoint Online/OneDrive, Power Automate, Viva Engage (Yammer), Stream, PowerShell scripting, advanced troubleshooting diagnostics, Copilot, Word, Excel, PowerPoint, OneNote, Visio, Project, Whiteboard, To Do, Planner, Lists, Viva Insights, Power Apps, Loop, Azure. Intermediate-level experience with Proofpoint E-mail Protection or a similar e-mail security service – Administration, Routing, Allow/Block List, Encryption, DLP, Send Securely, Secure Portal, SPF/DKIM/DMARC, delivery troubleshooting, incident response. Knowledge of other complimentary collaboration applications are desired: Zoom, BitTitan MigrationWiz, or ShareGate. Strong knowledge of IT Service Management and ITIL, preferably using Service Now – Incidents, Tasks, Problems, Knowledge, CMDB, Reporting, Dashboards. Proven ability to manage support and request tickets within SLAs, and drive Microsoft support cases to closure. Knowledge of Project Management using waterfall and agile frameworks. Proven ability to complete projects reliably and with quality. Knowledge of Networking and Security - DNS, Active Directory, Entra ID (Azure AD) including conditional access policies, certificates, firewalls, proxies, cloud access security brokers (CASB), single sign on (SSO), multi-factor authentication (MFA), data loss prevention (DLP) and identity and access management (IAM). Knowledge of Endpoints, Servers, and Cloud – Devices, operating systems, browsers, Intune, System Center, Nexthink, Amazon AWS, Azure. Microsoft certifications are desired, preferably MS-900, MS-700, MS-721, MS-102 You will Deliver excellent support for Collaboration capabilities to achieve service level agreements. Participation in the team on-call support rotation is required. Design, build, test, and deploy new Collaboration capabilities to achieve strategic goals and key deliverables reliably and with quality. Current goals are focused on Copilot, and Service Improvements. Reporting Relationships As our Collaboration Engineer, you will administratively report to our Delivery Manager/ Head of IT who reports to our Head of Infrastructure IT; and functionally to the Head of Collaboration Technology. Location: This position can be based in any of the following locations: Gurgaon For internal use only: R000106866

Posted 2 weeks ago

Apply

5.0 - 10.0 years

0 - 1 Lacs

Ahmedabad, Chennai, Bengaluru

Hybrid

Naukri logo

Job Summary: We are seeking an experienced Snowflake Data Engineer to design, develop, and optimize data pipelines and data architecture using the Snowflake cloud data platform. The ideal candidate will have a strong background in data warehousing, ETL/ELT processes, and cloud platforms, with a focus on creating scalable and high-performance solutions for data integration and analytics. --- Key Responsibilities: * Design and implement data ingestion, transformation, and loading processes (ETL/ELT) using Snowflake. * Build and maintain scalable data pipelines using tools such as dbt, Apache Airflow, or similar orchestration tools. * Optimize data storage and query performance in Snowflake using best practices in clustering, partitioning, and caching. * Develop and maintain data models (dimensional/star schema) to support business intelligence and analytics initiatives. * Collaborate with data analysts, scientists, and business stakeholders to gather data requirements and translate them into technical solutions. * Manage Snowflake environments including security (roles, users, privileges), performance tuning, and resource monitoring. * Integrate data from multiple sources including cloud storage (AWS S3, Azure Blob), APIs, third-party platforms, and streaming data. * Ensure data quality, reliability, and governance through testing and validation strategies. * Document data flows, definitions, processes, and architecture. --- Required Skills and Qualifications: * 3+ years of experience as a Data Engineer or in a similar role working with large-scale data systems. * 2+ years of hands-on experience with Snowflake including SnowSQL, Snowpipe, Streams, Tasks, and Time Travel. * Strong experience in SQL and performance tuning for complex queries and large datasets. * Proficiency with ETL/ELT tools such as dbt, Apache NiFi, Talend, Informatica, or custom scripts. * Solid understanding of data modeling concepts (star schema, snowflake schema, normalization, etc.). * Experience with cloud platforms (AWS, Azure, or GCP), particularly using services like S3, Redshift, Lambda, Azure Data Factory, etc. * Familiarity with Python or Java or Scala for data manipulation and pipeline development. * Experience with CI/CD processes and tools like Git, Jenkins, or Azure DevOps. * Knowledge of data governance, data quality, and data security best practices. * Bachelor's degree in Computer Science, Information Systems, or a related field. --- Preferred Qualifications: * Snowflake SnowPro Core Certification or Advanced Architect Certification. * Experience integrating BI tools like Tableau, Power BI, or Looker with Snowflake. * Familiarity with real-time streaming technologies (Kafka, Kinesis, etc.). * Knowledge of Data Vault 2.0 or other advanced data modeling methodologies. * Experience with data cataloging and metadata management tools (e.g., Alation, Collibra). * Exposure to machine learning pipelines and data science workflows is a plus.

Posted 2 weeks ago

Apply

7.0 - 10.0 years

15 - 22 Lacs

Pune

Work from Office

Naukri logo

As an experienced member of our Core banking Base Development / Professional Service Group, you will be responsible for effective Microservice development in Scala and delivery of our NextGen transformation / professional services projects/programs. What You Will Do: • Adhere the processes followed for development in the program. • Report status, and proactively identify issues to the Tech Lead and management team. • Personal ownership and accountability for delivering assigned tasks and deliverables within the established schedule. • Facilitate a strong and supportive team environment that enables the team as well as individual team members to overcome any political, bureaucratic and/or resource barriers to participation. • Recommend and Implement solutions. Be totally hands on and have the ability to work independently. What You Will Need to Have: • 4 to 8 years of recent hands-on in Scala and Akka Framework • Technical Skillset required o Should possess Hands-on experience in Scala development including Akka Framework. o Must have good understanding on Akka Streams. o Test driven development. o Awareness on message broker. o Hands-on Experience in design and development of Microservices. o Good awareness on Event driven Microservices Architecture. o GRPC Protocol + Protocol Buffers. o Hands-on Experience in Docker Containers. o Hands-on Experience in Kubernetes. o Awareness on cloud native applications. o Jira, Confluence, Ansible, Terraform. o Good knowledge of the cloud platforms (preferably AWS), their IaaS, PaaS, SaaS solutions. o Good knowledge and hands on experience on the scripting languages like Batch, Bash, hands on experience on Python would be a plus. o Knowledge of Integration and unit test and Behavior Driven Development o Need to have good problem-solving skills. o Good communication skills. What Would Be Great to Have: • Experience integrating to third party applications. • Agile knowledge • Good understanding of the configuration management • Financial Industry and Core Banking integration experience --

Posted 2 weeks ago

Apply

6.0 - 11.0 years

4 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

SUMMARY Job Role: Apache Kafka Admin Experience: 6+ years Location: Pune (Preferred), Bangalore, Mumbai Must-Have: The candidate should have 6 years of relevant experience in Apache Kafka Job Description: We are seeking a highly skilled and experienced Senior Kafka Administrator to join our team. The ideal candidate will have 6-9 years of hands-on experience in managing and optimizing Apache Kafka environments. As a Senior Kafka Administrator, you will play a critical role in designing, implementing, and maintaining Kafka clusters to support our organization's real-time data streaming and event-driven architecture initiatives. Responsibilities: Design, deploy, and manage Apache Kafka clusters, including installation, configuration, and optimization of Kafka brokers, topics, and partitions. Monitor Kafka cluster health, performance, and throughput metrics and implement proactive measures to ensure optimal performance and reliability. Troubleshoot and resolve issues related to Kafka message delivery, replication, and data consistency. Implement and manage Kafka security mechanisms, including SSL/TLS encryption, authentication, authorization, and ACLs. Configure and manage Kafka Connect connectors for integrating Kafka with various data sources and sinks. Collaborate with development teams to design and implement Kafka producers and consumers for building real-time data pipelines and streaming applications. Develop and maintain automation scripts and tools for Kafka cluster provisioning, deployment, and management. Implement backup, recovery, and disaster recovery strategies for Kafka clusters to ensure data durability and availability. Stay up-to-date with the latest Kafka features, best practices, and industry trends and provide recommendations for optimizing our Kafka infrastructure. Requirements: 6-9 years of experience as a Kafka Administrator or similar role, with a proven track record of managing Apache Kafka clusters in production environments. In - depth knowledge of Kafka architecture, components, and concepts, including brokers, topics, partitions, replication, and consumer groups. Hands - on experience with Kafka administration tasks, such as cluster setup, configuration, performance tuning, and monitoring. Experience with Kafka ecosystem tools and technologies, such as Kafka Connect, Kafka Streams, and Confluent Platform. Proficiency in scripting languages such as Python, Bash, or Java. Strong understanding of distributed systems, networking, and Linux operating systems. Excellent problem-solving and troubleshooting skills, with the ability to diagnose and resolve complex technical issues. Strong communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams and communicate technical concepts to non-technical stakeholders.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

4 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

SUMMARY Job Role: Apache Kafka Admin Experience: 6+ years Location: Pune (Preferred), Bangalore, Mumbai Must-Have: The candidate should have 6 years of relevant experience in Apache Kafka Job Description: We are seeking a highly skilled and experienced Senior Kafka Administrator to join our team. The ideal candidate will have 6-9 years of hands-on experience in managing and optimizing Apache Kafka environments. As a Senior Kafka Administrator, you will play a critical role in designing, implementing, and maintaining Kafka clusters to support our organization's real-time data streaming and event-driven architecture initiatives. Responsibilities: Design, deploy, and manage Apache Kafka clusters, including installation, configuration, and optimization of Kafka brokers, topics, and partitions. Monitor Kafka cluster health, performance, and throughput metrics and implement proactive measures to ensure optimal performance and reliability. Troubleshoot and resolve issues related to Kafka message delivery, replication, and data consistency. Implement and manage Kafka security mechanisms, including SSL/TLS encryption, authentication, authorization, and ACLs. Configure and manage Kafka Connect connectors for integrating Kafka with various data sources and sinks. Collaborate with development teams to design and implement Kafka producers and consumers for building real-time data pipelines and streaming applications. Develop and maintain automation scripts and tools for Kafka cluster provisioning, deployment, and management. Implement backup, recovery, and disaster recovery strategies for Kafka clusters to ensure data durability and availability. Stay up-to-date with the latest Kafka features, best practices, and industry trends and provide recommendations for optimizing our Kafka infrastructure. Requirements: 6-9 years of experience as a Kafka Administrator or similar role, with a proven track record of managing Apache Kafka clusters in production environments. In - depth knowledge of Kafka architecture, components, and concepts, including brokers, topics, partitions, replication, and consumer groups. Hands - on experience with Kafka administration tasks, such as cluster setup, configuration, performance tuning, and monitoring. Experience with Kafka ecosystem tools and technologies, such as Kafka Connect, Kafka Streams, and Confluent Platform. Proficiency in scripting languages such as Python, Bash, or Java. Strong understanding of distributed systems, networking, and Linux operating systems. Excellent problem-solving and troubleshooting skills, with the ability to diagnose and resolve complex technical issues. Strong communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams and communicate technical concepts to non-technical stakeholders.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

14 - 19 Lacs

Bengaluru, Delhi / NCR, Mumbai (All Areas)

Work from Office

Naukri logo

Role & responsibilities Urgent Hiring for one of the reputed MNC Exp - 5+ Years Location - Pan India Immediate Joiners only Snowflake developer , Pyspark , Python , API, CI/CD , Cloud services ,Azure , Azure Devops Subject: Fw : TMNA SNOWFLAKE POSITION Please share profiles for Snowflake developers having strong Pyspark experience Job Description: Strong hands-on experience in Snowflake development including Streams, Tasks, and Time Travel Deep understanding of Snowpark for Python and its application for data engineering workflows Proficient in PySpark , Spark SQL, and distributed data processing Experience with API development . Proficiency in cloud services (preferably Azure, but AWS/GCP also acceptable) Solid understanding of CI/CD practices and tools like Azure DevOps, GitHub Actions, GitLab, or Jenkins for snowflake. Knowledge of Delta Lake, Data Lakehouse principles, and schema evolution is a plus Preferred candidate profile

Posted 3 weeks ago

Apply

5 - 10 years

20 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Purpose of the Job : The person who joins us as a Lead Product Engineer will work in the capacity of an individual contributor. He/ she will work closely with the Product Owner to deliver high quality responsive web applications. As part of the job, he/she is expected to prepare artefacts to document the design and code, conduct design and code reviews for work done by the team, and mentor the junior engineers in the team. Key Tasks • Develop original algorithms, logic and code, and ensure that it withstands any test • Understand the difference between creating a product and working on a turnkey project, and write code accordingly • Demonstrate significant abstraction skills to convert requirements into usable product features • Creating original algorithms and ideas to solve complex issues Educational Background • Bachelors+ degree in Computer Science engineering/ related fields • Agile Certification • Certifications in the Financial domain Preferred Experience • 10 to 12 years of software development experience developing web applications using Java ( 1.8+)/ J2EE technology • Expertise in the Object Oriented Programming paradigm and using standard frameworks like Spring MVC, Hibernate • Hands on experience in building UI applications using React or Angular AND Spring and Hibernate • Strong experience in data base design & Stored procedure development • Proven experience in performance tuning of both Online and Batch applications. • Expertise in agile development methodologies and DevOps practices including continuous integration, static code analysis etc. • Expertise in Test Driven development and experience in Rapid Prototyping and testing with Minimum Viable Products • Experience in Product Implementation • Experience of working in Financial industries and/ or product development organizations building financial products Key Skills and Competencies • Team work • Intellectual curiosity • Financial business acumen • Effective communication

Posted 1 month ago

Apply

4 - 8 years

5 - 15 Lacs

Hyderabad

Hybrid

Naukri logo

Responsibilities: Perform software development and maintenance of entity-based micro services using Java/J2EE including Java 1.8 features like Lambdas, Streams, and Filters and with different frameworks like Spring Boot, Hibernate, Spring Cloud, Jenkins, Kafka, and REST.Perform software development on Angular , Javascript, React User Interface Implementing test suites to test the developed functionalities based on test scenarios proposed by Product Owner using Test Automation Frameworks like Junit, Mockito, Cucumber, etc. to ensure code quality and performance. Working in an AGILE environment with bi-weekly sprints and provided work updates in daily SCRUM calls, maintaining industry level best practices for software development using wide varieties of latest technologies and frameworks. Participating in software/hardware configuration, releases, and installation tasks Participate in technical planning and requirements gathering phases including Design, code, test, troubleshoot and document engineering software applications. Demonstrating the ability to adapt and work with team members of various experience level. Working on developing REST APIs (Micro Services) to expose REST end points which will be consumed by the end users as per the internal business requirements. Deploy the code into the OpenShift Platforms through continuous Integration and Continuous Deployment methodology (Jenkins) after each successful implementation of new feature. Participate in code reviews, testing, and debugging to ensure high-quality software. Required Skills: Proficiency in Java. Familiarity with Angualar, Javascript UI Familiarity with Spring Boot framework and REST APIs Strong problem-solving skills and willingness to learn. Able to adapt to a fast-paced work environment Able to work independently and as part of a team Knowledge of micro services development and OpenShift Knowledge of agile development methodologies

Posted 2 months ago

Apply

8 - 13 years

18 - 19 Lacs

Bengaluru

Work from Office

Naukri logo

Responsibilities: * Maintain and improve the backend of existing mobile applications. * Write reliable, well-structured, and testable code. * Participate in regular code reviews

Posted 2 months ago

Apply

4 - 9 years

15 - 27 Lacs

Chennai, Bengaluru, Gurgaon

Work from Office

Naukri logo

Immediate Positions Please call on 7208835287 / 9359055605 Role & responsibilities Java, Spring boot, microservices, Data structure algorithm, Multi-threading, Collections, Streams, Hibernate Hacker rank, GFG, Leet Code(Candidate need to complete minimum of 60 plus questions on any of the mentioned platform Only BTech/MTech/ MCA Preferred candidate profile Perks and benefits

Posted 3 months ago

Apply

3 - 8 years

5 - 15 Lacs

Chennai

Work from Office

Naukri logo

Snowflake Architect is responsible for designing, implementing, and optimizing data solutions using Snowflake Cloud Data Platform. They ensure scalability, security, and high performance in data warehousing, analytics, and cloud data solutions. Role & responsibilities 1. Architecture & Design Design end-to-end data solutions using Snowflake Cloud Data Platform. Define data architecture strategy, ensuring scalability and security. Establish best practices for Snowflake implementation, including data modeling, schema design, and query optimization. Design data lakes, data marts, and enterprise data warehouses (EDW) in Snowflake. 2. Data Engineering & Development Oversee ETL/ELT pipelines using Snowflake Snowpipe, Streams, Tasks, and Stored Procedures. Ensure efficient data ingestion, transformation, and storage using SQL, Python, or Scala. Implement data partitioning, clustering, and performance tuning for optimized query execution. 3. Security & Compliance Implement role-based access control (RBAC) and data governance policies. Ensure encryption, auditing, and data masking for security compliance. Define multi-cloud strategies (AWS, Azure, GCP) for Snowflake deployments. 4. Performance Optimization Optimize query performance and warehouse compute resources to reduce costs. Implement Materialized Views, Query Acceleration, and Caching to improve performance. Monitor Snowflake usage, cost management, and auto-scaling capabilities. 5. Integration & Automation Integrate Snowflake with BI tools (Tableau, Power BI, Looker), data lakes (S3, Azure Blob, GCS). Automate data workflows and pipeline orchestration using Airflow, dbt, or Snowflake Tasks. Implement CI/CD pipelines for data model deployments and schema changes. 6. Stakeholder Collaboration & Leadership Work closely with business analysts, data scientists, and IT teams to define requirements. Act as a technical advisor for Snowflake-related decisions and best practices. Provide mentorship and training to data engineers and analysts on Snowflake architecture. Key Skills Required: Snowflake Data Warehouse (Warehouses, Secure Data Sharing, Multi-Cluster Architecture) SQL, Python, Scala (for data processing and scripting) ETL/ELT & Data Pipelines (Informatica, Talend, dbt, Airflow) Cloud Services (AWS, Azure, GCP integration) Performance Tuning (Query Optimization, Snowflake Caching) Security & Governance (RBAC, PII Data Masking, Compliance) BI Tools Integration (Tableau, Power BI, Looker)

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies