Jobs
Interviews
10 Job openings at KMCCorp India
Java Full Stack Developer

Hyderabad, Telangana

0 - 8 years

INR Not disclosed

On-site

Full Time

Job Information Date Opened 04/29/2025 Industry Information Technology Job Type Full time Work Experience 5-7 years City Hyderabad State/Province Telangana Country India Zip/Postal Code 500003 Job Description Job Title: Java Full Stack Developer Location: Hyderabad Experience: 6-8 Years Job Summary We are looking for an experienced Java Full Stack Developer hands on experience in designing, developing, and deploying scalable web applications. The ideal candidate should have strong expertise in Java, Spring Boot, Microservices, RESTful APIs, and frontend technologies (Angular/React). You will be responsible for developing end to end solutions, optimizing performance, and ensuring seamless integration between frontend and backend systems. Key Responsibilities Design, develop, and maintain high performance, scalable Java based applications using Spring Boot, Microservices, and RESTful APIs. Build responsive and dynamic frontend applications using Angular/React, JavaScript/TypeScript, HTML5, and CSS3. Develop and optimize APIs and Microservices for high availability, security, and performance. Implement cloud native solutions AWS and containerization (Docker, Kubernetes). Collaborate with cross functional teams to define, design, and ship new features. Ensure code quality, performance, and security through unit testing, integration testing, and CI/CD pipelines. Troubleshoot and debug complex issues across the full stack. Follow Agile/Scrum methodologies for project delivery. Required Skills & Qualifications 6-8 years of experience in Java Full Stack Development. Strong expertise in Java 8/11+, Spring Boot, Spring MVC, Spring Security, and Hibernate/JPA. Hands on experience in Microservices architecture, RESTful APIs, and API Gateway (Spring Cloud, Netflix OSS). Proficiency in frontend technologies (Angular/React, JavaScript/TypeScript, HTML5, CSS3). Experience with database systems (SQL – MySQL/PostgreSQL, NoSQL – MongoDB). Knowledge of cloud platforms (AWS/Azure/GCP) and containerization (Docker, Kubernetes). Familiarity with CI/CD tools (Jenkins, GitLab CI, GitHub Actions). Strong understanding of design patterns, OOP, and SOLID principles. Experience with message brokers (Kafka, RabbitMQ) is a plus. Excellent problem-solving skills and ability to work in a fast paced environment.

Java Full Stack Lead

Hyderabad, Telangana

0 - 15 years

INR Not disclosed

On-site

Full Time

Job Information Date Opened 04/29/2025 Industry Information Technology Job Type Full time Work Experience 10-15 years City Hyderabad State/Province Telangana Country India Zip/Postal Code 500003 Job Description Job Title: Java Full Stack Lead Location: Hyderabad Experience Level: 10+ Years Job Summary We are seeking an experienced Java Full Stack Lead with strong expertise in Microservices, API development, and Java based frameworks to design, develop, and lead the implementation of scalable and high-performance applications. The ideal candidate will have hands on experience in both frontend and backend technologies, along with a deep understanding of cloud native architectures, RESTful APIs. Key Responsibilities Technical Leadership: Lead a team of developers in designing, developing, and deploying Java based microservices and full stack applications. Microservices Architecture: Design and implement scalable, resilient, and secure microservices using Spring Boot, Spring Cloud, and Docker/Kubernetes. API Development: Develop and maintain RESTful & Graph QL APIs, ensuring high performance, security, and scalability. Full Stack Development: Work on both frontend (Angular/React) and backend (Java, Spring, Hibernate) components. Cloud Integration: Deploy and manage applications on AWS/Azure/GCP, leveraging serverless and containerized solutions. Code Quality & Best Practices: Enforce clean code principles, CI/CD pipelines, and automated testing (JUnit, Mockito, Selenium). Performance Optimization: Identify and resolve bottlenecks in database (SQL/NoSQL), caching (Redis), and API response times. Mentorship: Guide and mentor junior developers, conduct code reviews, and promote best practices. Agile Collaboration: Work closely with Product Managers, Architects, and DevOps teams in an Agile/Scrum environment. Required Skills & Qualifications 10+ years of hands-on experience in Java/J2EE, Spring Boot, and Microservices. Strong expertise in RESTful API design, API Gateway (Kong, Apigee), and security (OAuth2, JWT). Proficiency in frontend frameworks (Angular/React/JavaScript/TypeScript). Experience with containerization (Docker, Kubernetes) and cloud platforms (AWS/Azure/GCP). Solid understanding of database systems (RDBMS, NoSQL) and ORM frameworks (Hibernate/JPA). Familiarity with message brokers (Kafka, RabbitMQ) and event driven architectures. Knowledge of DevOps tools (Jenkins, Git, Terraform) and monitoring (Prometheus, ELK). Strong problem-solving skills and ability to work in a fast-paced environment. Excellent communication and leadership skills.

UI Developer with React

Hyderabad, Telangana

0 - 10 years

INR Not disclosed

On-site

Full Time

Job Information Date Opened 05/08/2025 Industry Information Technology Job Type Full time Work Experience 8-10 years City Hyderabad State/Province Telangana Country India Zip/Postal Code 500003 Job Description Job Title: Senior Full Stack Developer (Frontend-Leaning) Location: Hyderabad Experience: 8-10 Years Job Summary We are looking for a Senior Full Stack Developer with strong frontend expertise to design and build high-performance applications. The ideal candidate will have deep experience in modern frontend frameworks (React/Angular/Vue.js) along with proficiency in Java/Node.js backend development. You will play a key role in shaping frontend architecture while ensuring seamless integration with backend services. Key Responsibilities Architect and develop scalable, responsive SPAs using React, Angular, or Vue.js with TypeScript. Lead UI component design, state management (Redux/NgRx), and performance optimization. Implement design systems (Material UI, Storybook) and ensure WCAG accessibility compliance. Collaborate with UX designers to transform Figma/Sketch wireframes into production-ready interfaces. Establish frontend testing strategies (Jest, Cypress, React Testing Library) and CI/CD pipelines. Drive adoption of modern tools (Next.js, Web Components, GraphQL, Micro Frontend). Develop and maintain RESTful APIs using Java/Spring Boot (or Node.js). Design database schemas and optimize queries (PostgreSQL, MongoDB). Implement authentication/authorization (JWT, OAuth2) and security best practices. Collaborate with DevOps to containerize (Docker) and deploy applications on AWS/Azure. Required Skills 8+ years with React/Angular/Vue.js, TypeScript, and modern CSS. Expertise in state management, build tools and testing frameworks. Strong knowledge of UI/UX principles, responsive design, and Web Performance. 4+ years with Java/Spring Boot (or Node.js/Express) and REST/GraphQL APIs. Experience with SQL/NoSQL databases and caching strategies .

Senior QA Automation Engineer

Telangana

10 - 15 years

INR Not disclosed

On-site

Part Time

Job Information Date Opened 05/29/2025 Industry Information Technology Job Type Full time Work Experience 10-15 years City Hydrerabad State/Province Telangana Country India Zip/Postal Code 500039 Job Description KMC is seeking a for a detail-oriented QA Engineer with a mix of manual and automation testing experience. You will be responsible for ensuring software quality by designing test cases, executing tests, reporting defects, and contributing to test automation efforts. If you enjoy working in an Agile environment and have exposure to both manual and automated testing, we’d love to hear from you! Requirements Analyze requirements, user stories, and design documents to create clear, detailed test cases. Perform functional, UI, regression, smoke, and exploratory testing manually. Document test results with evidence and maintain test documentation. Identify, report, and track defects using JIRA or similar tools. Validate bug fixes and conduct retesting/regression testing. Participate in sprint planning, stand-ups, and retrospectives. Work closely with developers, product owners, and other stakeholders. Contribute to UI or API automation scripts (Selenium, Postman, REST Assured, Cypress). Assist in maintaining and improving existing automation frameworks. Help integrate test scripts into CI/CD pipelines (Jenkins, GitLab CI, etc.). Collaborate with automation engineers to expand test coverage. Conduct performance, load, and stress testing to identify bottlenecks. Analyze system behavior under load and suggest optimizations. Required Skills & Experience: Manual Testing Strong experience in test case design and execution. Automation Exposure Familiarity with Selenium, Postman, REST Assured, or Cypress. Programming Basics Knowledge of Java, Python, or JavaScript for scripting. CI/CD Tools Experience with Git, Jenkins, or similar tools. API Testing Ability to test and debug RESTful APIs. Agile Methodology Experience working in Scrum/Kanban environments. Benefits Insurance - Family Term Insurance PF Paid Time Off - 20 days Holidays - 10 days Flexi timing Competitive Salary Diverse & Inclusive workspace

Platform Engineer – Event-Driven Platform (Confluent Cloud)

Hyderābād

5 years

INR Not disclosed

On-site

Part Time

Job Information Date Opened 05/29/2025 Industry Information Technology Job Type Full time Work Experience 5+ years City Hyderabad State/Province Telangana Country India Zip/Postal Code 500039 Job Description KMC is seeking a highly motivated and technically skilled Platform Engineer to join our team in developing and managing an event-driven platform built on Confluent Cloud . This role involves working closely with internal teams and client applications to build scalable, reliable, and secure data streaming solutions using Kafka . Design, develop, and manage scalable event-driven architecture on Confluent Cloud or Apache Kafka . Evaluate and enable new Confluent features , tools, and enhancements in alignment with business use cases. Automate platform operations and DevOps workflows to reduce manual interventions. Onboard new producer and consumer applications onto the Kafka platform by collaborating with client development teams. Troubleshoot platform and integration issues in both lower environments and production , ensuring minimal downtime. Define and implement best practices, governance, monitoring, and operational procedures for the Kafka-based platform. Collaborate with security, infrastructure, and application teams to ensure secure and compliant Kafka deployments. Continuously explore emerging technologies , stay updated on Kafka ecosystem enhancements, and provide insights for platform evolution Requirements 3–7 years of experience in platform engineering or infrastructure development roles. Hands-on experience with Apache Kafka (Open Source) or Confluent Kafka (Cloud or On-Premises). Proficient in programming/scripting (Java, Python, Shell, or similar) to build automation tools and CI/CD integrations. Solid understanding of Kafka architecture including brokers, topics, partitions, producers, consumers, schema registry, and Kafka Connect. Experience in client onboarding , access provisioning, schema management, and topic lifecycle management. Troubleshooting experience in multi-environment (DEV, STAGE, PROD) setups. Knowledge of observability tools (Confluent Control Center, Datadog, Grafana, etc.) and logs analysis. Familiarity with Terraform, Git, Jenkins , or similar DevOps tools is a plus. Required Skills and Experience: 3–7 years of experience in platform engineering or infrastructure development roles. Hands-on experience with Apache Kafka (Open Source) or Confluent Kafka (Cloud or On-Premises). Proficient in programming/scripting (Java, Python, Shell, or similar) to build automation tools and CI/CD integrations. Solid understanding of Kafka architecture including brokers, topics, partitions, producers, consumers, schema registry, and Kafka Connect. Experience in client onboarding , access provisioning, schema management, and topic lifecycle management. Troubleshooting experience in multi-environment (DEV, STAGE, PROD) setups. Knowledge of observability tools (Confluent Control Center, Datadog, Grafana, etc.) and logs analysis. Familiarity with Terraform, Git, Jenkins , or similar DevOps tools is a plus. Preferred Qualifications: Confluent Certification (e.g., Confluent Certified Developer/Admin ). Exposure to Kubernetes , AWS/GCP , or other cloud-native platforms. Experience working in agile teams and cross-functional collaboration. Benefits Insurance - Family Term Insurance PF Paid Time Off - 20 days Holidays - 10 days Flexi timing Competitive Salary Diverse & Inclusive workspace

Platform Engineer – Event-Driven Platform (Confluent Cloud)

Hyderabad, Telangana

0 - 7 years

INR Not disclosed

On-site

Full Time

Job Information Date Opened 05/29/2025 Industry Information Technology Job Type Full time Work Experience 5+ years City Hyderabad State/Province Telangana Country India Zip/Postal Code 500039 Job Description KMC is seeking a highly motivated and technically skilled Platform Engineer to join our team in developing and managing an event-driven platform built on Confluent Cloud . This role involves working closely with internal teams and client applications to build scalable, reliable, and secure data streaming solutions using Kafka . Design, develop, and manage scalable event-driven architecture on Confluent Cloud or Apache Kafka . Evaluate and enable new Confluent features , tools, and enhancements in alignment with business use cases. Automate platform operations and DevOps workflows to reduce manual interventions. Onboard new producer and consumer applications onto the Kafka platform by collaborating with client development teams. Troubleshoot platform and integration issues in both lower environments and production , ensuring minimal downtime. Define and implement best practices, governance, monitoring, and operational procedures for the Kafka-based platform. Collaborate with security, infrastructure, and application teams to ensure secure and compliant Kafka deployments. Continuously explore emerging technologies , stay updated on Kafka ecosystem enhancements, and provide insights for platform evolution Requirements 3–7 years of experience in platform engineering or infrastructure development roles. Hands-on experience with Apache Kafka (Open Source) or Confluent Kafka (Cloud or On-Premises). Proficient in programming/scripting (Java, Python, Shell, or similar) to build automation tools and CI/CD integrations. Solid understanding of Kafka architecture including brokers, topics, partitions, producers, consumers, schema registry, and Kafka Connect. Experience in client onboarding , access provisioning, schema management, and topic lifecycle management. Troubleshooting experience in multi-environment (DEV, STAGE, PROD) setups. Knowledge of observability tools (Confluent Control Center, Datadog, Grafana, etc.) and logs analysis. Familiarity with Terraform, Git, Jenkins , or similar DevOps tools is a plus. Required Skills and Experience: 3–7 years of experience in platform engineering or infrastructure development roles. Hands-on experience with Apache Kafka (Open Source) or Confluent Kafka (Cloud or On-Premises). Proficient in programming/scripting (Java, Python, Shell, or similar) to build automation tools and CI/CD integrations. Solid understanding of Kafka architecture including brokers, topics, partitions, producers, consumers, schema registry, and Kafka Connect. Experience in client onboarding , access provisioning, schema management, and topic lifecycle management. Troubleshooting experience in multi-environment (DEV, STAGE, PROD) setups. Knowledge of observability tools (Confluent Control Center, Datadog, Grafana, etc.) and logs analysis. Familiarity with Terraform, Git, Jenkins , or similar DevOps tools is a plus. Preferred Qualifications: Confluent Certification (e.g., Confluent Certified Developer/Admin ). Exposure to Kubernetes , AWS/GCP , or other cloud-native platforms. Experience working in agile teams and cross-functional collaboration. Benefits Insurance - Family Term Insurance PF Paid Time Off - 20 days Holidays - 10 days Flexi timing Competitive Salary Diverse & Inclusive workspace

NIFI/ETL Engineer

Hyderābād

4 - 5 years

INR Not disclosed

On-site

Part Time

Job Information Date Opened 05/23/2025 Industry Information Technology Job Type Full time Work Experience 4-5 years City Hyderabad State/Province Telangana Country India Zip/Postal Code 500059 Job Description KMC is seeking a motivated and adaptable NiFi/Astro/ETL Engineer with 3-4 years of experience in ETL workflows, data integration, and data pipeline management. The ideal candidate will thrive in an operational setting, collaborate well with team members, and demonstrate a readiness to learn and embrace new technologies. This role will focus on the development, maintenance, and support of ETL processes to ensure efficient data workflows and high-quality deliverables. Roles and Responsibilities: Design, implement, and maintain ETL workflows using Apache NiFi, Astro, and other relevant tools. Support data extraction, transformation, and loading (ETL) processes to ensure efficient data flow across systems. Collaborate with data teams to ensure seamless integration of data from various sources, supporting data consistency and availability.Configure and manage data ingestion processes from both structured and unstructured data sources. Monitor ETL processes and data pipelines, troubleshoot and resolve issues in real-time to ensure data accuracy and availability. Provide on-call support as necessary to maintain smooth data operations.Work closely with cross-functional teams to gather requirements, refine workflows, and ensure optimal data solutions. Contribute actively to team discussions, solution planning, and provide input for continuous improvement. Stay updated with industry trends and emerging technologies in data integration and ETL practices. Show willingness to learn and adapt to new tools and methodologies as required by project or team needs. Requirements 3-4 years of experience in ETL workflows, specifically with Apache NiFi and Astro (or similar platforms). Proficient in SQL and experience with data warehousing concepts. Familiarity with scripting languages (e.g., Python, Shell scripting) is a plus. Basic understanding of cloud platforms (AWS, Azure, or Google Cloud) Soft Skills: Strong problem-solving abilities with an operational mindset. Team player with effective communication skills to collaborate well within and across teams. Quick learner, adaptable to new tools, and willing to take on challenges with a positive attitude. Benefits Insurance - Family Term Insurance PF Paid Time Off - 20 days Holidays - 10 days Flexi timing Competitive Salary Diverse & Inclusive workspace

NIFI/ETL Engineer

Hyderabad, Telangana

0 - 4 years

INR Not disclosed

On-site

Full Time

Job Information Date Opened 05/23/2025 Industry Information Technology Job Type Full time Work Experience 4-5 years City Hyderabad State/Province Telangana Country India Zip/Postal Code 500059 Job Description KMC is seeking a motivated and adaptable NiFi/Astro/ETL Engineer with 3-4 years of experience in ETL workflows, data integration, and data pipeline management. The ideal candidate will thrive in an operational setting, collaborate well with team members, and demonstrate a readiness to learn and embrace new technologies. This role will focus on the development, maintenance, and support of ETL processes to ensure efficient data workflows and high-quality deliverables. Roles and Responsibilities: Design, implement, and maintain ETL workflows using Apache NiFi, Astro, and other relevant tools. Support data extraction, transformation, and loading (ETL) processes to ensure efficient data flow across systems. Collaborate with data teams to ensure seamless integration of data from various sources, supporting data consistency and availability.Configure and manage data ingestion processes from both structured and unstructured data sources. Monitor ETL processes and data pipelines, troubleshoot and resolve issues in real-time to ensure data accuracy and availability. Provide on-call support as necessary to maintain smooth data operations.Work closely with cross-functional teams to gather requirements, refine workflows, and ensure optimal data solutions. Contribute actively to team discussions, solution planning, and provide input for continuous improvement. Stay updated with industry trends and emerging technologies in data integration and ETL practices. Show willingness to learn and adapt to new tools and methodologies as required by project or team needs. Requirements 3-4 years of experience in ETL workflows, specifically with Apache NiFi and Astro (or similar platforms). Proficient in SQL and experience with data warehousing concepts. Familiarity with scripting languages (e.g., Python, Shell scripting) is a plus. Basic understanding of cloud platforms (AWS, Azure, or Google Cloud) Soft Skills: Strong problem-solving abilities with an operational mindset. Team player with effective communication skills to collaborate well within and across teams. Quick learner, adaptable to new tools, and willing to take on challenges with a positive attitude. Benefits Insurance - Family Term Insurance PF Paid Time Off - 20 days Holidays - 10 days Flexi timing Competitive Salary Diverse & Inclusive workspace

Big Data Engineer

Hyderābād

5 years

INR Not disclosed

On-site

Part Time

Job Information Date Opened 07/23/2025 Industry Information Technology Job Type Full time Work Experience 5+ years City Hyderabad State/Province Telangana Country India Zip/Postal Code 500039 Job Description Core Responsibilities Design and optimize batch/streaming data pipelines using Scala, Spark, and Kafka Implement real-time tokenization/cleansing microservices in Java Manage production workflows via Apache Airflow (batch scheduling) Conduct root-cause analysis of data incidents using Spark/Dynatrace logs Monitor EMR clusters and optimize performance via YARN/Dynatrace metrics Ensure data security through HashiCorp Vault (Transform Secrets Engine) Validate data integrity and configure alerting systems Requirements Technical Requirements Programming :Scala (Spark batch/streaming), Java (real-time microservices) Big Data Systems: Apache Spark, EMR, HDFS, YARN resource management Cloud & Storage :Amazon S3, EKS Security: HashiCorp Vault, tokenization vs. encryption (FPE) Orchestration :Apache Airflow (batch scheduling) Operational Excellence Spark log analysis, Dynatrace monitoring, incident handling, data validation Mandatory Competencies Expertise in distributed data processing (Spark on EMR/Hadoop) Proficiency in shell scripting and YARN job management Ability to implement format-preserving encryption (tokenization solutions) Experience with production troubleshooting (executor logs, metrics, RCA) Benefits Benefits Insurance - Family Term Insurance PF Paid Time Off - 20 days Holidays - 10 days Flexi timing Competitive Salary Diverse & Inclusive workspace

Big Data Engineer

Hyderabad, Telangana

5 years

None Not disclosed

On-site

Full Time

Job Information Date Opened 07/23/2025 Industry Information Technology Job Type Full time Work Experience 5+ years City Hyderabad State/Province Telangana Country India Zip/Postal Code 500039 Job Description Core Responsibilities Design and optimize batch/streaming data pipelines using Scala, Spark, and Kafka Implement real-time tokenization/cleansing microservices in Java Manage production workflows via Apache Airflow (batch scheduling) Conduct root-cause analysis of data incidents using Spark/Dynatrace logs Monitor EMR clusters and optimize performance via YARN/Dynatrace metrics Ensure data security through HashiCorp Vault (Transform Secrets Engine) Validate data integrity and configure alerting systems Requirements Technical Requirements Programming :Scala (Spark batch/streaming), Java (real-time microservices) Big Data Systems: Apache Spark, EMR, HDFS, YARN resource management Cloud & Storage :Amazon S3, EKS Security: HashiCorp Vault, tokenization vs. encryption (FPE) Orchestration :Apache Airflow (batch scheduling) Operational Excellence Spark log analysis, Dynatrace monitoring, incident handling, data validation Mandatory Competencies Expertise in distributed data processing (Spark on EMR/Hadoop) Proficiency in shell scripting and YARN job management Ability to implement format-preserving encryption (tokenization solutions) Experience with production troubleshooting (executor logs, metrics, RCA) Benefits Benefits Insurance - Family Term Insurance PF Paid Time Off - 20 days Holidays - 10 days Flexi timing Competitive Salary Diverse & Inclusive workspace

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Job Titles Overview