Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Role: Senior Databricks Engineer / Databricks Technical Lead/ Data Architect Location: Bangalore, Chennai, Delhi, Pune, Kolkata Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Mandatory Skills: Python/ PySpark / Spark with Azure/ AWS Databricks Skills: neo4j,pig,mongodb,pl/sql,architect,terraform,hadoop,pyspark,impala,apache kafka,adfs,etl,data warehouse,spark,azure,data bricks,databricks,rdbms,cassandra,aws,unix shell scripting,circleci,python,azure synapse,hive,git,kinesis,sql Show more Show less
Posted 2 days ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Role: Senior Databricks Engineer / Databricks Technical Lead/ Data Architect Location: Bangalore, Chennai, Delhi, Pune, Kolkata Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Mandatory Skills: Python/ PySpark / Spark with Azure/ AWS Databricks Skills: neo4j,pig,mongodb,pl/sql,architect,terraform,hadoop,pyspark,impala,apache kafka,adfs,etl,data warehouse,spark,azure,data bricks,databricks,rdbms,cassandra,aws,unix shell scripting,circleci,python,azure synapse,hive,git,kinesis,sql Show more Show less
Posted 2 days ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Responsibilities: Design, implement and upgrade to cost-effective fault-tolerant architecture Manage databases in cloud environments such as Amazon Resolving alerts and production issues Troubleshoot and support existing database processes; deliver fixes and optimizations where appropriate. Support software projects involving the database infrastructure and data migration Developing automation procedures and applications. Must -have skills: 5-8+ years of experience in information technology. At least 5 years of experience working as MongoDB, Mysql DBA in a fast-paced, rapid-growth and intensive environment 2+ years of experience with Cloud environments especially AWS managed services and tools. Some experience with other NoSQL technologies like Redis, Kafka Experience with database performance tuning. Show more Show less
Posted 2 days ago
9.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Join our digital revolution in NatWest Digital X In everything we do, we work to one aim. To make digital experiences which are effortless and secure. So we organise ourselves around three principles: engineer, protect, and operate. We engineer simple solutions, we protect our customers, and we operate smarter. Our people work differently depending on their jobs and needs. From hybrid working to flexible hours, we have plenty of options that help our people to thrive. This role is based in India and as such all normal working days must be carried out in India. Job Description Join us as a Software Engineer This is an opportunity for a technically minded individual to join us as a Software Engineer You’ll be designing, producing, testing and implementing working software, working across the lifecycle of the system Hone your existing software engineering skills and advance your career in this critical role We're offering this role at vice president level What you'll do Working in a permanent feature team, you’ll be developing knowledge of aspects of the associated platform across the disciplines of business, applications, data and infrastructure. You’ll also be liaising with principal engineers, architects in the domain and other key stakeholders to understand how the platform works and how it supports business objectives. You’ll also be: Applying Agile methods to the development of software on the backlog Producing resilient and long-lived software and acting flexibly to cope with future needs Delivering intentional architecture and formulating emergent design through innovative ideas, experimentation and prototyping Designing and developing software with a focus on the automation of build, test and deployment activities, using executable patterns The skills you'll need We’re looking for someone with strong experience in Selenium, Cucumber, Java , GitLab, DevOps, CICD, Python, Java, Micro services, Camunda, SQL and AWS. Also experience in Java full stack including Microservices, ReactJS, Spring, SpringBoot, SpringBatch, Pl/SQL, Oracle, PostgreSQL, Junit, Mockito, Cloud, REST API, API Gateway, Kafka and API development. You'll have an experience of 9+ years. You’ll also need to be capable of complex requirements analysis capture and validation against and with business and systems requirements. Additionally, you’ll demonstrate: Experience of leading the implementation of programming best practice, especially around scalability, automation, virtualisation, optimisation, availability and performance Sound collaboration skills with the ability to work with business teams to produce pragmatic solutions that work for the business Experience of information security policies and practices within the financial sector Strong stakeholder management skills and communication skills with the ability to communicate complex technical concepts in a simple way Show more Show less
Posted 2 days ago
40.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description Analyze, design develop, troubleshoot and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications. Career Level - IC3 Responsibilities As a member of the software engineering division, you will perform high-level design based on provided external specifications. Specify, design and implement minor changes to existing software architecture. Build highly complex enhancements and resolve complex bugs. Build and execute unit tests and unit plans. Review integration and regression test plans created by QA. Communicate with QA and porting engineering as necessary to discuss minor changes to product functionality and to ensure quality and consistency across specific products. Responsibilities Working with the team to develop and maintain full stack SaaS solutions. Collaborate with engineering and product teams, contribute to the definition of specifications for new features, and own the development of those features. Define and implement web services and the application backend microservices. Implement and/or assist with the web UI/UX development. Be a champion for cloud native best practices. Have proactive mindset about bug fixes, solving bottlenecks and addressing performance issues. Maintain code quality, organization, and automatization. Ensure testing strategy is followed within the team. Support the services you build in production. Essential Skills And Background Expert knowledge of Java Experience with micro-service development at scale. Experience working with Kafka Experience with automated test frameworks at the unit, integration and acceptance levels. Use of source code management systems such as git Preferred Skills And Background Knowledge of issues related to scalable, fault-tolerant architectures. Knowledge of Python Experience with SQL and RDMS (Oracle and/or MySQL preferred). Experience deploying applications in Kubernetes with Helm Experience with devops tools such as Prometheus and Grafana. Experience in Agile development methodology. Experience in terraform is preferred. Use of build tools like gradle and maven Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 2 days ago
7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
🚀 We're Hiring: Java Developer (Full-Time) 📍 Location: Chennai (Chennai One IT Park) / Pune (BNY ODC, Kharadi) – Work From Office – 5 Days a Week 💼 Experience: 7+ Years 💰 Budget: Up to ₹25 LPA (Strictly as per budget) 📅 Notice Period: Immediate Joiners or Max 2 Weeks Only Tech Stack Requirements: 🔹 Must-Have Skills: Java (JDK 11+) Spring Boot / Spring MVC Microservices Architecture REST API Design Messaging Queues (Kafka/MQ) Databases (Any) Concurrent Programming Functional Programming (Lambdas, Streams, Functional Interfaces) 🔹 Good-to-Have Skills: MongoDB Distributed Caching (Hazelcast / Redis) Cloud Development (Docker, AWS, Azure, Cloud Foundry) Spring JDBC / MyBatis / Hibernate If you're a skilled Java Developer ready to take on a challenging role in a dynamic environment, we want to hear from you! 📩 Apply now !:rajesh@reveilletechnologies.com ./ Show more Show less
Posted 2 days ago
5.0 years
0 Lacs
Greater Chennai Area
On-site
Job Description: Candidates with minimum 5+ years of experience in Javascript, Node, Typescript and Mongo DB Node JS Developer 1. Javascript- Mandate 2. Typescript 3. MongoDB 4. Nodejs 5. Microservices 6. Design/ System Design - Capable of handling a service/ system 7. Data bases- how db works behind the scenes, basic questions on MongoDB (Intermediate), Kafka, event driven architecture 8. Experience in ReactJS (TypeScript), HTML, CSS-Pre-processors, or CSS-in-JS in creating Enterprise Applications with high performance for Responsive Web Applications. Show more Show less
Posted 2 days ago
2.0 - 3.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Dot Net Developer Job Description: We are seeking a skilled C#, .NET Core Developer (2-3 years) Responsibilities: Develop applications using C#, .Net Core Strong in analytics skill sets Ensure the performance, quality, and responsiveness of applications. Identify and correct bottlenecks and fix bugs. Help maintain code quality, organization, and automation. Requirements: Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience as a .NET Developer. Strong knowledge on integration solutions with external API's and worked on queueing frameworks kike KAFKA, AMQ) Strong in developing microservices Experience with RESTful APIs and web services. Knowledge of SQL and database management. Excellent problem-solving skills and attention to detail. Ability to work independently and as part of a team. Preferred Qualifications: Experience with cloud platforms (e.g., Azure, AWS). Knowledge of Agile/Scrum methodologies. Strong understanding of software development principles and design patterns. Show more Show less
Posted 2 days ago
40.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Job Description Analyze, design develop, troubleshoot and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications. Career Level - IC3 Responsibilities As a member of the software engineering division, you will perform high-level design based on provided external specifications. Specify, design and implement minor changes to existing software architecture. Build highly complex enhancements and resolve complex bugs. Build and execute unit tests and unit plans. Review integration and regression test plans created by QA. Communicate with QA and porting engineering as necessary to discuss minor changes to product functionality and to ensure quality and consistency across specific products. Responsibilities Working with the team to develop and maintain full stack SaaS solutions. Collaborate with engineering and product teams, contribute to the definition of specifications for new features, and own the development of those features. Define and implement web services and the application backend microservices. Implement and/or assist with the web UI/UX development. Be a champion for cloud native best practices. Have proactive mindset about bug fixes, solving bottlenecks and addressing performance issues. Maintain code quality, organization, and automatization. Ensure testing strategy is followed within the team. Support the services you build in production. Essential Skills And Background Expert knowledge of Java Experience with micro-service development at scale. Experience working with Kafka Experience with automated test frameworks at the unit, integration and acceptance levels. Use of source code management systems such as git Preferred Skills And Background Knowledge of issues related to scalable, fault-tolerant architectures. Knowledge of Python Experience with SQL and RDMS (Oracle and/or MySQL preferred). Experience deploying applications in Kubernetes with Helm Experience with devops tools such as Prometheus and Grafana. Experience in Agile development methodology. Experience in terraform is preferred. Use of build tools like gradle and maven Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 2 days ago
5.0 - 8.0 years
0 Lacs
Gandhinagar, Gujarat, India
On-site
Position : Senior Java Full Stack Developer Experience : 5 to 8 Years Location: Gift City, Gandhinagar (Work from office only) Job Description : We are seeking a highly skilled and experienced Senior Java Full Stack Developer with a strong background in enterprise application development. Key Requirements: Core Expertise: · Proficient in Java 11/17, including advanced concepts and the Collections Framework · Strong understanding of Design Patterns · Deep experience with multithreading, concurrency, and ExecutorService · Hands-on experience with J2EE, JSF1.2, Primefaces 13, JSF -jakarta 4.*, Tomcat, Apache2, Azure, Jboss, wildfly, Vaadin, Spring Boot 3.x, Spring Data JPA, Hibernate, and Spring Batch · Experience with Microservices Architecture / Rest APIs/ SOAP APIs Good to Have: · Familiarity with microservice design patterns such as Saga and CQRS · Working knowledge of RabbitMQ and/or Apache Kafka for message brokering and streaming · Experience with Elasticsearch · Exposure to Relational and NoSQL databases like MySQL, PostgreSQL, or MongoDB · Experience with containerization and orchestration tools like Docker and Kubernetes · Tools & Version Control: · Proficient with Git, GitHub, GitLab, Bitbucket · Build Tools : Maven, Gradle · IDE & Editors: IntelliJ IDEA, Eclipse, Visual Studio Code Key Responsibilities: · Design, develop, and maintain scalable and high-performance backend services using Java 11/17 and Spring Boot 3.x. · Implement and manage microservices-based architecture following best practices. · Integrate and optimize messaging systems using RabbitMQ and/or Apache Kafka. · Design and optimize database schemas for relational and NoSQL databases. · Implement batch processing using Spring Batch for large-scale data workflows. · Apply appropriate design patterns and coding standards to build robust and maintainable code. · Work with Docker and Kubernetes for containerization, deployment, and orchestration of services. · Collaborate with DevOps teams for CI/CD pipeline setup and deployment automation. · Participate in code reviews, unit testing, and system integration testing. · Troubleshoot and resolve issues across development, test, and production environments. · Collaborate closely with front-end developers, QA teams, and other stakeholders for end-to-end delivery. Nice to Have Knowledge of CI/CD pipelines and DevOps practices. Experience with monitoring tools and performance tuning. Understanding of cloud platforms (AWS, GCP, or Azure). Show more Show less
Posted 2 days ago
3.0 years
0 Lacs
Lucknow, Uttar Pradesh, India
On-site
Job Title: Java Developer Location: Lucknow Job Type: Full-time Experience Level: Mid-Level/Senior Job Summary We are seeking a skilled Java Developer with expertise in Spring Boot, PostgreSQL, and other Java frameworks to design, develop, and maintain scalable back-end applications. The ideal candidate should have experience with multiple Java ecosystems, including Micro Profile, Quarkus, Micronaut, JakartaEE (formerly Java EE), and Hibernate, along with a strong understanding of modern software architecture. Key Responsibilities 1. Design and develop Java-based applications using Spring Boot, Quarkus, Micronaut, or Jakarta EE. 2. Build RESTful APIs, GraphQL, or gRPC services for seamless integrations. 3. Work with PostgreSQL (and optionally NoSQL databases like MongoDB) for optimized data storage. 4. Use JPA/Hibernate, jOOQ, or MyBatis for efficient database interactions. 5. Implement reactive programming with Spring WebFlux, Vert.x, or Akka. 6. Develop microservices using Spring Cloud, Kubernetes, or Docker. 7. Ensure application security via OAuth2, JWT, or Keycloak. 8. Write unit/integration tests using JUnit, TestNG, or Mockito. 9. Optimize performance using caching (Redis, Ehcache) and async messaging(Kafka, RabbitMQ). 10. Participate in Agile/Scrum development cycles. Required Skills & Qualifications * 3+ years of Java development experience. Strong expertise in Spring Boot and at least one other framework (e.g., Quarkus, Micronaut, Jakarta EE). *Experience with PostgreSQL (query optimization, indexing, stored procedures). *Knowledge of ORM tools (Hibernate, jOOQ, MyBatis). Familiarity with REST, GraphQL, or gRPC API design. *Experience with Maven/Gradle build tools. Knowledge of Docker, Kubernetes, and CI/CD pipelines. Understanding of Agile/Scrum methodologies. Preferred Skills (Nice to Have) *Reactive programming (Spring WebFlux, Vert.x, Project Reactor). *Cloud platforms (AWS, Azure, GCP). *NoSQL databases (MongoDB, Cassandra). Frontend basics (React, Angular, Thymeleaf). Event-driven architecture (Kafka, RabbitMQ). Show more Show less
Posted 2 days ago
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Experience: 7+ years Key Responsibilities • Lead the architecture, design, and development of enterprise-scale web applications with integrated Generative AI capabilities. • Define technical standards, architectural patterns, and best practices for scalable and secure software systems. • Collaborate with product, AI/ML, DevOps, and security teams to ensure alignment between business goals and technical execution. • Oversee full-stack development lifecycle, ensuring robust design, performance tuning, and security compliance. • Evaluate and integrate AI components such as LLMs, embedding services, and vector databases into product architecture. Mandatory Technical Skills 1. Cloud-Native Architecture & Microservices Proven experience designing scalable, cloud-native applications using microservices and event-driven patterns on Azure Ecosystem. 2. Performance Optimization Implementing caching strategies (Redis, CDN), asynchronous job processing (RabbitMQ, Kafka), and load-balanced architectures using Kubernetes or serverless platforms. 3. GenAI Integration & LLM Orchestration Hands-on with Generative AI technologies—Azure OpenAI, prompt engineering, RAG pipelines, embeddings, and vector search (e.g., Azure AI search, Azure PostgreSQL etc..). 4. Backend Engineering (APIs, Data, Security) Strong expertise in REST APIs with versioning, throttling, and gateway integrations., PostgreSQL/MongoDB, OAuth2/SSO, and secure coding aligned with GDPR/SOC2 standards. 5. Frontend Development (React/Next.js) Deep experience building enterprise-grade UIs using React.JS, component libraries, and modern design systems. 6. DevOps & Infrastructure as Code Proficiency in Azure CI/CD pipelines , Docker, and Kubernetes for automated, scalable deployments. 7. Real-Time & Scalable UI Patterns Experience with WebSockets/SSE, UI performance optimization, and handling large-scale dynamic frontends. 8. Testing, Observability & Quality Engineering Competence in automated unit/E2E testing , frontend performance profiling, and monitoring of Deployed solutions. Show more Show less
Posted 2 days ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Senior Java Developer (Spring Boot + AWS) Location: Hyderabad (Hybrid) Experience: 4-6 Years Key Responsibilities: Design & implement scalable features using Java, Spring Boot, and AWS . Manage CI/CD pipelines ( GitLab CI/CD, Kubernetes, Docker ). Collaborate in Agile teams to deliver high-quality backend solutions. Optimize cloud services ( Lambda, API Gateway, SQS, IAM ). Mentor junior developers and ensure SDLC best practices. Must-Have Skills: ✔ Java 8+/Spring Boot (OOPs, Design Patterns, JPA, Spring IoC) ✔ AWS Serverless (Lambda, API Gateway, SQS) ✔ DevOps (GitLab CI/CD, Kubernetes, Docker) ✔ Event-Driven Systems (Kafka or similar) Good-to-Have: ✔ NodeJS/Typescript exposure ✔ Java Swing/Struts Show more Show less
Posted 2 days ago
5.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Project Description: The WMI Core stream provides Core Banking capabilities across WM International locations, and works towards integration and synergies across WMI locations, driving capability-driven and modular platform strategy for Core Banking. We are seeking a Technical Lead with strong experience in Temenos Transact development, particularly in core banking customizations, interfaces, and API integrations. This role requires hands-on leadership to deliver scalable and maintainable banking solutions. Responsibilities: Lead the design, development, and customization of Temenos Transact (T24) modules. Develop and enhance local developments, AA products, and integration components. Define and implement API-based integrations with external systems (internal/external APIs, OFS, TCI). Guide and mentor a team of developers, ensuring adherence to coding standards and best practices. Collaborate with Business Analysts and Solution Architects for technical feasibility and solution design. Conduct code reviews, unit testing, and performance optimizations. Support DevOps practices for automated builds, deployments, and version control. Mandatory Skills Description: Hands-on experience 5-8 years' in both functional and automation testing. Proficiency in backend, UI, and database testing. Hands-on testing experience for REST / SOAP services, KAFKA / MQ event-based applications. A solid understanding of requirements analysis, test case creation, regression testing, and defect tracking. Familiarity with agile best practices. Programming languages: Java or Kotlin. Build tools: Maven or Gradle. BDD testing frameworks. Nice-to-Have Skills Description: Experience in Agile Framework Show more Show less
Posted 2 days ago
0.0 - 5.0 years
0 Lacs
Thiruvananthapuram, Kerala
On-site
Job Title: Software Engineer Location: Trivandrum, Kerala Employment Type: Full-Time Experience Required: 3 to 5 years Job Summary: We are looking for a passionate and skilled Mid-Level Java Developer to join our engineering team. The ideal candidate must have strong hands-on experience in Core Java , Spring Boot , Kafka , and AWS Cloud Services . You will be responsible for developing and maintaining scalable backend systems and integrating cloud-native solutions in a fast-paced, collaborative environment. Key Responsibilities: ● Design, develop, and maintain robust backend applications using Core Java and Spring Boot . ● Build and integrate event-driven architectures using Apache Kafka . ● Deploy, monitor, and manage applications on AWS Cloud . ● Write clean, maintainable, and efficient code following best practices. ● Participate in system design, code reviews, and technical discussions. ● Collaborate closely with DevOps, QA, and frontend teams to deliver high-quality solutions. ● Troubleshoot production issues and implement fixes with minimal turnaround time. ● Continuously explore, evaluate, and implement new technologies and best practices. Required Skills: ● Strong proficiency in Core Java (OOP concepts, collections, multithreading, exception handling, etc.) ● Hands-on experience with Spring Boot , Spring MVC , and related Spring modules ● Solid understanding and real-world usage of Kafka (producers, consumers, topics, message flow, etc.) ● Experience with AWS services (EC2, S3, Lambda, RDS, CloudWatch, etc.) ● Familiarity with RESTful APIs and microservices architecture. ● Proficiency in version control tools like Git . ● Good understanding of CI/CD pipelines and containerization (Docker is a plus). Excellent problem-solving skills and the ability to work independently or in a team. Job Types: Full-time, Permanent Pay: ₹484,064.22 - ₹1,488,524.85 per year Schedule: Day shift Work Location: In person
Posted 2 days ago
7.0 years
0 Lacs
Pune, Maharashtra, India
On-site
TCS Hiring for Java Developer Role!! TCS presents an excellent opportunity for Java Developer Role: Java Developer Desired Experience Range: 7-12 years Location: Pune Mode of Interview : Virtual Date: 19-06-2025 (Thursday) Must have: Java Spring boot Microservices Kafka(Mandate) Roles & Responsibilities: Excellent knowledge of Java supported by 5+ years of professional experience Implementational Knowledge and experience on various technologies like Spring, Spring Boot Hands-on experience working with Microservices Architecture. Hands-on experience of Apache Kafka and RabbitMQ is an advantage ring energy and passion to your work, be versatile and collaborative in style, empathetic in nature, confident in content and focused on outcomes at all levels in the client organization. Show more Show less
Posted 2 days ago
2.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About Mindstix Software Labs Mindstix accelerates digital transformation for the world's leading brands. We are a team of passionate innovators specialized in Digital Experiences, Enterprise Mobility, Cloud Engineering, and Data Science. Our UX studio and modern-stack engineers deliver world-class products for our global customers, including Fortune 500 enterprises and Silicon Valley startups. Our work impacts a diverse set of industries such as eCommerce, Luxury Retail, SaaS, Consumer Tech, Health Tech, and Hospitality. A fast-moving open culture powered by curiosity and craftsmanship. A team committed to bold thinking and innovation at the very intersection of business, technology, and design. That’s our DNA. Roles and Responsibilities: Mindstix is looking for a passionate and detail-oriented Python Developer to join our engineering team. You are a problem solver who enjoys building scalable backend systems, writing clean and efficient code, and collaborating across teams to deliver high-quality solutions. You take ownership of your work and thrive in a performance-driven environment: Design, develop, and maintain scalable Python-based applications, services, and APIs. Collaborate with cross-functional teams including frontend developers, DevOps, and product managers to understand requirements and deliver robust backend solutions. Write reusable, testable, and efficient code following best practices and design patterns. Optimize applications for speed, scalability, and security. Integrate third-party APIs and services as needed to support business functionality. Troubleshoot, debug, and upgrade existing software systems. Implement automated testing frameworks and unit tests to ensure high code quality. Participate in code reviews and contribute to continuous improvement in development processes. Stay up-to-date with the latest Python libraries, frameworks, and backend technologies. Document code, processes, and systems for maintainability and future enhancements. Qualifications and Skills Bachelor's or Master’s degree in Computer Science, Engineering, or a related technical field. 2+ years of hands-on experience in backend development using Python. Strong understanding of Python frameworks such as Django, Flask, or FastAPI. Experience working with RESTful APIs, asynchronous programming, and microservice architecture. Familiarity with relational databases (PostgreSQL, MySQL) and NoSQL databases (MongoDB, Redis). Solid understanding of data structures, algorithms, and object-oriented programming. Experience with Git, CI/CD pipelines, and containerization tools like Docker. Knowledge of cloud platforms like AWS, Azure, or GCP is a plus. Exposure to message brokers like RabbitMQ or Kafka is desirable. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Bonus: Experience with testing frameworks such as PyTest or UnitTest. Good to have: Familiarity with GraphQL, WebSockets, or event-driven architecture. Who Fits Best? You are a passionate programmer with a knack for solving complex engineering problems. You thrive in a fast-paced, creative environment and enjoy taking on new challenges. You value great design, have a strong aesthetic sense, and pay close attention to detail. You excel in customer-centric environments - actively listening, empathizing, and collaborating with globally distributed teams. You’re a team player who takes pride in mentoring and inspiring others to do their best work. You communicate ideas clearly, both in writing and in conversation, with strong English language skills. You’re detail-oriented and take pride in craftsmanship across every aspect of your work. Benefits An opportunity to work in a competitive environment with top-tier engineers in your industry. Flexible working environment, competitive compensation and perks, health insurance coverage, rewards and recognition, accelerated career planning. An opportunity to build products and solutions at a truly global scale. Location This position is primarily based at our Pune (India) headquarters, requiring all potential hires to work from this location. A modern workplace is deeply collaborative by nature, while also demanding a touch of flexibility. We embrace deep collaboration at our offices with reasonable flexi-timing and hybrid options to our seasoned team members. Equal Opportunity Employer Mindstix is committed to an inclusive and diverse work environment. We do not discriminate based on race, colour, ethnicity, ancestry, national origin, religion, gender, gender identity, gender expression, sexual orientation, age, disability, veteran status, genetic information, marital status or any other legally protected status Show more Show less
Posted 2 days ago
5.0 - 10.0 years
15 - 30 Lacs
Chennai
Hybrid
Job Summary: We are looking for a highly skilled Backend Data Engineer to join our growing FinTech team. In this role, you will design and implement robust data models and architectures, build scalable data ingestion pipelines, and ensure data quality across financial datasets. You will play a key role in enabling data-driven decision-making by developing efficient and secure data infrastructure tailored to the fast-paced FinTech environment. Key Responsibilities: Design and implement scalable data models and data architecture to support financial analytics, risk modeling, and regulatory reporting. Build and maintain data ingestion pipelines using Python or Java to process high-volume, high-velocity financial data from diverse sources. Lead data migration efforts from legacy systems to modern cloud-based platforms. Develop and enforce data validation processes to ensure accuracy, consistency, and compliance with financial regulations. Create and manage task schedulers to automate data workflows and ensure timely data availability. Collaborate with product, engineering, and data science teams to deliver reliable and secure data solutions. Optimize data processing for performance, scalability, and cost-efficiency in a cloud environment. Required Skills & Qualifications: Proficiency in Python and/or Java for backend data engineering tasks. Strong experience in data modelling , ETL/ELT pipeline development , and data architecture . Hands-on experience with data migration and transformation in financial systems. Familiarity with task scheduling tools (e.g., Apache Airflow, Cron, Luigi). Solid understanding of SQL and experience with relational and NoSQL databases. Knowledge of data validation frameworks and best practices in financial data quality. Experience with cloud platforms (AWS, GCP, or Azure), especially in data services. Understanding of data security , compliance , and regulatory requirements in FinTech. Preferred Qualifications: Experience with big data technologies (e.g., Spark, Kafka, Hadoop). Familiarity with CI/CD pipelines , containerization (Docker), and orchestration (Kubernetes). Exposure to financial data standards (e.g., FIX, ISO 20022) and regulatory frameworks (e.g., GDPR, PCI-DSS). Role & responsibilities Preferred candidate profile
Posted 2 days ago
15.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
We are a Rakuten Group company, providing global B2B services for the mobile telco industry and enabling next-generation, cloud-based, international mobile services. Building on the technology Rakuten used to launch Japan’s newest mobile network, we are now taking our mobile offering global! To support our ambitions to provide an innovative cloud-native telco platform for our customers, we are looking to recruit and develop top talent from Digital Product Management. Let’s build the future of mobile telecommunications together! We are a Rakuten Group company, providing global B2B/B2C services for the mobile telco industry and enabling next-generation, cloud-based, international mobile services. Building on the technology Rakuten used to launch Japan’s newest mobile network, we are now taking our mobile offering global! To support our ambitions to provide an innovative cloud-native telco platform for our customers, we are looking to recruit and develop top talent from Digital Product Management. Let’s build the future of mobile telecommunications together! Role : Technical Program manager You will independently lead cross-organisation programs, influencing the roadmap priorities and technical direction across teams. You will work with stakeholders across the organisation and own the communication of all aspects of the program including surfacing risks and progress towards the goal. You will guide the team towards technical solutions and make trade-off decisions. You will drive program management best practices across the organisation. The role requires closely working with the multiple functional teams (including but not limited to Business, Architects, Engineering, Operation support etc ) in building and maintaining program delivery timelines, unblocking teams, defining, and streamlining cross-functional dependencies along with increasing efficiency and velocity of project execution. You would likely spend most of the days in Agile, Kanban, or other project planning tools and scheduling meetings with relevant stakeholders to make sure projects keep moving forward to deliver a program execution strategy and timeline, as well as regular reporting of project health to stakeholders throughout a project’s life cycle. Team : RBSS Delivery organization Skills and Qualification Upto 15 years of hands-on technical project/program management experience with at least 10+ years of program managing /working in Scrums Must have Telecom Background with exposure on working with Telcom operators / ISP ( B2B, B2C customer solutions ) in software delivery / integration for at least 5+ years in BSS domain. Technology stack : Managed complex Data migration projects involving technologies such as Cloud ( AWS, GCP or compatible ), Microservices, Various DB solution (Oracle, MySQL, Couchbase, Elastic DB, Camunda etc ) ,Data streaming technologies ( such as Kafka) and tools associated with the technology stack Excellent Knowledge of Project Management Methodology and Software Development Life Cycles including Agile with excellent client-facing and internal communication skills. Ability to plan, organize, prioritize, and deliver multiple projects simultaneously. In-depth-knowledge and understanding of Telecom BSS business needs with the ability to establish/maintain high level of customer trust and confidence with Solid organizational skills including attention to detail and multitasking skills. Good to understanding of the challenges associated with BSS business and understanding of high level modules( CRM, Order Management , Revenue mgmt. and Billing services ) Excellent verbal, written, and presentation skills to effectively communicate complex technical and business issues (and solutions) to diverse audiences Strong analytical, planning, and organizational skills with an ability to manage competing demands Always curious about various issues/items. Have passion to learn continuously in a fast- moving environment Strong working knowledge of Microsoft Office, Confluence, JIRA, etc. Good to have: Project Management Professional (PMP) / Certified Scrum Master certification Good to have: knowledge of external solutions integrated with ETL software, Billing, Warehouse/supply chain related migrations projects Key job responsibilities Manage/Streamline the program planning by evaluating the incoming project demand across multiple channels against available capacity Regularly define and review KPI ‘s for proactively seek out new and improved mechanisms for visibility ensuring your program stays aligned with organization objectives Develop and Maintain Kanban boards /workstream dashboards Work with stakeholders during entire life cycle of the program, Execute Project requirements, Prepare detailed project plan, identify risks, manage vendor / vendor resources, measure program metrics and take corrective and preventive actions Ability to adopt Agile best practices ( such as estimation techniques) and define and optimize the processes is essential Coordinate with the product Management team to Plan Features and Stories into sprints, understand business priorities, align required stakeholders to make sure the team is able to deliver the expected outcome Manage Technology Improvements and other enhancements from conceptualization to delivery, have deep understanding of their impact, pros/cons, work through required detail, collaborate with all stakeholders till its successfully deployed in production Manage and Deliver Planned RBSS releases by working with customers .Work with Scrum masters, plan Scrum capacity, manage productivity of the teams Monitoring progress of the software developed by scrum teams, quality of the deliverables Working with engineering & product teams to scope product delivery, define solution strategies and understand development alternatives, as well as support Ensure availability to the team to answer questions and deliver direction. Work across multiple teams and vendors (cross-cutting across programs, business/engineering teams, and/or technologies) to drive delivery strategy & dependency management ensuring active delivery and pro-active communications Forecast and manage infrastructure and Resourcing demand against the operational growth of the platform in collaboration with engineering teams Delivering Agile projects that offer outstanding business value to the users. Supporting the stakeholders in implementing an effective project governance system. “Rakuten is committed to cultivating and preserving a culture of inclusion and connectedness. We are able to grow and learn better together with a diverse team and inclusive workforce. The collective sum of the individual differences, life experiences, knowledge, innovation, self-expression, and talent that our employees invest in their work represents not only part of our culture, but our reputation and Rakuten’s achievement as well. In recruiting for our team, we welcome the unique contributions that you can bring in terms of their education, opinions, culture, ethnicity, race, sex, gender identity and expression, nation of origin, age, languages spoken, veteran’s status, color, religion, disability, sexual orientation and beliefs” Show more Show less
Posted 2 days ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About company: Our client is prominent Indian multinational corporation specializing in information technology (IT), consulting, and business process services and its headquartered in Bengaluru with revenues of gross revenue of ₹222.1 billion with global work force of 234,054 and listed in NASDAQ and it operates in over 60 countries and serves clients across various industries, including financial services, healthcare, manufacturing, retail, and telecommunications. The company consolidated its cloud, data, analytics, AI, and related businesses under the tech services business line. Major delivery centers in India, including cities like Chennai, Pune, Hyderabad, and Bengaluru, kochi, kolkatta, Noida · Job Title: Java Full Stack With React js · Location: Hyderabad · Experience: 6+ Years · Job Type : Contract to hire. · Notice Period:- Immediate joiners . Payroll : People Prime . Client : MNC Client Mandatory Skills : Java,react,aws,cloud,kafka,sql Db Skill set: ( With Hands on Experience) Java full stack with React AWS and GCP cloud Kafka, SQL db, Show more Show less
Posted 2 days ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Company : They balance innovation with an open, friendly culture and the backing of a long-established parent company, known for its ethical reputation. We guide customers from what’s now to what’s next by unlocking the value of their data and applications to solve their digital challenges, achieving outcomes that benefit both business and society. About Client: Our client is a global digital solutions and technology consulting company headquartered in Mumbai, India. The company generates annual revenue of over $4.29 billion (₹35,517 crore), reflecting a 4.4% year-over-year growth in USD terms. It has a workforce of around 86,000 professionals operating in more than 40 countries and serves a global client base of over 700 organizations. Our client operates across several major industry sectors, including Banking, Financial Services & Insurance (BFSI), Technology, Media & Telecommunications (TMT), Healthcare & Life Sciences, and Manufacturing & Consumer. In the past year, the company achieved a net profit of $553.4 million (₹4,584.6 crore), marking a 1.4% increase from the previous year. It also recorded a strong order inflow of $5.6 billion, up 15.7% year-over-year, highlighting growing demand across its service lines. Key focus areas include Digital Transformation, Enterprise AI, Data & Analytics, and Product Engineering—reflecting its strategic commitment to driving innovation and value for clients across industries. Job Title: Kafka Administrator Location : Pan india Experience : 5 +Years Job Type : Contract to hire. Notice Period :- Immediate joiners. Mandatory Skills : Kafka connect, kafka connect clusters,schema registry, ksqlDBs, kafka streams, confluent kafka Job Summary: Should be proficient in designing and implementing a robust Kafka cluster on Azure considering factors such as scalability for future growth fault tolerance performance and multi zone DR. Expertise in developing integration pipelines using Kafka Connectors to facilitate seamless communication between applications This includes enabling Kafka Sources and Sinks using different connectors Should actively monitor the Kafka clusters health performance and address issues promptly. Shall have experience in confluence Kafka. Having a sound communication and presentable skills. Seniority Level Mid-Senior level Industry IT Services and IT Consulting Employment Type Contract Job Functions Business Development Consulting Skills Kafka connect kafka connect clusters schema registry ksql DBs kafka streams confluent kafka Show more Show less
Posted 2 days ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Coupa makes margins multiply through its community-generated AI and industry-leading total spend management platform for businesses large and small. Coupa AI is informed by trillions of dollars of direct and indirect spend data across a global network of 10M+ buyers and suppliers. We empower you with the ability to predict, prescribe, and automate smarter, more profitable business decisions to improve operating margins. Why join Coupa? 🔹 Pioneering Technology: At Coupa, we're at the forefront of innovation, leveraging the latest technology to empower our customers with greater efficiency and visibility in their spend. 🔹 Collaborative Culture: We value collaboration and teamwork, and our culture is driven by transparency, openness, and a shared commitment to excellence. 🔹 Global Impact: Join a company where your work has a global, measurable impact on our clients, the business, and each other. Learn more on Life at Coupa blog and hear from our employees about their experiences working at Coupa. The Impact of a Lead Software Engineer – Data to Coupa: The Lead Data Engineer plays a critical role in shaping Coupa’s data infrastructure, driving the design and implementation of scalable, high-performance data solutions. Collaborating with teams across engineering, data science, and product, this role ensures the integrity, security, and efficiency of our data systems. Beyond technical execution, the Lead Data Engineer provides mentorship and defines best practices, supporting a culture of excellence. Their expertise will directly support Coupa’s ability to deliver innovative, data-driven solutions, enabling business growth and reinforcing our leadership in cloud-based spend management. What You’ll Do: Lead and drive the development and optimization of scalable data architectures and pipelines. Design and implement best-in-class ETL/ELT solutions for real-time and batch data processing. Optimize data analysis and computation for performance, reliability, and cost efficiency, implementing monitoring solutions to identify bottlenecks. Architect and maintain cloud-based data infrastructure leveraging AWS, Azure, or GCP services. Ensure data security and governance, enforcing compliance with industry standards and regulations. Develop and promote best practices for data modeling, processing, and analytics.Mentor and guide a team of data engineers, fostering a culture of innovation and technical excellence Collaborate with stakeholders, including Product, Engineering, and Data Science teams, to support data-driven decision-making Automate and streamline data ingestion, transformation, and analytics processes to enhance efficiency. Develop real-time and batch data processing solutions, integrating structured and unstructured data sources What you will bring to Coupa: We are looking for a candidate with 10+ years of experience in Data Engineering and Application development with at least 3+ years in a Technical Lead role. They must have a graduate degree in Computer Science or a related field of study. They must have experience with programming languages such as Python and Java. Expertise in Python is a must Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Expertise in processing and analyzing large data workloads. Experience in designing and implementing scalable Data Warehouse solutions to support analytical and reporting needs Experience with API development and design with REST or GraphQL. Experience building and optimizing 'big data' data pipelines, architectures, and data sets. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement Strong analytic skills related to working with unstructured datasets. Build processes supporting data transformation, data structures, metadata, dependency, and workload management Working knowledge of message queuing, stream processing, and highly scalable 'big data' data stores Strong project management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment. Experience with big data tools: Spark, Kafka, etc. Experience with relational SQL and NoSQL databases. Experience with data pipeline and workflow management tools. Experience with AWS cloud services Coupa complies with relevant laws and regulations regarding equal opportunity and offers a welcoming and inclusive work environment. Decisions related to hiring, compensation, training, or evaluating performance are made fairly, and we provide equal employment opportunities to all qualified candidates and employees. Please be advised that inquiries or resumes from recruiters will not be accepted. By submitting your application, you acknowledge that you have read Coupa’s Privacy Policy and understand that Coupa receives/collects your application, including your personal data, for the purposes of managing Coupa's ongoing recruitment and placement activities, including for employment purposes in the event of a successful application and for notification of future job opportunities if you did not succeed the first time. You will find more details about how your application is processed, the purposes of processing, and how long we retain your application in our Privacy Policy. Show more Show less
Posted 2 days ago
10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
.NET - Technical Project Manager Location: Bangalore Experience: 10-15 years About the Role: We are looking for an experienced Engineering Manager / Technical Project Manager with deep expertise in .NET technologies to lead high-performing development teams. The ideal candidate should have a strong background in software development, project management, and agile methodologies , with a proven track record of delivering scalable and high-quality software solutions. Key Responsibilities: Lead and manage end-to-end software development life cycle, from requirements gathering to deployment and support . Oversee multiple projects ( Fixed Price, T&M ) and ensure timely delivery while maintaining high quality. Drive agile development processes (Scrum, Kanban) across distributed teams. Provide technical leadership in designing, developing, and implementing solutions using .NET, Angular, HTML, CSS, JavaScript . Work closely with Business Analysts, Product Owners, and Development Teams to align technology with business goals. Ensure best practices in Microservices Architecture, Kafka, MFE, and Cloud Technologies . Oversee CI/CD pipeline, Kubernetes, Docker, K8s , and monitoring/logging tools for efficient DevOps practices. Proactively identify, troubleshoot, and resolve technical issues while driving performance optimization. Mentor and guide team members, fostering a culture of collaboration, innovation, and excellence . Key Skills & Qualifications: 10-15 years of experience in software development, testing, and project management. 5+ years of experience in technical project management with expertise in handling complex software projects. Hands-on experience in .NET / Java, Microservices, Kafka, MFE . Strong knowledge of Agile methodologies, CI/CD, Kubernetes, Docker, K8s . Proven ability to manage distributed teams across different locations. Excellent problem-solving, communication, and collaboration skills. Show more Show less
Posted 2 days ago
15.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Title: Technical Project Manager / Engineering Manager – Java Location: Bangalore (Hybrid/On-site – specify as needed) Experience: 10–15 Years About the Role: We are seeking a Technical Project Manager / Engineering Manager with a strong background in Java development and project leadership to manage and deliver complex, scalable software solutions. This role demands a blend of technical expertise and project management acumen to lead high-performing teams and drive multiple projects to success. Key Responsibilities: Lead and manage the end-to-end software development life cycle —from requirements gathering and architecture design to development, deployment, and support. Manage multiple project types including Fixed Price and Time & Material (T&M) engagements. Oversee agile development practices (Scrum/Kanban), sprint planning, and cross-functional team collaboration across distributed locations. Provide technical leadership in solution design and development using Java, Angular, HTML, CSS, JavaScript . Align technical solutions with business objectives by working closely with Product Owners, Business Analysts, and Development Teams . Drive best practices in Microservices Architecture , Kafka , Micro-Frontend (MFE) design, and Cloud Platforms (AWS/Azure). Lead DevOps initiatives , ensuring robust CI/CD pipelines , containerization using Docker , orchestration with Kubernetes (K8s) , and efficient monitoring/logging tools . Proactively identify and resolve technical bottlenecks and ensure high performance and scalability of software systems. Foster a culture of excellence through mentorship , technical guidance , and career development of team members. Key Skills & Qualifications: 10–15 years of experience in software engineering , including development, testing, and delivery. Minimum 5 years in technical project/engineering management , handling large-scale software projects. Hands-on expertise in: Java, Spring Boot Microservices architecture Kafka Micro-Frontend (MFE) frameworks Strong grasp of: Agile methodologies (Scrum, Kanban) CI/CD tools and pipelines Containerization and orchestration : Docker, Kubernetes Cloud platforms : AWS / Azure Proven ability to lead distributed teams and collaborate with global stakeholders. Excellent problem-solving , communication , and team management skills. Why Join Us? Opportunity to lead impactful projects with cutting-edge technologies. Collaborate with a passionate and skilled technology team. Competitive compensation and fast-paced career growth. Culture of innovation, transparency, and continuous learning. Show more Show less
Posted 2 days ago
0.0 - 6.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Job Description: Analyst Quality Engineering Will be responsible for software quality engineering and test automation for Service Assurance applications. Required work includes collaborating with system engineers, participating in scrum, design, create and execute test plan and test cases, test automation, performance testing. Required Skills: Strong hand-on experience in Software testing of back-end applications and test automation using Robot framework. Working experience in Linux/Unix, shell scripting. Experience in end-to-end test automation using Robot framework. Hands on experience in performance testing using industry standard tools for backend applications. Hands on experience in programing language like Java, Python. Knowledge of containerization and orchestration with Docker, Helm, and Kubernetes Experience in Databases (SQL/NoSQL) like CASSANDRA, Postgres, MySQL and Snowflake. Knowledge of microservices Architecture and deployment. Knowledge of real-time data streaming and messaging solutions like Kafka. Experience with tools like Maven, GIT, Jenkins, JFrog. Knowledgeable of key networking technologies such as 5G Core, RAN, Transport, IP Routing, Ethernet, and Access Wireline Networks. Data collection approaches through adoption of industry standard methods and open-source capabilities and dashboards through Grafana. Experience with the TICK stack: Telegraf, InfluxDB, Chronograf and Kapacitor is a plus. Experience with the ELK stack: ElasticDB, Logstash, Kibana is a plus. Knowledge of Azure cloud and hands on experience in deployment and troubleshooting of microservices into AKS. Understanding of key protocols including HTTP, SNMP, DNS, SSH. Any prior experience in a telecommunications industry setting is a plus. Excellent written and verbal communication Overall Experience: 3 -6 years in relevant technologies. Weekly Hours: 40 Time Type: Regular Location: Bangalore, Karnataka, India It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made.
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Kafka, a popular distributed streaming platform, has gained significant traction in the tech industry in recent years. Job opportunities for Kafka professionals in India have been on the rise, with many companies looking to leverage Kafka for real-time data processing and analytics. If you are a job seeker interested in Kafka roles, here is a comprehensive guide to help you navigate the job market in India.
These cities are known for their thriving tech industries and have a high demand for Kafka professionals.
The average salary range for Kafka professionals in India varies based on experience levels. Entry-level positions may start at around INR 6-8 lakhs per annum, while experienced professionals can earn between INR 12-20 lakhs per annum.
Career progression in Kafka typically follows a path from Junior Developer to Senior Developer, and then to a Tech Lead role. As you gain more experience and expertise in Kafka, you may also explore roles such as Kafka Architect or Kafka Consultant.
In addition to Kafka expertise, employers often look for professionals with skills in: - Apache Spark - Apache Flink - Hadoop - Java/Scala programming - Data engineering and data architecture
As you explore Kafka job opportunities in India, remember to showcase your expertise in Kafka and related skills during interviews. Prepare thoroughly, demonstrate your knowledge confidently, and stay updated with the latest trends in Kafka to excel in your career as a Kafka professional. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.