Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
6.0 years
0 Lacs
Gurugram, Haryana, India
Remote
Job Title: Senior Java Developer (Remote) Experience: 6 to 8 Years Location: Remote Employment Type: Full-Time Notice period: Immediate Joiner Job Summary: We are looking for a highly skilled and experienced Senior Java Developer to join our distributed team. The ideal candidate should have a strong background in developing scalable enterprise-grade applications using Java and related technologies, with exposure to full-stack development, system integration, and performance optimization. Key Responsibilities: Design and develop high-performance, scalable, and reusable Java-based applications. Build RESTful APIs with a strong understanding of RESTful architecture. Implement enterprise integration patterns using Apache Camel or Spring Integration. Ensure application security in compliance with OWASP guidelines. Write and maintain unit, integration, and BDD tests using JUnit, Cucumber, Selenium. Conduct performance and load testing; optimize through memory and thread dump analysis. Collaborate with product owners, QA teams, and other developers in Agile/Scrum environments. Participate in code reviews, architecture discussions, and mentoring junior developers. Technical Skills & Experience Required: Core Backend: Strong proficiency in Java (8 or higher) Proficient in Spring Boot , Spring Security , Spring MVC , Spring Data Solid experience with REST API design, implementation, and testing using Postman , SoapUI Unit Testing , Integration Testing , BDD Testing Web Services and Integration: Experience with XML , Web Services (RESTful and SOAP) , Apache CXF Knowledge of Enterprise Integration Patterns Exposure to Apache Camel or Spring Integration Frontend & Full Stack: Familiarity with HTML5 , CSS3 Experience with TypeScript , JavaScript , jQuery , Node.js Working knowledge of Webpack and Gulp Database & Data Streaming: Strong in RDBMS and Database Design (e.g., Oracle , PL/SQL ) Exposure to MongoDB and NoSQL Understanding of Kafka architecture , Kafka as a data streaming platform Performance & Security: Experience in Performance Analysis and Application Tuning Understanding of Security aspects and OWASP guidelines Experience with Memory & Thread Dump Analysis Cloud & DevOps: Working knowledge of Kubernetes Familiarity with Elastic solutions at the enterprise level Experience in Identity and Access Management tools like ForgeRock About IGT Solutions: IGT Solutions is a next-gen customer experience (CX) company, defining and delivering transformative experiences for the global and most innovative brands using digital technologies. With the combination of Digital and Human Intelligence, IGT becomes the preferred partner for managing end-to-end CX journeys across Travel and High Growth Tech industries. We have a global delivery footprint, spread across 30 delivery centers in China, Colombia, Egypt, India, Indonesia, Malaysia, Philippines, Romania, South Africa, Spain, UAE, the US, and Vietnam, with 25000+ CX and Technology experts from 35+ nationalities. IGT's Digital team collaborates closely with our customers business & technology teams to take solutions faster to market while sustaining quality while focusing on business value and improving overall end-Customer Experience. Our offerings include industry solutions as well as Digital services. We work with leading global enterprise customers to improve synergies between business & technology by enabling rapid business value realization leveraging Digital Technologies. These include lifecycle transformation & rapid development / technology solution delivery services delivered leveraging traditional as well as Digital Technologies, deep functional understanding and software engineering expertise. IGT is ISO 27001:2013, CMMI SVC Level 5 and ISAE-3402 compliant for IT, and COPC® Certified v6.0, ISO 27001:2013 and PCI DSS 3.2 certified for BPO processes. The organization follows Six Sigma rigor for process improvements. It is our policy to provide equal employment opportunities to all individuals based on job-related qualifications and ability to perform a job, without regard to age, gender, gender identity, sexual orientation, race, color, religion, creed, national origin, disability, genetic information, veteran status, citizenship or marital status, and to maintain a non-discriminatory environment free from intimidation, harassment or bias based upon these grounds. Show more Show less
Posted 2 days ago
6.0 - 11.0 years
8 - 13 Lacs
Hyderabad
Work from Office
As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the world’s technology leader. Come to IBM and make a global impact Responsibilities: Responsible to manage end to end feature development and resolve challenges faced in implementing the same Learn new technologies and implement the same in feature development within the time frame provided Manage debugging, finding root cause analysis and fixing the issues reported on Content Management back end software system fixing the issues reported on Content Management back end software system Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall, more than 6 years of experience with more than 4+ years of Strong Hands on experience in Python and Spark Strong technical abilities to understand, design, write and debug to develop applications on Python and Pyspark. Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure strong problem-solving skill Preferred technical and professional experience Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure
Posted 2 days ago
6.0 - 11.0 years
8 - 13 Lacs
Hyderabad
Work from Office
As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the world’s technology leader. Come to IBM and make a global impact Responsibilities: Responsible to manage end to end feature development and resolve challenges faced in implementing the same Learn new technologies and implement the same in feature development within the time frame provided Manage debugging, finding root cause analysis and fixing the issues reported on Content Management back end software system fixing the issues reported on Content Management back end software system Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall, more than 6 years of experience with more than 4+ years of Strong Hands on experience in Python and Spark Strong technical abilities to understand, design, write and debug to develop applications on Python and Pyspark. Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure strong problem-solving skill Preferred technical and professional experience Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure
Posted 2 days ago
4.0 - 9.0 years
6 - 11 Lacs
Kochi
Work from Office
As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala ; Minimum 3 years of experience on Cloud Data Platforms on AWS; Experience in AWS EMR / AWS Glue / Data Bricks, AWS RedShift, DynamoDB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers
Posted 2 days ago
7.0 - 12.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Your Role & Responsibilities: Looking to make a significant impactThis is your chance to become a key part of a dynamic team of talented professionals, leading the development and deployment of innovative, industry-leading, cloud-based AI services. We are seeking an experienced AI & Cloud Software Engineer to join us. This role designing, developing, and deploying AI-based services. You will be instrumental in problem-solving, automating wide ranges of tasks, and interfacing with other teams and solve complex problems. Responsibilities: Develop AI capabilities in IBM Cloud based applications Design and be an avid coder who can get his hands dirty and be involved in the coding to the deepest level. Work in an agile environment of continuous deliverable. You’ll have access to all the technical training courses you need to become the expert you want to be. Define all aspects of development from appropriate technology and workflow to coding standards Collaborate with other professionals to determine functional and non-functional requirements Participate in technical reviews of requirements, specifications, designs, code and other artifacts. Learn new skills and adopt new practices readily in order to develop innovative and cutting-edge software products that maintain Company’s technical leadership position. Required education Bachelor's Degree Required technical and professional expertise Required Skills: Minimum 7-12 years of experience as Full Stack Developer with a focus on AI projects Experience with AI and machine learning frameworks such as scikit-learn, TensorFlow, PyTorch, LLMs, Generative AI. Familiarity with AI model deployment and integration. Solid understanding of backend technologies, including server-side languages (Node.js, Python, Java, etc.) and databases (Cassandra, PostgreSQL, etc.). Understanding and experience with RESTful APIs, Java/J2EE, Kafka & GitHub. Strong experience with Cloud Technologies, Kubernetes based microservices architecture, Kafka, Object Storage, Cassandra database and docker container technologies. Knowledge on IBM Cloud Technologies will be an added advantage. At least 6 years of hands-on development experience building applications with one or more of the followingJava, Spring, Liberty, Node.js, Express.js, Golang, NoSQL DB, Redis, distributed caches, containers etc., At least 3 years of experience in building and operating highly secured, distributed cloud services with one or more of the followingIBM Cloud, AWS, Azure, SRE, CI/CD,Docker, Container orchestration, performance testing, etc., At least 3 years of experience in web technologiesHTTP, REST, JSON, HTML, Ajax, JavaScript etc., Solid understanding of the micro-services architecture and modern cloud programming practices. Strong ability to design a clean, developer-friendly API. Passionate about constant, continuous learning and applying new technologies as well as mentoring others. Keen troubleshooting skills and strong verbal/written communication skills. Preferred technical and professional experience Preferred Skills: Experience in using messaging brokers like RabbitMQ, Kafka etc. Operating Systems (such as Red Hat, Ubuntu, etc.) Knowledge of network protocols such as TCP/IP, HTTP, etc. Experience and working knowledge of version Control systems like Github and build tools like Maven/Gradle Ability to learn and apply new technologies quickly Experience in working on a SaaS application with high industry standard CI/CD, and development cycle processes Strong sense of ownership of deliverables UI test automation skills - Selenium and/or Puppeteer Beyond the requirements, candidates should be passionate about in the role: Continuous learning and ability to adapt to change Working across global teams and collaborating across teams and organization boundaries Finding innovative ways to solve complex problems with cutting edge technologies.
Posted 2 days ago
12.0 - 17.0 years
14 - 19 Lacs
Bengaluru
Work from Office
Your Role & Responsibilities: Looking to make a significant impactThis is your chance to become a key part of a dynamic team of talented professionals, leading the development and deployment of innovative, industry-leading, cloud-based AI services. We are seeking an experienced AI & Cloud Software Engineer to join us. This role designing, developing, and deploying AI-based services. You will be instrumental in problem-solving, automating wide ranges of tasks, and interfacing with other teams and solve complex problems. Responsibilities: Develop AI capabilities in IBM Cloud based applications Design and be an avid coder who can get his hands dirty and be involved in the coding to the deepest level. Work in an agile environment of continuous deliverable. You’ll have access to all the technical training courses you need to become the expert you want to be. Define all aspects of development from appropriate technology and workflow to coding standards Collaborate with other professionals to determine functional and non-functional requirements Participate in technical reviews of requirements, specifications, designs, code and other artifacts. Learn new skills and adopt new practices readily in order to develop innovative and cutting-edge software products that maintain Company’s technical leadership position. Required education Bachelor's Degree Required technical and professional expertise Required Expertise Full Stack & AI/ML 7–12 years' experience with AI/ML tools (scikit-learn, TensorFlow, PyTorch, LLMs), model deployment, and full-stack development. Backend & APIs Strong in Java, Python, Node.js, REST APIs, Kafka, and databases like Cassandra, PostgreSQL. Cloud & DevOps Expertise in IBM Cloud/AWS/Azure, Kubernetes, Docker, microservices, CI/CD, and SRE practices. Web & Architecture Proficient in web technologies (HTTP, JSON, HTML, JS) and modern cloud/microservices architecture with API design skills. Preferred technical and professional experience Preferred Expertise Messaging & OSExperience with Kafka, RabbitMQ, and Linux environments (Red Hat, Ubuntu). Networking & ToolsKnowledge of TCP/IP, HTTP protocols, GitHub, Maven/Gradle. SaaS & CI/CDBackground in SaaS apps, CI/CD pipelines, and agile development cycles. Testing & AutomationFamiliarity with UI test tools like Selenium or Puppeteer. MindsetOwnership, adaptability, global collaboration, and eagerness to solve complex problems with new tech.
Posted 2 days ago
5.0 - 7.0 years
7 - 9 Lacs
Bengaluru
Work from Office
As a senior SAP Consultant, you will serve as a client-facing practitioner working collaboratively with clients to deliver high-quality solutions and be a trusted business advisor with deep understanding of SAP Accelerate delivery methodology or equivalent and associated work products. You will work on projects that assist clients in integrating strategy, process, technology, and information to enhance effectiveness, reduce costs, and improve profit and shareholder value. There are opportunities for you to acquire new skills, work across different disciplines, take on new challenges, and develop a comprehensive understanding of various industries. Your primary responsibilities include: Strategic SAP Solution FocusWorking across technical design, development, and implementation of SAP solutions for simplicity, amplification, and maintainability that meet client needs. Comprehensive Solution DeliveryInvolvement in strategy development and solution implementation, leveraging your knowledge of SAP and working with the latest technologies. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on AWS; Exposure to streaming solutions and message brokers like Kafka technologies. Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers
Posted 2 days ago
5.0 - 7.0 years
7 - 9 Lacs
Bengaluru
Work from Office
As a senior SAP Consultant, you will serve as a client-facing practitioner working collaboratively with clients to deliver high-quality solutions and be a trusted business advisor with deep understanding of SAP Accelerate delivery methodology or equivalent and associated work products. You will work on projects that assist clients in integrating strategy, process, technology, and information to enhance effectiveness, reduce costs, and improve profit and shareholder value. There are opportunities for you to acquire new skills, work across different disciplines, take on new challenges, and develop a comprehensive understanding of various industries. Your primary responsibilities include: Strategic SAP Solution FocusWorking across technical design, development, and implementation of SAP solutions for simplicity, amplification, and maintainability that meet client needs. Comprehensive Solution DeliveryInvolvement in strategy development and solution implementation, leveraging your knowledge of SAP and working with the latest technologies. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on AWS; Exposure to streaming solutions and message brokers like Kafka technologies. Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers
Posted 2 days ago
5.0 - 10.0 years
7 - 12 Lacs
Kochi
Work from Office
Develop user-friendly web applications using Java and React.js while ensuring high performance. Design, develop, test, and deploy robust and scalable applications. Building and consuming RESTful APIs. Collaborate with the design and development teams to translate UI/UX design wireframes into functional components. Optimize applications for maximum speed and scalability. Stay up-to-date with the latest Java and React.js trends, techniques, and best practices. Participate in code reviews to maintain code quality and ensure alignment with coding standards. Identify and address performance bottlenecks and other issues as they arise. Help us shape the future of Event Driven technologies, including contributing to Apache Kafka, Strimzi, Apache Flink, Vert.x and other relevant open-source projects. Collaborate within a dynamic team environment to comprehend and dissect intricate requirements for event processing solutions. Translate architectural blueprints into actualized code, employing your technical expertise to implement innovative and effective solutions. Conduct comprehensive testing of the developed solutions, ensuring their reliability, efficiency, and seamless integration. Provide ongoing support for the implemented applications, responding promptly to customer inquiries, resolving issues, and optimizing performance. Serve as a subject matter expert, sharing insights and best practices related to product development, fostering knowledge sharing within the team. Continuously monitor the evolving landscape of event-driven technologies, remaining updated on the latest trends and advancements. Collaborate closely with cross-functional teams, including product managers, designers, and developers, to ensure a holistic and harmonious product development process. Take ownership of technical challenges and lead your team to ensure successful delivery, using your problem-solving skills to overcome obstacles. Mentor and guide junior developers, nurturing their growth and development by providing guidance, knowledge transfer, and hands-on training. Engage in agile practices, contributing to backlog grooming, sprint planning, stand-ups, and retrospectives to facilitate effective project management and iteration. Foster a culture of innovation and collaboration, contributing to brainstorming sessions and offering creative ideas to push the boundaries of event processing solutions. Maintain documentation for the developed solutions, ensuring comprehensive and up-to-date records for future reference and knowledge sharing. Involve in building and orchestrating containerized services Required education Bachelor's Degree Preferred education Bachelor's Degree Required technical and professional expertise Proven 5+ years of experience as aFull stack developer (Java and React.js) with a strong portfolio of previous projects. Proficiency in Java, JavaScript, HTML, CSS, and related web technologies. Familiarity with RESTfulAPIs and their integration into applications. Knowledge of modern CICD pipelines and tools like Jenkinsand Travis. Strong understanding of version control systems, particularly Git. Good communication skills and the ability to articulate technical concepts to both technical and non-technical team members. Familiarity with containerizationand orchestrationtechnologies like Docker and Kubernetes for deploying event processing applications. Proficiency in troubleshootingand debugging. Exceptional problem-solving and analytical abilities, with a knack for addressing technical challenges. Ability to work collaboratively in an agile and fast-paced development environment. Leadership skills to guide and mentorjunior developers, fostering their growth and skill development. Strong organizational and time management skills to manage multiple tasks and priorities effectively. Adaptability to stay current with evolving event-driven technologies and industry trends. Customer-focused mindset, with a dedication to delivering solutions that meet or exceed customer expectations. Creative thinking and innovation mindset to drive continuous improvement and explore new possibilities. Collaborative and team-oriented approach to work, valuing open communication and diverse perspectives. Preferred technical and professional ex
Posted 2 days ago
4.0 - 6.0 years
6 - 8 Lacs
Pune
Work from Office
Developer leads the cloud application development/deployment for client based on AWS development methodology, tools and best practices. A developer responsibility is to lead the execution of a project by working with a senior level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong proficiency in Java, Spring Framework, Spring boot, RESTful APIs, excellent understanding of OOP, Design Patterns, strong knowledge of ORM tools like Hibernate or JPA Java based Micro-services framework, Hands on experience on Spring boot Microservices Strong knowledge of micro-service logging, monitoring, debugging and testing,In-depth knowledge of relational databases (e.g., MySQL) Experience in container platforms such as Docker and Kubernetes Experience in messaging platforms such as Kafka or IBM MQ Good understanding of Test-Driven-Development, familiar with Ant, Maven or other build automation framework, good knowledge of base UNIX commands Preferred technical and professional experience Significant software development experience, including 4-6+ years of experience in web UI application development
Posted 2 days ago
5.0 - 8.0 years
7 - 10 Lacs
Bengaluru
Work from Office
As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5-8 years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Preferred technical and professional experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Knowledge or experience of Snowflake will be an added advantage
Posted 2 days ago
5.0 - 8.0 years
7 - 10 Lacs
Bengaluru
Work from Office
As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5-8 years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Preferred technical and professional experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Knowledge or experience of Snowflake will be an added advantage
Posted 2 days ago
5.0 - 7.0 years
7 - 9 Lacs
Bengaluru
Work from Office
Work with broader team to build, analyze and improve the AI solutions. You will also work with our software developers in consuming different enterprise applications Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Resource should have 5-7 years of experience. Sound knowledge of Python and should know how to use the ML related services. Proficient in Python with focus on Data Analytics Packages. Strategy Analyse large, complex data sets and provide actionable insights to inform business decisions. Strategy Design and implementing data models that help in identifying patterns and trends. Collaboration Work with data engineers to optimize and maintain data pipelines. Perform quantitative analyses that translate data into actionable insights and provide analytical, data-driven decision-making. Identify and recommend process improvements to enhance the efficiency of the data platform. Develop and maintain data models, algorithms, and statistical models Preferred technical and professional experience Experience with conversation analytics. Experience with cloud technologies Experience with data exploration tools such as Tableu
Posted 2 days ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Project Role : Quality Engineering Lead (Test Lead) Project Role Description : Leads a team of quality engineers through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Applies business and functional knowledge to develop end-to-end testing strategies through the use of quality processes and methodologies. Applies testing methodologies, principles and processes to define and implement key metrics to manage and assess the testing process including test execution and defect resolution. Must have skills : Automated Testing Good to have skills : Core Banking, Java, Selenium Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Quality Engineering Lead (Test Lead), you will lead a team of quality engineers through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. You will apply business and functional knowledge to develop end-to-end testing strategies using quality processes and methodologies. Roles & Responsibilities: - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute on key decisions - Provide solutions to problems for their immediate team and across multiple teams - Lead team planning and ecosystem integration - Develop end-to-end testing strategies - Define and implement key metrics to manage and assess the testing process Professional & Technical Skills: -Must have 5-8 years of experience in API test automation, with a strong focus on developing automated test scripts and frameworks. - Must have hands-on experience with API testing tools like Postman, Rest Assured or similar tools. - Must Have Skills: Proficiency in Automated Testing, Selenium, API Testing - Strong understanding of test automation frameworks - Must have proficiency in scripting languages like Java, JavaScript, or Python to automate test scripts. - Expertise in mocking and stubbing APIs using tools like WireMock, Mock Server, or other service virtualization tools. - Hands on experience on Testing/New Man Automation/Karate API Automation - Experience in Enhancing/Creation of BDD Automation Framework for GUI/API. - Experience in BDD concepts such as Cucumber, Maven, TestNG etc. - Good To Have Skills: Experience with Selenium and Core Banking. - Knowledge of microservices architecture and API interactions, and experience with Docker. - Experience with Kafka consumer/producer testing - Ability to create and validate API data for testing purposes. Additional Information: - The candidate should have a minimum of 5 years of experience in Automated Testing. - This position is based at our Gurugram Client office, Its mandate to work from Gurugram 3 days/week - A 15 years full time education is required. 15 years full time education Show more Show less
Posted 2 days ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Project Role : Quality Engineering Lead (Test Lead) Project Role Description : Leads a team of quality engineers through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Applies business and functional knowledge to develop end-to-end testing strategies through the use of quality processes and methodologies. Applies testing methodologies, principles and processes to define and implement key metrics to manage and assess the testing process including test execution and defect resolution. Must have skills : Automated Testing Good to have skills : Selenium, Java Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Quality Engineering Lead (Test Lead), you will lead a team of quality engineers through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. You will apply business and functional knowledge to develop end-to-end testing strategies using quality processes and methodologies. Roles & Responsibilities: - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute on key decisions - Provide solutions to problems for their immediate team and across multiple teams - Lead team planning and ecosystem integration - Develop end-to-end testing strategies - Define and implement key metrics to manage and assess the testing process Professional & Technical Skills: -Must have overall 6-8 years of experience in testing (5+ years of experience in test automation). - Must have hands-on experience with API testing tools like Postman, Rest Assured or similar tools. - Must Have Skills: Selenium/WDIO - Strong understanding of test automation frameworks - Must have proficiency in scripting languages like Java, JavaScript, or Python to automate test scripts. - Expertise in mocking and stubbing APIs using tools like WireMock, Mock Server, or other service virtualization tools. - Hands on experience on Testing/New Man Automation/Karate API Automation - Experience in Enhancing/Creation of BDD Automation Framework for GUI/API. - Experience in BDD concepts such as Cucumber, Maven, TestNG etc. - Good To Have Skills: Experience with Selenium and Core Banking. - Knowledge of microservices architecture and API interactions, and experience with Docker. - Experience with Kafka consumer/producer testing - Ability to create and validate API data for testing purposes. Additional Information: - The candidate should have a minimum of 6-8 years of experience in Automated Testing. - This position is based at our Gurugram office, Its mandate to work from Gurugram 3 days/week - A 15 years full time education is required. 15 years full time education Show more Show less
Posted 2 days ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Project Role : Quality Engineering Lead (Test Lead) Project Role Description : Leads a team of quality engineers through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Applies business and functional knowledge to develop end-to-end testing strategies through the use of quality processes and methodologies. Applies testing methodologies, principles and processes to define and implement key metrics to manage and assess the testing process including test execution and defect resolution. Must have skills : Automated Testing Good to have skills : Selenium, Core Banking, Java Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Quality Engineering Lead (Test Lead), you will lead a team of quality engineers through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. You will apply business and functional knowledge to develop end-to-end testing strategies using quality processes and methodologies. Roles & Responsibilities: - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute on key decisions - Provide solutions to problems for their immediate team and across multiple teams - Lead team planning and ecosystem integration - Develop end-to-end testing strategies - Define and implement key metrics to manage and assess the testing process Professional & Technical Skills: -Must have 5-8 years of experience in API test automation, with a strong focus on developing automated test scripts and frameworks. - Must have hands-on experience with API testing tools like Postman, Rest Assured or similar tools. - Must Have Skills: Proficiency in Automated Testing, Selenium, API Testing - Strong understanding of test automation frameworks - Must have proficiency in scripting languages like Java, JavaScript, or Python to automate test scripts. - Expertise in mocking and stubbing APIs using tools like WireMock, Mock Server, or other service virtualization tools. - Hands on experience on Testing/New Man Automation/Karate API Automation - Experience in Enhancing/Creation of BDD Automation Framework for GUI/API. - Experience in BDD concepts such as Cucumber, Maven, TestNG etc. - Good To Have Skills: Experience with Selenium and Core Banking. - Knowledge of microservices architecture and API interactions, and experience with Docker. - Experience with Kafka consumer/producer testing - Ability to create and validate API data for testing purposes. Additional Information: - The candidate should have a minimum of 5 years of experience in Automated Testing. - This position is based at our Gurugram Client office, Its mandate to work from Gurugram 3 days/week - A 15 years full time education is required. 15 years full time education Show more Show less
Posted 2 days ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description United's Digital Technology team designs, develops, and maintains massively scaling technology solutions brought to life with innovative architectures, data analytics, and digital solutions. Our Values : At United Airlines, we believe that inclusion propels innovation and is the foundation of all that we do. Our Shared Purpose: "Connecting people. Uniting the world." drives us to be the best airline for our employees, customers, and everyone we serve, and we can only do that with a truly diverse and inclusive workforce. Our team spans the globe and is made up of diverse individuals all working together with cutting-edge technology to build the best airline in the history of aviation. With multiple employee-run "Business Resource Group" communities and world-class benefits like health insurance, parental leave, and space available travel, United is truly a one-of-a-kind place to work that will make you feel welcome and accepted. Come join our team and help us make a positive impact on the world. Job Overview And Responsibilities United Airlines is seeking talented people to join the Data Engineering team. Data Engineering organization is responsible for driving data driven insights & innovation to support the data needs for commercial and operational projects with a digital focus. You will work as a Senior Engineer - Machine Learning and collaborate with data scientists and data engineers to: Build high-performance, cloud-native machine learning infrastructure and services to enable rapid innovation across United Build complex data ingestion and transformation pipelines for batch and real-time data Support large scale model training and serving pipelines in distributed and scalable environment This position is offered on local terms and conditions within United’s wholly owned subsidiary United Airlines Business Services Pvt. Ltd. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. United Airlines is an equal opportunity employer. United Airlines recruits, employs, trains, compensates, and promotes regardless of race, religion, color, national origin, gender identity, sexual orientation, physical ability, age, veteran status, and other protected status as required by applicable law. Qualifications Required BS/BA, in Advanced Computer Science, Data Science, Engineering or related discipline or Mathematics experience required Strong software engineering experience with Python and at least one additional language such as Go, Java, or C/C++ Familiarity with ML methodologies and frameworks (e.g., PyTorch, Tensorflow) and preferably building and deploying production ML pipelines Experience developing cloud-native solutions with Docker and Kubernetes Cloud-native DevOps, CI/CD experience using tools such as Jenkins or AWS CodePipeline; preferably experience with GitOps using tools such as ArgoCD, Flux, or Jenkins X Experience building real-time and event-driven stream processing pipelines with technologies such as Kafka, Flink, and Spark Experience setting up and optimizing data stores (RDBMS/NoSQL) for production use in the ML app context Strong desire to stay aligned with the latest developments in cloud-native and ML ops/engineering and to experiment with and learn new technologies Experience 3 + years of software engineering experience with languages such as Python, Go, Java, Scala, Kotlin, or C/C++ 2 + years of experience working in cloud environments (AWS preferred) 2 + years of experience with Big Data technologies such as Spark, Flink 2 + years of experience with cloud-native DevOps, CI/CD At least one year of experience with Docker and Kubernetes in a production environment Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English and Hindi (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position Preferred Masters in computer science or related STEM field GGN00001744 Show more Show less
Posted 2 days ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Project Role : Quality Engineering Lead (Test Lead) Project Role Description : Leads a team of quality engineers through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Applies business and functional knowledge to develop end-to-end testing strategies through the use of quality processes and methodologies. Applies testing methodologies, principles and processes to define and implement key metrics to manage and assess the testing process including test execution and defect resolution. Must have skills : Automated Testing Good to have skills : Selenium, Core Banking Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Quality Engineering Lead (Test Lead), you will lead a team of quality engineers through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. You will apply business and functional knowledge to develop end-to-end testing strategies using quality processes and methodologies. Roles & Responsibilities: - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute on key decisions - Provide solutions to problems for their immediate team and across multiple teams - Lead team planning and ecosystem integration - Develop end-to-end testing strategies - Define and implement key metrics to manage and assess the testing process Professional & Technical Skills: -Must have overall 6-8 years of experience in testing (5+ years of experience in test automation). - Must have hands-on experience with API testing tools like Postman, Rest Assured or similar tools. - Must Have Skills: Selenium/WDIO - Strong understanding of test automation frameworks - Must have proficiency in scripting languages like Java, JavaScript, or Python to automate test scripts. - Expertise in mocking and stubbing APIs using tools like WireMock, Mock Server, or other service virtualization tools. - Hands on experience on Testing/New Man Automation/Karate API Automation - Experience in Enhancing/Creation of BDD Automation Framework for GUI/API. - Experience in BDD concepts such as Cucumber, Maven, TestNG etc. - Good To Have Skills: Experience with Selenium and Core Banking. - Knowledge of microservices architecture and API interactions, and experience with Docker. - Experience with Kafka consumer/producer testing - Ability to create and validate API data for testing purposes. Additional Information: - The candidate should have a minimum of 6-8 years of experience in Automated Testing. - This position is based at our Gurugram office, Its mandate to work from Gurugram 3 days/week - A 15 years full time education is required. 15 years full time education Show more Show less
Posted 2 days ago
170.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Summary Strategy Develop the strategic direction and roadmap for SCPAY, aligning with Business Strategy, ITO Strategy and investment priorities. Tap into latest industry trends, innovative products & solutions to deliver effective and faster product capabilities Support CASH Management Operations leveraging technology to streamline processes, enhance productivity, reduce risk and improve controls Business Work hand in hand with Payments Business, taking product programs from investment decisions into design, specifications, solutioning, development, implementation and hand-over to operations, securing support and collaboration from other SCB teams Ensure delivery to business meeting time, cost and high quality constraints Support respective businesses in growing Return on investment, commercialisation of capabilities, bid teams, monitoring of usage, improving client experience, enhancing operations and addressing defects & continuous improvement of systems Thrive an ecosystem of innovation and enabling business through technology Processes Responsible for the end-to-end deliveries of the technology portfolio comprising key business product areas such as Payments & Clearing. Own technology delivery of projects and programs across global SCB markets that a) develop/enhance core product capabilities b) ensure compliance to Regulatory mandates c) support operational improvements, process efficiencies and zero touch agenda d) build payments platform to align with latest technology & architecture trends, improved stability and scale Key Responsibilities People & Talent Employee, engage and retain high quality talent to ensure Payments Technology team is adequately staffed and skilled to deliver on business commitments Lead through example and build appropriate culture and values. Set appropriate tone and expectations for the team and work in collaboration with risk and control partners. Bridge skill / capability gaps through learning and development Ensure role, job descriptions and expectations are clearly set and periodic feedback provided to the entire team Ensure the optimal blend and balance of in-house and vendor resources Risk Management Be proactive in ensuring regular assurance that the Payments ITO Team is performing to acceptable risk levels and control standards Act quickly and decisively when any risk and control weakness becomes apparent and ensure those are addressed within quick / prescribed timeframes and escalated through the relevant committees Balance business delivery on time, quality and cost constraints with risks & controls to ensure that they do not materially threaten the Group’s ability to remain within acceptable risk levels Ensure business continuity and disaster recovery planning for the entire technology portfolio Governance Promote an environment where compliance with internal control functions and the external regulatory framework Regulatory & Business Conduct Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Lead the team to achieve the outcomes set out in the Bank’s Conduct Principles: [Fair Outcomes for Clients; Effective Financial Markets; Financial Crime Compliance; The Right Environment.] * Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters. Serve as a Director of the Board of [insert name of entities] Exercise authorities delegated by the Board of Directors and act in accordance with Articles of Association (or equivalent) Key stakeholders Solution Architect – SCPAY SCPAY – Programme Managers Group Payments Product Development Heads Group Cash Operations Governance Promote an environment where compliance with internal control functions and the external regulatory framework Skills And Experience Java / Spring Boot Kafka Streams, REST, JSON Design Principle Hazelcast & ELK Oracle & Postgres Qualifications Minimum 10 yrs of experience in the Dev role and in that a couple of years of experience as Dev lead role is an added advantage, good knowledge in Java, Microservices and Spring boot Technical Knowledge: Java / Spring Boot, Kafka Streams, REST, JSON, Netflix Micro Services suite ( Zuul / Eureka / Hystrix etc., ), 12 Factor Apps, Oracle, PostgresSQL, Cassandra & ELK Ability to work with geographically dispersed and highly varied stakeholders Very Good communication and interpersonal skills to manage senior stakeholders and top management Knowledge on JIRA and Confluence tools are desired About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential. Show more Show less
Posted 2 days ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Java Full stack Experience: 7+ yrs Location Option: Chennai / Pune - WFO All 5 days Tech Stack Required: Java (Version 8+), Spring Boot, Microservices with any Database and Any Messaging queue preferably Kafka AND Angular 2+ Versions (Not Angular JS). Java Full Stack: We are looking for Full Stack Developers who can continue to develop and enhance our platform to meet our client needs. In this role, you will work with our stakeholders, senior engineers and Product team to understand business requirements, architect technology solutions to solve the problems, and build out the solutions. Our Office platform current tech stack includes Java, Angular, Spring boot ,Docker, Ruby, Rails Application framework We use Postgres/Oracle as our RDBMS and IBM MQ/Kafka for messaging. Requirements : A Bachelor’s degree in computer science, Engineering, or a related discipline with 5+ years of work experience. Strong fundamentals in Data Structure, Algorithms, and Object-Oriented Design. Proficiency in Java 17 or higher and Front-End UI Technologies . Strong Experience in Spring Framework, Hibernate and proficiency with Spring Boot Experience in Angular 11 or higher, JavaScript frameworks, CSS, HTML. Experience & Good Understanding of Messaging frameworks like IBM MQ /Kafka Experience in Test driven and Behavior driven development Experience with Agile Software development methodologies, tools and processes Knowledge of Architectural patterns including Microservices architecture Knowledge of Securities or Financial Services Domain is a plus Job responsibilities: Work within a scrum team of 8+ people highly focused on service delivery, resiliency and interoperability with other services in the middle office platform. Consult and collaborate with other technologists to leverage and contribute to reusable code and services. Develop subject matter expertise in one or more functional areas Drive the design of scalable, high performing and robust applications and represent the software in design/code reviews with senior staff. Help the tech leadership team shape best practices for developing, sharing and continuously improving our software platform. Apply: shruthi@letzbizz.com Show more Show less
Posted 2 days ago
5.0 - 7.0 years
7 - 9 Lacs
Hyderabad, Ahmedabad
Work from Office
5--7+ years experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales) Create and manage cloud resources in AWS Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations Develop an infrastructure to collect, transform, combine and publish/distribute customer data. Define process improvement opportunities to optimize data collection, insights and displays. Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible Identify and interpret trends and patterns from complex data sets Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders. Key participant in regular Scrum ceremonies with the agile teams Proficient at developing queries, writing reports and presenting findings Mentor junior members and bring best industry practices Sound knowledge of AWS Glue, AWS Lambda, Python, PySpark
Posted 2 days ago
3.0 years
0 Lacs
Delhi, India
On-site
Responsibilities Design, build, and maintain scalable data pipelines and streaming systems. Develop real-time WebSocket and API integrations for data ingestion and delivery. Manage and optimize relational and non-relational databases for performance, reliability, and scalability. Collaborate with AI and product teams to build quantitative models and serve them in production environments. Design, build, and maintain robust ETL/ELT pipelines for ingesting and transforming on-chain and off-chain data. Build a real-time data streaming infrastructure using tools like Kafka or equivalent. Architect and optimize relational and non-relational databases to support complex queries and financial data models. Collaborate with product and analytics teams to design and deploy quantitative models for TVL, yield tracking, protocol metrics, etc. Implement tools and practices to ensure data integrity, quality, and observability across the platform. Contribute to our indexing infrastructure, working with smart contract data, subgraphs, or custom indexers. Requirements 3+ years of experience in data engineering, backend systems, or infrastructure roles. Strong knowledge of databases (SQL, NoSQL) and experience with data modeling and schema design. Proficient with PostgreSQL, TimescaleDB, or other time-series/analytical databases. Hands-on experience with stream processing frameworks (Kafka, Flink, etc. ). Expertise in building and consuming RESTful APIs and WebSocket protocols. Familiarity with blockchain data or financial data. Strong programming skills in Python or Go. Experience with quantitative finance modeling, DeFi metrics, or financial KPIs is a strong plus. Solid understanding of cloud infrastructure (e. g., AWS, GCP, or similar). Nice To Have Experience with subgraphs, The Graph, or building custom blockchain indexers. Background in data visualization platforms or interactive dashboards. Knowledge of DeFi protocols, tokenomics, and governance systems. Prior experience working in a fast-paced startup or early-stage product environment. This job was posted by Utsav Agarwal from Sharpe Labs. Show more Show less
Posted 2 days ago
4.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
We are seeking a talented Full Stack Developer experienced in Java, Kotlin, Spring Boot, Angular, and Apache Kafka to join our dynamic engineering team. The ideal candidate will design, develop, and maintain end-to-end web applications and real-time data processing solutions, leveraging modern frameworks and event-driven architectures. Location : Noida/Pune/Bangalore/Hyderabad/Chennai Timings : 2pm to 11pm Experience : 4-6 Years Key Responsibilities Design, develop, and maintain scalable web applications using Java, Kotlin, Spring Boot, and Angular. Build and integrate RESTful APIs and microservices to connect frontend and backend components. Develop and maintain real-time data pipelines and event-driven features using Apache Kafka. Collaborate with cross-functional teams (UI/UX, QA, DevOps, Product) to define, design, and deliver new features. Write clean, efficient, and well-documented code following industry best practices and coding standards. Participate in code reviews, provide constructive feedback, and ensure code quality and consistency. Troubleshoot and resolve application issues, bugs, and performance bottlenecks in a timely manner. Optimize applications for maximum speed, scalability, and security. Stay updated with the latest industry trends, tools, and technologies, and proactively suggest improvements. Participate in Agile/Scrum ceremonies and contribute to continuous integration and delivery pipelines. Required Qualifications Experience with cloud-based technologies and deployment (Azure, GCP). Familiarity with containerization (Docker, Kubernetes) and microservices architecture. Proven experience as a Full Stack Developer with hands-on expertise in Java, Kotlin, Spring Boot, and Angular (Angular 2+). Strong understanding of object-oriented and functional programming principles. Experience designing and implementing RESTful APIs and integrating them with frontend applications. Proficiency in building event-driven and streaming applications using Apache Kafka. Experience with database systems (SQL/NoSQL), ORM frameworks (e.g., Hibernate, JPA), and SQL. Familiarity with version control systems (Git) and CI/CD pipelines. Good understanding of HTML5, CSS3, JavaScript, and TypeScript. Experience with Agile development methodologies and working collaboratively in a team environment. Excellent problem-solving, analytical, and communication skills. Show more Show less
Posted 2 days ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Title: Chief Technology Officer (CTO) Role Overview: We are seeking a visionary Chief Technology Officer to lead our technology function and drive the development of innovative AdTech solutions. In this leadership role, you will define and implement the company's technical strategy while overseeing engineering, data science, and product technology teams. Your focus will be on building scalable, high-performance platforms including RTB, DSP, and SSP systems. Key Responsibilities: Develop and execute a forward-looking technology roadmap aligned with business goals. Lead cross-functional teams in engineering and product development. Architect and manage real-time bidding systems, data infrastructure, and platform scalability. Drive innovation in AI/ML, big data, and real-time analytics. Ensure system reliability, security, DevOps, and data privacy best practices. Collaborate with leadership to deliver impactful tech-driven products. Represent the company in technical partnerships and industry events. Requirements: 10+ years in software engineering, with 5+ in a leadership role. Strong background in AdTech (RTB, DSP, SSP, OpenRTB). Expertise in AI/ML, cloud (AWS/GCP), and big data (Kafka, Spark, Hadoop). Proven experience in building scalable backend systems and leading high-performing teams. Bachelor’s or Master’s in Computer Science or Engineering; MBA/PhD is a plus. Show more Show less
Posted 2 days ago
6.0 - 12.0 years
0 Lacs
Greater Bengaluru Area
On-site
Job Title: Node.js Developer Years of Experience: 6 to 12 years Notice period: Immediate to 30 Days Location: Bangalore, Chennai, Dubai Work Mode: WFO About Us: We prioritize our employees, fostering a collaborative and inclusive culture. Our mission is to empower our team while delivering exceptional solutions that enhance business performance and user experiences. GenAI Product Development | Digital Technology Solutions | ValueLabs - ValueLabs Key Responsibilities: Design and develop scalable, high-performance Node.js applications Develop and deploy RESTful APIs using Node.js, Express.js, and related technologies Collaborate with cross-functional teams to identify business requirements and develop solutions Troubleshoot and resolve technical issues related to Node.js applications Stay up-to-date with the latest Node.js technologies and best practices Required: Minimum 5 years of coding experience in NodeJS, JavaScript and Databases. At least 1 year hands-on in TypeScript . Hands on experience in performance tuning, debugging, monitoring Technical Skills: Excellent knowledge developing scalable and highly-available Restful APIs using NodeJS technologies Practical experience with GraphQL. Well versed with CI/CD principles, and actively involved in solving, troubleshooting issues in distributed services ecosystem Understanding of containerization, experienced in Dockers, Kubernetes. Exposed to API gateway integrations like 3Scale. Understanding of Single-Sign-on or token based authentication (Rest, JWT, oAuth) Possess expert knowledge of task/message queues include but not limited to: AWS, Microsoft Azure, Pushpin and Kafka Thanks, Monica P Show more Show less
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Kafka, a popular distributed streaming platform, has gained significant traction in the tech industry in recent years. Job opportunities for Kafka professionals in India have been on the rise, with many companies looking to leverage Kafka for real-time data processing and analytics. If you are a job seeker interested in Kafka roles, here is a comprehensive guide to help you navigate the job market in India.
These cities are known for their thriving tech industries and have a high demand for Kafka professionals.
The average salary range for Kafka professionals in India varies based on experience levels. Entry-level positions may start at around INR 6-8 lakhs per annum, while experienced professionals can earn between INR 12-20 lakhs per annum.
Career progression in Kafka typically follows a path from Junior Developer to Senior Developer, and then to a Tech Lead role. As you gain more experience and expertise in Kafka, you may also explore roles such as Kafka Architect or Kafka Consultant.
In addition to Kafka expertise, employers often look for professionals with skills in: - Apache Spark - Apache Flink - Hadoop - Java/Scala programming - Data engineering and data architecture
As you explore Kafka job opportunities in India, remember to showcase your expertise in Kafka and related skills during interviews. Prepare thoroughly, demonstrate your knowledge confidently, and stay updated with the latest trends in Kafka to excel in your career as a Kafka professional. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.