Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 11.0 years
0 Lacs
chennai, tamil nadu
On-site
You are invited to join NTT DATA as a Software Development Senior Specialist in Chennai, Tamil Nadu, India. NTT DATA is dedicated to hiring exceptional individuals who are innovative and passionate about their work. If you are looking to be part of an inclusive and forward-thinking organization, this opportunity is for you. As a Software Development Senior Specialist at NTT DATA, you will play a crucial role in the development, testing, and maintenance of software applications and systems. Your responsibilities will include leading the planning and design of product and technical initiatives, mentoring developers/team members, and driving improvements in engineering techniques and processes. Collaboration is key in this role, as you will work closely with team members to ensure high-quality deliverables that meet performance standards. Engaging with key internal stakeholders to understand user requirements and preparing low-level design documents will be part of your daily tasks. You will also participate in Agile planning and estimation activities to ensure project timelines are met. To thrive in this position, you should have at least 7 years of experience in developing JAVA and microservices applications. Excellent communication skills, hands-on coding experience, and a background in the banking domain are essential. A BE/B. Tech graduate with English proficiency is preferred. Additionally, you will be responsible for designing and maintaining robust Java-based backend systems, building real-time messaging pipelines, creating responsive front-end interfaces, and managing relational databases. Staying updated with emerging technologies and collaborating effectively with distributed teams are also crucial aspects of this role. NTT DATA is a trusted global innovator in business and technology services, serving Fortune Global 100 companies. As part of the NTT Group, we are committed to helping clients innovate, optimize, and transform for long-term success. If you are ready to contribute your expertise to a dynamic and collaborative engineering team, we encourage you to apply for this exciting opportunity.,
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
noida, uttar pradesh
On-site
You will be responsible for architecting, designing, and leading the development of robust middleware solutions at Coforge. Your expertise in Apache Camel, Kafka, and Spring Boot, combined with a strong understanding of enterprise integration patterns, microservices architecture, and cloud-native development, will be crucial in this role. You will lead the design and implementation of middleware solutions, architect scalable integration platforms, and collaborate with cross-functional teams to translate requirements into technical solutions. Furthermore, you will define and enforce best practices for middleware development, mentor and guide a team of developers, optimize performance, troubleshoot integration issues, and ensure the security, scalability, and maintainability of middleware components. Staying updated with emerging technologies and proposing innovative solutions will also be part of your responsibilities. Required Skills & Qualifications: - Proven experience of 8+ years in middleware and integration technologies. - Strong hands-on experience with Apache Camel, including routes, processors, and components. - Expertise in Apache Kafka, covering producers, consumers, topics, and stream processing. - Proficiency in Spring Boot and related Spring ecosystem, such as Spring Cloud and Spring Integration. - Solid understanding of RESTful APIs, JSON/XML, and message transformation. - Experience with containerization tools like Docker and Kubernetes, as well as CI/CD pipelines. - Familiarity with cloud platforms such as AWS, Azure, and GCP, and knowledge of cloud-native design. - Understanding of enterprise integration patterns (EIP) and event-driven architecture. - Excellent problem-solving, communication, and leadership skills. Preferred Qualifications: - Experience with API Gateways like Kong or Apigee. - Knowledge of monitoring tools such as Prometheus, Grafana, and ELK stack. - Exposure to DevOps practices and Infrastructure as Code tools like Terraform and Ansible. - Certifications in relevant technologies or cloud platforms.,
Posted 2 weeks ago
7.0 - 10.0 years
27 - 32 Lacs
Pune
Hybrid
Job Title: Big Data Developer Job Location: Pune Experience : 7+ Years Job Type: Hybrid. Strong skills in - Messaging Technologies like Apache Kafka or equivalent, Programming skill Scala, Spark with optimization techniques, Python Should be able to write the query through Jupyter Notebook Orchestration tool like NiFi, Airflow Design and implement intuitive, responsive UIs that allow issuers to better understand data and analytics Experience with SQL & Distributed Systems. Strong understanding of Cloud architecture. Ensure a high-quality code base by writing and reviewing performance, well-tested code Demonstrated experience building complex products. Knowledge of Splunk or other alerting and monitoring solutions. Fluent in the use of Git, Jenkins. Broad understanding of Software Engineering Concepts and Methodologies is required.
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Software Engineer specializing in Java for App2App Integration, you will play a crucial role in the Business Data Cloud Foundation Services team at SAP. Your primary responsibility will be to develop robust and scalable integration mechanisms that facilitate seamless data movement and real-time interoperability across SAP's business applications and the data fabric. By leveraging your expertise in Java, RESTful APIs, Apache Kafka, DevOps, BTP, and Hyperscaler ecosystems, you will contribute to the end-to-end development of services and pipelines supporting distributed data processing, data transformations, and intelligent automation. Your key responsibilities will include developing App2App integration components and services using Java and messaging frameworks, collaborating with cross-functional teams to ensure secure and performant communication across SAP applications, building and maintaining distributed data processing pipelines, and working closely with DevOps to enhance CI/CD pipelines and deployment strategies. Additionally, you will contribute to the platform's reliability, scalability, and security by implementing automated testing, logging, and telemetry, as well as supporting cloud-native deployment on SAP BTP and major Hyperscalers. To excel in this role, you should hold a Bachelors or Masters degree in Computer Science, Software Engineering, or a related field, and have at least 5 years of hands-on experience in backend development using Java. Strong object-oriented design skills, integration patterns, and familiarity with Apache Kafka or similar messaging systems in distributed environments are essential. Experience with SAP BTP, SAP Datasphere, SAP Analytics Cloud, or HANA would be highly advantageous, along with knowledge of CI/CD pipelines, containerization, Kubernetes, and DevOps best practices. Furthermore, your passion for clean code, automated testing, performance tuning, and continuous improvement, coupled with strong communication skills and the ability to collaborate effectively with global teams, will be key to your success in this role. Joining the Foundation Services team within the Business Data Cloud organization at SAP will offer you a collaborative, inclusive, and high-impact environment in Bangalore, India, where you can drive cutting-edge engineering efforts and contribute to SAP's Data & AI strategy. At SAP, we believe in fostering a culture of inclusion, promoting health and well-being, and offering flexible working models to ensure that every individual, regardless of background, feels valued and empowered to perform at their best. We are committed to creating a diverse and equitable workplace where all talents are unleashed and every employee has the opportunity to realize their full potential. By joining SAP, you will be part of a purpose-driven and future-focused company that values collaboration, personal development, and innovation, ultimately striving to create a better and more equitable world.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
You are an experienced backend developer with 5.5+ years of total experience. You have extensive knowledge in back-end development using Java 8 or higher, Spring Framework (Core/Boot/MVC), Hibernate/JPA, and Web flux. Your expertise includes a good understanding of Data Structures, Object-Oriented Programming, and Design Patterns. You are well-versed in REST APIs and Microservices Architecture and proficient in working with Relational and NoSQL databases, preferably PostgreSQL and MongoDB. Experience with CI/CD tools such as Jenkins, GOCD, or CircleCI is essential for you. You are familiar with test automation tools like xUnit, Selenium, or JMeter, and have hands-on experience with Apache Kafka or similar messaging technologies. Exposure to automated testing frameworks, performance testing tools, containerization tools like Docker, orchestration tools like Kubernetes, and cloud platforms, preferably Google Cloud Platform (GCP) is required. You have a strong understanding of UML and design patterns, excellent problem-solving skills, and a continuous improvement mindset. Effective communication and collaboration with cross-functional teams are key strengths of yours. Your responsibilities include writing and reviewing high-quality code, thoroughly understanding functional requirements, and analyzing clients" needs. You should be able to envision the overall solution for defined functional and non-functional requirements, determine and implement design methodologies and tool sets, and lead/support UAT and production rollouts. Creating, understanding, and validating WBS and estimated effort for a given module/task, addressing issues promptly, giving constructive feedback to team members, troubleshooting and resolving complex bugs, and providing solutions during code/design reviews are part of your daily tasks. Additionally, you are expected to carry out POCs to ensure that suggested design/technologies meet the requirements. You hold a Bachelors or Masters degree in computer science, Information Technology, or a related field.,
Posted 2 weeks ago
13.0 - 20.0 years
30 - 45 Lacs
Pune
Hybrid
Hi, Wishes from GSN!!! Pleasure connecting with you!!! We been into Corporate Search Services for Identifying & Bringing in Stellar Talented Professionals for our reputed IT / Non-IT clients in India. We have been successfully providing results to various potential needs of our clients for the last 20 years. At present, GSN is hiring DATA ENGINEERING - Solution Architect for one of our leading MNC client. PFB the details for your better understanding : 1. WORK LOCATION : PUNE 2. Job Role: DATA ENGINEERING - Solution Architect 3. EXPERIENCE : 13+ yrs 4. CTC Range: Rs. 35 LPA to Rs. 50 LPA 5. Work Type : WFO Hybrid ****** Looking for SHORT JOINERS ****** Job Description : Who are we looking for : Architectural Vision & Strategy: Define and articulate the technical vision, strategy and roadmap for Big Data, data streaming, and NoSQL solutions , aligning with overall enterprise architecture and business goals. Required Skills : 13+ years of progressive EXP in software development, data engineering and solution architecture roles, with a strong focus on large-scale distributed systems. Expertise in Big Data Technologies: Apache Spark: Deep expertise in Spark architecture, Spark SQL, Spark Streaming, performance tuning, and optimization techniques. Experience with data processing paradigms (batch and real-time). Hadoop Ecosystem: Strong understanding of HDFS, YARN, Hive and other related Hadoop components . Real-time Data Streaming: Apache Kafka: Expert-level knowledge of Kafka architecture, topics, partitions, producers, consumers, Kafka Streams, KSQL, and best practices for high-throughput, low-latency data pipelines. NoSQL Databases: Couchbase: In-depth experience with Couchbase OR MongoDB OR Cassandra), including data modeling, indexing, querying (N1QL), replication, scaling, and operational best practices. API Design & Development: Extensive experience in designing and implementing robust, scalable and secure APIs (RESTful, GraphQL) for data access and integration. Programming & Code Review: Hands-on coding proficiency in at least one relevant language ( Python, Scala, Java ) with a preference for Python and/or Scala for data engineering tasks. Proven experience in leading and performing code reviews, ensuring code quality, performance, and adherence to architectural guidelines. Cloud Platforms: Extensive EXP in designing and implementing solutions on at least one major cloud platform ( AWS, Azure, GCP ), leveraging their Big Data, streaming, and compute services . Database Fundamentals: Solid understanding of relational database concepts, SQL, and data warehousing principles. System Design & Architecture Patterns: Deep knowledge of various architectural patterns (e.g., Microservices, Event-Driven Architecture, Lambda/Kappa Architecture, Data Mesh ) and their application in data solutions. DevOps & CI/CD: Familiarity with DevOps principles, CI/CD pipelines, infrastructure as code (IaC) and automated deployment strategies for data platforms . ****** Looking for SHORT JOINERS ****** Interested, don't hesitate to call NAK @ 9840035825 / 9244912300 for IMMEDIATE response. Best, ANANTH | GSN | Google review : https://g.co/kgs/UAsF9W
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Senior ETL & Data Streaming Engineer at DataFlow Group, a global provider of Primary Source Verification solutions and background screening services, you will be a key player in the design, development, and maintenance of robust data pipelines. With over 10 years of experience, you will leverage your expertise in both batch ETL processes and real-time data streaming technologies to ensure efficient data extraction, transformation, and loading into our Data Lake and Data Warehouse. Your responsibilities will include designing and implementing highly scalable ETL processes using industry-leading tools, as well as architecting batch and real-time data streaming solutions with technologies like Talend, Informatica, Apache Kafka, or AWS Kinesis. You will collaborate with data architects, data scientists, and business stakeholders to understand data requirements and translate them into effective pipeline solutions, ensuring data quality, integrity, and security across all storage solutions. Monitoring, troubleshooting, and optimizing existing data pipelines for performance, cost-efficiency, and reliability will be a crucial part of your role. Additionally, you will develop comprehensive documentation for all ETL and streaming processes, contribute to data governance policies, and mentor junior engineers to foster a culture of technical excellence and continuous improvement. To excel in this position, you should have 10+ years of progressive experience in data engineering, with a focus on ETL, ELT, and data pipeline development. Your deep expertise in ETL tools like Talend, proficiency in Data Streaming Technologies such as AWS Glue and Apache Kafka, and extensive experience with AWS data services like S3, Glue, and Lake Formation will be essential. Strong knowledge of traditional data warehousing concepts, dimensional modeling, programming languages like SQL and Python, and relational and NoSQL databases will also be required. If you are a problem-solver with excellent analytical skills, strong communication abilities, and a passion for staying updated on emerging technologies and industry best practices in data engineering, ETL, and streaming, we invite you to join our team at DataFlow Group and make a significant impact in the field of data management.,
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
hyderabad, telangana
On-site
As a Principal AI Software Engineer, you will be responsible for designing, developing, and integrating AI-first microservices, APIs, and workflows to create intelligent systems for cutting-edge digital banking and financial technology solutions. Collaborating with AI architects and full-stack teams, you will embed intelligent automation, real-time analytics, and explainable AI into production environments. Your expertise will be crucial in traditional backend software development, designing APIs, and microservices to consume, host, or orchestrate AI/ML workloads. Key responsibilities include developing scalable, distributed microservices using Java and Python with embedded AI/ML capabilities, building RESTful and GraphQL APIs for AI-driven features like fraud detection and KYC automation, managing data using MySQL and MongoDB, integrating AI/ML models using various tools, and implementing real-time pipelines with Apache Kafka and Redis Streams. Additionally, you will align DevOps practices, leverage AWS services, develop APIs with authentication protocols, and collaborate with cross-functional teams. In terms of technical skills, you are expected to have at least 7 years of backend software development experience and proficiency in Java, Python, microservices, and API design. Experience with MySQL, MongoDB, GraphQL, RESTful API development, secure financial systems, AI/ML model integration in production, Apache Kafka, Redis, AWS, Azure, or GCP services, containerization tools, CI/CD pipelines, and modern frontend frameworks is required. Preferred experience includes real-time AI API deployment, AI applications in FinTech, working with data scientists and AI researchers, and exposure to Southeast Asia FinTech products.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a seasoned and adaptable Senior Software Engineer / Technical Lead with 8-12 years of experience in software development, you will play a crucial role in designing and constructing scalable, resilient, and high-performance systems. Your expertise in Java or .NET, profound understanding of microservices architecture, and practical experience with streaming platforms, databases, and a test-first development mindset will be invaluable in this role. Your responsibilities will include designing, developing, and maintaining enterprise-grade applications using Java or .NET frameworks. You will also be tasked with architecting and implementing microservices and REST APIs to ensure modularity, scalability, and performance. Working with relational (RDBMS) and big data technologies to handle large-scale datasets, as well as integrating and leveraging streaming platforms like Apache Kafka for real-time data processing, will be part of your daily tasks. Applying robust software design principles and adhering to test-first / TDD approaches to deliver clean and maintainable code will be essential. Collaborating closely with UI/UX and front-end teams to guarantee a seamless end-to-end product experience will also be part of your role. Additionally, you will have the opportunity to lead or contribute to code reviews, architecture discussions, and mentorship of junior engineers while staying updated with emerging technologies and being open to adopting new tools, languages, or frameworks as required. To be successful in this role, you should possess 3-6 years of hands-on software development experience along with a strong command over Java or .NET technologies and related ecosystems. Experience with RDBMS (e.g., MySQL, PostgreSQL), big data platforms (e.g., Hadoop, Spark), Apache Kafka or similar streaming technologies, and software architecture patterns, particularly microservices, is crucial. Proficiency in RESTful services and API design, familiarity with UI technologies (e.g., JavaScript, Angular, React), and demonstrated use of test-first methodologies (TDD, BDD, unit testing frameworks) are also required. Excellent problem-solving and communication skills, along with the ability to quickly learn and adapt to new technologies and frameworks, are essential for this role. While not mandatory, experience with cloud platforms such as AWS, Azure, or GCP, exposure to DevOps practices and CI/CD tools, and a background in containerization (Docker, Kubernetes) would be considered advantageous. We are committed to supporting your needs for any adjustments during the application and hiring process. If you require special assistance or accommodation to use our website, apply for a position, or perform a job, please contact us at accommodationrequests@maersk.com.,
Posted 2 weeks ago
11.0 - 15.0 years
0 Lacs
hyderabad, telangana
On-site
As an AI Azure Architect, your primary responsibility will be to develop the technical vision for AI systems that cater to the existing and future business requirements. This involves architecting end-to-end AI applications, ensuring seamless integration with legacy systems, enterprise data platforms, and microservices. Collaborating closely with business analysts and domain experts, you will translate business objectives into technical requirements and AI-driven solutions. Additionally, you will partner with product management to design agile project roadmaps aligning technical strategies with market needs. Coordinating with data engineering teams is essential to ensure smooth data flows, quality, and governance across different data sources. Your role will also involve leading the design of reference architectures, roadmaps, and best practices for AI applications. Evaluating emerging technologies and methodologies to recommend suitable innovations for integration into the organizational strategy is a crucial aspect of your responsibilities. You will be required to identify and define system components such as data ingestion pipelines, model training environments, CI/CD frameworks, and monitoring systems. Leveraging containerization (Docker, Kubernetes) and cloud services will streamline the deployment and scaling of AI systems. Implementation of robust versioning, rollback, and monitoring mechanisms to ensure system stability, reliability, and performance will be part of your duties. Moreover, you will oversee the planning, execution, and delivery of AI and ML applications, ensuring they are completed within budget and timeline constraints. Managing project goals, allocating resources, and mitigating risks will fall under your project management responsibilities. You will be responsible for overseeing the complete lifecycle of AI application developmentfrom conceptualization and design to development, testing, deployment, and post-production optimization. Emphasizing security best practices during each development phase, focusing on data privacy, user security, and risk mitigation, is crucial. In addition to technical skills, the ideal candidate for this role should possess key behavioral attributes such as the ability to mentor junior developers, take ownership of project deliverables, and contribute towards risk mitigation. Understanding business objectives and functions to support data needs is also essential. Mandatory technical skills for this position include a strong background in working with agents using langgraph, autogen, and CrewAI. Proficiency in Python, along with knowledge of machine learning libraries like TensorFlow, PyTorch, and Keras, is required. Experience with cloud computing platforms (AWS, Azure, Google Cloud Platform), containerization tools (Docker), orchestration frameworks (Kubernetes), and DevOps tools (Jenkins, GitLab CI/CD) is essential. Proficiency in SQL and NoSQL databases, designing distributed systems, RESTful APIs, GraphQL integrations, and event-driven architectures are also necessary. Preferred technical skills include experience with monitoring and logging tools, cutting-edge libraries like Hugging Face Transformers, and large-scale deployment of ML projects. Training and fine-tuning of Large Language Models (LLMs) is an added advantage. Educational qualifications for this role include a Bachelor's/Master's degree in Computer Science, along with certifications in Cloud technologies (AWS, Azure, GCP) and TOGAF certification. The ideal candidate should have 11 to 14 years of relevant work experience in this field.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
As a Senior Software Engineer at Procore's Product & Technology Team, you will play a crucial role in leading complex projects, providing technical guidance, and mentoring other engineers. Your focus will be on high-level design and architecture to ensure alignment with the organization's strategic goals. Collaborating with Product Managers, Designers, and fellow engineers, you will develop innovative features leveraging cutting-edge BIM 3D technology to address challenges in the construction industry. Your responsibilities will include: - Developing product features using Procore's BIM technologies - Setting standards for development teams and collaborating on initiatives with infrastructure and other software engineering teams - Designing and building systems and features aligned with Procore's technical vision of a service-oriented architecture - Contributing to code development for microservices, React front ends, and Rails apps - Driving innovation to meet the needs of enterprise and international customers - Collaborating with Engineering, Product, and UX teams to create user-centric solutions - Mentoring fellow engineers in best practices and assisting in delivering high-quality software The ideal candidate will have: - A Bachelor's Degree in Computer Science or a related field, or equivalent work experience - 5+ years of experience in Programming fundamentals, Test Driven Development, and Design principles (Golang, TypeScript) - Proficiency in frontend technologies such as HTML, CSS, React, Vue.js, or Angular - Strong understanding of RESTful API design, Golang, and databases like PostgreSQL - Experience in Service-Oriented Architecture and modern web development practices - Track record of designing solutions for technical challenges in large-scale projects - Experience in building Continuous Integration and Continuous Delivery systems - Familiarity with BIM, serverless frameworks, event streaming platforms, and automated testing frameworks At Procore, we value our employees and offer a comprehensive range of benefits to support your professional growth and well-being. You will have access to paid time off, healthcare coverage, career development programs, and more. Join us in building the software that constructs the world and be a part of a culture that encourages innovation and ownership of your work. If you are passionate about leveraging technology to revolutionize the construction industry, apply now to join our team of Groundbreakers in Pune. Take the next step in your career and contribute to creating solutions that make a meaningful impact.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
You are a talented and highly motivated .NET Core Developer with expertise in Apache Kafka, ready to join our development team. Your main responsibility will be to design, develop, and optimize real-time data streaming applications using .NET Core and Kafka. Your work will involve implementing Kafka producers and consumers for data ingestion, processing, and consumption to ensure high availability and fault tolerance. You will also be building event-driven architectures that leverage Kafka for efficient communication between microservices and systems. Your key responsibilities include developing real-time data streaming solutions, integrating Kafka with .NET Core applications, optimizing performance, handling message serialization, ensuring data integrity and fault tolerance, collaborating with cross-functional teams, and following continuous improvement and best practices. In this role, you will work with cutting-edge technologies to build scalable and fault-tolerant systems that process large volumes of data in real-time. You will be instrumental in designing and implementing fault-tolerant and resilient messaging systems that can recover from failures with minimal downtime. Additionally, you will participate in Agile ceremonies such as daily stand-ups, sprint planning, and code reviews to contribute to the design and architectural decisions regarding Kafka and .NET Core integration. Your success in this position will involve staying up to date with the latest developments in .NET Core and Kafka, incorporating best practices and new features into your development process. You will continuously strive to improve the performance, scalability, and maintainability of the systems you develop, promoting a culture of high-quality code by writing clean, modular, and maintainable code.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have a strong understanding of the tech stack including GCP Services such as BigQuery, Cloud Dataflow, Pub/Sub, Dataproc, and Cloud Storage. Experience with Data Processing tools like Apache Beam (batch/stream), Apache Kafka, and Cloud Dataprep is crucial. Proficiency in programming languages like Python, Java/Scala, and SQL is required. Your expertise should extend to Orchestration tools like Apache Airflow (Cloud Composer) and Terraform, and Security aspects including IAM, Cloud Identity, and Cloud Security Command Center. Knowledge of Containerization using Docker and Kubernetes (GKE) is essential. Familiarity with Machine Learning platforms such as Google AI Platform, TensorFlow, and AutoML is expected. Candidates with certifications like Google Cloud Data Engineer and Cloud Architect are preferred. You should have a proven track record of designing scalable AI/ML systems in production, focusing on high-performance and cost-effective solutions. Strong experience with cloud platforms (Google Cloud, AWS, Azure) and cloud-native AI/ML services like Vertex AI and SageMaker is important. Your role will involve implementing MLOps practices, including model deployment, monitoring, retraining, and version control. Leadership skills are key to guide teams, mentor engineers, and collaborate effectively with cross-functional teams to achieve business objectives. A deep understanding of frameworks like TensorFlow, PyTorch, and Scikit-learn for designing, training, and deploying models is necessary. Experience with data engineering principles, scalable pipelines, and distributed systems (e.g., Apache Kafka, Spark, Kubernetes) is also required. Nice to have requirements include strong leadership and mentorship capabilities to guide teams towards best practices and high-quality deliverables. Excellent problem-solving skills focusing on designing efficient, high-performance systems are valued. Effective project management abilities are necessary to handle multiple initiatives and ensure timely delivery. Collaboration and teamwork are emphasized to foster a positive and productive work environment.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
Genpact is a global professional services and solutions firm committed to delivering outcomes that shape the future. With a workforce of over 125,000 individuals across more than 30 countries, we are fueled by our inherent curiosity, entrepreneurial agility, and dedication to creating lasting value for our clients. Our purpose is to pursue a world that works better for people, serving and transforming leading enterprises, including the Fortune Global 500, through our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. We are looking for a Lead Consultant - Camunda 8 Administrator to join our team! As a Camunda 8 Administrator, your responsibilities will include installing, configuring, and administering the Camunda BPM platform. You will be tasked with ensuring the smooth operation of Camunda applications, monitoring system performance, and providing technical support to users. Collaboration with cross-functional teams to understand business requirements and implement Camunda-based solutions that align with organizational needs will be a crucial aspect of this role. **Responsibilities:** - Install, configure, and maintain the Camunda BPM platform using Java and Spring Framework. - Collaborate with business analysts and stakeholders to analyze business requirements related to process automation and workflow management. - Design and implement process models, workflows, decision tables, and forms using Camunda Modeler or similar tools. - Develop custom plugins, extensions, and integrations with external systems using Java and REST API to enhance the functionality of the Camunda platform. - Monitor the performance and health of the Camunda infrastructure, including application servers, databases, and middleware technologies. - Troubleshoot and resolve technical issues related to Camunda applications, workflows, and integrations. - Perform backups, disaster recovery planning, and ensure data integrity and security within the Camunda environment. - Collaborate with development teams to deploy and test Camunda-based solutions in various environments. - Create and maintain technical documentation, including installation guides and troubleshooting procedures. - Provide technical support and training to end-users on Camunda applications and processes. - Stay updated with the latest releases, features, and best practices of the Camunda BPM platform. **Qualifications:** **Minimum Qualifications:** - Proven experience as a Camunda Administrator working with Camunda BPM platform version 8. - Strong knowledge of Camunda BPM platform architecture, installation, configuration, and administration. - Proficiency in Java programming language and experience with Spring Framework. - Familiarity with frontend technologies such as HTML, CSS, JavaScript, AngularJS, and React. - Experience in developing custom plugins, extensions, and integrations using Java and REST API. - Understanding of databases and middleware technologies used in conjunction with Camunda. - Strong troubleshooting and problem-solving skills. - Knowledge of backup and disaster recovery procedures for Camunda infrastructure. - Excellent communication and collaboration skills. - Ability to work independently and manage multiple tasks effectively. **Preferred Qualifications/Skills:** - Passion for technology and problem-solving. - Excellent client-facing skills and technical curiosity. - Experience with Agile development methodologies. - Camunda certification or training is desirable. If you are looking to be part of a dynamic team that values innovation and excellence, we invite you to apply for the Lead Consultant - Camunda 8 Administrator role with Genpact. *Job Title:* Lead Consultant *Primary Location:* India-Hyderabad *Schedule:* Full-time *Education Level:* Bachelor's / Graduation / Equivalent *Job Posting Date:* Aug 27, 2024, 5:24:51 AM *Job Category:* Full Time,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
You are a skilled QA / Data Engineer with 3-5 years of experience, joining a team focused on ensuring the quality and reliability of data-driven applications. Your expertise lies in manual testing and SQL, with additional knowledge in automation and performance testing being highly valuable. Your responsibilities include performing thorough testing and validation to guarantee the integrity of the applications. Your must-have skills include extensive experience in manual testing within data-centric environments, strong SQL skills for data validation and querying, familiarity with data engineering concepts such as ETL processes, data pipelines, and data warehousing, experience in Geo-Spatial data, a solid understanding of QA methodologies and best practices for software and data testing, and excellent communication skills. It would be beneficial for you to have experience with automation testing tools and frameworks like Selenium and JUnit for data pipelines, knowledge of performance testing tools such as JMeter and LoadRunner for evaluating data systems, familiarity with data engineering tools and platforms like Apache Kafka, Apache Spark, and Hadoop, understanding of cloud-based data solutions like AWS, Azure, and Google Cloud, along with their testing methodologies. Your proficiency in SQL, JUnit, Azure, Google Cloud, communication skills, performance testing, Selenium, QA methodologies, Apache Spark, data warehousing, data pipelines, cloud-based data solutions, Apache Kafka, Geo-Spatial data, JMeter, data validation, automation testing, manual testing, AWS, ETL, Hadoop, LoadRunner, ETL processes, and data engineering will be crucial in excelling in this role.,
Posted 3 weeks ago
7.0 - 12.0 years
0 Lacs
maharashtra
On-site
As a Lead Data Engineer, you will be responsible for leveraging your 7 to 12+ years of hands-on experience in SQL database design, data architecture, ETL, Data Warehousing, Data Mart, Data Lake, Big Data, Cloud (AWS), and Data Governance domains. Your expertise in a modern programming language such as Scala, Python, or Java, with a preference for Spark/ Pyspark, will be crucial in this role. Your role will require you to have experience with configuration management and version control apps like Git, along with familiarity working within a CI/CD framework. If you have experience in building frameworks, it will be considered a significant advantage. A minimum of 8 years of recent hands-on SQL programming experience in a Big Data environment is necessary, with a preference for experience in Hadoop/Hive. Proficiency in PostgreSQL, RDBMS, NoSQL, and columnar databases will be beneficial for this role. Your hands-on experience in AWS Cloud data engineering components, including API Gateway, Glue, IoT Core, EKS, ECS, S3, RDS, Redshift, and EMR, will play a vital role in developing and maintaining ETL applications and data pipelines using big data technologies. Experience with Apache Kafka, Spark, and Airflow is a must-have for this position. If you are excited about this opportunity and possess the required skills and experience, please share your CV with us at omkar@hrworksindia.com. We look forward to potentially welcoming you to our team. Regards, Omkar,
Posted 3 weeks ago
4.0 - 8.0 years
0 Lacs
raipur
On-site
You are a highly skilled Java Developer with 4 to 6 years of professional experience in building robust web applications. You possess a deep understanding of Java technologies and have a proven track record in software development and system design. Your key responsibilities will include designing and implementing microservices using Spring, Spring Boot, and Spring Cloud. You will be developing and maintaining Java/J2EE applications using JDK8 or higher, managing both relational and NoSQL databases, and designing and deploying cloud-based architectures on platforms like AWS, Azure, or Google Cloud. As part of the development lifecycle, you will write comprehensive unit tests using jUnit, utilize Jenkins for automated build and deployment processes following CI/CD principles, optimize JVM performance, and apply performance enhancement techniques. Additionally, you will monitor system performance using tools like Splunk or Dynatrace, conduct log analysis, manage code repositories using version control systems such as Git, Subversion, or SourceTree, implement containerization with Docker, and orchestrate containers using Kubernetes. Working with message queuing systems like RabbitMQ or Apache Kafka and upholding web application security principles are also key responsibilities. You will be expected to solve complex problems in distributed systems, showcasing excellent analytical and troubleshooting skills. As for qualifications, a Bachelor's degree in Computer Science, Engineering, or a related field is required. You should have a strong foundation in Java programming and experience with software design. Effective collaboration in a team environment and clear communication skills are essential. Please note that moonlighting or holding additional employment outside of this position is strictly prohibited. Engaging in any form of secondary employment without explicit prior approval will lead to immediate termination of the contract. This is a temporary job with a contract length of 6 months. The location type is in-person, with a day shift and US shift schedule. The ability to commute/relocate to Raipur, Chhattisgarh is required before starting work. Preferred education includes a Bachelor's degree, and preferred work experience includes a total of 5 years, with 5 years of experience in Java and Spring Boot. Thank you for considering this opportunity.,
Posted 3 weeks ago
3.0 - 5.0 years
5 - 7 Lacs
Mumbai, Pune, Chennai
Work from Office
Core Skills: Java, J2EE, Junit, Spring Boot, Microservices ROLE OVERVIEW We are seeking proficient and result-oriented Java/Spring Boot Developer with hands-on experience in building scalable and high-performance systems Job Description Expert on Core Java (Java 11 or 17), J2EE, Spring Boot, JUnit Should have knowledge on NodeJs, Maven, GitHub/BitBucket Experience with RESTful services, Rabbit MQ, Active MQ, JSON, Graphql, Apache Kafka & postGres is a plus Excellent problem solving/troubleshooting skills on Java/J2EE technologies Roles & Responsibilities Participate in system design discussions, planning and performance tuning Write clean, testable and scalable code following industry best practices Resolve technical issues through debugging, research, and investigation Work in Agile/Scrum development lifecycle and participate in daily stand-ups and sprint planning Complete the task assigned within the given timelines Continuously learn new technologies UG - BE / B Tech in any specialization PG MCA/ M Tech
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
The Applications Development Intermediate Programmer Analyst position is an intermediate level role where you will be responsible for participating in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. Your main objective will be to contribute to applications systems analysis and programming activities. As an ideal candidate, you should have the following technical skills: - Full stack developer with expertise in .NET, Angular, and Oracle Database - Exposure to JIRA and Agile Methodologies - Familiarity with DevOps practices Your responsibilities will include: - Using your knowledge of applications development procedures and concepts to identify necessary system enhancements - Consulting with users, clients, and technology groups to recommend programming solutions - Analyzing applications for vulnerabilities, conducting testing, and debugging - Serving as an advisor to new or lower-level analysts - Identifying problems, analyzing information, and recommending solutions - Developing and maintaining full-stack applications using Angular and Java - Implementing responsive and user-friendly interfaces - Collaborating with UX/UI designers and participating in code reviews Qualifications required for this role are: - 5-8 years of relevant experience in the Financial Service industry - Intermediate level experience in Applications Development - Strong communication skills and problem-solving abilities - Proficiency in Angular, TypeScript, Java, Spring Framework, MongoDB, and Oracle - Experience with microservices architecture, RESTful API design, Apache Kafka, and frontend performance optimization Preferred qualifications include: - Experience with Angular migration between major versions - Knowledge of state management solutions and containerization technologies - Experience with CI/CD pipelines, DevOps practices, and contributions to open-source projects Education: - Bachelors degree/University degree or equivalent experience Please note that this job description provides a high-level overview of the responsibilities. Other job-related duties may be assigned as required. Citi is an equal opportunity and affirmative action employer.,
Posted 3 weeks ago
6.0 - 10.0 years
0 - 0 Lacs
pune, maharashtra
On-site
As a Senior Java Software Engineer based in Pune, you will be playing a crucial role in bridging the gap between software development and operations. Your primary responsibility will be to ensure the scalability, reliability, and performance of our systems. By joining our team, you will actively contribute to optimizing system health, automating processes, and supporting the growth and stability of our platform. Your key responsibilities will include designing, developing, and maintaining high-quality Java applications using best practices and design patterns. Collaborating with cross-functional teams to define, design, and implement new features will also be a critical part of your role. You will be instrumental in implementing and managing CI/CD pipelines to facilitate smooth and efficient deployment processes. Monitoring and enhancing system reliability, performance, and scalability will be essential tasks to ensure minimal downtime and optimal performance. Additionally, troubleshooting and resolving production issues will be part of your routine, along with participating in code reviews to offer constructive feedback to your peers. To qualify for this role, you should hold a Bachelor's degree in Computer Science, Engineering, or a related field, along with a minimum of 6 years of experience in Java development, demonstrating a strong understanding of design patterns. You should also have 6+ years of experience in a JAVA development, DevOps, or similar role. Your required skills should include a strong background in Java development and related frameworks such as Spring and Hibernate. A deep understanding of high-volume, high-transaction, and high-availability systems and architectures is essential. Proficiency in SOLID principles and design patterns, along with hands-on experience in Docker and Kubernetes in a production environment, will be beneficial, ideally scaling 23 projects. Familiarity with DevOps and DevSecOps best practices, including CI/CD pipelines, infrastructure as code, and automated testing, will also be advantageous. Moreover, your expertise in defining and implementing scalable, secure, and high-performance software architectures, proficiency in version control systems like Git and build tools such as Maven or Gradle, experience with cloud platforms like AWS, Azure, or Google Cloud, knowledge of microservices architecture and event-driven design, and familiarity with container orchestration using Kubernetes will be highly valuable. Additionally, proficiency in Apache Kafka and RESTful API development, experience with automation tools like Jenkins, Terraform, or Ansible, understanding of Agile methodologies and best practices, a proactive and solution-oriented mindset, great communication skills, strong problem-solving abilities, and the capability to work effectively in a fast-paced environment are required for this role. If you meet these qualifications and are excited about this opportunity, please feel free to mail your resume to pradnya.dhiware@talentcorner.in.,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Java Developer at GlobalLogic, you will work in a fast-paced, highly collaborative, and agile development environment that supports short iterative cycles. Flexibility in understanding changing business priorities is essential for this position. You should have a good understanding of Agile development methodologies such as test-driven development, continuous integration, code coverage, and code analysis. Demonstrated ability to develop and deliver complex software projects is crucial. Your interpersonal communication skills should be effective in a highly collaborative team environment. A self-directed work style is expected, where you will provide technical leadership and mentorship to less experienced professional staff. Minimum Qualifications and Experience: - 3+ years of commercial experience in software development with Java Dev and Cloud/Kafka - Knowledge of Cloud Computing and Virtualization technologies such as OpenStack and OPNFV and industry standards like ETSI-NFV in the Telecom market is a plus - Experience in developing applications with microservices architecture is beneficial Must-Have Skills: - Apache Kafka and other streaming message queue products - Java 17, Spring Boot 2 - RESTful, SOAP, JMS - JPA/Hibernate, MyBatis, SQL - Jboss 7.x - Docker, Kubernetes, Helm - Maven, Jenkins, Git, Gerrit, Bash, PostgreSQL, JUnit, OpenStack, RESTEasy Would Be a Plus: - Spring Boot - Linux x86 - Microservices Your major responsibilities and tasks will include reviewing new features" requirements, implementing new solution features, self-testing developed features, covering code with unit tests, writing automation test cases for functional tests, and providing demo sessions for developed functionality. At GlobalLogic, we prioritize a culture of caring, where you'll experience an inclusive environment of acceptance and belonging. Continuous learning and development opportunities are provided to help you grow personally and professionally. You'll have the chance to work on interesting and meaningful projects that make a real impact. We believe in balance and flexibility, offering various work arrangements to help you achieve a perfect balance between work and life. GlobalLogic is a high-trust organization where integrity is key. By joining us, you're placing your trust in a safe, reliable, and ethical global company that values truthfulness, candor, and integrity in everything we do. About GlobalLogic: GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world's largest and most forward-thinking companies. Since 2000, we've been at the forefront of the digital revolution, creating innovative and widely used digital products and experiences. Join us in transforming businesses and redefining industries through intelligent products, platforms, and services.,
Posted 3 weeks ago
2.0 - 6.0 years
0 Lacs
maharashtra
On-site
Senior .NET Engineer Chennai, Mumbai Role: Senior DotNet Engineer / Senior Software Engineer Experience: 2 6 years Qualification: Bachelors or masters degree in Computer Engineering or Science from exceptional institutions Job Type: Permanent, Full Time Industry: IT SaaS, Retail PRISTINE IS LOOKING FOR SENIOR .NET ENGINEER Does creating transformational business solutions energize you Do you want to immerse yourself in a small aggressive team that specializes in Consumer Behavior and AI Are you an original thinker, an articulate problem solver, somebody who is driven by teams success We are looking for exceptional Senior Dot Net Engineers who have passion for developing end-to-end solutions that position our customers to win. You will work with top notch Solution Engineers, Retail Experts, AI Scientists and Enterprise Architects. Headquartered in Boston, Pristine is a pioneer in Consumer Behavior and AI driven solutions for the Retail and Consumer Product industries. Retailers and Consumer Product Manufacturers are experiencing unprecedented business transformation. We want our customers to lead this transformation. We are multiplying our capabilities so our customers can take advantage of these massive opportunities. Everyday our SaaS platform receives and learns from 40 million+ new transactions, representing 12,000+ stores and websites and 35 million+ customers. Send your resume to career@pristineinfotech.com. Please describe in a few sentences what makes you supremely qualified for this role. Key Attributes and Abilities: Empathy towards End Users. Innate Quality and Process Orientation. Belief in Standards and Best Practices. Passion for Teams Success and Own Advancement. Sharing knowledge, Expressing Own Viewpoints, Encouraging Colleagues to Express, and Striving towards Objective-driven Consensus. Strong Requirements Analysis and Creative Problem Solving. Prioritizing Competing Objectives and Multi-tasking. Writing Clear, Concise, and Highly Readable Problem and Solution Statements. Articulating Complex Subjects via Engaging Presentations. Critical Experiences: Complete Solution Life Cycle Participation (Ideation, Requirements Elicitation, Architecture, Data Structure and Algorithm Design, Development, Testing, Deployment, Implementation, Training, Change Management, Monitoring and Enhancement). Designing and Implementing Effective, Simple and Flexible Technical Solutions to High-dimension Problems, using Service Oriented Architecture, REST and Microservices. Delivering On-time, High Quality Solutions in High-pressure Environments. Key Knowledge Areas: Design Patterns. Design Thinking. Big Data and Streaming Technologies Technical Expertise and Familiarity: Microsoft: .NET Core, ASP.NET, C#, NUNIT. Frameworks/Technologies: MVC, Web Services, REST API, Javascript, JQuery, CSS, ReactJS/AngularJS,Testing Frameworks. App/Web Servers: Tomcat/IIS. Databases & Streaming: Oracle, MySQL, NoSQL Databases such as MongoDB, Apache Kafka, In-memory Processing. Big Data Technology: Azure App Services, Azure Active Directory, Key Vault, Azure DevOps, Application Insights, Azure Storage, Redis Cache O/S: Linux, Unix, Windows. Experience in Big Data technologies and Advanced Coursework in AI and Machine Learning would be a plus. Responsibilities: Review business requirements and understand business solution directions, analyse, dissect system requirements and technical specifications. Develop software solutions by studying information needs; conferring with users; studying systems flow, data usage and work processes; investigating problem areas; following the software development lifecycle. Propose and implement best in class solutions for large business problems. Develop effective approaches for optimizing response times and computing resources and automate work processes. Support continuous improvement by investigating alternative approaches and technologies; and presenting them for architectural review. Implement enhancements and fix bugs as client priorities. Take ownership of issues and meeting SLAs. Support and develop software engineers by providing advice, coaching and educational opportunities. Providing guidance and standards to the team. Conducting design and code reviews. REMUNERATIONS AND BENEFITS: Start-up environment where you will help shape the industry future Work with amazingly talented and driven Retail Business Practitioners, Consumer Behavior Analysts, AI Scientists and Engineers. Annual 2 Weeks Study Time Competitive salaries and generous equity.,
Posted 3 weeks ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Java Developer at Virtusa, you will leverage your 6+ years of hands-on experience to design and implement web applications using the Java Platform. Your expertise in Java 8, Spring Boot, Spring Data JPA, Spring MVC, Spring Security, and Spring Cloud Config will be essential in developing service-oriented or microservice architectures. You will also be responsible for implementing RESTful API principles and working with messaging queues using Apache Kafka. In this role, you will utilize your solid experience with Oracle Database, testing frameworks like JUnit/Mockito, and continuous integration tools such as Git, Gitlab, Docker, and Appengine. Your familiarity with Agile methodologies (SCRUM) and software development life cycle will be crucial in delivering high-quality solutions. Additionally, you will work with tools like Swagger, Postman, Insomnia, Splunk, ServiceNow, and Jira to streamline development processes. At Virtusa, we prioritize teamwork, quality of life, and professional development. As part of our global team of 30,000 professionals, you will have the opportunity to work on exciting projects and leverage state-of-the-art technologies. We foster a collaborative environment that values new ideas and excellence, providing you with a dynamic platform to grow both personally and professionally. Join Virtusa and be part of a team that values your expertise and encourages innovation, enabling you to reach your full potential in a supportive and stimulating work environment.,
Posted 3 weeks ago
5.0 - 8.0 years
3 - 6 Lacs
Navi Mumbai
Work from Office
Required Details. 1.Total IT Exp: 2.Exp in Kafka: 3.Exp in Kafka Connect, Schema Registry, Kafka Streams 4.Exp in Kafka cluster: 5.Current CTC: 6.Exp CTC: 7.Notice Period/LWD: 8.Current Location: 9.Willing to relocate to Navi Mumbai: 10.Willing to work on Alternate Saturdays: Job Title: Kafka Administrator (5+ Years Experience) Location : CBD Belapur Navi Mumbai Job Type : [Full-time] Experience Required : 5+ Years Educational Qualification: B.E B.Tech BCA B.Sc-IT MCA M.Sc-IT M.Tech Job Summary: We are looking for a skilled and experienced Kafka Administrator with a minimum of 5 years of experience in managing Apache Kafka environments. The ideal candidate will be responsible for the deployment, configuration, monitoring, and maintenance of Kafka clusters to ensure system scalability, reliability, and performance. Key Responsibilities: Install, configure, and maintain Apache Kafka clusters in production and development environments. Monitor Kafka systems using appropriate tools and proactively respond to issues. Set up Kafka topics, manage partitions, and define data retention policies. Perform upgrades and patch management for Kafka and its components. Collaborate with application teams to ensure seamless Kafka integration. Troubleshoot and resolve Kafka-related production issues. Develop and maintain scripts for automation of routine tasks. Ensure security, compliance, and data governance for Kafka infrastructure. Maintain documentation and operational runbooks. Required Skills: Strong experience with Apache Kafka and its ecosystem (Kafka Connect, Schema Registry, Kafka Streams). Proficient in Kafka cluster monitoring and performance tuning. Experience with tools such as Prometheus, Grafana, ELK stack. Solid knowledge of Linux/Unix system administration. Hands-on experience with scripting languages like Bash, Python. Familiarity with DevOps tools (Ansible, Jenkins, Git). Experience with cloud-based Kafka deployments (e.g., Confluent Cloud, AWS MSK) is a plus. Qualification Criteria: Candidates must hold at least one of the following degrees: - B.E (Bachelor of Engineering) - B.Tech (Bachelor of Technology) - BCA (Bachelor of Computer Applications) - B.Sc-IT (Bachelor of Science in Information Technology) - MCA (Master of Computer Applications) - M.Sc-IT (Master of Science in Information Technology) - M.Tech (Master of Technology) Preferred Certifications (Not Mandatory): Confluent Certified Administrator for Apache Kafka (CCAAK) Linux and Cloud Administration Certifications (RHCSA, AWS, Azure)
Posted 3 weeks ago
8.0 - 13.0 years
8 - 18 Lacs
Gurugram
Work from Office
Role & responsibilities Preferred candidate profile We are looking for an immediate resource for Senior Java Developer role for Gurgaon location. Key Skills: Java Spring Boot , Apache Kafka, Apache Camel, OpenShift
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough