Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 9.0 years
6 - 10 Lacs
Pune
Work from Office
: Primary skills MUST have skills (not just a class room training experience please) Strong understanding and hands on experience on Core Java, Maven Strong understanding and hands on experience on Selenium WebDriver (v3.x), Selenium Grid Hands on experience on Cucumber, BDD methodology, Gherkin syntax Hands on experience on REST ASSURED/JUnit framework for API testing Hands on experience in any Java IDE (e.g. Eclipse) Hands on experience on GIT, JIRA, Jenkins Good Understanding of REST API types (GET, PUT, POST, DELETE) an its working Good understanding of HTTP, JSON protocol and syntax Good understanding of Message Schema, RAML, Message Request-Response mechanism Working experience in Agile methodology / SAFe with in-sprint automation Strong experience in handling Chrome, Firefox, Safari, Edge browser for testing Strong written and verbal communication skills Secondary skills Been able to understand Retail Banking functions/requirements Been able to test responsive UI aspect Experience/understanding of Browserstack/SauceLabs/Any execution grid services Understanding of STUB/Service virtualization Understanding of any API testing tools like POSTMAN, Rest Client, Anypoint etc ------ ------Software Engineer - Seven to Ten Years,REST Assured - Seven to Ten Years------PSP Defined SCU in QET_AT_QA Automation Engineer Emailanusha.r@ideslabs.com
Posted 1 month ago
7.0 - 9.0 years
9 - 11 Lacs
Telangana
Work from Office
Experience 5-7 years of experience in supporting Jira, including workflow configuration, schema configuration, and JQL Skills Understanding of Jira APIs, webhooks, and web technologies (HTML, CSS, JavaScript, PHP, Java) Certifications SAFe certifications or strong knowledge preferred Desired Skills Technical Proficiency Knowledge of Confluence APIs, DNS, databases, networking, and Active Directory Integration Develop and maintain integrations between Jira and other systems Problem-Solving Ability to create logic flow charts and process diagrams based on client needs Key Responsibilities Configuration and Administration Provide configuration and administrative support for the Atlassian tool suite, including Jira, Confluence, and key third-party add-ons Workflow Management Design and manage workflows, fields, screens, reports, and dashboards within Jira User Support Provide application support for Jira and Confluence to IT and the rest of the business Customization Customize Jira projects with various schemas to match business needs Training Train and mentor team members on the effective use of Jira Documentation Generate documentation on workflows and processes implemented in Jira to support runbooks Troubleshooting Address user concerns and queries, troubleshoot technical issues Performance Monitoring Monitor system performance and implement improvements where necessary
Posted 1 month ago
3.0 - 6.0 years
5 - 10 Lacs
Hyderabad
Work from Office
Functional Area: IT Software Software Development Role Category: Programming & Design Role: Software Developer Experience:- 4+ Yrs We are seeking a .NET developer responsible for building web-based applications using ASP.NET, WCF/WPF, JQuery, and JavaScript. Your primary responsibility will be to design and develop these layers of our applications, and to coordinate with the rest of the team working on different layers of the infrastructure with Good Communication skills. A commitment to collaborative problem solving, sophisticated design, and quality product is essential. Relevant Industry: 4+ Years of relevant experience in software development in .NET & required skill set. Required Skill Set: Strong knowledge of technologies like ASP.NET, jQuery, SQL, MY SQL, HTML, JSON, XML, WCF, JMeter & Postman Tools. Proficient in C#, Windows service, API, and AWS with a good knowledge of ecosystems. Strong understanding of object-oriented programming. Skill for writing reusable libraries. Familiar with various design and architectural patterns. Knowledge of concurrency patterns. Familiarity with Databases such as MySQL Server, SQL, Oracle, etc. Experience with popular web application frameworks. Familiarity with Windows Presentation framework. Knack for writing clean, readable, and easily maintainable code. Understanding of fundamental design principles for building a scalable application. Experience creating database schemas that represent and support business processes. Basic understanding of Common Language Runtime (CLR), its limitations, weaknesses, and workarounds. Experience implementing automated testing platforms and unit tests. Proficient understanding of code versioning tools. Roles & Responsibilities: Translate application storyboards and use cases into functional applications. Design, build, and maintain efficient, reusable, and reliable code. Integrate data storage solutions. Ensure the best possible performance, quality,
Posted 1 month ago
6.0 - 9.0 years
10 - 14 Lacs
Telangana
Work from Office
Experience: 5-7 years of experience in supporting Jira, including workflow configuration, schema configuration, and JQL Skills: Understanding of Jira APIs, webhooks, and web technologies (HTML, CSS, JavaScript, PHP, Java) Certifications: SAFe certifications or strong knowledge preferred Desired Skills: Technical Proficiency: Knowledge of Confluence APIs, DNS, databases, networking, and Active Directory Integration: Develop and maintain integrations between Jira and other systems Problem-Solving: Ability to create logic flow charts and process diagrams based on client needs Key Responsibilities: Configuration and Administration: Provide configuration and administrative support for the Atlassian tool suite, including Jira, Confluence, and key third-party add-ons Workflow Management: Design and manage workflows, fields, screens, reports, and dashboards within Jira User Support: Provide application support for Jira and Confluence to IT and the rest of the business Customization: Customize Jira projects with various schemas to match business needs Training: Train and mentor team members on the effective use of Jira Documentation: Generate documentation on workflows and processes implemented in Jira to support runbooks Troubleshooting: Address user concerns and queries, troubleshoot technical issues Performance Monitoring: Monitor system performance and implement improvements where necessary
Posted 1 month ago
4.0 - 8.0 years
7 - 10 Lacs
Mumbai, Chennai, Bengaluru
Work from Office
Noitce Period : Full stack developers with 4+ years of developing internet-scale solution development primarily using Java, Spring Boot and no-sql databases Must have demonstrated proficiency and experience in the following tools and technologies: Java 8 (Lambdas, Streams, Completable Future, optional, generics) Spring boot (webflux , Reactor 3), spring-data, REST REST APIs using Spring Boot 2.0 (reactive) and skilled in Open API (swagger) specification designing database schemas, index design, optimizations for query tuning Good knowledge creating applications using React.Js Working knowledge of cloud technologies (eg. docker, kubernetes, jager, prometheus) Take pride in writing good clean code, perform peer code reviews and architecture reviews. A bachelor's degree in Engineering or related field Job location: Hyd/Bang/Mumbai/Pune/Chennai/Noida
Posted 1 month ago
6.0 - 9.0 years
8 - 11 Lacs
Gurugram
Work from Office
Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build what’s next for their businesses. Your Role Design, develop, test, and maintain full-stack applications using Python and C#. Build and optimize back-end services, APIs, and integrations. Design and manage relational database schemas and queries, specifically in SQL Server. Design and deploy applications in Azure, leveraging Azure services for scalability and reliability. Implement Azure DevOps for CI/CD pipelines and source control. Integrate Azure AI services into applications to enhance functionality (e.g., Azure OpenAI, Cognitive Services). Your Profile 8-12 years of experience in full-stack application development Collaborate with cross-functional teams, including product managers, designers, and QA engineers. Translate business requirements into technical specifications and solutions. Follow best practices in code design, testing, and documentation. Ensure the security, scalability, and performance of the applications. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of "22.5 billion.
Posted 1 month ago
6.0 - 10.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Project description We've been engaged by a large European bank to provide resources to their Markets Program development team working on a wide range of projects like CTB changes, risk and regulatory projects, version upgrades, etc. We require an experienced Java Developer to work within the existing Team. Responsibilities Lead the design, development, and implementation of Murex Connectivity 2.0 solutions. Act as the primary point of contact for all Murex Connectivity 2.0-related queries, providing guidance and support to the team. Collaborate with cross-functional teams to ensure the successful delivery of Murex Connectivity 2.0 solutions, including integration with other systems. Provide technical leadership and mentorship to the team, ensuring adherence to best practices and standards in Murex Connectivity 2.0 development. Stay updated with the latest advancements in Murex Connectivity 2.0 and related technologies, integrating innovative approaches for sustained competitive advantage. Skills Must have 4+ years of experience in Murex Connectivity 2.0. In-depth knowledge of Murex C2 modules and their integration within a financial environment. Experience with Murex Back Office Workflows. Understanding of financial market instruments and trading workflows. Strong understanding of Murex Connectivity 2.0 architecture and related technologies. Experience in designing and implementing Murex Connectivity 2.0 solutions. Experience in leading and mentoring teams in Murex Connectivity 2.0 development. Solid grasp of integration with other systems and related technologies. Nice to have Docker, CI/CD, Jenkins and TeamCity JSON / XML schema understanding Other Languages EnglishB2 Upper Intermediate Seniority Senior
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Project description Development and maintenance of an enterprise level applications for a leading food corporation Responsibilities Power Platform Pipelines for Copilot Studio Agents (it also brings benefits for other Power Platform environments)oDevelop automated deployment pipelines for Copilot Studio agents https://learn.microsoft.com/en-us/power-platform/alm/pipelinesoImplement ALM best practices, including version control, testing, and release management.oCreation of the articles for the Power Platform HuboIdentify whats needed to have a E2E automatization from the environment request to the enablement of automated pipelines. Copilot Studio OperationsoCreate SOPs for vendor team to perform Copilot Studio operations (environment creation, change on DLPs, etc) Assess and implement Copilot Studio KitoThe kit helps makers develop and test custom agents, use large language model to validate AI-generated content, optimize prompts, and track aggregated key performance indicators of their custom agents.oIdentify and explore how to have automated testing for Copilot Studio Agents Help to validate Copilot Studio requirements from approved AI use cases Define and implement automated controls to validate AI governance processes within Copilot Studio use cases. Implement automated ISMS control Skills Must have 5+ years of experience with Power Platform Experience with Copilot Studio Agents Strong expertise in Power Platform governance & security Experience designing the systems, participating in architecture discussion Experience in Power Platform Admin Center & DLP policies. Extensive experience in building Power Apps (Canvas and Model-Driven Apps). Proficiency with Power Automate (Flows) for process automation. Strong understanding of Dataverse (Common Data Service) schema, relationships, and data management. Experience in designing and optimizing Dataverse entities and tables. Familiarity with Power BI for data visualization and reporting. Proficiency in JavaScript, TypeScript, and Power Fx. Knowledge of plugins, custom workflows, and PCF (PowerApps Component Framework) development. Ability to use APIs and connectors for integration with external systems. Experience with role-based security in Power Platform applications. Experience with Microsoft 365 (SharePoint, Teams, Excel) integration. Proficiency in integrating with third-party services using custom connectors. Familiarity with Agile or Scrum methodologies. Ability to perform requirements gathering, solution design, development, and deployment. Skilled in diagnosing and resolving performance and functionality issues in Power Platform solutions. Nice to have Microsoft Power Platform Functional Consultant or Developer certification. Experience with Azure services like Azure Logic Apps, Azure Functions, or Azure API Management. Familiarity with Azure DevOps for CI/CD pipelines for Power Platform applications. Advanced understanding of SQL and data modeling principles. Knowledge of ETL processes and tools. Exposure to other low-code platforms like Appian, Mendix, or OutSystems. Experience with advanced Power BI features, such as R and Python integration or DAX optimization. Knowledge of UI/UX best practices for creating intuitive and accessible applications. Familiarity with tools like Figma or Adobe XD for prototyping. Domain knowledge in sectors such as finance, healthcare, or manufacturing. Experience in mentoring junior developers or leading a Power Platform team. Strong stakeholder communication and collaboration skills. Additional certifications such as Azure Developer Associate or Microsoft Dynamics certifications. Other Languages EnglishC1 Advanced Seniority Lead
Posted 1 month ago
5.0 - 10.0 years
4 - 7 Lacs
Mumbai
Hybrid
PF Detection is mandatory Minimum 5 years of experience in database development and ETL tools. 2. Strong expertise in SQL and database platforms (e.g. SQL Server Oracle PostgreSQL). 3. Proficiency in ETL tools (e.g. Informatica SSIS Talend DataStage) and scripting languages (e.g. Python Shell). 4. Experience with data modeling and schema design. 5. Familiarity with cloud databases and ETL tools (e.g. AWS Glue Azure Data Factory Snowflake). 6. Understanding of data warehousing concepts and best practices
Posted 1 month ago
2.0 - 7.0 years
4 - 5 Lacs
Gurugram
Work from Office
We are looking for a talented and experienced MERN Stack Developer with a strong focus on backend development and API integrations. The ideal candidate will have at least 2 years of hands-on experience in building and managing scalable web applications using the MERN stack. This role requires a detail-oriented individual who can develop robust backends, ensure seamless API integrations, and contribute to creating high-performance applications. Key Responsibilities Develop, test, and maintain scalable server-side applications using Node.js and Express.js . Design and implement RESTful APIs and integrate third-party APIs into the platform. Work with MongoDB for database design, queries, and optimization. Collaborate with front-end developers to integrate the backend with React.js. Write clean, maintainable, and reusable code following industry-standard practices. Debug and resolve issues, ensuring high performance and responsiveness. Perform unit testing and participate in code reviews to maintain code quality. Collaborate with the team to define project requirements and timelines. Required Skills Proficiency in Node.js , Express.js , and backend architecture design. Strong experience with MongoDB , including schema design and query optimization. Knowledge of RESTful API development and third-party API integrations. Familiarity with front-end technologies like React.js for basic troubleshooting. Hands-on experience with Git/GitHub for version control. Understanding of deployment processes on cloud platforms such as AWS, Heroku, or similar. Strong problem-solving and analytical skills. Knowledge of Agile methodologies and development practices. Nice-to-Have Skills Knowledge of WebSocket for real-time communication. Experience with GraphQL APIs. Basic knowledge of containerization tools like Docker. What We Offer Competitive salary package in the range of 4 - 5 LPA. Opportunity to work on exciting, challenging projects. A collaborative and growth-oriented work environment. Flexible work hours and remote work opportunities. Skills Mern Full Stack Developer (PDF, Word, 5mb maximum) Current/Previous Job Title * Years of Experience * Recaptcha requires verification. Lets build great things together! Fill out this form and one of our client success managers will contact you within 24 hours. We have notifications set to make sure your message is received. Sign up for our exclusive email list and be the first to hear of special promotions, new arrivals, and designer news
Posted 1 month ago
3.0 - 6.0 years
8 - 12 Lacs
Mumbai
Work from Office
For more than 40 years, Accelya has been the industry s partner for change, simplifying airline financial and commercial processes and empowering the air transport community to take better control of the future. Whether partnering with IATA on industry-wide initiatives or enabling digital transformation to simplify airline processes, Accelya drives the airline industry forward and proudly puts control back in the hands of airlines so they can move further, faster. Good knowledge on UNIX/Linux commands and Shell scripting Knowledge Min L2 Cassandra Support Nice to have options to maybe include : ITIL Security best practises Sharding for scalability Knowledge and experience with Clustering Knowledge and experience with Cloud technologies OS Performance tuning for NoSQL Nice to have : Able to complete documentation through Confluence pages Number of years experience 7+ years -- depending on budget and skill level we are looking for. Knowledge in installation / upgrades / migration / backup and restore/data recovery, performance tuning, Cassandra schema design, and experience in loading data into Cassandra clusters. Very good english What does the future of the air transport industry look like to you? Whether you re an industry veteran or someone with experience from other industries, we want to make your ambitions a reality!
Posted 1 month ago
0.0 - 1.0 years
2 - 5 Lacs
Coimbatore
Work from Office
Responsibilities: Keyword Research and Analysis: Conduct keyword research to identify high-ranking local keywords. Develop and execute local SEO strategies to improve rankings on Google Maps and local search results. Analyze keyword competitiveness and search volume. Google My Business Management: Optimize and manage Google My Business listings for all locations. Ensure accuracy of business information, including NAP (Name, Address, Phone). Implement GMB best practices and utilize all available features to enhance our online presence. Local Citations and Directories: Build and manage local citations on relevant directories. Ensure consistency of NAP across all platforms. Competitive Analysis: Monitor and assess the local search presence of competitors. Identify opportunities to outperform competitors in local search rankings. On-Page SEO: Optimize website content for local keywords and location-specific information. Ensure proper implementation of title tags, meta descriptions, headers, and schema markup. Local Content Creation: Develop location-specific content, including blog posts, landing pages, and service pages. Ensure content is optimized for local search. Review Management: Monitor and respond to online reviews on various platforms (Google, Yelp, etc.). Implement strategies to improve online reputation. Local Link Building: Build relationships with local influencers, bloggers, and businesses for backlink opportunities. Engage in outreach campaigns for local partnerships. Reporting and Analysis: Analyze and report on key performance indicators (KPIs) related to local search using tools like Google Analytics, Moz, or SEMrush. Utilize data insights to make informed recommendations for improving local search visibility. Generate reports to showcase progress and areas for improvement. Stay Updated with Local SEO Trends: Stay informed about industry trends, algorithm updates, and best practices in local SEO. Qualifications: Bachelor s degree in Marketing, Digital Marketing, or related field. Proven experience in local SEO optimization. Proficiency with SEO tools like Google Analytics, Moz, SEMrush, etc. Strong understanding of Google My Business and other local search platforms. Excellent written and verbal communication skills. Ability to work independently and in a team. Preferred Qualifications: Google Analytics or SEO certification. Experience with HTML/CSS and website CMS platforms. Knowledge of local business directories and citation management
Posted 1 month ago
10.0 - 12.0 years
12 - 14 Lacs
Noida
Work from Office
Req ID: 327426 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a System Integration Specailist to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). NTT Data Services is Hiring! Positions Overview At NTT DATA, we know that with the right people on board, anything is possible. The quality, integrity, and commitment of our employees are key factors in our company s growth, market presence and our ability to help our clients stay a step ahead of the competition. By hiring, the best people and helping them grow both professionally and personally, we ensure a bright future for NTT DATA and for the people who work here. NTT DATA, Inc. currently seeks an Java API Developer to join our team in Bangalore . Clients business problem to solve? Positions General Duties and Tasks In these roles you will be responsible to: Design, develop, and maintain RESTful and GraphQL APIs using Java (8+) and Spring Boot. Collaborate with cross-functional teams to define, design, and ship new features. Write clean, testable, and efficient code following test-driven development (TDD) practices. Contribute to architectural decisions and participate in code reviews. Optimize database interactions and manage schema design, indexing, and migrations using PostgreSQL or Oracle. Utilize Docker and Kubernetes for containerization and orchestration of microservices. Support CI/CD pipelines using Jenkins and Rancher for Kubernetes management. Ensure code quality and maintainability using tools like JUnit and SonarQube. Work in an Agile environment, collaborating closely with team members and stakeholders. Requirements for this role include: Proficiency in Java 8+ (Java 17 preferred). Strong experience with Spring Boot and backend API development. Hands-on experience with RESTful and GraphQL APIs. Familiarity with JUnit, SonarQube, and TDD methodologies. Experience with Docker, Kubernetes, and microservices architecture. Knowledge of Rancher and Jenkins for CI/CD. Strong database skills in PostgreSQL or Oracle, including performance tuning and schema design. Proficient with Git and version control best practices. Excellent problem-solving skills and ability to work collaboratively in Agile teams. Preferences: - Required schedule availability for this position is Monday-Friday (12:00 PM to 10:00pm IST). The shift timings can be changed as per client requirements. Additionally, resources may have to do overtime and work on weekend s basis business requirement.
Posted 1 month ago
5.0 - 10.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Req ID: 324434 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Systems Integration Advisor to join our team in Bengaluru, Karn taka (IN-KA), India (IN). Job Title: API Developer (JAVA | Spring Boot| Rest & GraphQL) About the role: We are looking for a passionate and experienced API Developer to join our backend engineering team. In this role, you will play a critical part in designing and implementing robust APIs and microservices. You will work with cutting-edge tech stack including Spring Boot and Java 8+ (Java17 preferred) GraphQL, Docker and Kubernetes. In this role you ll be responsible for building clean, testable and efficient code, contributing to architecture decisions and supporting full lifecycle development from concept to development. Required Skills: Java 8+ (preferred java 17) Strong backend development experience using Spring Boot, specifically building APIs that integrate with relational databases. Proficient in building and consuming RESTful and GraphQL APIs Experience with Junit, SonarQube and test-driven development Knowledge of Docker and Kubernetes for microservice architecture Experience with Rancher for Kubernetes and Jenkins for CI/CD is preferred Hands-on expertise in PostgreSQL/Oracle, including schema design, indexing, query optimization and migration Familiarity with version control (GIT) Strong problem-solving skills and the ability to work collaboratively in Agile teams. Nice to Have: Experience with Apache Kafka Understanding of API security protocols (OAuth2, JWT, etc.)
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Req ID: 327427 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Systems Integration Advisor to join our team in Bangalore, Karn taka (IN-KA), India (IN). NTT Data Services is Hiring! Positions Overview At NTT DATA, we know that with the right people on board, anything is possible. The quality, integrity, and commitment of our employees are key factors in our company s growth, market presence and our ability to help our clients stay a step ahead of the competition. By hiring, the best people and helping them grow both professionally and personally, we ensure a bright future for NTT DATA and for the people who work here. NTT DATA, Inc. currently seeks an Java API Developer to join our team in Bangalore . Clients business problem to solve? Positions General Duties and Tasks In these roles you will be responsible to: Design, develop, and maintain RESTful and GraphQL APIs using Java (8+) and Spring Boot . Collaborate with cross-functional teams to define, design, and ship new features. Write clean, testable, and efficient code following test-driven development (TDD) practices. Contribute to architectural decisions and participate in code reviews. Optimize database interactions and manage schema design, indexing, and migrations using PostgreSQL or Oracle . Utilize Docker and Kubernetes for containerization and orchestration of microservices. Support CI/CD pipelines using Jenkins and Rancher for Kubernetes management. Ensure code quality and maintainability using tools like JUnit and SonarQube . Work in an Agile environment, collaborating closely with team members and stakeholders. Requirements for this role include: Proficiency in Java 8+ (Java 17 preferred). Strong experience with Spring Boot and backend API development. Hands-on experience with RESTful and GraphQL APIs. Familiarity with JUnit , SonarQube , and TDD methodologies. Experience with Docker , Kubernetes , and microservices architecture. Knowledge of Rancher and Jenkins for CI/CD. Strong database skills in PostgreSQL or Oracle , including performance tuning and schema design. Proficient with Git and version control best practices. Excellent problem-solving skills and ability to work collaboratively in Agile teams. Preferences: - Required schedule availability for this position is Monday-Friday (12:00 PM to 10:00pm IST) . The shift timings can be changed as per client requirements. Additionally, resources may have to do overtime and work on weekend s basis business requirement.
Posted 1 month ago
8.0 - 10.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Req ID: 324436 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Software Development Advisor to join our team in Bengaluru, Karn taka (IN-KA), India (IN). About the role: We are looking for a passionate and experienced API Developer to join our backend engineering team. In this role, you will play a critical part in designing and implementing robust APIs and microservices. You will work with cutting-edge tech stack including Spring Boot and Java 8+ (Java17 preferred) GraphQL, Docker and Kubernetes. In this role you ll be responsible for building clean, testable and efficient code, contributing to architecture decisions and supporting full lifecycle development from concept to development. Required Skills: Java 8+ (preferred java 17) Strong backend development experience using Spring Boot, specifically building APIs that integrate with relational databases. Proficient in building and consuming RESTful and GraphQL APIs Experience with Junit, SonarQube and test-driven development Knowledge of Docker and Kubernetes for microservice architecture Experience with Rancher for Kubernetes and Jenkins for CI/CD is preferred Hands-on expertise in PostgreSQL/Oracle, including schema design, indexing, query optimization and migration Familiarity with version control (GIT) Strong problem-solving skills and the ability to work collaboratively in Agile teams. Nice to Have: Experience with Apache Kafka Understanding of API security protocols (OAuth2, JWT, etc.)
Posted 1 month ago
3.0 - 5.0 years
5 - 6 Lacs
Bengaluru
Work from Office
Req ID: 329417 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Java API Developer to join our team in Bangalore, Karn taka (IN-KA), India (IN). "Key Responsibilities: Design, develop, and maintain RESTful and GraphQL APIs using Java (8+) and Spring Boot. Collaborate with cross-functional teams to define, design, and ship new features. Write clean, testable, and efficient code following test-driven development (TDD) practices. Contribute to architectural decisions and participate in code reviews. Optimize database interactions and manage schema design, indexing, and migrations using PostgreSQL or Oracle. Utilize Docker and Kubernetes for containerization and orchestration of microservices. Support CI/CD pipelines using Jenkins and Rancher for Kubernetes management. Ensure code quality and maintainability using tools like JUnit and SonarQube. Work in an Agile environment, collaborating closely with team members and stakeholders. Required Skills: Proficiency in Java 8+ (Java 17 preferred). Strong experience with Spring Boot and backend API development. Hands-on experience with RESTful and GraphQL APIs. Familiarity with JUnit, SonarQube, and TDD methodologies. Experience with Docker, Kubernetes, and microservices architecture. Knowledge of Rancher and Jenkins for CI/CD. Strong database skills in PostgreSQL or Oracle, including performance tuning and schema design. Proficient with Git and version control best practices. Excellent problem-solving skills and ability to work collaboratively in Agile teams."
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Data Engineer-Palantir - Role Palantir Data Engineers No of years experience Total 5+ year (Relevant Palantir experience 3+ years) Mandatory Skills Palantir skill Work Location India - Hybrid Job Title: Palantir Foundry Data Engineer The Role As a Palantir Foundry Data Engineer , you will be at the forefront of transforming raw data into actionable intelligence. Youll work directly with customers and internal teams to design and implement robust data pipelines, ensuring that data is clean, structured, and ready for analysis in Palantirs platforms. This role is deeply technical and highly collaborative, requiring both engineering expertise and a strong understanding of real-world data challenges. What Youll Do Data Integration & Modeling : Ingest and transform data from diverse sources (e.g., APIs, databases, flat files, streaming systems) into structured, queryable formats using Palantirs tools and custom code. Pipeline Development : Build scalable, fault-tolerant ETL/ELT pipelines using technologies such as PySpark, SQL, and Palantirs proprietary tools like Foundrys Code Repositories and Pipeline Builder . Customer Collaboration : Work closely with clients to understand their data landscape, identify integration challenges, and deliver tailored solutions that meet their operational and analytical needs. Data Quality & Governance : Implement validation checks, monitoring, and documentation to ensure data integrity, traceability, and compliance with security and privacy standards. Cross-functional Impact : Partner with deployment strategists, software engineers, and product designers to shape how data is used across applications and decision-making workflows. Object Management : Create the Ontology Objects, Links and actions which allow the maintenance of the data by the downstream applications. Palantir Tool Knowledge : Hands On experience with Code repositories, Pipeline Builders, Code workbook, Data health, Data Connections, Egress policies, Ontology Management, Contour, Solution designer & Data Lineage. What Were Looking For Technical Proficiency : Strong coding skills in Python, Java, or Scala, and deep experience with SQL. Familiarity with distributed data processing frameworks (e.g., Spark, Flink) is a plus. Data Architecture Knowledge : Understanding of data warehousing, data lakes, schema design, and performance optimization. Problem Solving : Ability to break down complex data problems and design elegant, scalable solutions. Communication Skills : Comfortable engaging with both technical and non-technical stakeholders. You can explain data concepts clearly and advocate for best practices. Adaptability : Thrive in fast-paced, ambiguous environments. Youre comfortable learning new tools and technologies on the fly. Preferred Qualifications Experience with cloud platforms (AWS, Azure, GCP) and containerization (Docker, Kubernetes). Familiarity with Palantir Foundry. Background in a domain such as Oil & gas, finance, healthcare, logistics, or defence is a plus. Bachelors or Masters degree in Computer Science, Engineering, Mathematics, or a related field. Location- PAN India Yrs of Exp-5+Yrs
Posted 1 month ago
3.0 - 5.0 years
4 - 8 Lacs
Bengaluru
Work from Office
4+ years of backend development experience with Node.js . Strong command of TypeScript and JavaScript (ES6+) . Hands-on experience with RESTful API design and GraphQL schema development. Good understanding of Git workflows and CI/CD tools like Jenkins, GitHub Actions. Experience with MongoDB , PostgreSQL , or MySQL . Node JS Mandatory Skills: Node JS. Experience3-5 Years.
Posted 1 month ago
8.0 - 13.0 years
25 - 30 Lacs
Bengaluru
Work from Office
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a AWS data Engineer Lead to join our team in bangalore, Karn taka (IN-KA), India (IN). Data Engineer Lead Robust hands-on experience with industry standard tooling and techniques, including SQL, Git and CI/CD pipelines mandiroty Management, administration, and maintenance with data streaming tools such as Kafka/Confluent Kafka, Flink Experienced with software support for applications written in Python & SQL Administration, configuration and maintenance of Snowflake & DBT Experience with data product environments that use tools such as Kafka Connect, Synk, Confluent Schema Registry, Atlan, IBM MQ, Sonarcube, Apache Airflow, Apache Iceberg, Dynamo DB, Terraform and GitHub Debugging issues, root cause analysis, and applying fixes Management and maintenance of ETL processes (bug fixing and batch job monitoring) Training & Certification Apache Kafka Administration Snowflake Fundamentals/Advanced Training Experience 8 years of experience in a technical role working with AWS At least 2 years in a leadership or management role About NTT DATA NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https: / / us.nttdata.com / en / contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If youd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here .
Posted 1 month ago
3.0 - 8.0 years
3 - 6 Lacs
Hyderabad
Work from Office
Req ID: 325273 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Adobe AEP Developer to join our team in Hyderabad, Telangana (IN-TG), India (IN). Experience in developing digital marketing / digital analytics solutions using Adobe products. Experience in Adobe Experience Cloud products and recent experience of minimum 3+ years with Adobe Experience Platform or similar CDP Good knowledge of Data Science workspace and building intelligent Services on AEP Strong knowledge of datasets in Adobe Experience Platform, load data into Platform through data source connectors, APIs, and streaming ingestion connectors Experience in creating all required Adobe XDM (Experience Data Model) in JSON based on approved data model for all loading data files. Knowledge on utilizing Adobe Experience Platform (AEP) UI & POSTMAN to automate all customer schema data lake & profile design setups within each sandbox environment. Experience in configuration within Adobe Experience Platform all necessary identities & privacy settings and creating new segments within AEP to meet customer use cases. Test/Validate the segments with the required destinations. Managing customer data by using Real-Time Customer Data Platform (RTCDP), and analyze customer data by using Customer Journey Analytics (CJA) Experience with creating connection, data views, and dashboard in CJA Hands-on experience in configuration and integration of Adobe Marketing Cloud modules like Audience Manager, Analytics, Campaign and Target Adobe Experience Cloud tool certifications (Adobe Campaign, Adobe Experience Platform, Adobe Target, Adobe Analytics) are desirable. Proven ability to communicate in both verbal and writing in a high performance, collaborative environment. Experience with data analysis, modeling, and mapping to coordinate closely with Data Architect(s) Build the necessary schemas, workflows to ingest customers data, transform the data & load the data into AEP successfully. Build Audiences (segmentations) and create necessary pipeline for Destination activation. About NTT DATA NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https: / / us.nttdata.com / en / contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If youd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here .
Posted 1 month ago
6.0 - 10.0 years
7 - 11 Lacs
Pune
Work from Office
Req ID: 323909 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Ingest Engineer to join our team in Pune, Mah r shtra (IN-MH), India (IN). Job Duties: The Applications Development Technology Lead Analyst is a senior level position responsible for establishing and implementing new or revised application systems and programs in coordination with the Technology team. This is a position within the Ingestion team of the DRIFT data ecosystem. The focus is on ingesting data in a timely , complete, and comprehensive fashion while using the latest technology available to Citi. The ability to leverage new and creative methods for repeatable data ingestion from a variety of data sources while always questioning "is this the best way to solve this problem" and "am I providing the highest quality data to my downstream partners" are the questions we are trying to solve. Responsibilities: Partner with multiple management teams to ensure appropriate integration of functions to meet goals as well as identify and define necessary system enhancements to deploy new products and process improvements Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation Develop comprehensive knowledge of how areas of business, such as architecture and infrastructure, integrate to accomplish business goals Provide in-depth analysis with interpretive thinking to define issues and develop innovative solutions Serve as advisor or coach to mid-level developers and analysts, allocating work as necessary Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firms reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Minimum Skills Required: 6-10 years of relevant experience in Apps Development or systems analysis role Extensive experience system analysis and in programming of software applications Application Development using JAVA, Scala, Spark Familiarity with event driven applications and streaming data Experience with Confluent Kafka, HDFS, HIVE, structured and unstructured database systems (SQL and NoSQL) Experience with various schema and data types -> JSON, AVRO, Parquet, etc. Experience with various ELT methodologies and formats -> JDBC, ODBC, API, Web hook, SFTP, etc. Experience working within the Agile and version control tool sets (JIRA, Bitbucket, Git, etc.) Ability to adjust priorities quickly as circumstances dictate Demonstrated leadership and project management skills Consistently demonstrates clear and concise written and verbal communication
Posted 1 month ago
5.0 - 10.0 years
4 - 5 Lacs
Bengaluru
Work from Office
Req ID: 329922 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Cloud Data Support Engineer to join our team in Bangalore, Karn taka (IN-KA), India (IN). Basic Qualifications: (what are the skills required to this job with minimum years of experience on each) Must have 8 - 10 years experience in Cloud Data Engineering Support roles preferably in Microsoft Azure. Must have some years Cloud Data Engineering Support of a Microsoft Azure Data Warehouse. Experience in a Cloud environment front line support role handling broke/fix recovery. Must have Cloud Data Engineering Support experience working with large datasets and complex data environments, processes, and associated solutions. Incident and Problem Management using ServiceNow Knowledge and experience with a data warehouse star schema and ETL concepts SQL Basic skills and experience 5-6 years experience and knowledge of Streaming Support with Event Hub, Databricks Cluster, Cosmos DB, Scala, Azure SQL DB and Azure App Insights. Knowledge of API s along with Azure Functions, Scala, Python, Azure SQL DB and Azure App Insights. Knowledge of Batch Support with Databricks, Python, Microsoft Data Factory, Azure App Insights, IWS Job Scheduling Knowledge of automated and scheduled batch job tools such as IBM IWS and Control-M. Must have good oral and written communication skills to effectively communicate with various IT teams and business representatives. Must be able to both collaborate in a team-oriented environment and work independently with direction. Nice to Have; (But not a must) Knowledge of Data Product Hygiene and be able to champion this in your work. Support Security and Audit Compliance initiatives related to the Microsoft Azure environment Experience with Big Data solutions and no SQL DBMSs Experience with Azure, AWS, or other Public Cloud environments Experience with Microsoft Azure platform technologies like Databricks applications, Event Hub, Data Factory, Scala, Synapse Analytics, Delta Lake, Cosmos DB, or DevOps Bachelors degree in Computer Science, Computer Information Systems, Management Information Systems, or related field preferred About NTT DATA NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https: / / us.nttdata.com / en / contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If youd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here .
Posted 1 month ago
2.0 - 6.0 years
4 - 7 Lacs
Hyderabad
Work from Office
We are looking forward to hire SAP HR ABAP Professionals in the following areas : Key Responsibilities: Develop, enhance, and maintain ABAP programs, reports, interfaces, conversions, enhancements, and forms (RICEF) in SAP HCM. Work on HR-specific ABAP including Logical Databases (LDBs), Infotypes, Macros, Payroll Schemas, PCRs, and BAdIs . Design and optimize custom workflows, ALV reports, Smart Forms, SAPscripts, and Adobe Forms for HR processes. Develop OData services, CDS views, and Fiori apps for HCM modules. Integrate SAP HCM with SuccessFactors, ESS/MSS, and other third-party systems . Debug and troubleshoot complex HR-PY, PA, TM, and OM issues. Collaborate with functional teams to understand business requirements and provide technical solutions. Perform performance tuning and code optimization for large-scale HR data processing. Follow SAP best practices and ensure compliance with coding standards. Working knowledge or Willing to learn CPI Required Skills & Qualifications: 10+ years of hands-on ABAP development experience , with at least 5+ years in SAP HCM . Strong expertise in HR ABAP , including: Infotype programming, PNP logical database, Payroll & Time Schema modifications HR BAdIs, Enhancement Spots, User Exits Workforce Analytics (HR-SF), ESS/MSS, and Portal integration Proficient in OOPS ABAP, BAPIs, BDCs, RFCs, IDocs, and Web Dynpro . Experience with SAP Fiori, OData . Knowledge of SAP HCM integration with SuccessFactors is a plus. Strong debugging and performance optimization skills. SAP S/4HANA experience is desirable. Excellent communication and client-facing skills. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture
Posted 1 month ago
8.0 - 12.0 years
12 - 22 Lacs
Hyderabad
Work from Office
We are seeking a highly experienced and self-driven Senior Data Engineer to design, build, and optimize modern data pipelines and infrastructure. This role requires deep expertise in Snowflake, DBT, Python, and cloud data ecosystems. You will play a critical role in enabling data-driven decision-making across the organization by ensuring the availability, quality, and integrity of data. Key Responsibilities: Design and implement robust, scalable, and efficient data pipelines using ETL/ELT frameworks. Develop and manage data models and data warehouse architecture within Snowflake . Create and maintain DBT models for transformation, lineage tracking, and documentation. Write modular, reusable, and optimized Python scripts for data ingestion, transformation, and automation. Collaborate closely with data analysts, data scientists, and business teams to gather and fulfill data requirements. Ensure data integrity, consistency, and governance across all stages of the data lifecycle. Monitor pipeline performance and implement optimization strategies for queries and storage. Follow best practices for data engineering including version control (Git), testing, and CI/CD integration. Required Skills and Qualifications: 8+ years of experience in Data Engineering or related roles. Deep expertise in Snowflake : schema design, performance tuning, security, and access controls. Proficiency in Python , particularly for scripting, data transformation, and workflow automation. Strong understanding of data modeling techniques (e.g., star/snowflake schema, normalization). Proven experience with DBT for building modular, tested, and documented data pipelines. Familiarity with ETL/ELT tools and orchestration platforms like Apache Airflow or Prefect . Advanced SQL skills with experience handling large and complex data sets. Exposure to cloud platforms such as AWS , Azure , or GCP and their data services. Preferred Qualifications: Experience implementing data quality checks and governance frameworks. Understanding of modern data stack and CI/CD pipelines for data workflows. Contributions to data engineering best practices, open-source projects, or thought leadership.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough