Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
11.0 - 15.0 years
16 - 20 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering Service Line Strategic Technology Group Responsibilities Power Programmer is an important initiative within Global Delivery to develop a team of Full Stack Developers who will be working on complex engineering projects, platforms and marketplaces for our clients using emerging technologies., They will be ahead of the technology curve and will be constantly enabled and trained to be Polyglots., They are Go-Getters with a drive to solve end customer challenges and will spend most of their time in designing and coding, End to End contribution to technology oriented development projects., Providing solutions with minimum system requirements and in Agile Mode., Collaborate with Power Programmers., Open Source community and Tech User group., Custom Development of new Platforms & Solutions ,Opportunities., Work on Large Scale Digital Platforms and marketplaces., Work on Complex Engineering Projects using cloud native architecture ., Work with innovative Fortune 500 companies in cutting edge technologies., Co create and develop New Products and Platforms for our clients., Contribute to Open Source and continuously upskill in latest technology areas., Incubating tech user group Technical and Professional : Cloud Architecture & Design, Cloud Optimization & Automation, Innovation & Thought Leadership, Extensive experience with AWS, Azure, or GCP cloud platforms,Deep understanding of cloud computing concepts, including IaaS, PaaS, and SaaS, Strong experience with infrastructure as code (IaC) and DevOps practices,Experience with containerization and orchestration (Docker, Kubernetes), Strong knowledge of cloud security best practices and compliance standards, Industry certifications Preferred Skills: Technology-Cloud Platform-Cloud Platform - ALL Technology-Container Platform-Docker Technology-Container Platform-Kubernetes Technology-Cloud Platform-Google Cloud - Architecture
Posted 4 weeks ago
5.0 - 8.0 years
15 - 18 Lacs
Bengaluru
Work from Office
Looking for Immediate joiners Roles Design, develop, and maintain scalable and robust Java applications. Collaborate with product managers, designers, and other engineers to gather requirements and translate them into technical specications. Write clean, maintainable, and eicient code following best practices and coding standards. Conduct code reviews and provide constructive feedback to team members. Troubleshoot and debug applications to optimize performance and resolve issues. Participate in the full software development lifecycle, including planning, development, testing, and deployment. Stay up-to-date with emerging technologies and industry trends to continuously improve skills and knowledge. Mentor and guide junior engineers, fostering a culture of learning and collaboration within the team. Qualification: Bachelors degree in Computer Science, Engineering, or a related eld. 5 to 10 years of professional experience in software development, with a strong focus on one or more programming languages such as Java Experience with RESTful APIs and microservices architecture. Prociency in Java frameworks such as Spring, Hibernate, or similar. Knowledge of database systems (e.g., MySQL, PostgreSQL, Oracle) and SQL. Experience of working with event driven architectures. Understanding of software development methodologies (Agile, Scrum, etc.). Strong problem-solving skills and the ability to work independently as well as in a team environment. Familiarity with front-end technologies (e.g., HTML, CSS, JavaScript) is a plus. Excellent communication and interpersonal skills. Prefered qualification: Experience with cloud platforms (e.g., AWS, Azure, Google Cloud). Familiarity with containerization technologies (e.g., Docker, Kubernetes). Knowledge of CI/CD tools and practices. Experience with test-driven development (TDD) and automated testing frameworks.
Posted 4 weeks ago
8.0 - 12.0 years
25 - 27 Lacs
Bengaluru
Work from Office
About the Role: We are seeking a skilled RPA Developer with expertise in UI Path and strong backend development experience in Java, Python, and .NET. The ideal candidate will have a solid understanding of robotic process automation (RPA) development, along with the technical expertise to integrate backend technologies to enhance automation processes. You will be responsible for developing, designing, and maintaining RPA workflows that integrate with multiple systems using backend technologies. Key Responsibilities: RPA Development: Design, develop, and deploy RPA solutions using UI Path, automating repetitive processes, improving operational efficiency, and ensuring accuracy. Backend Integration: Collaborate with backend development teams to integrate RPA solutions with existing backend technologies such as Java, Python, and .NET to improve system automation and data flow. Process Analysis: Analyze business processes, identify automation opportunities, and create detailed workflows to automate tasks while ensuring the scalability and security of the automation. Solution Design & Architecture: Design and implement RPA solutions in alignment with business goals and IT strategy, ensuring seamless integration with enterprise systems and applications. Coding & Scripting: Write and maintain backend code in Java, Python, and .NET to support RPA functionality, including building custom APIs, handling data transformations, and ensuring smooth communication between systems. Testing & Debugging: Ensure thorough testing, troubleshooting, and debugging of RPA solutions and backend integrations to ensure stability and reliability. Maintenance & Support: Provide ongoing support and maintenance for existing RPA workflows, including enhancements, bug fixes, and optimizations. Collaboration & Documentation: Collaborate with cross-functional teams (Business Analysts, IT, and Operations) to gather requirements and translate them into technical solutions. Document RPA workflows, backend integration details, and system architecture. Key Skills & Qualifications: RPA Development: Hands-on experience with UI Path RPA platform, including development, deployment, and maintenance of automation processes. Programming Skills: o Strong proficiency in Java, Python, and .NET for backend integration and development of custom components. o Experience with RESTful APIs, web services, and database connectivity (SQL, NoSQL) to facilitate backend integration. Experience with Automation Tools: Familiarity with additional RPA tools is a plus (e.g., Automation Anywhere, Blue Prism). Backend Technologies: Deep understanding of backend technologies and frameworks in Java, Python, and .NET to build integrations, perform data transformations, and optimize system performance. Education & Experience: Bachelors degree in Computer Science, Information Technology, or a related field (or equivalent work experience). Minimum 5-8 years of experience working with RPA (UI Path), Java, Python, and .NET. Proven experience in developing backend solutions and integrating with RPA tools. Desired Skills (Optional): Certifications in UI Path RPA. Knowledge of cloud platforms like AWS, Azure, or Google Cloud is a plus. Familiarity with DevOps practices and CI/CD pipelines.
Posted 4 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
Here at UKG, our purpose is people™. Our HR, payroll, and workforce management solutions help organizations unlock happier outcomes for all. And our U Krewers, who build those solutions and support our business, are talented, collaborative, and innovative problem-solvers. We strive to create a culture of belonging and an employee experience that empowers our people – both at work and at home. Our benefits show that we care about the whole you, from adoption and surrogacy assistance to tuition reimbursement and wellness programs. Our employee resource groups provide a welcoming place to land, learn, and connect with those who share your passions and interests. What are you waiting forLearn more atwww.ukg.com/careers#WeAreUKGDescription & Qualifications Description Site Reliability Engineers at UKG are team members that have a breadth of knowledge encompassing all aspects of service delivery. They develop software solutions to enhance, harden and support our service delivery processes. This can include building and managing CI/CD deployment pipelines, automated testing, capacity planning, performance analysis, monitoring, alerting, chaos engineering and auto remediation. Site Reliability Engineers must have a passion for learning and evolving with current technology trends. They strive to innovate and are relentless in their pursuit of a flawless customer experience. They have an “automate everything” mindset, helping us bring value to our customers by deploying services with incredible speed, consistency and availability. Primary/Essential Duties and Key ResponsibilitiesEngage in and improve the lifecycle of services from conception to EOL, includingsystem design consulting, and capacity planning Define and implement standards and best practices related toSystem Architecture, Service delivery, metrics and the automation of operational tasks Support services, product & engineering teams by providing common tooling and frameworks to deliver increased availability and improved incident response. Improve system performance, application delivery and efficiency through automation, process refinement, postmortem reviews, and in-depth configuration analysis Collaborate closely with engineering professionals within the organization to deliver reliable services Identify and eliminate operational toil by treating operational challenges as a software engineering problem Actively participate in incident response, including on-call responsibilities Qualifications Engineering degree, or a related technical discipline, or equivalent work experience Experience coding in higher-level languages (e.g., Python, Javascript, C++, or Java) Knowledge of Cloud based applications & Containerization Technologies Demonstrated understanding of best practices in metric generation and collection, log aggregation pipelines, time-series databases, and distributed tracing Ability to analyze current technology utilized and engineering practices within the company and develop steps and processes to improve and expand upon them Working experience with industry standards like Terraform, Ansible. (Experience, Education, Certification, License and Training) Must have at least 5 years of hands-on experience working within Engineering or Cloud. Minimum 2 years' experience with public cloud platforms (e.g. GCP, AWS, Azure) Experience in configuration and maintenance of applications & systems infrastructure. Experience with distributed system design and architecture Experience building and managing CI/CD Pipelines EEO Statement Equal Opportunity Employer Ultimate Kronos Group is proud to be an equal opportunity employer and is committed to maintaining a diverse and inclusive work environment. All qualified applicants will receive considerations for employment without regard to race, color, religion, sex, age, disability, marital status, familial status, sexual orientation, pregnancy, genetic information, gender identity, gender expression, national origin, ancestry, citizenship status, veteran status, and any other legally protected status under federal, state, or local anti-discrimination laws.ViewThe EEO Know Your Rights posterand itssupplement.View thePay Transparency Nondiscrimination Provision UKG participates in E-Verify. View the E-Verify postershere.
Posted 4 weeks ago
6.0 - 10.0 years
6 - 10 Lacs
Greater Noida
Work from Office
SQL DEVELOPER: Design and implement relational database structures optimized for performance and scalability. Develop and maintain complex SQL queries, stored procedures, triggers, and functions. Optimize database performance through indexing, query tuning, and regular maintenance. Ensure data integrity, consistency, and security across multiple environments. Collaborate with cross-functional teams to integrate SQL databases with applications and reporting tools. Develop and manage ETL (Extract, Transform, Load) processes for data ingestion and transformation. Monitor and troubleshoot database performance issues. Automate routine database tasks using scripts and tools. Document database architecture, processes, and procedures for future reference. Stay updated with the latest SQL best practices and database technologies.Data Retrieval: SQL Developers must be able to query large and complex databases to extract relevant data for analysis or reporting. Data Transformation: They often clean, join, and reshape data using SQL to prepare it for downstream processes like analytics or machine learning. Performance Optimization: Writing queries that run efficiently is key, especially when dealing with big data or real-time systems. Understanding of Database Schemas: Knowing how tables relate and how to navigate normalized or denormalized structures is essential. QE: Design, develop, and execute test plans and test cases for data pipelines, ETL processes, and data platforms. Validate data quality, integrity, and consistency across various data sources and destinations. Automate data validation and testing using tools such as PyTest, Great Expectations, or custom Python/SQL scripts. Collaborate with data engineers, analysts, and product managers to understand data requirements and ensure test coverage. Monitor data pipelines and proactively identify data quality issues or anomalies. Contribute to the development of data quality frameworks and best practices. Participate in code reviews and provide feedback on data quality and testability. Strong SQL skills and experience with large-scale data sets. Proficiency in Python or another scripting language for test automation. Experience with data testing tools Familiarity with cloud platforms and data warehousing solutions
Posted 4 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Hyderabad
Work from Office
Primary Skills IBM Sterling B2B Integrator ExpertiseProficient in designing, developing, and maintaining integration solutions using IBM Sterling B2B Integrator, including business process modeling, service configuration, and partner onboarding. Business Process Modeling (BPM)Skilled in creating and managing complex business processes using BPML (Business Process Modeling Language), including custom services, adapters, and error handling logic. EDI and B2B IntegrationStrong understanding of Electronic Data Interchange (EDI) standards such as X12, EDIFACT, and TRADACOMS, and experience in implementing B2B integrations with trading partners. Map Development and Data TransformationExperience in developing and maintaining maps using Sterling Map Editor for data transformation between different formats (EDI, XML, flat files, JSON). Communication Protocols and AdaptersHands-on experience with communication protocols like FTP, SFTP, HTTP, AS2, and adapters such as File System, FTP, HTTP, and JMS for secure and reliable data exchange. Partner Onboarding and ManagementAbility to configure trading partner profiles, certificates, and communication channels, ensuring smooth onboarding and secure data transactions. System Monitoring and TroubleshootingProficient in using Sterling dashboards, logs, and monitoring tools to identify, troubleshoot, and resolve integration issues and performance bottlenecks. Deployment and Environment ManagementFamiliarity with deploying solutions across development, QA, and production environments, including packaging, versioning, and rollback strategies. Secondary Skills Knowledge of IBM Sterling File Gateway and Control Center Experience with scripting languages (Shell, Python) for automation and custom utilities Familiarity with database systems (Oracle, DB2, SQL Server) and writing SQL queries Understanding of security standards (SSL/TLS, PGP, digital certificates) in B2B contexts Exposure to Agile methodologies and tools like Jira, Confluence, or ServiceNow Basic knowledge of cloud platforms and hybrid integration scenarios Strong documentation and communication skills for working with business and technical teams
Posted 4 weeks ago
6.0 - 8.0 years
0 Lacs
Mohali, Punjab, India
On-site
We are seeking an experienced Lead Full Stack Developer with a strong background in the MERN stack and serverless technologies. The ideal candidate will have a minimum of 6 years of experience in software development, with significant expertise in both relational (RDBMS) and NoSQL databases. You will lead a team in designing and developing scalable, high-quality solutions while ensuring best practices are followed throughout the development lifecycle. Responsibilities: Develop and maintain scalable backend services using Node.js and serverless technologies such as AWS Lambda, Google Cloud Functions, or Azure Functions. Architect and implement front-end solutions using React.js, ensuring seamless integration with backend services. Design, implement, and optimise both relational (PostgreSQL, MySQL) and NoSQL ( MongoDB ) databases. Translate business requirements into robust, efficient technical solutions. Conduct code reviews to maintain high standards of code quality, performance, and maintainability. Stay updated with the latest trends and advancements in the MERN stack, serverless architectures, and cloud platforms. Lead and mentor a team of developers, providing technical guidance and fostering a collaborative environment. Requirements: Minimum of 6 years of professional experience in software development, with at least 6 years specifically in the MERN stack. Proven leadership experience in managing and mentoring development teams. Proficiency in Node.js, Express.js, React.js, and MongoDB. Hands-on experience with serverless architectures, including AWS Lambda, Google Cloud Functions, or Azure Functions . Strong understanding of RDBMS (PostgreSQL, MySQL) and NoSQL databases (MongoDB). Solid understanding of RESTful APIs, microservices architecture, and cloud platforms. Excellent problem-solving, debugging, and collaboration skills. Additional Skills: Experience with GraphQL, Docker, and Kubernetes is a plus. Contributions to open-source projects are highly valued. Understanding of DevOps practices and CI/CD Pipeline
Posted 4 weeks ago
1.0 - 2.0 years
4 - 8 Lacs
Mumbai
Work from Office
*Strong understanding of JavaScript/TypeScript. * Experience with databases like PostgreSQL or MySQL. *REST API design and integration *Experience with ReactJS, NestJS, or Next.js API routes. * Exposure to AWS, Firebase, or cloud platforms.
Posted 4 weeks ago
7.0 - 12.0 years
17 - 32 Lacs
Navi Mumbai, Pune
Work from Office
Hi, We are looking to hire an IDP Solutions Engineer with experience in IDP Platforms/Technologies Data Extraction, AWS, OCR etc Job Location : Pune/Vashi, Navi Mumbai Experience : 7-12 years Work Mode : Work from office Job Profile We are hiring a Solutions Engineer who has hands-on experience in IDP/OCR, Digitization, Data Extraction and into solutioning. The addition of a Solution Engineer is critical to support our growing pipeline of complex opportunities that require in-depth technical evaluation, solution design, and client alignment prior to contract closure. As our product and service portfolio becomes more sophisticated, clients increasingly expect tailored demonstrations, rapid prototyping, and detailed technical responses to RFPs and RFIs. The Presales Solution Engineer will work closely with sales, product, and implementation teams to ensure proposals are technically sound, feasible, and aligned with client needs. Their involvement directly impacts win rates, accelerates the sales cycle, and ensures a smoother transition to delivery teams post-sale. Increased engagement of a dedicated SE will help: Improve technical accuracy and credibility during client conversations Enhance solution quality and reduce post-sale rework Shorten response times to RFPs and technical queries Increase overall win rates and deal value This role is essential for ensuring our offerings are positioned competitively and with confidence in high-stakes deals, especially in enterprise and public sector pursuits. The Presales Solution Engineer plays a critical role in the sales process by working closely with account executives, prospects, and customers to understand requirements and propose tailored technical solutions. This individual bridges the gap between customer needs and the company's products or services, delivering compelling solution demonstrations and technical insights. Key Responsibilities: Collaborate with the sales team to identify customer needs and define technical requirements. Prepare and deliver high-impact product demonstrations, presentations, and proof of concepts. Design tailored solutions that align with client objectives and leverage the organizations offerings. Respond to RFPs, RFIs, and technical questionnaires in coordination with proposal teams. Act as a technical advisor during the sales cycle, addressing questions around architecture, security, scalability, and integration. Coordinate with product and engineering teams to ensure proposed solutions are feasible and aligned with the roadmap. Support pilots, solution workshops, and stakeholder discussions Maintain up-to-date knowledge of product features, market trends, and competitor solutions. Requirements: Bachelors degree in Engineering, Computer Science, Information Systems, or related field. 7+ years of experience in presales, solutions engineering, or a related technical role. Strong understanding of enterprise software/SaaS, integrations, APIs, and cloud platforms (e.g., AWS, Azure). Proven experience with Intelligent Document Processing (IDP) platforms and technologies Experience working with cross-functional teams, including sales, product, and implementation. Ability to manage multiple opportunities and deadlines in a fast-paced environment.
Posted 4 weeks ago
10.0 - 14.0 years
1 - 10 Lacs
Bengaluru
Work from Office
Responsibilities: * Design enterprise architectures for AI deployments, data lakes & warehouses. * Lead legacy system modernization initiatives. * Ensure compliance with NIST, ISO & GDPR standard * Align AI, cloud, and security with business goals.
Posted 4 weeks ago
3.0 - 6.0 years
5 - 8 Lacs
Mumbai, New Delhi, Bengaluru
Work from Office
We are seeking an experienced AI Tester with 3-6 years of expertise in software testing and AI/ML systems validation. The ideal candidate will focus on ensuring the performance, accuracy, and ethical compliance of AI models while delivering high-quality results. Key responsibilities include developing and executing test plans for AI/ML models, validating data integrity and preprocessing pipelines, performing functional and performance testing, and testing AI integrations within end-to-end systems. The role involves building and maintaining automation frameworks using tools like TensorFlow Testing Library and PyTest, analyzing metrics such as accuracy and latency, and collaborating with cross-functional teams to enhance model quality and deployment processes. Candidates should also document test cases, scenarios, and results while reporting issues using tools like Jira. Proficiency in Python, AI/ML frameworks (e.g., TensorFlow, PyTorch), and cloud platforms is essential. Experience in MLOps, ethical AI testing, and tools like SHAP or LIME is preferred. Location: Chennai, Hyderabad, Kolkata, Pune, Ahmedabad, Remote
Posted 4 weeks ago
3.0 - 6.0 years
5 - 8 Lacs
Mumbai, New Delhi, Bengaluru
Work from Office
We are seeking an experienced AI Tester with 3-6 years of expertise in software testing and AI/ML systems validation. The ideal candidate will focus on ensuring the performance, accuracy, and ethical compliance of AI models while delivering high-quality results. Key responsibilities include developing and executing test plans for AI/ML models, validating data integrity and preprocessing pipelines, performing functional and performance testing, and testing AI integrations within end-to-end systems. The role involves building and maintaining automation frameworks using tools like TensorFlow Testing Library and PyTest, analyzing metrics such as accuracy and latency, and collaborating with cross-functional teams to enhance model quality and deployment processes. Candidates should also document test cases, scenarios, and results while reporting issues using tools like Jira. Proficiency in Python, AI/ML frameworks (e.g., TensorFlow, PyTorch), and cloud platforms is essential. Experience in MLOps, ethical AI testing, and tools like SHAP or LIME is preferred. Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote
Posted 4 weeks ago
5.0 - 10.0 years
12 - 15 Lacs
Gurugram, Ahmedabad
Work from Office
We are seeking a highly skilled GCP Data Engineer with experience in designing and developing data ingestion frameworks, real-time processing solutions, and data transformation frameworks using open-source tools. The role involves operationalizing open-source data-analytic tools for enterprise use, ensuring adherence to data governance policies, and performing root-cause analysis on data-related issues. The ideal candidate should have a strong understanding of cloud platforms, especially GCP, with hands-on expertise in tools such as Kafka, Apache Spark, Python, Hadoop, and Hive. Experience with data governance and DevOps practices, along with GCP certifications, is preferred.
Posted 4 weeks ago
10.0 - 15.0 years
35 - 40 Lacs
Noida
Work from Office
Description: We are seeking a seasoned Manager – Data Engineering with strong experience in Databricks or the Apache data stack to lead complex data platform implementations. You will be responsible for leading high-impact data engineering engagements for global clients, delivering scalable solutions, and driving digital transformation. Requirements: Required Skills & Experience: • 12–18 years of total experience in data engineering, including 3–5 years in a leadership/managerial role. • Hands-on experience in Databricks OR core Apache stack – Spark, Kafka, Hive, Airflow, NiFi, etc. • Expertise in one or more cloud platforms: AWS, Azure, or GCP – ideally with Databricks on cloud. • Strong programming skills in Python, Scala, and SQL. • Experience in building scalable data architectures, delta lakehouses, and distributed data processing. • Familiarity with modern data governance, cataloging, and data observability tools. • Proven experience managing delivery in an onshore-offshore or hybrid model. • Strong communication, stakeholder management, and team mentoring capabilities. Job Responsibilities: Key Responsibilities: • Lead the architecture, development, and deployment of modern data platforms using Databricks, Apache Spark, Kafka, Delta Lake, and other big data tools. • Design and implement data pipelines (batch and real-time), data lakehouses, and large-scale ETL frameworks. • Own delivery accountability for data engineering programs across BFSI, telecom, healthcare, or manufacturing clients. • Collaborate with global stakeholders, product owners, architects, and business teams to understand requirements and deliver data-driven outcomes. • Ensure best practices in DevOps, CI/CD, infrastructure-as-code, data security, and governance. • Manage and mentor a team of 10–25 engineers, conducting performance reviews, capability building, and coaching. • Support presales activities including solutioning, technical proposals, and client workshops What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!
Posted 4 weeks ago
5.0 - 8.0 years
9 - 13 Lacs
Pune
Work from Office
Role Purpose VROPS-VMWARE JD: Knowledge of virtualization platforms and associated technologies including (but not limited to) the fullVMware stack (ESXi, vCenter, vSAN, NSX, and the vRealize Suite) Should have good hands-on experience on designing & solution & implementing V sphere, V center, VRA,VRO, VCS & VROPs Should have good hands-on experience in Cloud Builder Understanding of configuration management and software defined data center (SDDC) infrastructure Must have good understanding of Cloud platforms viz AWS, Azure etc. and ability to design and deliver solutions around it Knowledge of embedded solutions on AWS, Azure etc. (For example VMC on AWS, VMware Horizon on Azure etc) Ability to develop/build IT solutions to meet business requirements Ability to implement & troubleshoot Documentation, organization, and time management skills Take initiative to improve processes, procedures, skills, and technical knowledge Excellent communication & presentation skill Do 1. Lead automation roadmaps and strategies for various BUs Ensure complete understanding of requirements needed to implement automation of various accounts in BUs Do shortlisting of accounts depending upon the size of account and their ability to accommodate maturity Drive automation maturity in the shortlisted accounts to the next level Align the customers by showcasing Wipros capability to drive automation and ROI achievement for the customer Ensure contractual commitments for a particular project are met by understanding the scope and requirements Ensure the desired software and infrastructures are implemented required for automation Ensure scalable standards of dashboard and process support system for active automation monitoring Develop and review the Account Automation Plans for each account in consultation with the account partners and delivery teams Review the automation projects on progress and resolves complex escalations related to operations, production, quality control, schedules and maintenance Come up with solution of the problems regarding automation persistent in development unit Manage the appropriate level of access control to protect export controlled, proprietary, and sensitive project information Periodically review the project status completion vis-a vis the project plan and ensure successful roll out Review the upcoming automation trends, technologies and ways of working and identify the capability gap within the team Responsible to maintain customer relationships and derive maximum customer references to ensure business continuity Receive feedback from the customers and align resources internally to close all the gaps Deliver No. Performance Parameter Measure 1. Client Engagement CSAT, Customer reference, Customer Solutions, mean time to resolve customer issues (MMT reducing trend) 2. BOT implementation Work done by the bots for platform/non platform accounts, No. of platform accounts per DU Mandatory Skills: VmWare Vrealize. Experience5-8 Years.
Posted 4 weeks ago
7.0 - 12.0 years
12 - 22 Lacs
Mumbai
Work from Office
Job Name (Digital Banking) Associate Data Analyst Location - Mumbai Grade - Senior Manager / AVP Looking for Business Analyst working in Regulated sector by RBI - Bank, Lending NBFC or consulting Firms - Working on Banking data. Having experience in Business credit risk. Predominant Skills - Data Quality; Remediation Processes (Databases, SQL and Python) Data Visualisation Skills (Dashboard, Tableau Power, BI) Informatica Data Quality Basic understanding of Data Lakes and Cloud environment Job Purpose HDFC Bank has huge volume of data, both structured and unstructured, and we are focused on creating assets out of data and deriving best value from the data for the Bank. The Data Remediation and DaaS specialist will be responsible for improving customer data quality through various internal data remediation methodologies. This role will also focus on designing, implementing, and maintaining global and local data marts on the Banks Data Lake to support business, marketing, analytics, regulatory, and other functional use cases. This role is crucial in ensuring high-quality customer data while enabling business functions with reliable and well-structured data marts. The ideal candidate will be someone with a passion for data quality, strong technical skills, and a strategic mindset to drive data-driven decision-making across the Bank. Role & responsibilities Customer Data Quality Management • Analyze and assess data quality issues in customer records • Implement data cleansing, standardization, and deduplication strategies. • Monitor and improve the accuracy, completeness, and consistency of customer data. Formulate Data Remediation Strategies • Conduct root cause analysis to identify sources of poor data quality. • Coordinate with internal stakeholders to drive data improvement initiatives. Data Mart Development & Maintenance • Engage with multiple business, product, credit, risk, analytics, marketing, finance, BIU etc. stakeholders to discover requirements of data marts along with the current challenges faced Providing inputs and recommendation on continuous improvement of policies, procedures, processes, standards, and control pertaining to Data Marts Quantify the impact in business value terms (revenue/cost/loss) due to launch of global and loc Experience Required 5-7 years of total work experience in Data Quality/ Data Product creation 5+ years of experience in Banking and Financial services Experience of working in large, multi-functional, matrix organization Strong technical & functional understanding of Data Remediation and Data Products that includes Staging, Mapping, Cleanse Function, Match Rules, Validation, Trust Scores, Remediation Techniques, Mart creation methodologies & best practices etc Experience with industry-leading master data/metadata/data quality suites, such as Informatica Data Quality Exposure of working in Cloud environment will be an added advantage
Posted 4 weeks ago
11.0 - 15.0 years
15 - 25 Lacs
Pune
Hybrid
Sr Specialist Software Engineer What’s the role all about? You will be a key contributor to developing a multi-region, multi-tenant SaaS product. You will collaborate with the core R&D team, using technologies like React, .NET/C# and AWS to build scalable, high-performance products within a cloud-first, microservices-driven environment. How will you make an impact? Take ownership of the software development lifecycle, including design, development, unit testing, and deployment, working closely with QA teams. Ensure that architectural concepts are consistently implemented across the product. Act as a product expert within R&D, understanding the product’s requirements and its market positioning. Work closely with cross-functional teams to ensure successful product delivery. Key Responsibilities: Lead the design and implementation of software features in alignment with product specifications and adhere to High-Level Design (HLD) and Low-Level Design (LLD) standards. Lead the development of scalable, multi-tenant SaaS solutions. Collaborate with Product Management, R&D, UX, and DevOps teams to deliver seamless, end-to-end solutions. Advocate for and implement Continuous Integration and Delivery (CI/CD) practices to improve development efficiency and product quality. Mentor junior engineers, share knowledge, and promote best practices within the team. Assist in solving complex technical problems and enhance product functionality through innovative solutions. Conduct code reviews to ensure adherence to design principles and maintain high-quality standards. Plan and execute unit testing to verify functionality and ensure automation coverage. Contribute to the ongoing support of software features, ensuring complete quality coverage and responsiveness to any issues during the software lifecycle. Qualifications & Experience: Bachelor’s or Master’s degree in Computer Science, Electronics Engineering, or a related field from a reputed institute. More than 11 years of experience in software development with a strong focus on backend technologies and a track record of delivering complex projects. Expertise in React, JavaScript, Typescript for front-end development. Experience working with public cloud platforms like AWS (mandatory). Hands-on experience with Continuous Integration and Delivery (CI/CD) practices using tools like Docker, Kubernetes, and other modern pipelines. Experience in .Net is good to have but NOT mandatory Experience in developing high-performance, highly available, and scalable systems. Working knowledge of RESTful APIs Solid understanding of scalable and microservices architectures, performance optimization, and secure coding practices. Exceptional problem-solving skills and the ability to work on multiple concurrent projects. What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NiCE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NiCEr! Enjoy NiCE-FLEX! At NiCE, we work according to the NiCE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere.
Posted 4 weeks ago
15.0 - 20.0 years
25 - 40 Lacs
Mumbai, Navi Mumbai, Mumbai (All Areas)
Work from Office
Hi Candidate, Job Title: Solution Architect Location Airoli. Experience:15+ Years Job details: Proven experience as a Solution Architect with hands-on expertise in the latest technologies and cloud platforms . Extensive experience in designing and deploying cloud-native applications and microservices architectures using Docker , Kubernetes , and serverless platforms . Hands-on experience with modern back-end technologies, such as Node.js , Go , Rust , Spring Boot , Django , and frameworks like ASP.NET Core . Familiarity with API gateways , GraphQL , and event-driven architectures . Expertise in AWS , Azure , and Google Cloud , with experience deploying and managing applications in these environments. Proficiency in using serverless frameworks like AWS Lambda , Azure Functions , and Google Cloud Functions . Expertise in SQL (e.g., PostgreSQL , MySQL ) NoSQL (e.g., MongoDB , Cassandra , DynamoDB ) , and Oracle databases. Experience with distributed databases , event-sourcing , and CQRS patterns. Strong experience, Jenkins , Docker , Kubernetes , OpenShift , and Helm . Guide the adoption of the front-end and back-end technologies, including: Front-End: React , Vue.js , Next.js , Svelte for building modern and responsive UI. Back-End: Node.js , Go , Rust , Spring Boot , Django , ASP.NET Core , Flask . Databases: PostgreSQL , MongoDB , CockroachDB , Redis , Cassandra , and GraphQL for API management. Cloud Platforms: Expertise in AWS , Azure , and Google Cloud with serverless technologies like AWS Lambda , Azure Functions , Google Cloud Functions . Education: Bachelors degree in Computer Science , Software Engineering , Information Technology , or a related field (Master's degree preferred). Note: Interested candidate apply on vishakha.gangurde@kiya.ai Regards Vishakha Gangurde
Posted 1 month ago
2.0 - 5.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Roles & Responsibilities 1. Technical Leadership Oversee design and development of scalable applications using .NET Core and React . Provide hands-on support in coding, architecture decisions, and full-stack integration. Ensure implementation of best practices , design patterns, and code quality standards. 2. Project Planning & Execution Break down business requirements into technical tasks and allocate work across the team. Lead sprint planning, track progress, and ensure timely delivery of features. Collaborate with product managers and QA for release planning and bug fixing. 3. Code Review & Quality Assurance Review code for performance, security, and maintainability. Guide the team in writing clean, reusable, and testable code . Establish coding standards and ensure adherence through regular reviews. 4. Team Management & Mentoring Mentor junior developers and help them improve their technical and soft skills. Foster a positive team environment that encourages innovation and accountability. Conduct regular 1-on-1s, performance feedback, and team knowledge-sharing sessions. 5. Communication & Collaboration Act as a bridge between developers, UI/UX, QA, and management teams. Provide regular updates to stakeholders on project status, risks, and progress. Facilitate smooth communication within the team and across departments. 6. Deployment & DevOps Involvement Assist in setting up CI/CD pipelines for deployment. Work with DevOps teams to manage cloud hosting (Azure/AWS) and infrastructure. Ensure application health, performance monitoring, and uptime management.
Posted 1 month ago
10.0 - 17.0 years
22 - 37 Lacs
Mumbai, Navi Mumbai, Mumbai (All Areas)
Work from Office
Hi Candidate, Job Title: Solution Architect Location Airoli. Experience:10+ Years Job details: Proven experience as a Solution Architect with hands-on expertise in the latest technologies and cloud platforms . Extensive experience in designing and deploying cloud-native applications and microservices architectures using Docker , Kubernetes , and serverless platforms . Hands-on experience with modern back-end technologies, such as Node.js , Go , Rust , Spring Boot , Django , and frameworks like ASP.NET Core . Familiarity with API gateways , GraphQL , and event-driven architectures . Expertise in AWS , Azure , and Google Cloud , with experience deploying and managing applications in these environments. Proficiency in using serverless frameworks like AWS Lambda , Azure Functions , and Google Cloud Functions . Expertise in SQL (e.g., PostgreSQL , MySQL ) NoSQL (e.g., MongoDB , Cassandra , DynamoDB ) , and Oracle databases. Experience with distributed databases , event-sourcing , and CQRS patterns. Strong experience, Jenkins , Docker , Kubernetes , OpenShift , and Helm . Guide the adoption of the front-end and back-end technologies, including: Front-End: React , Vue.js , Next.js , Svelte for building modern and responsive UI. Back-End: Node.js , Go , Rust , Spring Boot , Django , ASP.NET Core , Flask . Databases: PostgreSQL , MongoDB , CockroachDB , Redis , Cassandra , and GraphQL for API management. Cloud Platforms: Expertise in AWS , Azure , and Google Cloud with serverless technologies like AWS Lambda , Azure Functions , Google Cloud Functions . Education: Bachelors degree in Computer Science , Software Engineering , Information Technology , or a related field (Master's degree preferred). Note: Interested candidate apply on vishakha.gangurde@kiya.ai Regards Vishakha Gangurde
Posted 1 month ago
7.0 - 10.0 years
7 - 10 Lacs
Delhi, India
On-site
Key Responsibilities: Design and Development: Design, develop, and maintain scalable ETL pipelines using cloud-native tools (AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, etc.). Architect and implement data lakes and data warehouses on cloud platforms (AWS, Azure, GCP). Develop and optimize data ingestion, transformation, and loading processes using Databricks, Snowflake, Redshift, BigQuery and Azure Synapse. Implement ETL processes using tools like Informatica, SAP Data Intelligence, and others. Develop and optimize data processing jobs using Spark Scala. Data Integration and Management: Integrate various data sources, including relational databases, APIs, unstructured data, and ERP systems into the data lake. Ensure data quality and integrity through rigorous testing and validation. Perform data extraction from SAP or ERP systems when necessary. Performance Optimization: Monitor and optimize the performance of data pipelines and ETL processes. Implement best practices for data management, including data governance, security, and compliance. Collaboration and Communication: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Collaborate with cross-functional teams to design and implement data solutions that meet business needs. Documentation and Maintenance: Document technical solutions, processes, and workflows. Maintain and troubleshoot existing ETL pipelines and data integrations. Education: Bachelor s degree in Computer Science, Information Technology, or a related field. Advanced degrees are a plus. Experience: 7+ years of experience as a Data Engineer or in a similar role. Proven experience with cloud platforms: AWS, Azure, and GCP. Hands-on experience with cloud-native ETL tools such as AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, etc. Experience with other ETL tools like Informatica, SAP Data Intelligence, etc. Experience in building and managing data lakes and data warehouses. Proficiency with data platforms like Redshift, Snowflake, BigQuery, Databricks, and Azure Synapse. Experience with data extraction from SAP or ERP systems is a plus. Strong experience with Spark and Scala for data processing. Skills: Strong programming skills in Python, Java, or Scala. Proficient in SQL and query optimization techniques. Familiarity with data modeling, ETL/ELT processes, and data warehousing concepts. Knowledge of data governance, security, and compliance best practices. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Preferred Qualifications: Experience with other data tools and technologies such as Apache Spark, or Hadoop. Certifications in cloud platforms (AWS Certified Data Analytics - Specialty, Google Professional Data Engineer, Microsoft Certified: Azure Data Engineer Associate). Experience with CI/CD pipelines and DevOps practices for data engineering Selected applicant will be subject to a background investigation, which will be conducted and the results of which will be used in compliance with applicable law.
Posted 1 month ago
3.0 - 8.0 years
3 - 8 Lacs
Bengaluru, Karnataka, India
On-site
Roles and Responsibilities : What will you do Collaborate Across Teams: Work closely with cross-functional teams to strategically shape and define system requirements, ensuring alignment with overall business goals. Design and Drive Projects: Lead the design and execution of high-priority, high-visibility cloud platform projects, ensuring timely delivery and exceptional quality. Own Feature Development: Take full ownership of the feature development lifecycle, from refining requirements through to successful production deployment, ensuring seamless integration and functionality. Build Secure and Scalable Backend: Develop and maintain a secure, scalable backend infrastructure that serves as the backbone for other development teams, supporting their needs and enhancing overall system performance. Ensure Efficiency and Performance: Focus on developing efficient solutions that guarantee backend reliability, scalability, and optimal performance, leveraging best practices and cutting-edge technologies. Conduct Code Reviews: Provide constructive feedback during code reviews, fostering a culture of continuous improvement and high-quality code standards. Troubleshoot Complex Issues: Quickly identify, diagnose, and resolve complex technical issues, minimizing downtime and ensuring smooth operations. What skills and knowledge should you bring Our technology stack is diverse and cutting-edge, designed to support robust and scalable solutions. Here's a snapshot of the key technologies we use: Front-End: Typescript, React, Nx, REST & GraphQL API (for internal and external UI applications). Primary Backend: Java with the Spring framework family (Boot, Web, Web-Flux, Cloud) for developing resilient backend services. Build & Database Interaction Tools: Gradle, Jooq. Secondary Backend Languages: Python, Golang (for specific components and scripting needs). Data Streaming: Kafka for real-time data streaming. Databases: PostgreSQL, ClickHouse, Redis for robust and efficient data storage and retrieval. Containerization & Orchestration: Docker, Helm, Kubernetes for scalable and manageable deployments. Infrastructure as Code: Terraform for enabling consistent and repeatable deployments. Cloud Platforms: AWS, GCP for scalable and reliable infrastructure. CI/CD: Github Actions, ArgoCD for continuous integration and deployment, facilitating rapid and reliable delivery of features. Monitoring & Observability: Prometheus, Grafana for ensuring system health and performance.
Posted 1 month ago
3.0 - 8.0 years
3 - 8 Lacs
Hyderabad, Telangana, India
On-site
Roles and Responsibilities : What will you do Collaborate Across Teams: Work closely with cross-functional teams to strategically shape and define system requirements, ensuring alignment with overall business goals. Design and Drive Projects: Lead the design and execution of high-priority, high-visibility cloud platform projects, ensuring timely delivery and exceptional quality. Own Feature Development: Take full ownership of the feature development lifecycle, from refining requirements through to successful production deployment, ensuring seamless integration and functionality. Build Secure and Scalable Backend: Develop and maintain a secure, scalable backend infrastructure that serves as the backbone for other development teams, supporting their needs and enhancing overall system performance. Ensure Efficiency and Performance: Focus on developing efficient solutions that guarantee backend reliability, scalability, and optimal performance, leveraging best practices and cutting-edge technologies. Conduct Code Reviews: Provide constructive feedback during code reviews, fostering a culture of continuous improvement and high-quality code standards. Troubleshoot Complex Issues: Quickly identify, diagnose, and resolve complex technical issues, minimizing downtime and ensuring smooth operations. What skills and knowledge should you bring Our technology stack is diverse and cutting-edge, designed to support robust and scalable solutions. Here's a snapshot of the key technologies we use: Front-End: Typescript, React, Nx, REST & GraphQL API (for internal and external UI applications). Primary Backend: Java with the Spring framework family (Boot, Web, Web-Flux, Cloud) for developing resilient backend services. Build & Database Interaction Tools: Gradle, Jooq. Secondary Backend Languages: Python, Golang (for specific components and scripting needs). Data Streaming: Kafka for real-time data streaming. Databases: PostgreSQL, ClickHouse, Redis for robust and efficient data storage and retrieval. Containerization & Orchestration: Docker, Helm, Kubernetes for scalable and manageable deployments. Infrastructure as Code: Terraform for enabling consistent and repeatable deployments. Cloud Platforms: AWS, GCP for scalable and reliable infrastructure. CI/CD: Github Actions, ArgoCD for continuous integration and deployment, facilitating rapid and reliable delivery of features. Monitoring & Observability: Prometheus, Grafana for ensuring system health and performance.
Posted 1 month ago
3.0 - 8.0 years
3 - 8 Lacs
Delhi, India
On-site
Roles and Responsibilities : What will you do Collaborate Across Teams: Work closely with cross-functional teams to strategically shape and define system requirements, ensuring alignment with overall business goals. Design and Drive Projects: Lead the design and execution of high-priority, high-visibility cloud platform projects, ensuring timely delivery and exceptional quality. Own Feature Development: Take full ownership of the feature development lifecycle, from refining requirements through to successful production deployment, ensuring seamless integration and functionality. Build Secure and Scalable Backend: Develop and maintain a secure, scalable backend infrastructure that serves as the backbone for other development teams, supporting their needs and enhancing overall system performance. Ensure Efficiency and Performance: Focus on developing efficient solutions that guarantee backend reliability, scalability, and optimal performance, leveraging best practices and cutting-edge technologies. Conduct Code Reviews: Provide constructive feedback during code reviews, fostering a culture of continuous improvement and high-quality code standards. Troubleshoot Complex Issues: Quickly identify, diagnose, and resolve complex technical issues, minimizing downtime and ensuring smooth operations. What skills and knowledge should you bring Our technology stack is diverse and cutting-edge, designed to support robust and scalable solutions. Here's a snapshot of the key technologies we use: Front-End: Typescript, React, Nx, REST & GraphQL API (for internal and external UI applications). Primary Backend: Java with the Spring framework family (Boot, Web, Web-Flux, Cloud) for developing resilient backend services. Build & Database Interaction Tools: Gradle, Jooq. Secondary Backend Languages: Python, Golang (for specific components and scripting needs). Data Streaming: Kafka for real-time data streaming. Databases: PostgreSQL, ClickHouse, Redis for robust and efficient data storage and retrieval. Containerization & Orchestration: Docker, Helm, Kubernetes for scalable and manageable deployments. Infrastructure as Code: Terraform for enabling consistent and repeatable deployments. Cloud Platforms: AWS, GCP for scalable and reliable infrastructure. CI/CD: Github Actions, ArgoCD for continuous integration and deployment, facilitating rapid and reliable delivery of features. Monitoring & Observability: Prometheus, Grafana for ensuring system health and performance.
Posted 1 month ago
8.0 - 13.0 years
8 - 18 Lacs
Chennai
Hybrid
Strong Technical candidate within depth knowledge in (MEAN / LAMP Stack) with at least any one of JavaScript Frameworks (Angular 2+) Strong knowledge and working experience in using ORM, REST/GraphQL API, Message Queues, Caching, and performance optimization are required Strong technical expertise on using Cloud Services in AWS / GCP . Excellent problem solving, critical thinking, and communication skills Should have experience in handling a team Highly organized and self-motivated Strong knowledge of RDBMS and OOPS is required . Good knowledge in standards, design patterns are required Any certifications or technical writing done on the related technologies is a value add Interested can share your updated resume to katharbee.kalimulla@aspiresys.com Thanks & Regards Katharbee Kalimulla | HR - Talent Acquisition Mobile: +91- 7708198176 Website: www.aspiresys.com | Blog: http://blog.aspiresys.com
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15459 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France