Jobs
Interviews

6417 Cloud Jobs - Page 23

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 7.0 years

6 - 9 Lacs

Bengaluru

Work from Office

Position Title: JAVA FSD - 68470 - 39302 - GR - Sr Software Engineer (IND) Job Family: IFT > Engineering /Dev Shift: Job Description: Design, build, and maintain efficient, reusable, and reliable Java code. Develop and implement highly responsive user-interface components using Java technologies. Design and develop server-side logic using Java frameworks. Ensure the best possible performance, quality, and responsiveness of applications. Identify bottlenecks and bugs, and devise solutions to these problems. Collaborate with other team members and stakeholders to develop high-quality software. Maintain code integrity and organization. Implement security and data protection measures. Stay up-to-date with industry developments and new technologies. Design, develop, and maintain RESTful APIs and web services. Ensure seamless integration of front-end and back-end functionalities through APIs. Document API specifications and provide support for API consumers. Monitor and optimize API performance and scalability. Experience with cloud platforms like AWS and Snowflake Experience in DevOps practices and tools. Familiarity with Agile development methodologies. Job Type: Full time

Posted 1 week ago

Apply

4.0 - 10.0 years

6 - 12 Lacs

Bengaluru

Work from Office

MongoDB s mission is to empower innovators to create, transform, and disrupt industries by unleashing the power of software and data. We enable organizations of all sizes to easily build, scale, and run modern applications by helping them modernize legacy workloads, embrace innovation, and unleash AI. Our industry-leading developer data platform, MongoDB Atlas, is the only globally distributed, multi-cloud database and is available in more than 115 regions across AWS, Google Cloud, and Microsoft Azure. Atlas allows customers to build and run applications anywhere on premises, or across cloud providers. With offices worldwide and over 175,000 new developers signing up to use MongoDB every month, it s no wonder that leading organizations, like Samsung and Toyota, trust MongoDB to build next-generation, AI-powered applications. We are looking to speak to candidates who are based in Bengaluru for our hybrid working model. To drive the personal growth and business impact of our employees, we re committed to developing a supportive and enriching culture for everyone. From employee affinity groups, to fertility assistance and a generous parental leave policy, we value our employees wellbeing and want to support them along every step of their professional and personal journeys. Learn more about what it s like to work at MongoDB , and help us make an impact on the world! MongoDB is an equal opportunities employer. Req ID - 425449

Posted 1 week ago

Apply

4.0 - 10.0 years

6 - 12 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

We are looking for a strong Oracle Peoplesoft technical data migration consultant (With ERP Cloud experience) technical consultant who thrives on solving complex business problems in reporting and data migration track. The ideal candidate should: Be able to operate independently to provide quality work products; perform varied and complex duties and tasks that need independent judgment Have excellent communication skills both written & verbal Have good interpersonal skills with ability to build rapport with all stakeholders Have ability to present ideas and solutions in a clear & concise manner Be self-motivated with a lot of energy and drive Have the ability and willingness to learn The ideal candidate should be Bachelor of Engineering/Bachelor of Technology or Master of ComputerApplications with experience ranging from 4 to 10 years and should: Have hands-on experience in data model of Oracle ERP Cloud and Peoplesoft (PSFT) applications (Financials, Distribution, Manufacturing) Have experience (In-Depth Understanding of Data Model and Business process functionality and related data flow) in Oracle ERP Cloud applications (Finance or Supply chain) Have experience in SaaS technical components namely FBDI etc. Have experience in writing efficient and optimized code and understanding of performance tuning techniques Have experience in data migration from People soft to Oracle Cloud Career Level - IC2 Career Level - IC2 Your Responsibilities As an integral part of the Oracle ERP Cloud Implementation team, you will be responsible for the following: Working with remote and geographically distributed teams to enable building the right products, using the right building blocks and making them consumable by other products easily Be very technically hands-on and own/drive key end to end product/services Ensure customer success including delivering fixes/patches as needed Help build high performance organization including referring, interviewing top talent to Oracle Design & Development of reports and data migration for the customer implementation. Translate business processes and requirements into technical requirements and designs Participate proactively in Organization initiatives

Posted 1 week ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Kochi, Chennai, Thiruvananthapuram

Work from Office

" Aws,Python,Cloud ","description":" Job Title: Cloud AI Tech Lead \u2013 AWS\/Python\/NLP Experience: 5 to 7 years Location: Chennai \/ Thiruvananthapuram (TVM) \/ Kochi Job Type: Full-time Domain: Cloud AI, NLP Mandatory Skills: Cloud Platforms: Strong hands-on experience with AWS (EC2, Lambda, S3, SageMaker, etc.) Programming: Expert in Python AI\/ML\/NLP: Proven experience with Natural Language Processing (NLP) tools and frameworks (e.g., spaCy, NLTK, Hugging Face Transformers) Experience deploying AI\/ML models in production environments Architecture & Leadership: Ability to design end-to-end cloud-based AI systems Prior experience leading small to mid-sized AI\/ML teams Strong communication and stakeholder management skills Nice to Have: Exposure to MLOps tools (e.g., MLflow, Kubeflow) Experience with Docker\/Kubernetes Familiarity with data pipelines and ETL frameworks Knowledge of GCP or Azure is a plus Responsibilities: Lead the development and deployment of AI\/ML\/NLP solutions on AWS Collaborate with data scientists, engineers, and product teams Ensure scalable, reliable, and secure cloud infrastructure for AI projects Mentor junior developers and ensure code quality and best practices ","

Posted 1 week ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Noida

Work from Office

This is a role of Print industry eco-system solution and product specialist in Adobe s print business development team with expertise in Enterprise and Production Print workflows and deployments (both in premise and in cloud). Role involves, Identifying customers needs and business objectives through a detailed discovery of their current business processes and print workflow services. Actively drive and lead the technology evaluation part of the sales lifecycle working in conjunction with the sales as the key technical advocate for our solutions by pointing out the value of Adobe s Print Services SDK & application. Collaborate closely with internal collaborators across a matrixed internal organization Product management, Engineering, Sales, Marketing, Customer Success. Articulate technology and product positioning to both business and technical users, creatively handles objections using documentation, presentations, and demonstrations. Establish and maintains strong customer relationships throughout the sales cycle. Provide strong defence against competing products, services and object handle questions fielded by the competition in a sales opportunity. Respond to functional and technical elements of RFIs/RFPs. Role is global in scope and hence, deep understanding of business ethics in different geographies is a must. What you need to succeed Presales or consulting experience for an enterprise SaaS platform-based organization with well-defined print workflows is a must. A minimum of X+ years of experience in a similar role. Hands-on experience with several of the following: Enterprise Cloud Print Service deployment, Enterprise digital & print workflows or intelligent document processing, Pre-press workflows, etc Expert in building and presenting Customer Points of view (POV) and value proposition decks. Ability to identify the critical business issues customers face and provide solutions. Strong software demonstration and value-selling experience. Excellent customer-facing skills with experience addressing and achieving agreement from senior executives. Must be a self-starter and comfortable working in a fast-paced, innovative environment with high levels of collaboration and teamwork. Bachelor s degree or equivalent experience in computer science or a related technical field, master s or equivalent experience in business studies. .

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Embedded C - Xpetize Technology Solutions PVT LTD Experience Level : 7 to 10 Yrs Experience: 5+ Years Location: Bangalore Joining: Immediate Mode of Work Work from office Key Responsibilities: Embedded, C, Linux, Python- Domain only into IOT Xpetize is a technology solutions company, supporting customers in IoT, application and engineering services, data services, cybersecurity, cloud and social services. We are headquartered in Trivandrum with offices in Bengaluru, Pune, USA and Japan. We work tirelessly to help our customers across the globe since 2011 and relentlessly trying to grow our expertise across geographies. Our flagship Industry 4.0 product XPETICS, is a fully managed IIoT platform that lets customers securely connect and process IoT data at scale. We have a flexible and open work culture, lots of fun, flexi work hours, up skilling programs, medical insurance for family and parents and ensuring work life balance.e

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

eProductivity Software (ePS) is a leading global provider of industry-specific business and production software technology for the packaging and print industries. eProductivitys integrated and automated software offerings and point solutions are designed to enable revenue growth and drive operating and production efficiencies. With several offices worldwide, including in Bangalore, and over thirty-years dedicated to delivering best-in-class technology to the packaging and printing industries, it is our deeply held philosophy that eProductivity Software succeeds when our customers thrive. For more information, visit us at . Position Overview: We are seeking a skilled Cloud Engineer to design and implement scalable, high-performance, and cost-optimized infrastructure solutions to support the global delivery of our packaging software solutions. This role is pivotal in advancing our cloud-first strategy utilizing AWS, and on-premises infrastructurewith a strong emphasis on automation, container orchestration, and infrastructure-as-code (IaC) using Terraform and related technologies. Your mission is to engineer and orchestrate the cloud foundation that powers our global customer experiences. Position Description: Design and architect secure, scalable, and cost-efficient multi-cloud infrastructure solutions across AWS, and on-premises colocation environments. Build and maintain infrastructure-as-code (IaC) using Terraform, ensuring versioned, modular, and reusable code structures. Implement automation strategies to support rapid provisioning, configuration, and scaling of infrastructure resources. Utilize container orchestration platforms (e.g., Kubernetes, ECS, AKS) to support highly available and performant deployment environments. Define and enforce infrastructure governance, security, and compliance best practices across environments. Collaborate with software architects and cloud operations to align infrastructure with product delivery requirements and SLAs. Monitor infrastructure performance and cost metrics; make recommendations and implement optimizations. Stay abreast of evolving cloud technologies and recommend innovations to improve agility and efficiency. Position Requirements 5+ years of hands-on experience designing and implementing cloud infrastructure in multi-cloud environments (AWS, Azure, and hybrid/on-premises setups). Strong expertise with Terraform and infrastructure-as-code principles. Proficiency with container orchestration tools (e.g., Kubernetes, Helm, ECS, AKS). Deep understanding of cloud networking, compute, storage, and security architectures. Experience with automation tools and frameworks (e.g., Ansible, Python, or Bash scripting). Strong troubleshooting skills in complex cloud environments. Ability to evaluate performance and cost trade-offs in cloud resource design. Preferred Skills: Certifications in AWS and/or Azure cloud architecture (e.g., AWS Solutions Architect, Azure Solutions Architect Expert). Familiarity with hybrid cloud integration patterns and network design. Experience working in regulated or compliance-sensitive environments (e.g., SOC2, HIPAA, ISO 27001). Exposure to GitOps, policy-as-code, or FinOps practices is a plus. At ePS, we are a global team that solves unique business challenges for our customers worldwide. We believe in and are committed to fostering an inclusive workplace where our rich diversity fuels continuous innovation and success, valuing everyone\u0027s expertise and unique perspective. Our commitment to our customers and to an inclusive culture will be evidenced through our actions, outcomes, and the quality of our products and services. ePS Empowering Packaging and Print

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Pune

Work from Office

":" Job Title: Principal Engineer/ Architect Location: Pune, India Experience: 5+ Years Employment Type: Full-Time Function: Engineering & Architecture Role Overview We are looking for a PrincipalEngineer / Architect who combines deep technical expertise with strategicthinking to design and implement scalable, secure, and modern digital systems.This is a senior technical leadership position that requires hands-onarchitecture experience, a solid command of cloud-native development, and aproven track record of leading teams through complex solution delivery. You will collaborate withcross-functional teams\u2014including engineering, product, DevOps, and businessstakeholders\u2014to define technical roadmaps, ensure alignment with enterprisearchitecture principles, and guide platform evolution. Requirements Key Responsibilities Architecture & Design Lead the design of modular, microservices-based, and secure architecture for scalable digital platforms. Define and enforce cloud-native architectural best practices using Azure, AWS, or GCP. Prepare high-level design artefacts, interface contracts, data flow diagrams, and service blueprints. Cloud Engineering & DevOps Drive infrastructure design and automation using Terraform or CloudFormation. Support Kubernetes-based container orchestration and efficient CI/CD pipelines. Optimize for performance, availability, cost, and security using modern observability stacks and metrics. Data & API Strategy Architect systems that handle structured and unstructured data with performance and reliability. Design APIs with reusability, governance, and lifecycle management in mind. Guide caching, query optimization, and stream/batch data pipelines across the stack. Technical Leadership Act as a hands-on mentor to engineering teams, leading by example and resolving architectural blockers. Review technical designs, codebases, and DevOps pipelines to uphold engineering excellence. Translate strategic business goals into scalable technology solutions with pragmatic trade-offs. Key Requirements Must Have: 5+ years in software architecture or principal engineering roles with real-world system ownership. Strong experience in cloud-native architecture with AWS, Azure, or GCP (certification preferred). Programming experience with Java , Python , or Node.js , and frameworks like Flask , FastAPI , Celery . Proficiency with PostgreSQL , MongoDB , Redis , and scalable data design patterns. Expertise in Kubernetes , containerization, and GitOps-style CI/CD workflows. Strong foundation in Infrastructure as Code (Terraform, CloudFormation). Excellent verbal and written communication; proven ability to work across technical and business stakeholders. Nice to Have: Experience in MLOps pipelines , observability stacks (ELK, Prometheus/Grafana), and tools like MLflow , Langfuse . Familiarity with Generative AI frameworks (LangChain, LlamaIndex), Vector Databases (Milvus, ChromaDB). Understanding of event-driven , serverless , and agentic AI architecture models. Python libraries such as pandas , NumPy , PySpark and support for multi-component pipelines (MCP). Preferred: Prior experience leading technical teams in regulated domains (finance, healthcare, govtech). Cloud security, cost optimization, and compliance-oriented architectural mindset. What YouGain Work on mission-critical projects using the latest cloud, data, and AI technologies. Collaborate with a world-class, cross-disciplinary team. Opportunities to contribute to open architecture , reusable frameworks, and technical IP. Career advancement via leadership , innovation labs , and enterprise architecture pathways . Competitive compensation , flexibility , and a culture that values innovation and impact. ","

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Job Title: Test Data Management(TDM) Experience: 5+Years Location: Bengaluru,Chennai, Hyderabad, Pune, Vadodara Notice Period: Immediate Joiners Only Job Overview: We are looking for a skilled Test Data Management (TDM) with hands-on expertise in data de-identification, masking, andsynthetic data generation. The ideal candidate will have solid experience using Delphix for data masking and virtualization and should be comfortablealigning with consumer roadmaps for faster test data provisioning. Exposure toscripting languages like Python or .NET and experience in cloud-basedenvironments and CI/CD integration is a plus. Key Responsibilities: Design, implement, and manage Test Data Management solutions to support software testing across multiple environments. Leverage Delphix for data de-identification, data masking, and test data provisioning. Build and maintain synthetic test data generation capabilities for varied testing scenarios. Collaborate with QA, DevOps, and development teams to ensure efficient test data availability aligned with sprint and release cycles. Support TDM strategies that adhere to data privacy and compliance standards. Understand and integrate TDM needs with CI/CD pipelines and cloud-hosted platforms, where applicable. Maintain documentation and best practices related to test data provisioning, masking, and management. Required Skills & Experience: Minimum 5 years of experience with Test Data Management tools and techniques. At least 3 years of hands-on experience with Delphix , especially in data masking and de-identification. Minimum 2 years of experience in synthetic data generation methodologies. Familiarity with scripting in Python or .NET is a plus. Good understanding of test data governance, compliance, and security principles. Exposure to CI/CD pipelines and cloud-hosted environments (optional but preferred). Strong communication and collaboration skills to align with cross-functional teams.

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Noida

Work from Office

Summary: Looking for Experienced Databricks Resident consultants having prior working experience on Databricks Data Engineering. The candidate will be responsible for setting up Databricks Workspaces, creating Delta Lakes, Writing declarative ingestion pipelines & utilizing databricks data quality frameworks (constraints) to delivery high quality Data Pipelines. The role will be customer facing and will require unique problem solving, stakeholder management & thought leadership qualities. Roles and Responsibility: Proficient in data engineering, data platforms, and analytics with a strong track record of successful projects and in-depth knowledge of industry best practices. Should have worked on atleast 2 Datawarehouse Migration engagements in Past (Preferably one on Databricks, but can be non databricks migration also). Should have prior experience of Dimensional Data Modelling on tools like Erwin Modeller etc. 5+ years of hands-on experience in data engineering, with at least 2+ years in Databricks Architect and Apache Spark. Proficient in Cluster Management & Workspace Organisation. Proficient in Delta Lake Framework, Databricks Workspaces and Unity Catalog. Proficient in writing ETL Pipelines on Databricks by using Delta Live Tables for Streaming , Batch & Incremental Batch ingestion features. Comfortable architecting & writing code in either Python and SQL. Should have excellent understanding of Apache Spark Job Performance Tuning, Partitions Optimization etc. Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one. Deep experience with distributed computing with Apache Spark and knowledge of Spark runtime internals. Familiarity with CI/CD for production deployments. Working knowledge of MLOps. Design and deployment of performant end-to-end data architectures. Experience with technical project delivery managing scope and timelines. Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects. Certifications: Databricks Certified Associate Developer for Apache Spark. Databricks Certified Professional Data Engineer. Cloud Certifications (AWS Certified Solutions Architect, Azure Solutions Architect Expert).

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

About the Opportunity In the dynamic financial technology and cloud-based ERP consulting sector, our client is a leading player dedicated to digital transformation for financial operations. Specializing in Oracle Cloud solutions, this organization partners with global enterprises to streamline revenue management and receivables processes. This on-site role in India is designed for professionals ready to drive innovation in cloud financial solutions. Role: Oracle Cloud Revenue Management Receivables Specialist Role & Responsibilities Implement and configure Oracle Cloud Revenue Management solutions with a focus on Receivables (AR) modules. Collaborate with clients and cross-functional teams to design and deliver tailored AR strategies and solutions. Customize setups for revenue recognition, invoicing, and receivables processes within the Oracle Cloud environment. Troubleshoot and resolve system issues to ensure seamless AR operations and data integrity. Provide training and post-implementation support to foster efficient user adoption and system optimization. Continuously assess and refine processes to enhance the overall financial workflow and system performance. Skills & Qualifications Must-Have: Proven experience with Oracle Cloud Revenue Management, particularly in the Receivables (AR) domain. Strong skills in system configuration, implementation, and problem resolution. Solid understanding of financial processes, including revenue recognition and receivables management. Excellent client engagement abilities with a track record of delivering on-site solutions. Preferred: Relevant Oracle certifications or equivalent cloud technology credentials. Experience in ERP consulting or related financial technology sectors. Strong interpersonal and communication skills to effectively collaborate in dynamic teams. Benefits & Culture Highlights Competitive compensation package and comprehensive benefits. A collaborative, innovative work culture that values continuous learning and professional growth. An inclusive, technology-forward environment that supports career advancement and skills development.

Posted 1 week ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Kolkata

Work from Office

":" We are looking for a Salesforce Developer with 6-8 years of experience, including at least 5 years hands-on with Salesforce across both declarative and programmatic areas. The ideal candidate should be proficient in Sales and Service Cloud, Experience Cloud, and ideally Salesforce Industries (Vlocity) including Omniscripts, EPC, and Integration Procedures. Experience working with Git or similar version control systems and in Agile/SCRUM environments is essential. The role is based in Kolkata and offers a stimulating opportunity to work across multiple business verticals within a high-performing team delivering mission-critical solutions. Key Responsibilities Participate in refining and scoping upcoming sprint work. Assist solution architects with technical design and breaking down complex tasks. Accountable for timely delivery of assigned tickets, meeting acceptance criteria. Conduct spikes/investigations into innovative technologies for future project viability. Ensure teammates work meets code quality standards through reviews. Coordinate with other teams for integrations, ensuring alignment of tasks and APIs. Work with the QA team to investigate and resolve issues. Mentor junior team members, providing assistance with tasks and problem-solving. Requirements Requirements Minimum 5 years of Salesforce experience across declarative and programmatic areas Experience with Sales and Service Cloud Experience with Experience Cloud Knowledge of Salesforce Industries (Vlocity) including Omniscripts, EPC, and Integration Procedures is an advantage Proficiency with Git or similar version control systems Experience in Agile/SCRUM methodology Strong problem-solving and mentoring abilities Ability to manage tasks independently and work cross-functionally Benefits Competitive compensation Training and mentorship programme Exposure to cutting-edge Salesforce projects in a high-impact environment " , "Job_Opening_ID":"ZR_3137_JOB" , "Job_Type":"Contract" , "Job_Opening_Name":"Salesforce Developer","State":"West Bengal" , "Currency":"EUR" , "Country":"India" , "Zip_Code":"700001" , "id":"40099000030203299" , "Publish":true , "Keep_on_Career_Site":false , "Date_Opened":"2025-07-22"}]);

Posted 1 week ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Bengaluru

Work from Office

About the Opportunity Operating in the cutting-edge travel technology and expense management sector, our firm is a high-growth leader dedicated to streamlining corporate travel booking and expense processes. This role is based on-site in India and focuses on integrating and supporting enterprise-grade travel solutions using Spotnana and Concur. Join our dynamic team and be a part of revolutionizing travel and expense management for global clients. Role & Responsibilities Manage the end-to-end implementation and integration of Spotnana and Concur travel and expense platforms in alignment with customer requirements. Troubleshoot, diagnose, and resolve technical issues to ensure seamless performance of travel and expense systems. Collaborate closely with cross-functional teams to design, develop, and optimize integration workflows between travel booking and expense management modules. Maintain and customize system configurations to meet evolving enterprise demands and enhance user experience. Develop and document technical guides and best practices for ongoing system maintenance and future upgrades. Ensure compliance with industry security and operational standards while managing platform integrations. Skills & Qualifications Must-Have: Proven experience in integrating and supporting platforms like Spotnana and Concur or similar travel & expense management solutions. Must-Have: Strong technical background in API integrations, system troubleshooting, and performance optimization. Must-Have: Demonstrable expertise in managing on-site technical support, working collaboratively with cross-functional teams. Preferred: Familiarity with enterprise-level travel booking systems and financial compliance standards. Preferred: Experience with cloud-based environments and modern software development practices. Preferred: Excellent communication skills with a proven track record in delivering technical presentations and documentation. Benefits & Culture Highlights Work in a collaborative and innovative environment with opportunities for professional growth. Engage with cutting-edge technologies and impactful enterprise projects in the travel tech sphere. Enjoy a supportive, on-site work setting that encourages skill development and cross-team collaboration.

Posted 1 week ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Pune, Chennai

Work from Office

Position Overview: We are seeking a Senior Data Scientist Engineer with experience bringing highly scalable enterprise SaaS applications to market. This is a uniquely impactful opportunity to help drive our business forward and directly contribute to long-term growth at Virtana. If you thrive in a fast-paced environment, take initiative, embrace proactivity and collaboration, and you re seeking an environment for continuous learning and improvement, we d love to hear from you! Virtana is a remote first work environment so you ll be able to work from the comfort of your home while collaborating with teammates on a variety of connectivity tools and technologies. Role Responsibilities: Research and test machine learning approaches for analyzing large-scale distributed computing applications. Develop production-ready implementations of proposed solutions across different models AI and ML algorithms, including testing on live customer data to improve accuracy, efficacy, and robustness Work closely with other functional teams to integrate implemented systems into the SaaS platform Suggest innovative and creative concepts and ideas that would improve the overall platform. Job Location - Pune, Chennai or Remote Qualifications: The ideal candidate must have the following qualifications: 6 + years experience in practical implementation and deployment of large customer-facing ML based systems. MS or M Tech (preferred) in applied mathematics/statistics; CS or Engineering disciplines are acceptable but must have with strong quantitative and applied mathematical skills In-depth working, beyond coursework, familiarity with classical and current ML techniques, both supervised and unsupervised learning techniques and algorithms Implementation experiences and deep knowledge of Classification, Time Series Analysis, Pattern Recognition, Reinforcement Learning, Deep Learning, Dynamic Programming and Optimization Experience in working on modeling graph structures related to spatiotemporal systems Programming skills in Python is a must Experience in understanding and usage of LLM models and Prompt engineering is preferred. Experience in developing and deploying on cloud (AWS or Google or Azure) Good verbal and written communication skills Familiarity with well-known ML frameworks such as Pandas, Keras, TensorFlow

Posted 1 week ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Hyderabad

Work from Office

Position: SAP ABAP Build Code/Build APP/AILocation: Anywhere in IndiaYear of SAP Experience 6+ Years Responsibilities:Design, build, and configure applications using SAP Build Apps, leveraging its low code capabilitiesStrong understanding of low code development platforms like SAP Build Apps. Experience with SAP BTP (Business Technology Platform). Familiarity with cloud native technologies. Experience with SAP ABAP development (may be required for certain roles). Knowledge of APIs and integrations. Enables AI powered extensions of existing SAP applications.

Posted 1 week ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Noida, Nagpur, Hyderabad

Work from Office

Xpetize Technology Solutions Private Limited is looking for Java Data Engineer to join our dynamic team and embark on a rewarding career journey. Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up - to - date with industry standards and technological advancements that will improve the quality of your outputs.

Posted 1 week ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Job Title: Senior Generative AIEngineer Experience: 7-12 Years Location: Bengaluru,Chennai, Hyderabad, Pune, Vadodara Work Mode: Hybrid Notice Period: Immediate Joiners Only Job Overview: We are looking for a highly skilled Senior Generative AIEngineer to design, develop, and deploy enterprise-scale AI solutions usingMicrosoft Azure and cutting-edge Generative AI technologies. You will be at theforefront of building intelligent, cloud-native applications leveraging LargeLanguage Models (LLMs), Retrieval-Augmented Generation (RAG) pipelines, OCR,and agent-based frameworks. Key Responsibilities: Architect and implement scalable Generative AI solutions using Azure AI services. Develop and integrate RAG pipelines, OCR models, and agent-based architectures into enterprise applications. Leverage Azure OpenAI, Azure Machine Learning Studio, Azure Cognitive Search, and Azure Document Intelligence for various AI-driven use cases. Utilize GenAI frameworks and tools like LangChain, LlamaIndex, and Semantic Kernel for solution design and rapid prototyping. Collaborate with cross-functional teams including data scientists, cloud architects, and business stakeholders to translate requirements into AI capabilities. Ensure performance optimization, model monitoring, and governance of deployed AI systems. Stay updated with the latest in GenAI and Azure AI advancements and recommend strategic improvements. Required Skills & Qualifications: 712 years of hands-on experience in AI/ML development, with a strong focus on Generative AI. Proficiency with Azure AI stack including Azure OpenAI, Azure ML Studio, Cognitive Search, and Document Intelligence. Practical experience with LLMs and RAG architectures in enterprise contexts. Expertise in using GenAI tools like LangChain, LlamaIndex, and Semantic Kernel. Strong programming skills in Python and experience with ML libraries. Excellent problem-solving abilities and communication skills.

Posted 1 week ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Hyderabad

Work from Office

Overview At Prolifics, we are currently implementing multiple solutions in Software Development, and we are looking to hire talented Senior Oracle Consultant for our development centre in India. This position would be based out of Hyderabad and is a permanent position. If you are looking for a high growth company with rock-solid stability, if you thrive in the energetic atmosphere of high-profile projects, we want to talk to you today! Let\u2019s connect and explore possibilities of having you onboard the Prolifics team! Job Title: Senior Oracle Consultant Primary skills: Strong Oracle Cloud Infrastructure (OCI) and integration tools knowledge Secondary skills: Experience with 3+ full-cycle Oracle implementations Location: Hyderabad (Mindspace#12B) Educational Qualification: B.Tech/BE/M.Tech/MCA/M.Sc Experience: 7+ Job Description: Key Responsibilities Lead Oracle Cloud Success Navigator implementations and configuration Configure dashboards, metrics, and success measurement frameworks Execute end-to-end Oracle EBS and Fusion implementation projects Conduct business process analysis and system integrations Lead client workshops and provide strategic consulting guidance Mentor team members and ensure quality deliverables Required Qualifications 7+ years Oracle EBS (R12) and Fusion Cloud Applications experience Proven Oracle Cloud Success Navigator implementation experience Strong Oracle Cloud Infrastructure (OCI) and integration tools knowledge Experience with 3+ full-cycle Oracle implementations Deep understanding of Finance, Supply Chain, and HR modules Excellent client-facing and leadership skills Oracle certifications preferred Preferred Experience Oracle Cloud Success Navigator certification Big 4 or Oracle Partner consulting background About us: Prolifics Corporation Limited is a Global Technology Solutions Provider with presence across North America (USA and Canada), Europe (UK and Germany), Middle East & Asia. In India, we have offshore development centres: 2 in Hyderabad & 1 in Pune. For more than 40 years, Prolifics has transformed enterprises of all sizes including over 100 Fortune 1000 companies by solving their complex IT challenges. Our clients include Fortune 50 and Fortune 100 companies across a broad range of industries including Financial Services, Insurance, Government, Healthcare, Telecommunications, Manufacturing and Retail. We rank consistently in Dream Companies to Work for and Dream Employer of the Year ranking from World HRD Congress, ranked 7 in 2019. We encourage you to visit us on www.prolifics.com or follow us on Twitter, LinkedIn, Facebook, YouTube and other social media to know more about us. At Prolifics, we are currently implementing multiple solutions in Software Development, and we are looking to hire talented Senior Oracle Consultant for our development centre in India. This position would be based out of Hyderabad and is a permanent posi

Posted 1 week ago

Apply

10.0 - 15.0 years

35 - 40 Lacs

Pune

Work from Office

Overview Senior Full Stack Developer - Years of Experience - 10+ years. Required skills -Node, JavaScript, TypeScript, Python, React, Angular, Git, GitHub, Jenkins, Terraform, Cloud Formation, DynamoDB, AWS (EC2, SNS, Step functions, DynamoDB, Redshift, Athena, Snowflake, S3) Note - The must have skills are Node or Python and those AWS technologies. We expect the engineers to be able to read and understand code that is written in other languages (C#, Typescript, Python) even if they arent fluent / extremely capable.

Posted 1 week ago

Apply

10.0 - 15.0 years

35 - 40 Lacs

Pune

Work from Office

Job Title: SAP Techno-FunctionalArchitect Experience: 10+ Years Location: Pune Notice Period: ImmediateJoiners Only Position Overview: We are looking for an accomplished SAP Techno-FunctionalArchitect to lead the architecture and integration design for our upcoming SAPS/4HANA implementation program at TBEL . This is a high-impact roleresponsible for driving end-to-end solution design, ensuring seamlessintegration, and delivering scalable, secure, and future-ready SAP solutionsthat align with our business objectives. Key Responsibilities: Lead the technical architecture, design, and implementation of SAP S/4HANA solutions aligned with business goals. Collaborate with business, IT, and senior stakeholders to understand requirements and validate solution designs. Own and review functional and technical specifications provided by the system integrator (SI). Design robust architecture focused on scalability, performance, security, and user experience (UX). Guide integration using technologies such as SAP CPI, SAP BTP , and coordinating data flow across applications. Oversee the delivery and quality of work from SI partners and external vendors. Define and maintain architectural standards, technical roadmaps, and support documentation throughout the TBEL SAP program. Provide strategic directions on platform adoption, innovations, and best-fit SAP solutions. Actively participate in Architecture Review Boards (DAB/ARB) and technical governance forums. Translate business needs into scalable technical solutions using your strong techno-functional expertise . Required Qualifications & Skills: Bachelor\u2019s or master\u2019s degree in computer science, Information Technology, or a related field. 10+ years of experience in SAP platform architecture. Proven track record of leading at least 4 global SAP S/4HANA implementations end-to-end. Deep expertise in SAP system landscape design (on-prem, cloud, hybrid). Strong knowledge of SAP Integration technologies CPI and BTP are mandatory . Experience working with SAP Analytics Cloud (SAC) , SAP Datasphere , and SAP Public Cloud is an advantage. SAP certifications such as SAP Certified Application Architect are highly preferred. Ability to work hands-on with both business and technical teams, translating requirements into practical solutions. Excellent analytical, communication, and stakeholder management skills. Exposure to cross-functional global teams and diverse business environments is desirable. Note: While thepreference is for a techno-functional architect , candidates with strong functionalSA backgrounds will also be considered.

Posted 1 week ago

Apply

10.0 - 15.0 years

35 - 40 Lacs

Bengaluru

Work from Office

In this role, you will lead the design, development, and optimisation of large-scale, distributed data systems that power OCIs critical services. You ll work closely with cross-functional teams to build scalable data infrastructure, real-time analytics pipelines, and intuitive data visualisation tools. We re looking for a highly technical leader with deep expertise in Analytics, Business Intelligence, Data Visualisation, Java, and Microservices. Key Responsibilities: Architect and implement scalable, reliable data solutions using modern cloud-native technologies Collaborate with engineering, product, and operations teams to define and deliver key initiatives Lead the development of intuitive and interactive data visualisations that help users explore and understand complex datasets. Mentor and guide junior engineers, promoting best practices in data engineering and system design Lead architecture reviews, design discussions, and implementation strategies. Preferred Qualifications: 10+ years of experience in software engineering, with at least 5 years in data/analytics-focused roles. Strong proficiency in Java, with solid experience in building and deploying microservices in production environments. Deep understanding of data modeling, ETL/ELT processes, and analytics best practices. Experience with BI tools such as Oracle Analytics Cloud, Tableau, Power BI. Experience building data platforms in a cloud environment (preferably OCI, AWS, Azure, or GCP) Proven ability to lead complex technical projects end-to-end. Strong understanding of RESTful APIs, containerization (Docker/Kubernetes), and DevOps practices. Excellent communication and collaboration skills, with a passion for mentoring and leading technical teams. As a member of the software engineering division, you will take an active role in the definition and evolution of standard practices and procedures. In this role, you ll lead the design and development of scalable, cloud-native data systems and pipelines that power critical OCI services. Were looking for someone with deep experience in distributed systems, big data technologies and a strong background in building data platforms in the cloud. If youre passionate about building high-performance data infrastructure at scale, we d love to hear from you.

Posted 1 week ago

Apply

4.0 - 7.0 years

18 - 30 Lacs

Noida, Pune

Hybrid

Minimum Qualifications: • Bachelor's or master's degree in computer science, Engineering, or a related technical field. • 4-7 years of professional software development experience. • Deep expertise in one or more programming languages such as .NET/ .Net Core & Microservices, WebAPI, C# & Database. • Extensive experience with software architecture and design patterns, including the ability to design and implement scalable, reliable systems in a DevOps model. • Proficiency with cloud technologies like Azure, AWS, GCP, and version control systems like GitHub. • Strong problem-solving skills and attention to detail, with a commitment to delivering high-quality software solutions. • Proficiency in building telemetry or observability as part of the development process. • Strong leadership, communication, and interpersonal skills, with the ability to influence and drive technical decisions across the organization.

Posted 1 week ago

Apply

3.0 - 5.0 years

15 - 27 Lacs

Bengaluru

Work from Office

Job Summary As Software Engineer, you will be responsible for coming up with a test design / strategy for qualifying a feature and to develop, automate and execute tests using the NetApp’s automation framework. This requirement is for the System Test Engineering group that is a centralized organization to test / qualify various features of ONTAP product line. Job Requirements Role is of “Software Development in Testing” (SDIT) that demands for a strong automation skill with a hands-on experience to be able to test & qualify a system under test. Contribute to test methodologies, plans, and automation. A person should have a good understanding of test automation framework, scripting languages that enables the person to carry out automated tests who can triage & report issues. Design, deploy, and maintain automated system tests to replicate real-world scenarios. Understand Functional specification and system architecture to properly design feature test plans. Review bug descriptions, functional requirements and design documents and incorporate them into test plans and cases. Possess strong communication skills who can articulate well to be able to work with an extended team consisting of developers and partner teams. Work with the support team to troubleshoot and reproduce customer impacting issues. Proactive, positive attitude, a good team player, self-motivated and flexible. A person should be willing to work with partners across geographical locations who is expected to collaborate to deliver on commitments. Technical Skills Strong Automation skillset who has hands on experience with Python is a must. Basic knowledge about storage/ storage protocols /cloud domain / K8s / Data protection techniques is preferable. Basic knowledge of Networking Protocols – TCP / IP, UDP / IP, VLAN etc. is preferable. Experience in QA / Testing Familiarity with automation / testing frameworks Sound knowledge in test design and test strategies Education A Bachelor of Science Degree in Engineering or Computer Science with 2 years of experience, or a Master’s Degree; or equivalent experience is required.

Posted 1 week ago

Apply

12.0 - 18.0 years

30 - 45 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Skills required to contribute: 11+ Years total experience and 3+ Years of Architecture and Design Experience Strong modernization Consulting experience: assessment of legacy technologies, Solution development for future state technologies, estimation, and roadmap development Multiple project Implementation experience of application modernization to Cloud projects Strong pre-sales expertise of leading the RFI/RFP responses including solution architecture, estimation, implementation plan development and working with internal stakeholders & external partners. Excellent communication (written and oral) and interpersonal skills Excellent leadership and management skills Microsoft.Net Core, C#, ASP.NET, MVC, WCF, Web/Rest APIs, Angular, React Azure APIM, Azure Functions, Azure Service Bus, Azure Logic App, Azure ADB2C Blob Storage, Cosmos DB, Azure Data Factory, Azure Redis Cache, hands on experience working with Azure SQL Database Azure and various services, Azure Kubernetes Service, Service Fabric, Microservices Service Fabric Applications/Services implementation Hands on experience in Dockers, Containerization, Elastic search Entity Framework, MS SQL Server, NO SQL SOA and Microservices Design Patterns, Solid Principles, API Integration Azure DevOps, GIT GIT HUB, SonarQube, Terraform, ARM, Agile Methodology Strong Problem Solving skills, Design and solution Skills Excellent communication skills and an Influencer

Posted 1 week ago

Apply

5.0 - 8.0 years

27 - 42 Lacs

Bengaluru

Work from Office

Job Summary As a Software Engineer at NetApp India’s R&D division, you will be responsible for the design, development and validation of software for Big Data Engineering across both cloud and on-premises environments. You will be part of a highly skilled technical team named NetApp Active IQ. The Active IQ DataHub platform processes over 10 trillion data points per month that feeds a multi-Petabyte DataLake. The platform is built using Kafka, a serverless platform running on Kubernetes, Spark and various NoSQL databases. This platform enables the use of advanced AI and ML techniques to uncover opportunities to proactively protect and optimize NetApp storage, and then provides the insights and actions to make it happen. We call this “actionable intelligence”. Job Requirements Design and build our Big Data Platform, and understand scale, performance and fault-tolerance • Interact with Active IQ engineering teams across geographies to leverage expertise and contribute to the tech community. • Identify the right tools to deliver product features by performing research, POCs and interacting with various open-source forums • Work on technologies related to NoSQL, SQL and in-memory databases • Conduct code reviews to ensure code quality, consistency and best practices adherence. Technical Skills • Big Data hands-on development experience is required. • Demonstrate up-to-date expertise in Data Engineering, complex data pipeline development. • Design, develop, implement and tune distributed data processing pipelines that process large volumes of data; focusing on scalability, low -latency, and fault-tolerance in every system built. • Awareness of Data Governance (Data Quality, Metadata Management, Security, etc.) • Experience with one or more of Python/Java/Scala. • Knowledge and experience with Kafka, Storm, Druid, Cassandra or Presto is an added advantage. Education • A minimum of 5 years of experience is required. 5-8 years of experience is preferred. • A Bachelor of Science Degree in Electrical Engineering or Computer Science, or a Master Degree; or equivalent experience is required.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies