Jobs
Interviews

1170 Cloud Platform Jobs - Page 8

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 - 4.0 years

0 - 0 Lacs

bangalore, kochi, chennai

On-site

Role Overview A Fresher DevOps Engineer supports the automation and optimisation of software development and deployment, assists with system monitoring, and helps troubleshoot issues, with a strong emphasis on continuous learning. Key Responsibilities Learn and assist in setting up and maintaining CI/CD pipelines (e.g., Jenkins, GiLab CI). Support operating and monitoring cloud infrastructure (AWS, Azure, GCP). Help automate deployment, testing, and release processes. Collaborate with senior developers, IT, and QA teams to streamline software delivery. Document process steps and contribute to operational knowledge base. Apply basic troubleshooting to resolve minor technical and configuration issues. Assist with infrastructure as code, learning tools like Terraform or Cloud Formation. Take part in incident management and basic system maintenance when needed. Follow security best practices in all assigned tasks. Required Skills Basic understanding of Linux operating systems. Familiarity with scripting languages (e.g., Bash, Python). Foundation in version control (Git), and basic software development concepts. Eagerness to learn containerisation (Docker), cloud basics, and automation tools. Good communication, problem-solving, and adaptability skills.

Posted 1 week ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

hyderabad

Hybrid

Bazel-based build systems for large-scale, polyglot codebases Initiatives to migrate existing build systems to Bazel and optimize build times through incremental builds and artifact caching Exp. managing & optimizing distributed Bazel build system

Posted 1 week ago

Apply

5.0 - 9.0 years

8 - 16 Lacs

hyderabad

Hybrid

Location: Hyderabad Kindly send your resume to harisha.r@kryonknowledgeworks.com/ +91 93619 12010 Job Description: Strong knowledge of Core Java, Spring Framework (Spring Boot, MVC), and JPA. Experience with RESTful APIs, Microservices, and SQL/NoSQL databases. Familiarity with version control systems (e.g., Git). Understanding of Agile/Scrum methodologies. Excellent problem-solving and communication skills. Experience with cloud platforms (GCP). Knowledge of CI/CD pipelines and DevOps practices. Mandatory Skills: Strong knowledge of Core Java, Spring Framework (Spring Boot, MVC), and JPA. Experience with RESTful APIs, Microservices, and SQL/NoSQL databases. Familiarity with version control systems (e.g., Git). Understanding of Agile/Scrum methodologies. Excellent problem-solving and communication skills. Experience with cloud platforms (GCP). Knowledge of CI/CD pipelines and DevOps practices.

Posted 1 week ago

Apply

11.0 - 15.0 years

20 - 25 Lacs

mumbai

Work from Office

About The Role Enterprise Architects go beyond designing IT systems to deliver business change, which may also be supported and enabled by IT. They define services from a business perspective, both with and without automation. They make an initial grouping of services in components, using principles based on business objectives and constraints. They have deep business and industry expertise, are familiar with industry standards and can work at the boardroom and senior management level.Enterprise Architects define and ensure a comprehensive and coherent view across Business, Information, Systems and Technology. They go beyond designing IT systems to deliver business change, which may also be supported and enabled by IT. About The Role - Grade Specific Managing Enterprise Architect - Design, deliver and manage complete enterprise solutions. Demonstrate leadership of topics in the architect community and show a passion for technology and business acumen. Work as a stream lead at CIO/CTO level for an internal or external client. Lead Capgemini operations relating to market development and/or service delivery excellence. Are seen as a role model in their (local) community. Certificationpreferably Capgemini Architects certification level 2 or above, IAF and TOGAF 9 or equivalent. Skills (competencies) (SDLC) Methodology Active Listening Adaptability Agile (Software Development Framework) Analytical Thinking APIs Automation (Frameworks) AWS (Cloud Platform) AWS Architecture Business Acumen Business Analysis Business Transformation C# Capgemini Integrated Architecture Framework (IAF) Cassandra (Relational Database) Change Management Cloud Architecture Coaching Collaboration Commercial Awareness Confluence Delegation DevOps Docker ETL Tools Executive Presence Financial Awareness GitHub Google Cloud Platform (GCP) Google Cloud Platform (GCP) (Cloud Platform) IAF (Framework) Influencing Innovation Java (Programming Language) Jira Kubernetes Managing Difficult Conversations Microsoft Azure DevOps Negotiation Network Architecture Oracle (Relational Database) Problem Solving Project Governance Python Relationship-Building Risk Assessment Risk Management SAFe Sales Strategy Salesforce (Integration) SAP (Integration) SharePoint Slack SQL Server (Relational Database) Stakeholder Management StorageArchitecture Storytelling Strategic Planning Strategic Thinking Sustainability Awareness Teamwork Technical Governance Time Management TOGAF (Framework) Verbal Communication Written Communication

Posted 1 week ago

Apply

6.0 - 9.0 years

12 - 16 Lacs

hyderabad

Hybrid

Location: Hyderabad, Telangana. Mode of work: Hybrid Kindly Send your Resume to 9361912009. Fullstack with Data exposure for Catalog ====== We are seeking an experienced Full Stack Developer with strong expertise in Databricks, SQL, and modern web technologies. The ideal candidate will be responsible for designing, developing, and maintaining scalable applications that integrate data engineering, analytics, and interactive user interfaces. This role requires a blend of front-end and back-end development skills, coupled with hands-on experience in Databricks and large-scale data management. Key Responsibilities: Design and develop full-stack applications using [React/Angular/Vue] for front-end and [Node.js/Java/.NET] for back-end. Build and optimize data pipelines, transformations, and workflows in Databricks. Write efficient, scalable, and maintainable SQL queries for analytics, reporting, and data integration. Collaborate with data engineers, analysts, and business stakeholders to translate requirements into technical solutions. Integrate APIs and services to support real-time and batch data-driven applications. Ensure code quality, security, and performance through testing, code reviews, and best practices. Deploy applications in cloud environments (Azure/AWS/GCP) with CI/CD pipelines. Required Skills & Experience: 6+ years of experience as a Full Stack Developer. Strong experience with Databricks (data pipelines, notebooks, Delta Lake, Spark SQL). Proficiency in SQL (complex queries, performance tuning, stored procedures). Hands-on experience with at least one modern front-end framework (React, Angular, or Vue). Back-end expertise with Node.js, Java Strong understanding of REST APIs, microservices, and integration patterns. Experience with cloud platforms (Azure preferred; AWS/GCP acceptable). Familiarity with CI/CD tools (GitHub Actions, Jenkins, Azure DevOps, etc.). Excellent problem-solving and communication skills. Preferred Qualifications: Experience with data visualization tools (Power BI, Tableau, or similar). Knowledge of Python for data-related tasks. Exposure to big data technologies (Apache Spark, Kafka, Delta Lake). Familiarity with containerization (Docker, Kubernetes).

Posted 1 week ago

Apply

10.0 - 15.0 years

0 Lacs

bengaluru

Remote

Project Role : SAP ECS - Enterprise Cloud Services Database Administration(Contractual) Remote : Bangaluru, India Start Date : Immediate Duration: 1 Year Distribution Effort : 5/Days per week Skill : HANA Long Description : ECS CAE HANA Architecture Position APJ / AMEX / US Urgency : Very High Location: India / US / Canada / AMEX OnCall / Weekend Work / Out of hours Work required: no, maybe reasonable timezone overlaps Remote / Partial Onsite required : Remote Duration of the Engagement / Long-Term Position: 6month with following extensions Summary: We are looking for a SAP HANA expert with strong focus on databases and data management topics, experienced in SAP HANA best practices, cloud service engineering processes, transformation and innovation services, with a strong background in technical/functional end-to-end data environment management. The candidate should provide functional and technical expertise across multiple technologies and business domains. -Key activities will include supporting POCs, architectures, migrations/upgrades/patching (where applicable), automation, performance & tuning, on-demand expertise, and optimizing existing processes through automation and AI/ML. Experience and Role Requirements: Bachelors Degree / Masters degree in Computer Science, Information Technology, or similar with 10+ years of related experience Strong experience with SAP HANA internals and SAP HANA Database management/administration, monitoring and maintenance (including patching and upgrades) Strong experience in HANA performance and tuning, Root cause analysis, HANA system replication, HANA scale-out systems, and various extension options like Native Storage Extension, Extension Nodes, dynamic tiering etc. Strong experience in SAP Business continuity / resiliency topics including HADR etc. Strong experience in SAP HANA SQL Script and one or more of Python/Shell/Perl Integration experience in one or more areas: SAP Data Intelligence, SAP Cloud Platform Integration, Enterprise Information Management (SDI, SDA, SDS, BODS, Information Steward, SRS) Strong Experience in designing, implementing, and onboarding Cloud Services and solutions to the solutions portfolio Strong SAP architecture knowledge desirable to comprehend customer needs & solution appropriately Experience in enterprise applications preferably SAP S/4HANA or SAP BW/4HANA and/or SAP Basis and NetWeaver Experienced in driving critical projects and bringing them to conclusion successfully. Experience in triaging, troubleshooting, and resolving technical and functional incidents and problems, pertaining to SAP applications. Hands-on experience in Hyperscalers, preferably in Azure and/or AWS and/or GCP Demonstrate a growth mindset and stay current with product and multiple industry advancements Support with customer escalations to help troubleshoot and resolve incidents and problems on time Excellent analytical skills, face-to-face and remote communication skills with proven ability to bridge Technology and Business goals to provide productive solutions Able to continuously learn and upskill to maximize contributions to a fast-evolving organization Support defining, implementing, and documenting processes for new solutions and services Required Skills: SAP solution knowledge including: S/4HANA or SAP B/4HANA. SAP HANA Scale-up & Scale-out solutions SAP HANA NSE, Extension nodes SAP HANA XS, SAP HANA XSA, SAP HANA Cloud SAP HANA Security experience Strong technical / solution architecting skills Strong scripting / programming skills will be preferred 5+ years of broad experience, including processes, portfolio, solution/service design and onboarding, in a leading Managed Cloud Services organization Good to Have Architect level certification with one or more hyperscalers (AWS, Azure, or GCP). Familiarity with Pacemaker and HA/DR Solutions. Functional expertise in one or more Application components: Master Data Management/Governance, Information Lifecycle Management, Data aging (pertaining to SAP S/4HANA) Familiarity with storage architectures and one or more SAN technology, e.g. EMC, Hitachi, NetApp etc. Familiarity with SAP Analytics Cloud (SAC) and/or SAP Business Technology Platform (BTP) and other SAP SaaS solutions Familiarity with other SAP Database products SAP ASE, SAP IQ, SAP Replication Server. Familiarity with RISE with SAP offerings Familiarity with Artificial intelligence / Machine learning and the ability to apply the knowledge to existing processes or delivery

Posted 1 week ago

Apply

3.0 - 4.0 years

5 - 8 Lacs

bengaluru

Work from Office

Build and deploy machine learning models for forecasting, route optimization, anomaly detection, and other analyses. Hands-on experience in python, SQL, Cloud platforms (AWS/Azure/GCP) and cloud native-ML tools like SageMaker, Azure ML, Vertex AI. Required Candidate profile Proficient in python libraries and CI/CD pipeline tools, git. Exposure to Generative AI frameworks or large language models (e.g., OpenAI GPT, Hugging Face Transformers)

Posted 1 week ago

Apply

3.0 - 4.0 years

5 - 8 Lacs

gurugram

Work from Office

Build and deploy machine learning models for forecasting, route optimization, anomaly detection, and other analyses. Hands-on experience in python, SQL, Cloud platforms (AWS/Azure/GCP) and cloud native-ML tools like SageMaker, Azure ML, Vertex AI. Required Candidate profile Proficient in python libraries and CI/CD pipeline tools, git. Exposure to Generative AI frameworks or large language models (e.g., OpenAI GPT, Hugging Face Transformers)

Posted 1 week ago

Apply

5.0 - 8.0 years

25 - 40 Lacs

pune, gurugram, bengaluru

Hybrid

Salary: 25 to 40 LPA Exp: 5 to 10 years Location: Gurgaon/Pune/Bengalore Notice: Immediate to 30 days..!! Job Profile: Experienced Data Engineer with a strong foundation in designing, building, and maintaining scalable data pipelines and architectures. Skilled in transforming raw data into clean, structured formats for analytics and business intelligence. Proficient in modern data tools and technologies such as SQL, T-SQL, Python, Databricks, and cloud platforms (Azure). Adept at data wrangling, modeling, ETL/ELT development, and ensuring data quality, integrity, and security. Collaborative team player with a track record of enabling data-driven decision-making across business units. As a Data engineer, Candidate will work on the assignments for one of our Utilities clients. Collaborating with cross-functional teams and stakeholders involves gathering data requirements, aligning business goals, and translating them into scalable data solutions. The role includes working closely with data analysts, scientists, and business users to understand needs, designing robust data pipelines, and ensuring data is accessible, reliable, and well-documented. Regular communication, iterative feedback, and joint problem-solving are key to delivering high-impact, data-driven outcomes that support organizational objectives. This position requires a proven track record of transforming processes, driving customer value, cost savings with experience in running end-to-end analytics for large-scale organizations. Design, build, and maintain scalable data pipelines to support analytics, reporting, and advanced modeling needs. Collaborate with consultants, analysts, and clients to understand data requirements and translate them into effective data solutions. Ensure data accuracy, quality, and integrity through validation, cleansing, and transformation processes. Develop and optimize data models, ETL workflows, and database architectures across cloud and on-premises environments. Support data-driven decision-making by delivering reliable, well-structured datasets and enabling self-service analytics. Provides seamless integration with cloud platforms (Azure), making it easy to build and deploy end-to-end data pipelines in the cloud Scalable clusters for handling large datasets and complex computations in Databricks, optimizing performance and cost management. Must to have Client Engagement Experience and collaboration with cross-functional teams Data Engineering background in Databricks Capable of working effectively as an individual contributor or in collaborative team environments Effective communication and thought leadership with proven record. Candidate Profile: Bachelors/masters degree in economics, mathematics, computer science/engineering, operations research or related analytics areas 3+ years experience must be in Data engineering. Hands on experience on SQL, Python, Databricks, cloud Platform like Azure etc. Prior experience in managing and delivering end to end projects Outstanding written and verbal communication skills Able to work in fast pace continuously evolving environment and ready to take up uphill challenges Is able to understand cross cultural differences and can work with clients across the globe.

Posted 1 week ago

Apply

3.0 - 8.0 years

7 - 11 Lacs

bengaluru

Work from Office

Your Role and Responsibilities Who you are: We are Firmware professional working on Z systems and we build the most secure systems for our customers to deploy their enterprise applications. Also, We provide atmost security and enable IBM LinuxONE enterprise customers with on-premise, customer-managed container solution through Secure Service Containers which provides data protection as well protection from Insider threats. We develop new features if or Dynamic partition manager which is a quicker and easier way to deploy Linux servers on a z System (Mainframe). DPM is a configuration manager that is designed for setting up and managing Linux servers that run on a mainframe system. What you’ll do: To Build the world most secure system via Z systems which helps customers to deploy their application, also provide a platform to deploy Cloud to defend against threats like Cyber Threats, etc and making the enablement experience quicker and easier. How we’ll help you grow: You’ll have access to all the technical and management training courses you need to become the expert you want to be You’ll learn directly from expert developers in the field; our team leads love to mentor You have the opportunity to work in many different areas to figure out what really excites you Required education Bachelor's Degree Preferred education Bachelor's Degree Required technical and professional expertise 3+ years of relevant industry experience. Strong programming skills in Java, Javascript, Python, ReactJS. Expertise in OS concepts, Container technologies, Security, Virtualization Management, REST API, App Dev on Cloud Platform, DevOps (Continuous Integration) Demonstrated execution experience of App cloud development / container technologies DevOps. Good communication skills and be able to work effectively in a global team environment. Preferred technical and professional experience Z Systems experience, Private Cloud knowledge, Exposure to Catalog Services, Micro Services. Prior experience / exposure to Storage, Networking and Security would be an added advantage

Posted 1 week ago

Apply

4.0 - 8.0 years

14 - 18 Lacs

pune, mumbai (all areas)

Hybrid

Role description: Senior DevOps Team Lead - GCP Specialist About the Role: We are seeking an experienced DevOps Team Lead with deep expertise in Google Cloud Platform (GCP) to help drive our cloud infrastructure strategy and operations. This role requires hands-on technical leadership with a focus on GCP services, particularly GKE, networking, and optimization strategies. Key Responsibilities - Lead the design, implementation, and management of our GCP-based infrastructure - Architect and optimize GKE clusters for performance, reliability, and cost - Design and implement secure VPC architectures, subnets, and firewall rules in GCP - Establish comprehensive logging and monitoring systems using GCP's native tooling - Drive cost optimization initiatives across our GCP resources - Mentor team members on GCP best practices and implementation patterns - Collaborate with development teams to implement CI/CD pipelines in GCP Required Technical Skills GCP Expertise (Primary Focus): - Extensive experience with Google Kubernetes Engine (GKE), including advanced configurations, scaling, and upgrades - Deep knowledge of GCP networking: VPC design, subnet architecture, and firewall configuration - Proven experience implementing comprehensive logging and monitoring solutions with GCP's Cloud Logging and Monitoring - Demonstrated success in optimizing GCP costs through resource management and architecture decisions - Experience with GCP IAM and security best practices General DevOps Skills: - Infrastructure as Code (Terraform, Helm preferred) - CI/CD pipeline design and implementation - Container orchestration and microservices architecture - Production troubleshooting and performance optimization - Incident management and response Qualifications - 5+ years of DevOps experience with at least 3 years focusing primarily on GCP environments - GCP Professional certifications preferred (Cloud Architect, DevOps Engineer) - Experience leading technical teams and mentoring engineers - Strong understanding of cloud security principles and best practices - Experience scaling infrastructure for high-traffic applications Note: While experience with other cloud platforms is valuable, this role requires demonstrated expertise and hands-on experience specifically with GCP as your primary cloud platform. Job Application Link: https://trampolinetech.ripplehire.com/candidate/?token=3JqyMuJLt1DX1AYD19fJ&ref=LI03⟨=en#detail/job/781848

Posted 1 week ago

Apply

6.0 - 11.0 years

20 - 27 Lacs

pune

Remote

Hybrid work Hinjewadi,Pune DialogflowCX - Flows, Playbooks, Productionising Voice and Chat Agents, Test Cases and Evaluations, Python(Cloud functions, Colab), SQL(Bigquery), Analytical skills Required Candidate profile Hybrid work Hinjewadi,Pune , Test Cases and Evaluations, Python(Cloud functions, Colab), SQL(Bigquery), Analytical skills

Posted 1 week ago

Apply

7.0 - 12.0 years

20 - 25 Lacs

bengaluru

Hybrid

Please find below details about the position for your reference & provide confirmation/ acceptance for this job role: CAA (Cloud Architect Advisory) This is SAP Cloud Architect position which is elevated role as per career progression/ path for SAP Basis consultant. Architects primary role/ responsibility is to plan/ design & guide customers about their SAP Landscape and for this job position also expectation is that selected candidate will perform similar responsibilities to guide SAP Customers about their landscape on SAP RISE Platform. SAP RISE Platform offers all latest SAP Technologies (Products) with options for customer to select their own Hyperscaler (AWS, Azure or GCP). CAA team member works very closely with SAP presales team to understand Customer requirement for their landscape and based on their SAP Product/ Infrastructure selection create their to-be landscape architecture/ design. CAA team member has responsibility to discuss landscape architecture (setup & build process, dependencies etc.) with Customer leaders & their technical team. (Refer detailed Roles & Responsibilities below) This role is Customer facing and will require to join meetings, answer technical queries from customer team & present assessments. Since it is an architect role so there will not be any hands-on work. Scope for the career Have a scope of learning new technologies on SAP Products (S/4HANA, HANA Database, MDG, SaaS Products (Ariba, Salesforce etc.) & Cloud (AWS, Azure OR GCP) Will have access to all SAP internal training & learning materials to gain knowledge in latest technologies. Will gain exposure to all latest build architectures in one single place/ position which will be valuable/ useful in future. Generally this exposure is not available in any regular SAP Basis position. Will not be assigned/ dedicated to one single customer but will engage with Multiple customers in various Industries to gain knowledge about different SAP Landscape setups, Delivery processes & Challenges. Roles and Responsibilities: As a Technical Architect below would be the Roles & Responsibilities and to perform this job proper KT & Training will be provided once onboarded in the account: Bill Of Material Review & Analysis Review technical accuracy of Bill Of Material to validate that no obsolete version product (ECC etc.) is selected to build on SAP RISE platform Provide feedback to the Sales Team on Bill Of Material to confirm if selected product and version is available to build Do technical validation of customer landscape to validate Product version build viability Analyze Early Watch Reports & Technical Data to validate hardware selection/ sizing. Work closely with Sales Team on architecture, landscape, and to-be. Sizing the to be Landscape for Migrations. Customer Deal Support Pre-Signature Prepare Technical Assessments to discuss with Customer and explain about their to-be landscape Conduct Technical Assessments with customers Perform workshops with customer (networking, HA concepts, RACI reviews) Customer Deal Support Post Signature Onboarding Questionnaire Support (assist customer with questions on onboarding) Answer questions of customer relating to delivery Provide delivery with key information during sales cycle Provide turnover to delivery of key items during sales cycle Answer questions of delivery specific to the delivery Work Location : Remote(As off now) Technical skill and experience required: 6+ years (CAA Role)- REX (SAP Basis, SAP HANA and S/4 HANA skills, Migration/ Upgrades & Experience on Any Cloud (AWS, MS Azure, GCP) Good Comm Skill ,who can articulate , Exp in Customer Facing

Posted 1 week ago

Apply

4.0 - 9.0 years

20 - 35 Lacs

pune, gurugram, bengaluru

Hybrid

Salary: 20 to 35 LPA Exp: 5 to 8 years Location: Gurgaon /Pune/Bangalore Notice: Immediate to 30 days..!! Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using GCP services such as BigQuery, Data Flow, PubSub, Dataproc, and Cloud Storage. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Develop complex SQL queries to extract insights from large datasets stored in Google Cloud SQL databases. Troubleshoot issues related to data processing workflows and provide timely resolutions. Desired Candidate Profile 5-9 years of experience in Data Engineering with expertise GCP & Biq query data engineering. Strong understanding of GCP Cloud Platform Administration including Compute Engine (Dataproc), Kubernetes Engine (K8s), Cloud Storage, Cloud SQL etc. . Experience working on big data analytics projects involving ETL processes using tools like Airflow or similar technologies.

Posted 1 week ago

Apply

6.0 - 11.0 years

10 - 17 Lacs

pune

Work from Office

SUMMARY Job Title: DevOps Application Lead Location: Pune Experience Required: 6+ years Must-Have: The candidate should possess at least 5 years of relevant experience in DevOps with any cloud platform (AWS, Azure, GCP). Education: Completion of 15 years of full-time education. Job Summary: We are in search of a DevOps Application Lead who will be responsible for the design, development, and management of applications, while also leading the team to deliver high-quality solutions. This role involves close collaboration with cross-functional teams, overseeing project timelines, and ensuring the implementation of best practices in DevOps processes. The chosen candidate will serve as the primary point of contact for the team, providing guidance in problem-solving and driving continuous improvement. Roles & Responsibilities: Lead and oversee a team in the implementation of DevOps practices. Take ownership of team decisions and project outcomes. Collaborate with multiple teams and contribute to key decisions. Design and execute CI/CD pipelines and automation solutions. Offer solutions to technical issues and mentor junior team members. Continuously enhance processes to improve efficiency and performance. Professional & Technical Skills: Must Have: Strong expertise in DevOps tools and practices. Proficiency in CI/CD, automation, and monitoring. Hands-on experience with cloud platforms (AWS, Azure, GCP). Knowledge of Docker, Kubernetes, and containerization. Scripting experience in Python, Bash, or similar languages. Why Join Us? Opportunity to lead impactful DevOps projects. Work with cutting-edge tools and cloud technologies. Collaborative and growth-focused environment. Requirements Requirements: At least 5 years of relevant experience in DevOps. Completion of 15 years of full-time education.

Posted 1 week ago

Apply

3.0 - 8.0 years

19 - 30 Lacs

pune

Hybrid

Note: We are only seeking candidates with Airline domain experience, including NDC, GDS/Amadeus. Must have 3+ years of experience in a Product Owner role. Requirements Responsibilities: Work with stakeholders, architects, engineering managers, cross-functional product teams and developers to build high value features to help improve overall code quality and efficiencies Align stakeholders around the product vision and drive teams to deliver on quarterly roadmap goals Work with product leaders to build and maintain the product roadmap Own KPI monitoring and reporting Align product teams to identify opportunities for improvement in security/CI-CD/quality/performance practices and design reusable components Establish product requirements, set priorities and user expectations with scrum teams Attend all agile ceremonies as the product SME Translate requirements into well articulated EPICs and stories for technical teams Determine acceptance criteria and work with QA to build test cases Report on team output and continuously optimize processes to ensure productivity Identify internal dependencies, risks and pivot between high priority tasks Desired Skills and Experience: 3+ years of well-rounded experience in a product role Bachelor's degree in computer science or MBA preferred Knowledge of software architecture, REST, cloud platform, platform engineering, micro-services, third party integrations Experience working as a hands-on Quality Engineer or Application Developer [an asset] Travel industry experience [with experience in Web or API Connections] [an asset] Experience in building UI interfaces/wireframing [an asset] Strategic thinker with strong experience in data-backed decision making Strong project management and organizational skills Output and delivery driven Resilient with the ability to work with a level of ambiguity Proficient in communication, expressing and articulating ideas across technical teams, product teams and business stakeholders. Passion for problem solving, outside-of-the-box-thinking and root cause analysis Leadership skills - Ability to bring teams together for a common purpose and communicate ideas across multiple teams and stakeholders Highly accountable team player Contact: Sam- 7982371791 Email: Sam@hiresquad.in

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

pune, maharashtra

On-site

You will be joining our team as Assistant Vice President, Applications Development Senior Programmer Analyst (C12) based in Pune, India. The XVA and Cross Asset Margin technology teams focus on developing and enhancing strategic systems and services that manage risk in Citi Capital Markets. This includes calculating margin requirements for OTC bilateral derivatives and optimizing margin requirements for clients using models like VaR, SIMM, and Credit Stress. You will support various stakeholders such as traders, salespeople, risk managers, financial controllers, and operations staff. As an Applications Development Senior Programmer Analyst, your role involves participating in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. You will contribute to applications systems analysis and programming activities with a primary focus on UI platform development. Responsibilities: - Provide expertise in applications programming and ensure application design aligns with the overall architecture blueprint. - Develop standards for coding, testing, debugging, and implementation based on advanced system flow knowledge. - Gain comprehensive knowledge of business areas to facilitate integration and achievement of business goals. - Guide the team in setting design standards to enhance development efficiency and product rollout. - Take ownership of planning and executing smooth Production releases. - Analyze existing operations, identify risks and redundancies, and develop solutions. - Collaborate with team members and management to ensure projects meet application development and governance standards. - Stay updated on industry trends and developments. - Resolve high-impact problems/projects through thorough evaluation of complex business processes and industry standards. - Assess risks appropriately when making business decisions to safeguard Citigroup and its clients. Qualifications: - 7+ years of relevant experience in Apps Development role. - Hands-on coding experience, particularly in Java (Spring-Web, Spring-Data, Hibernate) and modern UI technologies. - Good knowledge of UI/UX design patterns and microservices architecture. - Experience building applications on cloud platforms. - Strong analytical, troubleshooting, and problem-solving skills. - Excellent verbal and written communication skills. - Ability to work independently and as part of a team. - Proficiency in multitasking and task prioritization. - Preferred experience in building public-facing web portals, UX design patterns, and business knowledge of Margin, CVA, XVA, and regulatory stress testing. Education: - Bachelor's degree in Computer Science, Mathematics, or equivalent. - Master's degree is preferred. Citi is an equal opportunity and affirmative action employer, encouraging all qualified applicants to apply for career opportunities. If you require a reasonable accommodation due to a disability, please review Accessibility at Citi.,

Posted 1 week ago

Apply

7.0 - 12.0 years

18 - 24 Lacs

bengaluru

Work from Office

Responsibilities: * Deploy, Configure & Manage AWS data bricks clusters * Optimize performance through tuning and scaling * Collaborate with cross-functional teams on data engineering projects * Ensure security compliance and access control Work from home

Posted 1 week ago

Apply

6.0 - 10.0 years

25 - 35 Lacs

gurugram, mumbai (all areas)

Work from Office

6+ years of IT experience, including at least 3-4 years hands-on with Splunk ITSI in complex environments. ITSI or Splunk Architect Certification is mandatory. If not then the candidate should have hands-on exp. Required Candidate profile Strong proficiency in Splunk SPL scripting, dashboard creation. Exp. integrating Splunk ITSI with tools & technologies across Network, Cloud platforms, On-premises Data Centers, and IoT systems.

Posted 1 week ago

Apply

12.0 - 20.0 years

15 - 25 Lacs

pune

Work from Office

We are seeking a highly experienced and visionary Vice President of Technology to lead our technology strategy, oversee product development, and ensure our engineering and IT teams deliver scalable, secure, and innovative solutions. The VP of Technology will play a key role in aligning business goals with technical initiatives, driving digital transformation, and ensuring that technology remains a competitive advantage. Role & responsibilities Strategic Leadership Define and execute the organizations long-term technology strategy in alignment with business objectives. Evaluate emerging technologies and trends to identify opportunities for innovation and competitive differentiation. Collaborate with executive leadership to influence company strategy and growth. Technology & Product Development Oversee end-to-end software/product architecture, design, development, testing, and deployment. Ensure scalable, secure, and high-performing technology platforms. Partner with Product Management to balance customer needs, technical feasibility, and business priorities. Team Leadership & Culture Build, mentor, and inspire a high-performing engineering, data, and IT team. Foster a culture of innovation, accountability, collaboration, and continuous improvement. Define career paths, set performance metrics, and develop leadership within the technology organization. Operations & Governance Establish best practices in software development, DevOps, data security, and IT compliance. Own technology budgets, vendor management, and resource planning. Ensure compliance with relevant regulations, cybersecurity, and data privacy standards. Stakeholder Engagement Serve as the primary technology spokesperson to internal and external stakeholders. Build partnerships with customers, vendors, and technology partners to strengthen business outcomes. Communicate complex technical concepts to non-technical audiences, including board members. Qualifications Education: Bachelors or Masters degree in Computer Science, Engineering, Information Technology, or related field. MBA is a plus. Experience: 12+ years of progressive experience in software development, engineering, or IT, with at least 5+ years in senior leadership roles. Proven track record of scaling technology in high-growth organizations. Preferred candidate profile Strong leadership and people management skills. Excellent knowledge of software engineering, cybersecurity, data management, and IT operations. Ability align technology with business priorities.

Posted 1 week ago

Apply

2.0 - 4.0 years

9 - 12 Lacs

noida

Work from Office

Job Responsibilities Design, build & maintain scalable data pipelines for ingestion, processing & storage. Collaborate with data scientists, analysts, and product teams to deliver high-quality data solutions. Optimize data systems for performance, reliability, scalability, and cost-efficiency . Implement data quality checks ensuring accuracy, completeness, and consistency. Work with structured & unstructured data from diverse sources. Develop & maintain data models, metadata, and documentation . Automate & monitor workflows using tools like Apache Airflow (or similar). Ensure data governance & security best practices are followed. Required Skills & Qualifications Bachelors/Master’s degree in Computer Science, Engineering, or related field . 3–5 years of experience in data engineering, ETL development, or backend data systems . Proficiency in SQL & Python/Scala . Experience with big data tools (Spark, Hadoop, Kafka, etc.). Hands-on with cloud data platforms (AWS Redshift, GCP BigQuery, Azure Data Lake). Familiar with orchestration tools (Airflow, Luigi, etc.). Experience with data warehousing & data modeling . Strong analytical & problem-solving skills ; ability to work independently & in teams. Preferred Qualifications Experience with containerization (Docker, Kubernetes). Knowledge of CI/CD processes & Git version control. Understanding of data privacy regulations (GDPR, CCPA, etc.). Exposure to machine learning pipelines / MLOps is a plus.

Posted 1 week ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

mumbai, bengaluru, delhi / ncr

Hybrid

We are Hiring: Data Engineers Enterprise Data Platforms & Analytics (50+ Openings | 3 - 20 Yrs Exp | Multiple Levels) Location: Mumbai / Pune / Chandigarh / Bengaluru / Gurugram / Noida / New Delhi / Kolkata / Hyderabad . Chennai ( Remote and Hybrid options are available for some roles) Industry: Fortune 500 Client Projects (Staffing via Hatchtra Innotech Pvt. Ltd.) Employment Type: Full-Time / Contract (C2H) About us: Hatchtra is a leading staffing and workforce solutions company, trusted by Fortune 500 organizations and global enterprises to build their technology teams. When you join us, you'll work directly with our Fortune 500 client teams that power mission-critical systems worldwide. About the Role We are looking for Data Engineers (3-20 years) to design, build, and maintain scalable data pipelines and enterprise data platforms. This role is ideal for engineers passionate about big data, cloud data platforms, ETL/ELT processes, and analytics engineering to support global enterprises in their digital transformation journey. From data ingestion to advanced analytics enablement , you will play a critical role in turning raw data into actionable insights. Open Positions & Designations Associate Data Engineer (3-5 Yrs) Data Engineer / Senior Data Engineer (5-10 Yrs) Lead Data Engineer / Analytics Engineer (812 Yrs) Data Engineering Manager / Solutions Architect (1015 Yrs) Director – Data Platforms & Engineering (15–20 Yrs) Key Responsibilities Scale with Level Professional (3–5 Yrs) Design and implement ETL/ELT pipelines to integrate data from various sources. Build and optimize data warehouses and data lakes on cloud or on-prem platforms. Work with SQL, Python, or Scala for data processing and transformation. Implement data quality checks and validation frameworks. Support analytics teams by making clean, reliable data available. Develop basic automation scripts for data ingestion and reporting workflows. Mid-Level (5–10 Yrs) Lead data integration projects across structured, semi-structured, and unstructured data. Optimize pipelines for performance, scalability, and cost-effectiveness . Implement data modeling best practices for analytics and BI. Collaborate with data scientists, analysts, and product teams to enable ML/AI use cases. Deploy streaming data pipelines using Kafka, Kinesis, or Azure Event Hub. Manage and improve data governance, lineage, and metadata management . Senior/Leadership (10–20 Yrs) Define enterprise data engineering strategy and architecture standards . Lead multi-cloud data platform modernization initiatives (AWS, Azure, GCP). Build and manage global data engineering teams and delivery models. Partner with executives to deliver data-driven transformation programs . Oversee compliance, data security, and privacy frameworks (GDPR, HIPAA, SOC2). Drive innovation in real-time analytics, serverless data processing, and AI-driven data engineering . Skills & Tools Core Expertise: ETL/ELT Development, Data Modeling, Pipeline Automation, Data Warehousing Programming: SQL, Python, Scala, Java Data Platforms: Snowflake, BigQuery, Azure Synapse, Amazon Redshift, Databricks Orchestration: Apache Airflow, AWS Step Functions, Azure Data Factory, dbt Streaming: Kafka, Kinesis, Azure Event Hubs, Spark Streaming Big Data Tools: Apache Spark, Hadoop, Hive Cloud Expertise: AWS, Azure, GCP (multi-cloud deployments) DevOps for Data: Docker, Kubernetes, Terraform, CI/CD for DataOps Preferred Certifications: Google Professional Data Engineer AWS Certified Data Analytics – Specialty Microsoft Certified: Azure Data Engineer Associate Qualifications Bachelor’s/Master’s in Computer Science, Data Engineering, or related field. 3+ years of experience in data engineering, ETL/ELT, or data platform development. Strong proficiency in SQL, Python/Scala , and cloud-native data services. Experience in building scalable, secure, and automated pipelines . Proven leadership and strategic architecture experience for senior roles. Why Join Us? Contribute to enterprise-scale data transformation projects . Collaborate with Fortune 500 companies and global engineering teams. Exposure to cutting-edge cloud data platforms and big data technologies . Career growth, certifications, and flexible work options. How to Apply For quick consideration, please email your resume and include the desired position and experience level (e.g., “Data Engineer – Mid-Level”) in the subject line.

Posted 1 week ago

Apply

5.0 - 10.0 years

5 - 15 Lacs

hyderabad, pune, bengaluru

Work from Office

Automation Tester -Selenium YOU MUST: Having 5+ years in Test Automation Good Experience in creating or extending automation frameworks using Selenium Good Experience in developing selenium java automation Experience with APIs and Cloud platforms Defining and reviewing automation test packs to improve software quality. Excellent team collaboration and the ability to proactive research and solve issues Experience in continuous integration/deployment using Jenkins Excellent communication and leadership skills Contribute to planning, design, and implementation of test automation frameworks using best-practice techniques and principles. Develop well tested/maintainable code Experience in Financial Sector Experience in Agile Scrum/TDD/BDD methodologies Experience working with Jira/Confluence If this position sounds of interest, please apply and a member of the resource team will be in contact to proceed with your application.

Posted 1 week ago

Apply

3.0 - 8.0 years

12 - 22 Lacs

pune

Work from Office

Functional Modules: > SF Time Management (Con & S.Con) > SF EC (Con & S.Con) > SF/SAP Payroll (Con & S.Con) > SF Onboarding 2.0 (Con & S.Con) > SF Compensation (Con) Technical Modules: > SAP CPI (Con) > ABAP HR (Con) > SAP BTP (Con & S.Con)

Posted 1 week ago

Apply

3.0 - 8.0 years

12 - 22 Lacs

bengaluru

Work from Office

Functional Modules: > SF Time Management (Con & S.Con) > SF EC (Con & S.Con) > SF/SAP Payroll (Con & S.Con) > SF Onboarding 2.0 (Con & S.Con) > SF Compensation (Con) Technical Modules: > SAP CPI (Con) > ABAP HR (Con) > SAP BTP (Con & S.Con)

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies