Home
Jobs

19748 Gcp Jobs - Page 12

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description As an Architect within Ford Credit IT, you will play a pivotal role in architecting and delivering end-to-end digital solutions across web, mobile, and backend platforms. Your focus will extend beyond AEM development, incorporating backend API integrations, security frameworks, and GraphQL-based data access layers. You will interface closely with product leadership and diverse stakeholders to align technology strategies with business goals, ensuring scalable and secure architecture that supports our digital product portfolio. Responsibilities Responsibilities: Lead the architecture and technical strategy for AEM-based frontends as well as backend integrations, emphasizing security, scalability, and performance. Design and develop modern AEM SPA applications using ReactJS, with deep integration into backend services via REST and GraphQL APIs. Architect secure solutions including API authentication, authorization, data encryption, and compliance with organizational and regulatory standards. Drive comprehensive architecture documentation, describing solution design, integration points, security considerations, and deployment models for management review. Collaborate with cross-functional teams and stakeholders to gather requirements, define technical solutions, and ensure alignment with organizational goals. Provide technical leadership and mentorship, enabling engineering teams to deliver high-quality solutions. Be prepared to deep dive into codebases as necessary while maintaining a strategic oversight. Support migration and modernization initiatives by gradually evolving legacy applications towards modern cloud-native architectures on Google Cloud. Lead discussions around reusable components, microfrontend architecture, and interoperability strategies. Stay current with evolving AEM capabilities, GraphQL ecosystem, web security best practices, and emerging cloud technologies. Demonstrate excellent stakeholder management skills, managing expectations and fostering alignment across technical and business teams. Qualifications Required Skills: Proven experience designing and implementing large-scale AEM architectures, including custom components, templates, workflows, and SPA development with ReactJS. Strong expertise in backend integration patterns, including RESTful APIs and GraphQL, with solid understanding of API design, versioning, security protocols (OAuth, JWT), and performance optimization. Deep understanding of web application security principles, including secure authentication flows, data protection, and vulnerability mitigation. Experience with cloud platforms (preferably GCP) encompassing deployment, monitoring, and securing applications. Ability to create clear, comprehensive architecture and design documentation tailored for diverse audiences ranging from developers to senior management. Demonstrated experience in stakeholder engagement, coordinating across product, engineering, and business teams to align technical solutions with organizational objectives. Hands-on coding skills and willingness to inspect and contribute to code as needed, bridging the gap between strategic vision and implementation. Familiarity with Agile methodologies, DevOps practices, and continuous integration/continuous delivery (CI/CD) pipelines. Consulting experience is a distinct advantage, bringing strong client engagement, problem-solving, and solution delivery capabilities. Technical Skills: Adobe Experience Manager (6+) with hands-on SPA application development ReactJS, Node.js, Java, HTML5, CSS3, and JavaScript frameworks Basic understanding of backend integration with Java or similar languages GraphQL APIs: integration and security Cloud platforms: Google Cloud Platform preferred; Azure/AWS awareness Experience with Adobe Target, Adobe Launch, and Micro Frontend (MFE) architectures is a plus Knowledge of security standards like OAuth 2.0, OpenID Connect, and secure API gateways Primary: Minimum 6 years in Adobe Experience Manager architecture and SPA development Proficient with ReactJS, Node.js, Java, HTML5, and CSS3 Secondary: Familiarity with Single-SPA framework, Angular Experience with Adobe Target & Launch for digital marketing optimization Micro Frontend architecture development Show more Show less

Posted 1 day ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Greetings! Please find the JD below, if you find interested, please share your update profile asap to below mail id nitin.1@zensar.com. Note: Looking for only immediate joiners. Key Responsibilities: Develop and maintain test automation frameworks for frontend and backend systems using Java and related tools Write and execute automated test scripts for UI, API, and backend services. Test containerized applications in Docker and Kubernetes environments, and cloud platforms (AWS, Azure, GCP). Collaborate with developers and QA team members to identify test requirements and ensure test coverage. Integrate automated tests into CI/CD pipelines (e.g., Jenkins, GitLab CI). Analyze test results, identify bugs, and work with the development team to resolve issues. Maintain and enhance test environments and test data management, and write complex SQL queries Participate in Agile/Scrum ceremonies and contribute to sprint planning and retrospectives. Required Skills and Qualifications: Bachelor’s degree in computer science, Engineering, or a related field. 3+ years of experience in test automation using Java Strong programming skills in Java, JavaScript, or Python. Strong knowledge of Selenium WebDriver, TestNG/Junit. Experience with REST API testing using Postman or Rest Assured, and JSON/XML. Familiarity with version control systems (e.g., Git). Experience with CI/CD tools like Jenkins, Maven, or Gradle. Solid understanding of software testing principles, including functional, regression, integration, and performance testing. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Familiarity with microservices and container orchestration. Strong debugging and analytical skills Nice-to-Haves: Experience with AI/ML testing tools. Experience with BDD frameworks like Cucumber. Knowledge of cloud platforms (AWS, Azure, GCP). Knowledge of Infrastructure as Code (Terraform, Ansible). Familiarity with containerization tools like Docker and Kubernetes. Exposure to performance testing tools like JMeter or Gatling or K6 Exposure to mobile testing (Appium), Playwright Conduct security testing using OWASP ZAP or Burp Suite. Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Himachal Pradesh, India

On-site

Linkedin logo

Job Posting for Automation Developer , Industry is Healthcare Domain. Location – Parwanoo (Himachal Pradesh) Job Description: Design end-to-end IoT solutions, including hardware, software, and cloud integration. Select appropriate IoT platforms (AWS IoT, Azure IoT, Google Cloud IoT). Develop system architecture for scalability, security, and reliability. Collaborate with engineers and stakeholders to implement IoT projects. Ensure compliance with data privacy and security standards. Develop firmware for IoT devices (microcontrollers, sensors, gateways). Write backend services for data processing and analytics. Implement communication protocols (MQTT, Web Sockets, REST APIs). Optimize device performance and power consumption. Debug and troubleshoot IoT systems. Write backend services for data processing and analytics. Design and prototype IoT hardware (PCBs, sensors, edge devices). Select components (microcontrollers, wireless modules, power systems). Work with manufacturers for production and testing. Ensure compliance with industry standards (FCC, CE). Optimize for low-power and wireless connectivity (Wi-Fi, Cellular, LoRa ) Develop data pipelines for IoT sensor data. Implement real-time analytics and machine learning models. Optimize databases (SQL, NoSQL, Time-Series DBs like InfluxDB). Visualize data using dashboards (Grafana, Tableau, Power BI) Strong knowledge of IoT protocols (MQTT, COAP, HTTP) Experience with cloud platforms (AWS, Azure, GCP). Familiarity with embedded systems and wireless communication (BLE, LoRa, Zigbee). Programming skills (Python, Java, C/C++). Understanding of cyber security for IoT. Python, SQL, and data processing frameworks. Experience with cloud data services (AWS IoT Analytics, Azure Stream Analytics). Knowledge of machine learning for predictive maintenance. Show more Show less

Posted 1 day ago

Apply

15.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

We are a Rakuten Group company, providing global B2B services for the mobile telco industry and enabling next-generation, cloud-based, international mobile services. Building on the technology Rakuten used to launch Japan’s newest mobile network, we are now taking our mobile offering global! To support our ambitions to provide an innovative cloud-native telco platform for our customers, we are looking to recruit and develop top talent from Digital Product Management. Let’s build the future of mobile telecommunications together! We are a Rakuten Group company, providing global B2B/B2C services for the mobile telco industry and enabling next-generation, cloud-based, international mobile services. Building on the technology Rakuten used to launch Japan’s newest mobile network, we are now taking our mobile offering global! To support our ambitions to provide an innovative cloud-native telco platform for our customers, we are looking to recruit and develop top talent from Digital Product Management. Let’s build the future of mobile telecommunications together! Role : Technical Program manager You will independently lead cross-organisation programs, influencing the roadmap priorities and technical direction across teams. You will work with stakeholders across the organisation and own the communication of all aspects of the program including surfacing risks and progress towards the goal. You will guide the team towards technical solutions and make trade-off decisions. You will drive program management best practices across the organisation. The role requires closely working with the multiple functional teams (including but not limited to Business, Architects, Engineering, Operation support etc ) in building and maintaining program delivery timelines, unblocking teams, defining, and streamlining cross-functional dependencies along with increasing efficiency and velocity of project execution. You would likely spend most of the days in Agile, Kanban, or other project planning tools and scheduling meetings with relevant stakeholders to make sure projects keep moving forward to deliver a program execution strategy and timeline, as well as regular reporting of project health to stakeholders throughout a project’s life cycle. Team : RBSS Delivery organization Skills and Qualification Upto 15 years of hands-on technical project/program management experience with at least 10+ years of program managing /working in Scrums Must have Telecom Background with exposure on working with Telcom operators / ISP ( B2B, B2C customer solutions ) in software delivery / integration for at least 5+ years in BSS domain. Technology stack : Managed complex Data migration projects involving technologies such as Cloud ( AWS, GCP or compatible ), Microservices, Various DB solution (Oracle, MySQL, Couchbase, Elastic DB, Camunda etc ) ,Data streaming technologies ( such as Kafka) and tools associated with the technology stack Excellent Knowledge of Project Management Methodology and Software Development Life Cycles including Agile with excellent client-facing and internal communication skills. Ability to plan, organize, prioritize, and deliver multiple projects simultaneously. In-depth-knowledge and understanding of Telecom BSS business needs with the ability to establish/maintain high level of customer trust and confidence with Solid organizational skills including attention to detail and multitasking skills. Good to understanding of the challenges associated with BSS business and understanding of high level modules( CRM, Order Management , Revenue mgmt. and Billing services ) Excellent verbal, written, and presentation skills to effectively communicate complex technical and business issues (and solutions) to diverse audiences Strong analytical, planning, and organizational skills with an ability to manage competing demands Always curious about various issues/items. Have passion to learn continuously in a fast- moving environment Strong working knowledge of Microsoft Office, Confluence, JIRA, etc. Good to have: Project Management Professional (PMP) / Certified Scrum Master certification Good to have: knowledge of external solutions integrated with ETL software, Billing, Warehouse/supply chain related migrations projects Key job responsibilities Manage/Streamline the program planning by evaluating the incoming project demand across multiple channels against available capacity Regularly define and review KPI ‘s for proactively seek out new and improved mechanisms for visibility ensuring your program stays aligned with organization objectives Develop and Maintain Kanban boards /workstream dashboards Work with stakeholders during entire life cycle of the program, Execute Project requirements, Prepare detailed project plan, identify risks, manage vendor / vendor resources, measure program metrics and take corrective and preventive actions Ability to adopt Agile best practices ( such as estimation techniques) and define and optimize the processes is essential Coordinate with the product Management team to Plan Features and Stories into sprints, understand business priorities, align required stakeholders to make sure the team is able to deliver the expected outcome Manage Technology Improvements and other enhancements from conceptualization to delivery, have deep understanding of their impact, pros/cons, work through required detail, collaborate with all stakeholders till its successfully deployed in production Manage and Deliver Planned RBSS releases by working with customers .Work with Scrum masters, plan Scrum capacity, manage productivity of the teams Monitoring progress of the software developed by scrum teams, quality of the deliverables Working with engineering & product teams to scope product delivery, define solution strategies and understand development alternatives, as well as support Ensure availability to the team to answer questions and deliver direction. Work across multiple teams and vendors (cross-cutting across programs, business/engineering teams, and/or technologies) to drive delivery strategy & dependency management ensuring active delivery and pro-active communications Forecast and manage infrastructure and Resourcing demand against the operational growth of the platform in collaboration with engineering teams Delivering Agile projects that offer outstanding business value to the users. Supporting the stakeholders in implementing an effective project governance system. “Rakuten is committed to cultivating and preserving a culture of inclusion and connectedness. We are able to grow and learn better together with a diverse team and inclusive workforce. The collective sum of the individual differences, life experiences, knowledge, innovation, self-expression, and talent that our employees invest in their work represents not only part of our culture, but our reputation and Rakuten’s achievement as well. In recruiting for our team, we welcome the unique contributions that you can bring in terms of their education, opinions, culture, ethnicity, race, sex, gender identity and expression, nation of origin, age, languages spoken, veteran’s status, color, religion, disability, sexual orientation and beliefs” Show more Show less

Posted 1 day ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About company: Our client is prominent Indian multinational corporation specializing in information technology (IT), consulting, and business process services and its headquartered in Bengaluru with revenues of gross revenue of ₹222.1 billion with global work force of 234,054 and listed in NASDAQ and it operates in over 60 countries and serves clients across various industries, including financial services, healthcare, manufacturing, retail, and telecommunications. The company consolidated its cloud, data, analytics, AI, and related businesses under the tech services business line. Major delivery centers in India, including cities like Chennai, Pune, Hyderabad, and Bengaluru, kochi, kolkatta, Noida · Job Title: Java Full Stack With React js · Location: Hyderabad · Experience: 6+ Years · Job Type : Contract to hire. · Notice Period:- Immediate joiners . Payroll : People Prime . Client : MNC Client Mandatory Skills : Java,react,aws,cloud,kafka,sql Db Skill set: ( With Hands on Experience) Java full stack with React AWS and GCP cloud Kafka, SQL db, Show more Show less

Posted 1 day ago

Apply

0.0 - 2.0 years

0 Lacs

Mohali, Punjab

On-site

Indeed logo

Zoptal Solutions Pvt. Ltd. is a technology-driven company building innovative digital solutions across web, mobile, and AI platforms. As part of our growth journey, we're seeking an experienced and passionate AI Developer to join our core tech team and work on cutting-edge AI-powered products. Key Responsibilities: Develop, fine-tune, and deploy AI/ML models with a focus on NLP and Generative AI. Design and implement Retrieval-Augmented Generation (RAG) pipelines using tools like Hugging Face . Build robust, scalable, and high-performance AI solutions using Python . Integrate and manage Vector Databases like Astra DB , and work with PostgreSQL for structured data. Collaborate with product managers and developers to define AI project requirements. Continuously improve model performance and accuracy through experimentation and optimization. Stay up-to-date with the latest research and advancements in AI/ML technologies. Required Skills and Qualifications: Strong programming skills in Python . Experience working with Hugging Face Transformers and Datasets . Understanding of RAG pipelines and retrieval-based systems. Proficiency in Vector Databases (e.g., Astra DB, FAISS) and PostgreSQL . Good understanding of machine learning fundamentals and model evaluation techniques. Ability to write clean, maintainable, and well-documented code. Preferred Skills: Experience with LangChain , LLMs , or OpenAI APIs . Background in building production-ready AI/NLP solutions. Knowledge of cloud platforms (AWS, GCP, Azure) and deployment tools. Experience required: 2 years or above Location: 8B, Mohali Work Type: Work from office, 5 Days working How to Apply: Send your updated resume to our email address with the subject, "Applying for AI Developer" Best Regards, Kavita Rai HR Manager Job Type: Full-time Pay: ₹25,000.00 - ₹60,000.00 per month Location Type: In-person Schedule: Day shift Monday to Friday Ability to commute/relocate: Mohali, Punjab: Reliably commute or planning to relocate before starting work (Required) Experience: AI Development : 2 years (Required) Hugging Face: 2 years (Required) Astra DB/vector DB/PostGreSQL: 2 years (Required) RAG Pipeline: 2 years (Required) Location: Mohali, Punjab (Preferred) Work Location: In person Speak with the employer +91 9518295576

Posted 1 day ago

Apply

3.0 years

0 Lacs

India

Remote

Linkedin logo

Title: Data Engineer Location: Remote Employment type: Full Time with BayOne We’re looking for a skilled and motivated Data Engineer to join our growing team and help us build scalable data pipelines, optimize data platforms, and enable real-time analytics. What You'll Do Design, develop, and maintain robust data pipelines using tools like Databricks, PySpark, SQL, Fabric, and Azure Data Factory Collaborate with data scientists, analysts, and business teams to ensure data is accessible, clean, and actionable Work on modern data lakehouse architectures and contribute to data governance and quality frameworks Tech Stack Azure | Databricks | PySpark | SQL What We’re Looking For 3+ years experience in data engineering or analytics engineering Hands-on with cloud data platforms and large-scale data processing Strong problem-solving mindset and a passion for clean, efficient data design Job Description: Min 3 years of experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, AWS, GCP, Data Bricks etc. Azure experience is preferred over other cloud platforms. 5 years of proven experience with SQL, schema design and dimensional data modelling Solid knowledge of data warehouse best practices, development standards and methodologies Experience with ETL/ELT tools like ADF, Informatica, Talend etc., and data warehousing technologies like Azure Synapse, Microsoft Fabric, Azure SQL, Amazon redshift, Snowflake, Google Big Query etc. Strong experience with big data tools (Databricks, Spark etc..) and programming skills in PySpark and Spark SQL. Be an independent self-learner with “let’s get this done” approach and ability to work in Fast paced and Dynamic environment. Excellent communication and teamwork abilities. Nice-to-Have Skills: Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Cosmo DB knowledge. SAP ECC /S/4 and Hana knowledge. Intermediate knowledge on Power BI Azure DevOps and CI/CD deployments, Cloud migration methodologies and processes BayOne is an Equal Opportunity Employer and does not discriminate against any employee or applicant for employment because of race, color, sex, age, religion, sexual orientation, gender identity, status as a veteran, and basis of disability or any federal, state, or local protected class. This job posting represents the general duties and requirements necessary to perform this position and is not an exhaustive statement of all responsibilities, duties, and skills required. Management reserves the right to revise or alter this job description. Show more Show less

Posted 1 day ago

Apply

4.0 years

0 Lacs

Kochi, Kerala

On-site

Indeed logo

Job Summary: We are looking for a dynamic and results-driven Cloud & Networking Sales Executive to join our high-performance sales team. The ideal candidate will have a strong foundation in cloud and IT solution sales , exceptional business development capabilities , and a proven track record of meeting and exceeding revenue targets . This is a strategic role focused on generating leads, closing deals, and building long-term enterprise relationships across global markets. Key Responsibilities: Develop and execute effective sales strategies to drive growth in Cloud, Networking, Hosting, and Tech Support services. Identify, qualify, and convert new business opportunities through market research, networking, and direct outreach. Build and nurture strategic client relationships with enterprise decision-makers to create long-term partnerships. Consistently meet or exceed sales targets and KPIs by maintaining a healthy pipeline and strong closing skills. Lead international sales efforts , expanding into new regions and sectors. Collaborate with technical teams to present tailored solutions that address client needs. Provide market feedback to product and leadership teams to refine offerings and drive innovation. What We’re Looking For: ✅ Minimum 4 years of IT sales experience , with a strong focus on cloud and networking solutions . ✅ Demonstrated success in enterprise-level sales , including CXO-level interactions. ✅ Strong understanding of cloud platforms (AWS, Azure, GCP), networking solutions, and managed services. ✅ Expertise in lead generation , consultative selling, and closing enterprise deals. ✅ Excellent communication, negotiation, and presentation skills. ✅ Self-motivated, target-driven, and able to work independently in a fast-paced environment. ✅ Experience in international sales is highly desirable. Job Type: Full-time Pay: ₹30,000.00 - ₹100,000.00 per month Benefits: Provident Fund Ability to commute/relocate: Ernakulam, Kerala: Reliably commute or planning to relocate before starting work (Preferred) Work Location: In person

Posted 1 day ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description "Architect to lead our cloud infrastructure and automation initiatives on Google Cloud Platform (GCP). This pivotal role will be responsible for designing, implementing, and maintaining a robust, secure, and scalable platform that empowers our development teams to deliver high-quality software efficiently. You will be instrumental in driving our DevOps and DevSecOps practices, ensuring a seamless and secure software delivery lifecycle. Responsibilities: Platform Architecture and Design: Define and document the target state architecture for our GCP platform, considering scalability, reliability, security, and cost-effectiveness. Design and implement infrastructure-as-code (IaC) solutions using tools like Terraform or Cloud Deployment Manager. Architect and implement CI/CD pipelines leveraging GCP services and industry best practices. Develop and maintain platform standards, policies, and guidelines. Evaluate and recommend new GCP services and technologies to enhance our platform capabilities. DevOps Leadership and Implementation: Champion and drive the adoption of DevOps principles and practices across development, operations, and security teams. Establish and optimize automated build, test, and deployment processes. Implement robust monitoring, logging, and alerting solutions to ensure platform health and performance. Foster a culture of collaboration, automation, and continuous improvement. DevSecOps Integration: Lead the integration of security practices throughout the software development lifecycle (SDLC). Define and implement security controls within the infrastructure and CI/CD pipelines. Automate security testing and vulnerability management processes. Ensure compliance with relevant security standards and regulations. GCP Infrastructure Build and Management: Lead the provisioning and management of GCP resources, including compute, storage, networking, and databases. Optimize infrastructure for performance, availability, and cost efficiency. Implement disaster recovery and business continuity plans on GCP. Troubleshoot and resolve complex platform and infrastructure issues. Collaboration and Communication: Collaborate effectively with development teams, security engineers, and other stakeholders to understand their needs and provide platform solutions. Communicate technical concepts and solutions clearly and effectively to both technical and non-technical audiences. Provide guidance and mentorship to junior team members. Participate in architectural reviews and provide constructive feedback." Show more Show less

Posted 1 day ago

Apply

4.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

About Us: Atyeti is a US-based IT services provider with global offices in Singapore, Malaysia, Hong Kong, Philippines, North Carolina, Raleigh and India- Pune, Hyderabad, Bangalore, Chennai, and Trivandrum . We work for Enterprise clients such as JP Morgan, Credit Suisse, HSBC, Citi, BOA, Dun & Bradstreet, McKinsey, Blackrock, FactSet to name a few . Also, we are implementation partner with Hashi Corp, Finastra, GCP, Snowflake. Please visit www.atyeti.com. Role: Automation Engineer Location: Mumbai Experience: 4+ Years Mode: Full-Time We are looking for an experienced Automation Engineer to join our automation development team. The ideal candidate will have a strong background in designing and building end-to-end automation solutions using UiPath and Microsoft Power Platform tools. The role involves working closely with business stakeholders to assess, design, develop, and deliver scalable automation solutions that enhance operational efficiency and business value. Responsibilities: Perform independent feasibility assessments to determine automation potential. Collaborate with business stakeholders to analyse existing processes and identify automation opportunities. Conduct requirement gathering sessions through workshops, interviews, and walkthroughs. Create clear documentation such as Process Design Documents (PDD) , user stories, and process maps. Design automation solutions aligned with business goals and prepare detailed Solution Design Documents. Develop and maintain automation workflows using UiPath and Power Platform . Ensure code quality by following development standards and conducting peer reviews. Create and execute test plans, including unit testing, integration testing, and UAT. Participate in continuous improvement efforts by identifying process optimization opportunities. Coordinate with IT and infrastructure teams to manage environment setup and deployment activities. Ensure timely delivery of assigned tasks and compliance with organizational standards and policies. Explore and propose the integration of cognitive elements like OCR, AI/ML, and image recognition into automation solutions. Required Skills: 3+ years of hands-on experience in UiPath RPA development . Strong experience with Microsoft Power Platform (Power Automate, Power Apps). UiPath certification (Advanced RPA Developer or Business Analyst) preferred. Proven experience in end-to-end automation delivery, including requirement analysis, design, development, and testing. Strong understanding of SDLC and Agile methodologies (Scrum/Kanban). Excellent communication, analytical thinking, and stakeholder management skills. Proficiency in MS Office tools. Desired Skills: Experience with Python scripting, Azure AI, Azure Apps, or VBA. Exposure to cognitive technologies like OCR, image recognition, and AI/ML integration. Familiarity with project management and collaboration tools like JIRA, Confluence, ServiceNow, and Azure DevOps. Prior experience in working with IT operations or support teams is a plus Show more Show less

Posted 1 day ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Coupa makes margins multiply through its community-generated AI and industry-leading total spend management platform for businesses large and small. Coupa AI is informed by trillions of dollars of direct and indirect spend data across a global network of 10M+ buyers and suppliers. We empower you with the ability to predict, prescribe, and automate smarter, more profitable business decisions to improve operating margins. Why join Coupa? 🔹 Pioneering Technology: At Coupa, we're at the forefront of innovation, leveraging the latest technology to empower our customers with greater efficiency and visibility in their spend. 🔹 Collaborative Culture: We value collaboration and teamwork, and our culture is driven by transparency, openness, and a shared commitment to excellence. 🔹 Global Impact: Join a company where your work has a global, measurable impact on our clients, the business, and each other. Learn more on Life at Coupa blog and hear from our employees about their experiences working at Coupa. The Impact of a Lead Software Engineer – Data to Coupa: The Lead Data Engineer plays a critical role in shaping Coupa’s data infrastructure, driving the design and implementation of scalable, high-performance data solutions. Collaborating with teams across engineering, data science, and product, this role ensures the integrity, security, and efficiency of our data systems. Beyond technical execution, the Lead Data Engineer provides mentorship and defines best practices, supporting a culture of excellence. Their expertise will directly support Coupa’s ability to deliver innovative, data-driven solutions, enabling business growth and reinforcing our leadership in cloud-based spend management. What You’ll Do: Lead and drive the development and optimization of scalable data architectures and pipelines. Design and implement best-in-class ETL/ELT solutions for real-time and batch data processing. Optimize data analysis and computation for performance, reliability, and cost efficiency, implementing monitoring solutions to identify bottlenecks. Architect and maintain cloud-based data infrastructure leveraging AWS, Azure, or GCP services. Ensure data security and governance, enforcing compliance with industry standards and regulations. Develop and promote best practices for data modeling, processing, and analytics.Mentor and guide a team of data engineers, fostering a culture of innovation and technical excellence Collaborate with stakeholders, including Product, Engineering, and Data Science teams, to support data-driven decision-making Automate and streamline data ingestion, transformation, and analytics processes to enhance efficiency. Develop real-time and batch data processing solutions, integrating structured and unstructured data sources What you will bring to Coupa: We are looking for a candidate with 10+ years of experience in Data Engineering and Application development with at least 3+ years in a Technical Lead role. They must have a graduate degree in Computer Science or a related field of study. They must have experience with programming languages such as Python and Java. Expertise in Python is a must Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Expertise in processing and analyzing large data workloads. Experience in designing and implementing scalable Data Warehouse solutions to support analytical and reporting needs Experience with API development and design with REST or GraphQL. Experience building and optimizing 'big data' data pipelines, architectures, and data sets. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement Strong analytic skills related to working with unstructured datasets. Build processes supporting data transformation, data structures, metadata, dependency, and workload management Working knowledge of message queuing, stream processing, and highly scalable 'big data' data stores Strong project management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment. Experience with big data tools: Spark, Kafka, etc. Experience with relational SQL and NoSQL databases. Experience with data pipeline and workflow management tools. Experience with AWS cloud services Coupa complies with relevant laws and regulations regarding equal opportunity and offers a welcoming and inclusive work environment. Decisions related to hiring, compensation, training, or evaluating performance are made fairly, and we provide equal employment opportunities to all qualified candidates and employees. Please be advised that inquiries or resumes from recruiters will not be accepted. By submitting your application, you acknowledge that you have read Coupa’s Privacy Policy and understand that Coupa receives/collects your application, including your personal data, for the purposes of managing Coupa's ongoing recruitment and placement activities, including for employment purposes in the event of a successful application and for notification of future job opportunities if you did not succeed the first time. You will find more details about how your application is processed, the purposes of processing, and how long we retain your application in our Privacy Policy. Show more Show less

Posted 1 day ago

Apply

0.0 - 4.0 years

0 Lacs

Raipur, Chhattisgarh

On-site

Indeed logo

Job Title: Backend Developer Experience: 3–4 Years Location: Raipur (On-site / Hybrid as applicable) Employment Type: Full-Time Job Summary: We are seeking a skilled and motivated Backend Developer with 3–4 years of hands-on experience in designing, developing, and maintaining robust backend systems. The ideal candidate will work closely with our frontend, DevOps, and product teams to build scalable and high-performance applications. Key Responsibilities: Develop and maintain secure, scalable backend services and APIs. Design database schemas, write optimized queries, and ensure data integrity. Integrate with third-party services and external APIs. Collaborate with frontend and DevOps teams to ensure seamless deployments. Troubleshoot and debug backend issues and implement effective solutions. Participate in code reviews, sprint planning, and architecture discussions. Write clean, maintainable, and well-documented code. Ensure best practices in security, performance, and scalability. Required Skills & Qualifications: 3–4 years of professional experience in backend development. Strong proficiency in one or more backend languages/frameworks: Node.js / Express.js Python / Django / Flask Java / Spring Boot Experience with RESTful APIs and microservice architecture. Strong understanding of relational and NoSQL databases (e.g., PostgreSQL, MongoDB). Experience with version control systems (Git). Familiarity with CI/CD pipelines and cloud platforms (AWS/GCP/Azure) is a plus. Solid understanding of software engineering principles and design patterns. Education: Bachelor’s or Master’s degree in Computer Science, IT, or a related field. Salary: As per industry standards and experience. Job Types: Full-time, Permanent Benefits: Paid sick time Location Type: In-person Schedule: Day shift Morning shift Education: Bachelor's (Required) Language: Hindi, English (Required) Location: Raipur, Chhattisgarh (Required) Work Location: In person

Posted 1 day ago

Apply

2.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Company Overview DevBytes is your ultimate go-to app for the latest content from the tech, startup, and dev worlds. With just a tap, you may dive into the freshest trends in AI, ML, cloud, AR/VR, cybersecurity, NLP, data science, DevOps, and everything coding. It is your one-stop platform for tech news on the fly, delivering trending updates daily from industry giants like Google, OpenAI, Apple, Meta, Amazon, X, Netflix, Tesla, Microsoft, SpaceX, and beyond. Stay in the loop on the stories that matter to you. Role and Responsibilities Design, develop, and optimize machine learning algorithms and models. Collaborate with cross-functional teams to integrate AI solutions into existing systems. Analyze large datasets to extract actionable insights and identify patterns. Research and implement state-of-the-art AI techniques and frameworks. Develop APIs or services to deploy models for real-world applications. Monitor and maintain the performance and accuracy of AI models in production. Ensure AI systems adhere to ethical guidelines, fairness, and privacy standards. Qualifications- Completed a Bachelor's/Master's in Computer Science or IT, with 2+ years of development experience. Experience with data preprocessing, feature engineering, and model evaluation. Solid understanding of probability, statistics, and data analysis Hands-on experience with Python. Proven ability to write clean, efficient, and modular code. Deep understanding of machine learning fundamentals. Familiarity with cloud platforms (AWS, Azure, GCP) and containerization tools (Docker, Kubernetes). Show more Show less

Posted 1 day ago

Apply

2.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Greeting from Infosys BPM Ltd., We are hiring for Content and Technical writer, ETL DB Testing, ETL Testing Automation, .NET, Python Developer skills. Please walk-in for interview on 18th & 19th June 2025 at Chennai location Note: Please carry copy of this email to the venue and make sure you register your application before attending the walk-in. Please use below link to apply and register your application. Please mention Candidate ID on top of the Resume *** https://career.infosys.com/jobdesc?jobReferenceCode=PROGEN-HRODIRECT-215140 Interview details Interview Date: 18th & 19th June 2025 Interview Time: 10 AM till 1 PM Interview Venue:TP 1/1, Central Avenue Techno Park, SEZ, Mahindra World City, Paranur, TamilNadu Please find below Job Description for your reference: Work from Office*** Rotational Shifts Min 2 years of experience on project is mandate*** Job Description: Content and Technical writer Develop high-quality technical documents, including user manuals, guides, and release notes. Collaborate with cross-functional teams to gather requirements and create accurate documentation. Conduct functional testing and manual testing to ensure compliance with FDA regulations. Ensure adherence to ISO standards and maintain a clean, organized document management system. Strong understanding of Infra domain Technical writer that can convert complex technical concepts into easy to consume documents for the targeted audience. In addition, will also be a mentor to the team with technical writing. Job Description: ETL DB Testing Strong experience in ETL testing, data warehousing, and business intelligence. Strong proficiency in SQL. Experience with ETL tools (e.g., Informatica, Talend, AWS Glue, Azure Data Factory). Solid understanding of Data Warehousing concepts, Database Systems and Quality Assurance. Experience with test planning, test case development, and test execution. Experience writing complex SQL Queries and using SQL tools is a must, exposure to various data analytical functions. Familiarity with defect tracking tools (e.g., Jira). Experience with cloud platforms like AWS, Azure, or GCP is a plus. Experience with Python or other scripting languages for test automation is a plus. Experience with data quality tools is a plus. Experience in testing of large datasets. Experience in agile development is must Understanding of Oracle Database and UNIX/VMC systems is a must Job Description: ETL Testing Automation Strong experience in ETL testing and automation. Strong proficiency in SQL and experience with relational databases (e.g., Oracle, MySQL, PostgreSQL, SQL Server). Experience with ETL tools and technologies (e.g., Informatica, Talend, DataStage, Apache Spark). Hands-on experience in developing and maintaining test automation frameworks. Proficiency in at least one programming language (e.g., Python, Java). Experience with test automation tools (e.g., Selenium, PyTest, JUnit). Strong understanding of data warehousing concepts and methodologies. Experience with CI/CD pipelines and version control systems (e.g., Git). Experience with cloud-based data warehouses like Snowflake, Redshift, BigQuery is a plus. Experience with data quality tools is a plus. Job Description: .Net Should have worked on .Net development/implementation/Support project Must have experience in .NET, ASP.NET MVC, C#, WPF, WCF, SQL Server, Azure Must have experience in Web services, Web API, REST services, HTML, CSS3 Understand Architecture Requirements and ensure effective Design, Development, Validation and Support activities. REGISTRATION PROCESS: The Candidate ID & SHL Test(AMCAT ID) is mandatory to attend the interview. Please follow the below instructions to successfully complete the registration. (Talents without registration & assessment will not be allowed for the Interview). Candidate ID Registration process: STEP 1: Visit: https://career.infosys.com/joblist STEP 2: Click on "Register" and provide the required details and submit. STEP 3: Once submitted, Your Candidate ID(100XXXXXXXX) will be generated. STEP 4: The candidate ID will be shared to the registered Email ID. SHL Test(AMCAT ID) Registration process: This assessment is proctored, and talent gets evaluated on Basic analytics, English Comprehension and writex (email writing). STEP 1: Visit: https://apc01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fautologin-talentcentral.shl.com%2F%3Flink%3Dhttps%3A%2F%2Famcatglobal.aspiringminds.com%2F%3Fdata%3DJTdCJTIybG9naW4lMjIlM0ElN0IlMjJsYW5ndWFnZSUyMiUzQSUyMmVuLVVTJTIyJTJDJTIyaXNBdXRvbG9naW4lMjIlM0ExJTJDJTIycGFydG5lcklkJTIyJTNBJTIyNDE4MjQlMjIlMkMlMjJhdXRoa2V5JTIyJTNBJTIyWm1abFpUazFPV1JsTnpJeU1HVTFObU5qWWpRNU5HWTFOVEU1Wm1JeE16TSUzRCUyMiUyQyUyMnVzZXJuYW1lJTIyJTNBJTIydXNlcm5hbWVfc3E5QmgxSWI5NEVmQkkzN2UlMjIlMkMlMjJwYXNzd29yZCUyMiUzQSUyMnBhc3N3b3JkJTIyJTJDJTIycmV0dXJuVXJsJTIyJTNBJTIyJTIyJTdEJTJDJTIycmVnaW9uJTIyJTNBJTIyVVMlMjIlN0Q%3D%26apn%3Dcom.shl.talentcentral%26ibi%3Dcom.shl.talentcentral%26isi%3D1551117793%26efr%3D1&data=05%7C02%7Comar.muqtar%40infosys.com%7Ca7ffe71a4fe4404f3dac08dca01c0bb3%7C63ce7d592f3e42cda8ccbe764cff5eb6%7C0%7C0%7C638561289526257677%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C&sdata=s28G3ArC9nR5S7J4j%2FV1ZujEnmYCbysbYke41r5svPw%3D&reserved=0 STEP 2: Click on "Start new test" and follow the instructions to complete the assessment. STEP 3: Once completed, please make a note of the AMCAT ID( Access you Amcat id by clicking 3 dots on top right corner of screen). NOTE: During registration, you'll be asked to provide the following information: Personal Details: Name, Email Address, Mobile Number, PAN number. Availability: Acknowledgement of work schedule preferences (Shifts, Work from Office, Rotational Weekends, 24/7 availability, Transport Boundary) and reason for career change. Employment Details: Current notice period and total annual compensation (CTC) in the format 390000 - 4 LPA (example). Candidate Information: 10-digit candidate ID starting with 100XXXXXXX, Gender, Source (e.g., Vendor name, Naukri/LinkedIn/Found it, or Direct), and Location Interview Mode: Walk-in Attempt all questions in the SHL Assessment app. The assessment is proctored, so choose a quiet environment. Use a headset or Bluetooth headphones for clear communication. A passing score is required for further interview rounds. 5 or above toggles, multi face detected, face not detected, or any malpractice will be considered rejected Once you've finished, submit the assessment and make a note of the AMCAT ID (15 Digit) used for the assessment. Documents to Carry: Please have a note of Candidate ID & AMCAT ID along with registered Email ID. Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Please carry 2 set of updated Resume/CV (Hard Copy). Please carry original ID proof for security clearance. Please carry individual headphone/Bluetooth for the interview. Pointers to note: Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Original Government ID card is must for Security Clearance. Regards, Infosys BPM Recruitment team. Show more Show less

Posted 1 day ago

Apply

2.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Greeting from Infosys BPM Ltd., Exclusive Women's Walkin drive We are hiring for Content and Technical writer, ETL DB Testing, ETL Testing Automation, .NET, Python Developer skills. Please walk-in for interview on 20th June 2025 at Chennai location Note: Please carry copy of this email to the venue and make sure you register your application before attending the walk-in. Please use below link to apply and register your application. Please mention Candidate ID on top of the Resume *** https://career.infosys.com/jobdesc?jobReferenceCode=PROGEN-HRODIRECT-215140 Interview details Interview Date: 20th June 2025 Interview Time: 10 AM till 1 PM Interview Venue:TP 1/1, Central Avenue Techno Park, SEZ, Mahindra World City, Paranur, TamilNadu Please find below Job Description for your reference: Work from Office*** Rotational Shifts Min 2 years of experience on project is mandate*** Job Description: Content and Technical writer Develop high-quality technical documents, including user manuals, guides, and release notes. Collaborate with cross-functional teams to gather requirements and create accurate documentation. Conduct functional testing and manual testing to ensure compliance with FDA regulations. Ensure adherence to ISO standards and maintain a clean, organized document management system. Strong understanding of Infra domain Technical writer that can convert complex technical concepts into easy to consume documents for the targeted audience. In addition, will also be a mentor to the team with technical writing. Job Description: ETL DB Testing Strong experience in ETL testing, data warehousing, and business intelligence. Strong proficiency in SQL. Experience with ETL tools (e.g., Informatica, Talend, AWS Glue, Azure Data Factory). Solid understanding of Data Warehousing concepts, Database Systems and Quality Assurance. Experience with test planning, test case development, and test execution. Experience writing complex SQL Queries and using SQL tools is a must, exposure to various data analytical functions. Familiarity with defect tracking tools (e.g., Jira). Experience with cloud platforms like AWS, Azure, or GCP is a plus. Experience with Python or other scripting languages for test automation is a plus. Experience with data quality tools is a plus. Experience in testing of large datasets. Experience in agile development is must Understanding of Oracle Database and UNIX/VMC systems is a must Job Description: ETL Testing Automation Strong experience in ETL testing and automation. Strong proficiency in SQL and experience with relational databases (e.g., Oracle, MySQL, PostgreSQL, SQL Server). Experience with ETL tools and technologies (e.g., Informatica, Talend, DataStage, Apache Spark). Hands-on experience in developing and maintaining test automation frameworks. Proficiency in at least one programming language (e.g., Python, Java). Experience with test automation tools (e.g., Selenium, PyTest, JUnit). Strong understanding of data warehousing concepts and methodologies. Experience with CI/CD pipelines and version control systems (e.g., Git). Experience with cloud-based data warehouses like Snowflake, Redshift, BigQuery is a plus. Experience with data quality tools is a plus. Job Description: .Net Should have worked on .Net development/implementation/Support project Must have experience in .NET, ASP.NET MVC, C#, WPF, WCF, SQL Server, Azure Must have experience in Web services, Web API, REST services, HTML, CSS3 Understand Architecture Requirements and ensure effective Design, Development, Validation and Support activities. REGISTRATION PROCESS: The Candidate ID & SHL Test(AMCAT ID) is mandatory to attend the interview. Please follow the below instructions to successfully complete the registration. (Talents without registration & assessment will not be allowed for the Interview). Candidate ID Registration process: STEP 1: Visit: https://career.infosys.com/joblist STEP 2: Click on "Register" and provide the required details and submit. STEP 3: Once submitted, Your Candidate ID(100XXXXXXXX) will be generated. STEP 4: The candidate ID will be shared to the registered Email ID. SHL Test(AMCAT ID) Registration process: This assessment is proctored, and talent gets evaluated on Basic analytics, English Comprehension and writex (email writing). STEP 1: Visit: https://apc01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fautologin-talentcentral.shl.com%2F%3Flink%3Dhttps%3A%2F%2Famcatglobal.aspiringminds.com%2F%3Fdata%3DJTdCJTIybG9naW4lMjIlM0ElN0IlMjJsYW5ndWFnZSUyMiUzQSUyMmVuLVVTJTIyJTJDJTIyaXNBdXRvbG9naW4lMjIlM0ExJTJDJTIycGFydG5lcklkJTIyJTNBJTIyNDE4MjQlMjIlMkMlMjJhdXRoa2V5JTIyJTNBJTIyWm1abFpUazFPV1JsTnpJeU1HVTFObU5qWWpRNU5HWTFOVEU1Wm1JeE16TSUzRCUyMiUyQyUyMnVzZXJuYW1lJTIyJTNBJTIydXNlcm5hbWVfc3E5QmgxSWI5NEVmQkkzN2UlMjIlMkMlMjJwYXNzd29yZCUyMiUzQSUyMnBhc3N3b3JkJTIyJTJDJTIycmV0dXJuVXJsJTIyJTNBJTIyJTIyJTdEJTJDJTIycmVnaW9uJTIyJTNBJTIyVVMlMjIlN0Q%3D%26apn%3Dcom.shl.talentcentral%26ibi%3Dcom.shl.talentcentral%26isi%3D1551117793%26efr%3D1&data=05%7C02%7Comar.muqtar%40infosys.com%7Ca7ffe71a4fe4404f3dac08dca01c0bb3%7C63ce7d592f3e42cda8ccbe764cff5eb6%7C0%7C0%7C638561289526257677%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C&sdata=s28G3ArC9nR5S7J4j%2FV1ZujEnmYCbysbYke41r5svPw%3D&reserved=0 STEP 2: Click on "Start new test" and follow the instructions to complete the assessment. STEP 3: Once completed, please make a note of the AMCAT ID( Access you Amcat id by clicking 3 dots on top right corner of screen). NOTE: During registration, you'll be asked to provide the following information: Personal Details: Name, Email Address, Mobile Number, PAN number. Availability: Acknowledgement of work schedule preferences (Shifts, Work from Office, Rotational Weekends, 24/7 availability, Transport Boundary) and reason for career change. Employment Details: Current notice period and total annual compensation (CTC) in the format 390000 - 4 LPA (example). Candidate Information: 10-digit candidate ID starting with 100XXXXXXX, Gender, Source (e.g., Vendor name, Naukri/LinkedIn/Found it, or Direct), and Location Interview Mode: Walk-in Attempt all questions in the SHL Assessment app. The assessment is proctored, so choose a quiet environment. Use a headset or Bluetooth headphones for clear communication. A passing score is required for further interview rounds. 5 or above toggles, multi face detected, face not detected, or any malpractice will be considered rejected Once you've finished, submit the assessment and make a note of the AMCAT ID (15 Digit) used for the assessment. Documents to Carry: Please have a note of Candidate ID & AMCAT ID along with registered Email ID. Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Please carry 2 set of updated Resume/CV (Hard Copy). Please carry original ID proof for security clearance. Please carry individual headphone/Bluetooth for the interview. Pointers to note: Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Original Government ID card is must for Security Clearance. Regards, Infosys BPM Recruitment team. Show more Show less

Posted 1 day ago

Apply

6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

About Company : Our client is prominent Indian multinational corporation specializing in information technology (IT), consulting, and business process services and its headquartered in Bengaluru with revenues of gross revenue of ₹222.1 billion with global work force of 234,054 and listed in NASDAQ and it operates in over 60 countries and serves clients across various industries, including financial services, healthcare, manufacturing, retail, and telecommunications. The company consolidated its cloud, data, analytics, AI, and related businesses under the tech services business line. Major delivery centers in India, including cities like Chennai, Pune, Hyderabad, and Bengaluru, kochi, kolkatta, Noida. Job Title : Java Full Stack With React. Location : Pan India (Hybrid). Experience : 6+ Years. Job Type : Contract To Hire (C2H). Notice Period : Immediate Joiners. Payroll : People Prime World Wide Pvt Ltd. Client : MNC Client. Mandatory Skills : Java Full Stack, Java, React.JS, AWS / GCP Job Title: Java Full Stack Developer (React) Work Type: Contract to Hire (Long-Term Opportunity) Work Mode: Hybrid Location: Pan India Shift Timing: 11:00 AM – 10:00 PM IST Note: Must be flexible to attend evening client calls Key Requirements: Must-Have Skills: Java, Spring Boot React.js AWS or GCP (any one is mandatory) REST APIs, SQL/NoSQL Git, CI/CD basics Responsibilities: Develop and maintain full stack web applications using Java and React Work on cloud deployment (AWS or GCP) Collaborate with team members and clients in a hybrid setup Ensure timely delivery and attend evening client meetings as required Experience: 6–8 years preferred Employment Type: Contract to Hire (C2H) Show more Show less

Posted 1 day ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: Full Stack Developer (MERN + Python) Experience: 2–3 Years Location: Madhapur, Hyderabad(Onsite) Employment Type: Full-Time Joining Requirement: Immediate Joiners Preferred Working Days: 6 Days from Office Company: Multiplier AI About Us: Multiplier AI is a leading AI company in India, focusing on healthcare. We help top pharmaceutical companies like Abbott, Cipla, Sun Pharma, Glenmark, and Galderma optimize operations and enhance decision-making with AI-driven products. Job Summary: We are looking for a skilled Full Stack Developer with 2–3 years of hands-on experience in the MERN stack (MongoDB, Express.js, React.js, Node.js) along with strong proficiency in Python. You will be responsible for building scalable web applications, developing RESTful APIs, and integrating backend logic with front-end functionality. This role is ideal for developers who are passionate about technology, eager to take ownership, and ready to contribute from day one. Key Responsibilities: Design, develop, test, and deploy full-stack web applications using the MERN stack. Develop robust and scalable REST APIs using Node.js and Python. Integrate front-end UI with server-side logic using React.js and Express. Optimize applications for speed, scalability, and reliability. Collaborate with cross-functional teams including designers, product managers, and other developers. Write clean, maintainable, and well-documented code. Troubleshoot and debug application issues. Participate in code reviews and continuous improvement processes. Requirements: 2–3 years of professional experience in full stack development. Strong proficiency in MERN stack: MongoDB, Express.js, React.js, Node.js. Proficiency in Python (especially for backend scripting or automation). Experience with version control systems like Git. Solid understanding of RESTful API design and integration. Familiarity with deployment processes, cloud platforms (e.g., AWS, GCP), and CI/CD pipelines is a plus. Ability to work in a fast-paced, agile development environment. Excellent problem-solving and communication skills. Nice to Have: Experience with front-end libraries like Redux or Tailwind CSS. Knowledge of microservices architecture. Familiarity with testing frameworks like Jest or Mocha. Exposure to AI/ML projects or data engineering pipelines. What We Offer: Competitive salary and performance-based incentives. Opportunity to work on impactful, cutting-edge projects. A collaborative and innovation-driven work culture. Flexible work environment. Ready to build the future with us? If you're a proactive developer looking for an exciting role where your contributions will matter from day one — and you're available to join immediately — we’d love to hear from you! Show more Show less

Posted 1 day ago

Apply

5.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

🚀 We’re Hiring: Senior Full Stack Developer (ASP.NET & C#) – Kochi 🚀 Are you a code wizard who thrives on building world-class products? Ready to own your work, shape the future of fintech, and see your ideas in action? This is YOUR moment! Our client, a global leader in investment management tech, is on the hunt for a creative, driven, and passionate Full Stack Developer to join their powerhouse team in Kakkanad, Kochi. ✨ What’s in it for you? Build the next-gen cloud platform trusted to manage BILLIONS in assets Work with cutting-edge tech: Akka.NET, Docker, Kubernetes, and more Collaborate with smart, supportive teammates who love to innovate See your code go live—fast! (We do TDD, CI, and automated deployments) Deep dive into the fascinating world of investment management What you’ll bring: 5+ years’ experience with ASP.NET Web API & C# Modern JavaScript framework skills (React, Angular, Vue, Aurelia, etc.) Solid HTML5, CSS3, AJAX, REST API know-how Passion for building secure, high-performance apps A love for learning, sharing, and making an impact Bonus points if you know: Cloud platforms (AWS/Azure/GCP), TypeScript, Webpack, DevExtreme/Kendo UI, or have an interest in investing! 💡 Ready to code your legacy? Drop your CV at m.neethu@ssconsult.in or DM me. Let’s build the future of fintech—together! #FullStackDeveloper #KochiJobs #FintechCareers #ASPNet #CSharp #React #Angular #Vue #HiringNow Show more Show less

Posted 1 day ago

Apply

0.0 - 3.0 years

0 Lacs

Noida, Uttar Pradesh

On-site

Indeed logo

Noida,Uttar Pradesh,India Job ID 764288 Join our Team About this opportunity: Join Ericsson as an Oracle Database Administrator and play a key role in managing and optimizing our critical database infrastructure. As an Oracle DBA, you will be responsible for installing, configuring, Upgrading and maintaining Oracle databases, ensuring high availability, performance, and security. You’ll work closely with cross-functional teams to support business-critical applications, troubleshoot issues, and implement database upgrades and patches. This role offers a dynamic and collaborative environment where you can leverage your expertise to drive automation, improve efficiency, and contribute to innovative database solutions. What you will do: Oracle, PostgreSQL, MySQL, and/or MariaDB database administration in production environments. Experience with Container Databases (CDBs) and Pluggable Databases (PDBs) for better resource utilization and simplified management. High availability configuration using Oracle Dataguard, PostgreSQL, MySQL replication, and/or MariaDB Galera clusters. Oracle Enterprise Manager administration which includes alarm integration. Familiarity with Linux tooling such as iotop, vmstat, nmap, OpenSSL, grep, ping, find, df, ssh, and dnf. Familiarity with Oracle SQL Developer, Oracle Data Modeler, pgadmin, toad, PHP, MyAdmin, and MySQL Workbench is a plus. Familiarity with NoSQL, such as MongoDB is a plus. Knowledge of Middle ware like Golden-gate both oracle to oracle and oracle to BigData. Oracle, PostgreSQL, MySQL, and/or MariaDB database administration in production environments. Conduct detailed performance analysis and fine-tuning of SQL queries and stored procedures. Analyze AWR, ADDMreports to identify and resolve performance bottlenecks. Implement and manage backup strategies using RMAN and other industry-standard tools. Performing pre-patch validation using opatch and datapatch. Testing patches in a non-production environment to identify potential issues before applying to production. Apply Oracle quarterly patches and security updates. Implement and manage backup strategies using RMAN and other industry-standard tools. The skills you bring: Bachelor of Engineering or equivalent experience with at least 2 to 3 years in the field of IT. Must have experience in handling operations in any customer service delivery organization. Thorough understanding of basic framework of Telecom / IT processes. Willingness to work in a 24x7 operational environment with rotating shifts, including weekends and holidays, to support critical infra and ensure minimal downtime. Strong understanding of Linux systems and networking fundamentals. Knowledge of cloud platforms (AWS, Azure, GCP) and containerization (Docker, Kubernetes) is a plus. Oracle Certified Professional (OCP) is preferred Why join Ericsson? At Ericsson, you´ll have an outstanding opportunity. The chance to use your skills and imagination to push the boundaries of what´s possible. To build solutions never seen before to some of the world’s toughest problems. You´ll be challenged, but you won’t be alone. You´ll be joining a team of diverse innovators, all driven to go beyond the status quo to craft what comes next. What happens once you apply?

Posted 1 day ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana

On-site

Indeed logo

Job Information Date Opened 06/17/2025 Job Type Full time Industry IT Services City Hyderabad State/Province Telangana Country India Zip/Postal Code 500081 About Us About DATAECONOMY: We are a fast-growing data & analytics company headquartered in Dublin with offices inDublin, OH, Providence, RI, and an advanced technology center in Hyderabad,India. We are clearly differentiated in the data & analytics space via our suite of solutions, accelerators, frameworks, and thought leadership. Job Description Job Title: Technical Project Manager Location: Hyderabad Employment Type: Full-time Experience: 10+ years Domain: Banking and Insurance We are seeking a Technical Project Manager to lead and coordinate the delivery of data-centric projects. This role bridges the gap between engineering teams and business stakeholders, ensuring the successful execution of technical initiatives, particularly in data infrastructure, pipelines, analytics, and platform integration. Responsibilities: Lead end-to-end project management for data-driven initiatives, including planning, execution, delivery, and stakeholder communication. Work closely with data engineers, analysts, and software developers to ensure technical accuracy and timely delivery of projects. Translate business requirements into technical specifications and work plans. Manage project timelines, risks, resources, and dependencies using Agile, Scrum, or Kanban methodologies. Drive the development and maintenance of scalable ETL pipelines, data models, and data integration workflows. Oversee code reviews and ensure adherence to data engineering best practices. Provide hands-on support when necessary, in Python-based development or debugging. Collaborate with cross-functional teams including Product, Data Science, DevOps, and QA. Track project metrics and prepare progress reports for stakeholders. Requirements Required Qualifications: Bachelor’s or master’s degree in computer science, Information Systems, Engineering, or related field. 10+ years of experience in project management or technical leadership roles. Strong understanding of modern data architectures (e.g., data lakes, warehousing, streaming). Experience working with cloud platforms like AWS, GCP, or Azure. Familiarity with tools such as JIRA, Confluence, Git, and CI/CD pipelines. Strong communication and stakeholder management skills. Benefits Company standard benefits.

Posted 1 day ago

Apply

3.0 years

0 Lacs

New Delhi, Delhi, India

Remote

Linkedin logo

At AlgoSec, What you do matters! Over 2,200 of the world’s leading organizations trust AlgoSec to help secure their most critical workloads across public cloud, private cloud, containers, and on-premises networks. Join our global team, securing application connectivity, anywhere. We are hiring a QA Automation Developer to join our global team working in an agile environment. Reporting to: Automation Team Leader Location: Gurgaon, India (Hybrid/Remote) Direct employment Responsibilities Plan, write and execute E2E automatic tests for complex features using java and selenium. Perform testing for AlgoSec new SaaS product, working with multiple cloud vendors as AWS, Azure and GCP Running tests in CI/CD environment. Requirements BSc in Computer Science/Engineering. At least 3 years of experience in object-oriented programming: Java. At least 2 years of experience in developing complex automation tests using TestNG, RestAssured (Java). Experience in manual QA testing (ability to write your own test before automation) Experience working with at least one cloud provider (AWS/Azure/GCP) Multitasking and problem-solving abilities, context switching and "out-of-the-box" thinking abilities. Team player, pleasant and with a high level of integrity. Very organized, thorough, and devoted. Bright, fast learner, independent. Good written and verbal communication skills in English. Advantages Experience in QA of network security software products. Experience in developing complex automation tests using selenium (Java). Experience in testing SaaS applications. AlgoSec is an Equal Opportunity Employer (EEO), committed to creating a friendly, inclusive environment that is a pleasure to work in, and where there is an unbiased acceptance of others. AlgoSec believes that diversity and an inclusive company culture are key drivers of creativity, innovation and performance. Furthermore, a diverse workforce and the maintenance of an atmosphere that welcomes versatile perspectives will enhance our ability to fulfill our vision. Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About is the fastest growing Cloud Infrastructure and Managed Services Provider addressing mission critical workloads of 3000+ enterprises including 15 of the Global Fortune 100 companies spread across 30+ Countries Cloud4C is a niche and only managed services provider with Single SLA upto Application layer through 18 Centers of Excellence Cloud4C specializes in Multi-Cloud requirements, addressing the complex needs of large enterprises across Hyper scale Public Cloud Platforms Azure, GCP, AWS & Alibaba to Private Cloud environments The company offers an integrated cloud security services through 26 security tools and 40+ security controls to ensure data is protected and backed by industry compliances PCIDSS, GxP, HIPPA, IRAP, MAS etc. We are currently present in USA, Canada, Mexico, Colombia, Turkey, UK, Netherlands, Switzerland, Portugal, Japan, South Korea, Australia, New Zealand, South East Asia, Srilanka, Middle East and India We work with the leading technology companies in providing the community clouds like HANA Enterprise Cloud along with SAP, Banking Community Cloud along with Fidelity, G-Cloud along with Infosys to name a few Role: Assistant Manager / Manager FP&A Job are searching for an experienced Financial Planning & Analysis (FP&A) Asst. Manager/ Manager. A Financial Planning & Analysis (FP&A) Professional Oversees All Corporate Projections. They Give An Analysis Of Every Technical, Administrative, And Significant Impact. A Strong Applicant Has a Strong Analytical Approach, Tactical Awareness, And Excellent Interpersonal Financial Reporting Prepare and review monthly, quarterly, and annual financial statements in compliance with GAAP/IFRS. Cross border financial Consolidation experience is must Ensure accuracy and timeliness of internal and external financial reports. Coordinate with auditors for annual audits and address audit findings. Maintain internal controls and compliance with regulatory requirements. Oversee consolidation processes and intercompany transactions. Develop and implement reporting systems and automation tools. B) Financial Planning & Analysis (FP&A) Lead budgeting, forecasting, and long-term financial planning activities. Perform variance analysis (actual vs. budget/forecast) and explain key drivers. Provide insightful financial analysis to support strategic initiatives and operational efficiency. Partner with department heads to align financial planning with business goals. Monitor KPIs and develop dashboards for management reporting. Conduct scenario modeling and financial risk A qualified CA and at least five years of experience in finance, accounting, or a related field are required A strong analytical toolkit that makes use of business intelligence and reporting software Capability to multitask and adapt to a constantly changing, quick environment Outstanding communication skills and relationship-building abilities Capability to lead projects throughout an organization Advanced computer program skills, such as the ability to write macros in Excel and other financial package (ref:iimjobs.com) Show more Show less

Posted 1 day ago

Apply

5.0 years

0 Lacs

New Delhi, Delhi, India

On-site

Linkedin logo

At AlgoSec, What you do matters! Over 2,200 of the world’s leading organizations trust AlgoSec to help secure their most critical workloads across public cloud, private cloud, containers, and on-premises networks. Join our global team, securing application connectivity, anywhere. AlgoSec is seeking for Site Reliability Engineer for the SRE team in India. Reporting to: Head of SRE Location : Gurgaon, India Direct Employment Responsibilities Ensure the reliability, scalability, and performance of our company's production environment, including complex architecture with multiple servers, deployment & various cloud technologies. Ability to collaborate with cross-functional teams, work independently, and prioritize effectively in a fast-paced environment. Effectively oversee and enhance monitoring capabilities for production environment and ensure optimal performance and functionality across the technology stack. Demonstrates flexibility to support our 24/7 operations and is willing to participate in on-call rotations to ensure timely incident response and resolution. Effectively address and resolve unexpected service issues while also creating and implementing tools and automation measures to proactively mitigate the likelihood of future problems. Requirements Minimum 5 years of experience in SRE/DevOps position for SaaS based products. Experience in managing mission critical production environment. Experience on version control tools like GIT, Bitbucket, etc. Experience in establishing CI/CD procedures with Jenkins. Working knowledge of databases. Experience in effectively managing AWS infrastructure, demonstrating proficiency across multiple AWS Cloud services including networking, EC2, VPC, EKS, ELB/NLB, API GW, Cognito, and more. Experience in monitoring tools like Datadog, ELK, Prometheus and Grafana, etc. Experience in understanding and managing Linux infrastructure. Experience in bash or python. Experience with IaC like CloudFormation / CDK / Terraforms Experience in Kubernetes and container management. Possesses excellent written and verbal communication skills in English, allowing for effective and articulate correspondence. Demonstrates strong teamwork, maintains a positive demeanor, and upholds a high level of integrity. Exhibits exceptional organizational abilities, displays thorough attention to detail, and remains highly committed to tasks at hand. Displays sharp intellect, adeptness at picking up new information quickly, and is highly self-motivated. Advantages Additional cloud services knowledge (Azure, GCP, etc.) Understanding of Java, Maven, NodeJS based applications. Experience in serverless architecture AlgoSec is an Equal Opportunity Employer (EEO), committed to creating a friendly, inclusive environment that is a pleasure to work in, and where there is an unbiased acceptance of others. AlgoSec believes that diversity and an inclusive company culture are key drivers of creativity, innovation and performance. Furthermore, a diverse workforce and the maintenance of an atmosphere that welcomes versatile perspectives will enhance our ability to fulfill our vision. Show more Show less

Posted 1 day ago

Apply

2.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Title: AI Engineer Location: Chennai, Tamil Nadu (5 days WFO) Experience: Minimum 2 Years Job Type: Full-time Job Summary: We are looking for a passionate and skilled AI Engineer to join our team in Chennai. The ideal candidate should have at least 2+ years of hands-on experience in designing, developing, and deploying AI/ML models. You will work on innovative projects involving natural language processing (NLP), computer vision, recommendation systems, and more, contributing to real-world applications that drive business outcomes. Key Responsibilities: Design, develop, and deploy machine learning and deep learning models Analyze large datasets to extract actionable insights Collaborate with data engineers and software developers to integrate models into production systems Optimize algorithms for performance, scalability, and accuracy Research and experiment with the latest advancements in AI and ML Conduct model testing, validation, and fine-tuning Document processes, models, and results for internal and client use Required Skills & Qualifications: Bachelor's or Master's degree in Computer Science, Data Science, AI, or a related field 2+ years of hands-on experience in AI/ML model development Proficiency in Python and ML libraries such as TensorFlow, PyTorch, scikit-learn, or Keras Experience with data preprocessing, feature engineering, and model evaluation Knowledge of NLP, computer vision, or time series modeling Familiarity with tools like Jupyter, Git, Docker, and cloud platforms (AWS/GCP/Azure) Strong problem-solving and analytical skills Excellent communication and collaboration abilities Preferred Qualifications: Experience with deploying models using Flask, FastAPI, or similar frameworks Exposure to MLOps tools and practices Contributions to open-source AI/ML projects or participation in Kaggle competitions Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Hyderabad, Telangana, India; Bengaluru, Karnataka, India . Minimum qualifications: Bachelor's degree in Computer Science or equivalent practical experience. Experience in automating infrastructure provisioning, Developer Operations (DevOps), integration, or delivery. Experience in networking, compute infrastructure (e.g., servers, databases, firewalls, load balancers) and architecting, developing, or maintaining cloud solutions in virtualized environments. Experience in scripting with Terraform and Networking, DevOps, Security, Compute, Storage, Hadoop, Kubernetes, or Site Reliability Engineering. Preferred qualifications: Certification in Cloud with experience in Kubernetes, Google Kubernetes Engine, or similar. Experience with customer-facing migration including service discovery, assessment, planning, execution, and operations. Experience with IT security practices like identity and access management, data protection, encryption, certificate and key management. Experience with Google Cloud Platform (GCP) techniques like prompt engineering, dual encoders, and embedding vectors. Experience in building prototypes or applications. Experience in one or more of the following disciplines: software development, managing operating system environments (Linux or related), network design and deployment, databases, storage systems. About The Job The Google Cloud Consulting Professional Services team guides customers through the moments that matter most in their cloud journey to help businesses thrive. We help customers transform and evolve their business through the use of Google’s global network, web-scale data centers, and software infrastructure. As part of an innovative team in this rapidly growing business, you will help shape the future of businesses of all sizes and use technology to connect with customers, employees, and partners. Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Provide domain expertise in cloud platforms and infrastructure to solve cloud platform tests. Work with customers to design and implement cloud based technical architectures, migration approaches, and application optimizations that enable business objectives. Be a technical advisor and perform troubleshooting to resolve technical tests for customers. Create and deliver best practice recommendations, tutorials, blog articles, and sample code. Travel up to 30% for in-region for meetings, technical reviews, and onsite delivery activities. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form . Show more Show less

Posted 1 day ago

Apply

Exploring GCP Jobs in India

The job market for Google Cloud Platform (GCP) professionals in India is rapidly growing as more and more companies are moving towards cloud-based solutions. GCP offers a wide range of services and tools that help businesses in managing their infrastructure, data, and applications in the cloud. This has created a high demand for skilled professionals who can work with GCP effectively.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Hyderabad
  4. Pune
  5. Chennai

Average Salary Range

The average salary range for GCP professionals in India varies based on experience and job role. Entry-level positions can expect a salary range of INR 5-8 lakhs per annum, while experienced professionals can earn anywhere from INR 12-25 lakhs per annum.

Career Path

Typically, a career in GCP progresses from a Junior Developer to a Senior Developer, then to a Tech Lead position. As professionals gain more experience and expertise in GCP, they can move into roles such as Cloud Architect, Cloud Consultant, or Cloud Engineer.

Related Skills

In addition to GCP, professionals in this field are often expected to have skills in: - Cloud computing concepts - Programming languages such as Python, Java, or Go - DevOps tools and practices - Networking and security concepts - Data analytics and machine learning

Interview Questions

  • What is Google Cloud Platform and its key services? (basic)
  • Explain the difference between Google Cloud Storage and Google Cloud Bigtable. (medium)
  • How would you optimize costs in Google Cloud Platform? (medium)
  • Describe a project where you implemented CI/CD pipelines in GCP. (advanced)
  • How does Google Cloud Pub/Sub work and when would you use it? (medium)
  • What is Cloud Spanner and how is it different from other database services in GCP? (advanced)
  • Explain the concept of IAM and how it is implemented in GCP. (medium)
  • How would you securely transfer data between different regions in GCP? (advanced)
  • What is Google Kubernetes Engine (GKE) and how does it simplify container management? (medium)
  • Describe a scenario where you used Google Cloud Functions in a project. (advanced)
  • How do you monitor performance and troubleshoot issues in GCP? (medium)
  • What is Google Cloud SQL and when would you choose it over other database options? (medium)
  • Explain the concept of VPC (Virtual Private Cloud) in GCP. (basic)
  • How do you ensure data security and compliance in GCP? (medium)
  • Describe a project where you integrated Google Cloud AI services. (advanced)
  • What is the difference between Google Cloud CDN and Google Cloud Load Balancing? (medium)
  • How do you handle disaster recovery and backups in GCP? (medium)
  • Explain the concept of auto-scaling in GCP and when it is useful. (medium)
  • How would you set up a multi-region deployment in GCP for high availability? (advanced)
  • Describe a project where you used Google Cloud Dataflow for data processing. (advanced)
  • What are the best practices for optimizing performance in Google Cloud Platform? (medium)
  • How do you manage access control and permissions in GCP? (medium)
  • Explain the concept of serverless computing and how it is implemented in GCP. (medium)
  • What is the difference between Google Cloud Identity and Access Management (IAM) and AWS IAM? (advanced)
  • How do you ensure data encryption at rest and in transit in GCP? (medium)

Closing Remark

As the demand for GCP professionals continues to rise in India, now is the perfect time to upskill and pursue a career in this field. By mastering GCP and related skills, you can unlock numerous opportunities and build a successful career in cloud computing. Prepare well, showcase your expertise confidently, and land your dream job in the thriving GCP job market in India.

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies