Home
Jobs

15888 Gcp Jobs - Page 29

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

DevOps Engineer L1 Role Overview: As a DevOps Engineer (L1), you will assist in automating processes, managing deployments, and maintaining the infrastructure. This role is ideal for someone with foundational knowledge of DevOps principles who is eager to grow in a fast-paced environment. Key Responsibilities: Support and maintain CI/CD pipelines to streamline deployments. Monitor application performance and troubleshoot issues. Perform routine tasks such as server monitoring, log analysis, and backup management. Collaborate with development teams to ensure smooth releases. Maintain and optimize cloud infrastructure (e.g., AWS, Azure, or GCP). Ensure basic security measures, including firewall configuration and patch management.  Qualifications: Bachelor’s degree in Computer Science, IT, or a related field. Experience with CI/CD tools like Jenkins, GitLab, or CircleCI. Basic knowledge of cloud platforms (AWS, Azure, or GCP). Familiarity with Linux/Unix systems and scripting languages (e.g., Bash, Python). Understanding of containerization (Docker) and orchestration tools (Kubernetes is a plus). Good problem-solving skills and a willingness to learn. Show more Show less

Posted 1 day ago

Apply

6.0 - 10.0 years

10 - 16 Lacs

Bengaluru

Hybrid

Naukri logo

Role & responsibilities Full Stack (Backend heavy), Java, React, Kafka- Key Responsibilities: Design and develop scalable backend services using Java (Spring Boot preferred). Build responsive frontend components using React.js . Integrate and manage real-time data pipelines using Apache Kafka . Collaborate with cross-functional teams to deliver high-performance, reliable systems. Write clean, maintainable, and testable code for both frontend and backend. Optimize application performance and troubleshoot production issues. Requirements: Strong experience in Java backend development (Spring Boot, REST APIs). Solid knowledge of React.js and modern JavaScript/TypeScript. Hands-on experience with Apache Kafka for messaging and streaming. Familiar with SQL/NoSQL databases (e.g., PostgreSQL, MongoDB). Experience with CI/CD , Git, and containerization tools (Docker/Kubernetes a plus). Ability to work in an Agile environment and write unit/integration tests. Nice to Have: Experience with microservices architecture. Cloud platform exposure (AWS, GCP, or Azure).

Posted 1 day ago

Apply

14.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Department: Technology Location: Pune Description Are you passionate about building test automation that accelerates product excellence? Do you believe that smart QA practices empower developers and elevate user experiences? Join Scan-IT as a Software Testing Manager! We’re seeking a detail-oriented and forward-thinking Software Testing Manager to lead our QA efforts with a strong focus on test automation, especially using tools like Testim.io. This is a unique opportunity to scale a robust quality engineering culture across our global software teams. We’re a technology company with global reach – active in 35+ countries across 3 continents. From Barcelona to Singapore, our digital solutions support the logistics networks that keep the world moving. Backed by a strong financial foundation and a culture built on trust, innovation, and opportunity, we offer the stability of a well-established business with the energy of a growing international tech team. Bring your leadership, strategy, and hands-on experience – and help us raise the bar for quality across all touchpoints. What You'll Do… Own QA Strategy: Define and evolve the company-wide testing and QA automation strategy. Lead Automation Implementation: Drive the adoption and optimization of automation tools, especially Testim.io, across web and interface testing pipelines. Build and Mentor QA Teams: Grow and mentor a global team of 25+ QA engineers, instilling strong testing practices and a quality-first mindset. Ensure High Coverage : Define test plans, manage execution across integration, regression, and performance testing. Collaborate Cross-Functionally : Partner with DevOps, Engineering, and Product teams to ensure test coverage and quality gates are built into the CI/CD pipeline. Champion Tools & Standards : Promote scalable test frameworks, reusable components, and automated scripts. Monitor and Report : Analyze test metrics, identify gaps, and continuously improve QA processes. Documentation & Training: Maintain comprehensive documentation using tools like Document360 and deliver internal training on test methodologies and tooling. What You’ll Need… Bachelor’s degree in Computer Science, Engineering, or a related field. 14+ years of professional experience in software quality assurance or engineering. 8+ years of experience leading QA teams or managing automation initiatives. Deep knowledge of automation tools; hands-on experience with Testim.io is required. Familiarity with scripting languages like JavaScript or Python for custom test scenarios. Understanding of testing strategies across APIs, microservices, and UI. Experience with CI/CD tools like Jenkins, GitHub Actions, or GitLab CI. Familiarity with Agile development and project management tools (e.g., JIRA, Confluence). Strong analytical mindset, problem-solving skills, and effective communication abilities. Experience with cloud platforms (AWS, Azure, or GCP) is a plus. Here’s What We Offer… At Scan-IT, we pride ourselves on our vibrant and supportive culture. Join our dynamic, international team and take on meaningful responsibilities from day one. Innovative Environment: Explore new technologies in the transportation and logistics industry. Collaborative Culture: Work with some of the industry’s best in an open and creative environment. Professional Growth: Benefit from continuous learning, mentorship, and career advancement. Impactful Work: Enhance efficiency and drive global success. Inclusive Workplace : Enjoy hybrid work opportunities and a supportive, diverse culture. Competitive Compensation: Receive a salary that reflects your expertise. Growth Opportunities: Achieve your full potential with ample professional and personal development opportunities. Join Scan-IT and be part of a team that’s shaping the future of the transportation and logistics industry. Visit www.scan-it.com.sg and follow us on LinkedIn, Facebook and X. Show more Show less

Posted 1 day ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Mumbai

Work from Office

Naukri logo

We are seeking a highly skilled Senior Snowflake Developer with expertise in Python, SQL, and ETL tools to join our dynamic team. The ideal candidate will have a proven track record of designing and implementing robust data solutions on the Snowflake platform, along with strong programming skills and experience with ETL processes. Key Responsibilities: Designing and developing scalable data solutions on the Snowflake platform to support business needs and analytics requirements. Leading the end-to-end development lifecycle of data pipelines, including data ingestion, transformation, and loading processes. Writing efficient SQL queries and stored procedures to perform complex data manipulations and transformations within Snowflake. Implementing automation scripts and tools using Python to streamline data workflows and improve efficiency. Collaborating with cross-functional teams to gather requirements, design data models, and deliver high-quality solutions. Performance tuning and optimization of Snowflake databases and queries to ensure optimal performance and scalability. Implementing best practices for data governance, security, and compliance within Snowflake environments. Mentoring junior team members and providing technical guidance and support as needed. Qualifications: Bachelor's degree in Computer Science, Engineering, or related field. 7+ years of experience working with Snowflake data warehouse. Strong proficiency in SQL with the ability to write complex queries and optimize performance. Extensive experience developing data pipelines and ETL processes using Python and ETL tools such as Apache Airflow, Informatica, or Talend. Strong Python coding experience needed minimum 2 yrs Solid understanding of data warehousing concepts, data modeling, and schema design. Experience working with cloud platforms such as AWS, Azure, or GCP. Excellent problem-solving and analytical skills with a keen attention to detail. Strong communication and collaboration skills with the ability to work effectively in a team environment. Any relevant certifications in Snowflake or related technologies would be a plus

Posted 1 day ago

Apply

3.0 - 5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

This is a key position supporting client organization with strong Analytics and data science capabilities. There is significant revenue and future opportunities associated with this role. Job Description: Develop and maintain data tables (management, extraction, harmonizing etc.) using GCP/ SQL/ Snowflake etc. This involves designing, implementing, and writing optimized codes, maintaining complex SQL queries to extract, transform, and load (ETL) data from various tables/sources, and ensuring data integrity and accuracy throughout the data pipeline process. Create and manage data visualizations using Tableau/Power BI. This involves designing and developing interactive dashboards and reports, ensuring visualizations are user-friendly, insightful, and aligned with business requirements, and regularly updating and maintaining dashboards to reflect the latest data and insights. Generate insights and reports to support business decision-making. This includes analyzing data trends and patterns to provide actionable insights, preparing comprehensive reports that summarize key findings and recommendations, and presenting data-driven insights to stakeholders to inform strategic decisions. Handle ad-hoc data requests and provide timely solutions. This involves responding to urgent data requests from various departments, quickly gathering, analyzing, and delivering accurate data to meet immediate business needs, and ensuring ad-hoc solutions are scalable and reusable for future requests. Collaborate with stakeholders to understand and solve open-ended questions. This includes engaging with business users to identify their data needs and challenges, working closely with cross-functional teams to develop solutions for complex, open-ended problems, and translating business questions into analytical tasks to deliver meaningful results. Design and create wireframes and mockups for data visualization projects. This involves developing wireframes and mockups to plan and communicate visualization ideas, collaborating with stakeholders to refine and finalize visualization designs, and ensuring that wireframes and mockups align with user requirements and best practices. Communicate findings and insights effectively to both technical and non-technical audiences. This includes preparing clear and concise presentations to share insights with diverse audiences, tailoring communication styles to suit the technical proficiency of the audience, and using storytelling techniques to make data insights more engaging and understandable. Perform data manipulation and analysis using Python. This includes utilizing Python libraries such as Pandas, NumPy, and SciPy for data cleaning, transformation, and analysis, developing scripts and automation tools to streamline data processing tasks, and conducting statistical analysis to generate insights from large datasets. Implement basic machine learning models using Python. This involves developing and applying basic machine learning models to enhance data analysis, using libraries such as scikit-learn and TensorFlow for model development and evaluation, and interpreting and communicating the results of machine learning models to stakeholders. Automate data processes using Python. This includes creating automation scripts to streamline repetitive data tasks, implementing scheduling and monitoring of automated processes to ensure reliability, and continuously improving automation workflows to increase efficiency. Requirements: 3 to 5 years of experience in data analysis, reporting, and visualization. This includes a proven track record of working on data projects and delivering impactful results and experience in a similar role within a fast-paced environment. Proficiency in GCP/ SQL/ Snowflake/ Python for data manipulation. This includes strong knowledge of GCP/SQL/Snowflake services and tools, advanced SQL skills for complex query writing and optimization, and expertise in Python for data analysis and automation. Strong experience with Tableau/ Power BI/ Looker Studio for data visualization. This includes demonstrated ability to create compelling and informative dashboards, and familiarity with best practices in data visualization and user experience design. Excellent communication skills, with the ability to articulate complex information clearly. This includes strong written and verbal communication skills, and the ability to explain technical concepts to non-technical stakeholders. Proven ability to solve open-ended questions and handle ad-hoc requests. This includes creative problem-solving skills and a proactive approach to challenges, and flexibility to adapt to changing priorities and urgent requests. Strong problem-solving skills and attention to detail. This includes a keen eye for detail and accuracy in data analysis and reporting, and the ability to identify and resolve data quality issues. Experience in creating wireframes a nd mockups. This includes proficiency in design tools and effectively translating ideas into visual representations. Ability to work independently and as part of a team. This includes being self-motivated and able to manage multiple tasks simultaneously and having a collaborative mindset and willingness to support team members. Location: Bangalore Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less

Posted 1 day ago

Apply

5.0 - 10.0 years

0 Lacs

Bangalore Urban, Karnataka, India

On-site

Linkedin logo

Job Title: Java Backend Developer Location: Bangalore Experience: 5-10 Years Job Description 4 to 8 years of hands-on experience in Java (8/11/17) and Spring Boot frameworks. Strong understanding of Object-Oriented Programming (OOP) and design patterns. Experience in developing RESTful APIs and integrating with external systems. Proficient in Spring Core, Spring MVC, Spring Data JPA, and Spring Security. Experience with RDBMS (MySQL/PostgreSQL/Oracle) and ORM frameworks (Hibernate/JPA). Knowledge of Microservices architecture and related best practices. Familiarity with CI/CD tools like Jenkins, GitHub Actions, or GitLab CI. Experience with build tools like Maven or Gradle. Exposure to containerization technologies (Docker, Kubernetes) is a plus. Experience with cloud platforms (AWS, Azure, GCP) is desirable. Show more Show less

Posted 1 day ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Company Overview: Games24x7 is India’s leading and most valuable multi-gaming unicorn. We’re a full-stack gaming company, offering awesome game-playing experiences to over 100 million players through our products - Rummy Circle, India’s first and largest online rummy platform, My11Circle, the country’s fastest-growing fantasy sports platform. A pioneer in the online skill gaming industry in India, Games24x7 was founded in 2006 when two New York University-trained economists Bhavin Pandya, and Trivikraman Thampy met at the computer lab and discovered their shared passion for online games. We’ve always been a technology company at heart, and over the last decade and a half, we’ve built the organisation on a strong foundation of ‘the science of gaming’, leveraging behavioural science, artificial intelligence, and machine learning to provide immersive and hyper-personalised gaming experiences to each of our players. Backed by marquee investors including Tiger Global Management, The Raine Group, and Malabar Investment Advisors, Games24x7 is leading the charge in India’s gaming revolution, constantly innovating and offering novel entertainment to players! Our 800+ passionate teammates create their magic from our offices in Mumbai, Bengaluru, New Delhi, Miami. For more information and career opportunities you may visit www.games24x7.com. Role Overview: Games24x7 is seeking a highly experienced and results-oriented Associate Director - Analytics to lead the analytics function specifically for our flagship fantasy sports product, My11Circle. This critical leadership role will be responsible for developing and executing the data strategy, building and mentoring a high-performing analytics team, and driving data-informed decisions across all aspects of the My11Circle product lifecycle – from user acquisition and engagement to monetization and retention. The ideal candidate will possess a strong analytical background, deep understanding of product analytics principles, proven experience in leading analytics teams, and a passion for the online gaming or fantasy sports domain. You will be a strategic thinker with a hands-on approach, capable of translating complex data into actionable insights that directly impact the success of My11Circle. Responsibilities: Strategic Leadership: Develop and champion the overall data and analytics strategy for the My11Circle product, aligning with business objectives and product roadmap. Define key performance indicators (KPIs) and establish robust reporting frameworks to track product performance and user behavior. Proactively identify opportunities for leveraging data to drive product innovation, user growth, and revenue optimization. Collaborate with product management, engineering, marketing, and other stakeholders to understand their data needs and provide actionable insights. Stay abreast of the latest trends and technologies in data analytics, GenAI, and the gaming industry. Team Leadership & Development: Build, mentor, and lead a team of talented data analysts and scientists dedicated to supporting the My11Circle product. Foster a data-driven culture within the team and across the broader organization. Define team roles and responsibilities, set clear performance expectations, and provide regular feedback and coaching. Promote professional development and continuous learning within the analytics team. Product Analytics & Insights Generation: Oversee the design, development, and execution of in-depth analysis on user acquisition, engagement, retention, monetization, and gameplay patterns within My11Circle. Utilize various analytical techniques (e.g., cohort analysis, segmentation, regression, A/B testing analysis) to uncover key insights and trends. Develop and maintain dashboards and reports that provide clear and actionable insights to stakeholders. Proactively identify areas of friction or opportunity within the user journey and provide data-backed recommendations for improvement. Drive the adoption of self-service analytics capabilities within the My11Circle team. Experimentation & Optimization: Partner with the product team to design and analyze A/B tests and other experiments to optimize product features, user flows, and marketing campaigns. Establish best practices for experimentation and ensure rigorous statistical analysis of results. Translate experiment findings into actionable recommendations and drive their implementation. Data Infrastructure & Governance: Collaborate with data engineering teams to ensure the availability, accuracy, and reliability of data required for My11Circle analytics. Advocate for and contribute to the development of a scalable and efficient data infrastructure. Ensure compliance with data governance policies and best practices. Explore and evaluate new data analytics tools and technologies to enhance the team's capabilities, including potential applications of GenAI for insights generation and automation. Qualifications: Bachelor's or Master's degree in a quantitative field such as Statistics, Mathematics, Computer Science, Economics, or a related 1 discipline. 8+ years of progressive experience in data analytics, with a significant focus on product analytics. 4+ years of experience leading and managing analytics teams. Deep understanding of the online gaming or fantasy sports industry is highly preferred. Strong proficiency in SQL and experience working with large datasets. Expertise in at least one data visualization tool (e.g., Tableau, Power BI, Looker). Solid understanding of statistical analysis, experimental design, and causal inference. Experience with programming languages for data analysis (e.g., Python, R) is mandatory. Excellent communication, presentation, and storytelling skills with the ability to translate complex data into clear and actionable insights for both technical and non-technical audiences. Proven ability to collaborate effectively with cross-functional teams. Strong problem-solving skills and a data-driven mindset. Experience with cloud-based data platforms (e.g., AWS, GCP, Azure) is a plus. Familiarity with GenAI tools and their potential applications in data analysis is a plus. Personal Attributes: Passion for data and its ability to drive business decisions. Strong leadership qualities with the ability to inspire and motivate a team. Strategic thinker with a hands-on approach. Excellent analytical and problem-solving skills. Strong communication and interpersonal skills. Ability to thrive in a fast-paced and dynamic environment. Proactive and results-oriented. If you are a passionate and experienced analytics leader with a deep understanding of product analytics and a desire to make a significant impact on a leading fantasy sports platform, we encourage you to apply! Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Research Engineer, Applied Research (Biotech AI – Drug Discovery) About the Company Quantiphi is an award-winning AI-first digital engineering company, driven by a deep desire to solve transformational problems at the heart of businesses. Our signature approach combines groundbreaking machine-learning research with disciplined cloud and data-engineering practices to create breakthrough impact at unprecedented speed. Quantiphi has seen 2.5x growth YoY since its inception in 2013 to 3500+ team members globally. For more details, please visit our website or LinkedIn page. About the Applied Research Unit Applied Research is an R&D practice at Quantiphi focused on advancing the frontiers of AI technologies with Applied Machine Learning at its core. We ideate and build novel solutions to high-impact, cutting-edge challenges, with a focus on advanced prototyping and scalable proof of concepts. Within this unit, the AI-Accelerated Drug Discovery practice is a key pillar that aims to apply state-of-the-art AI methodologies to revolutionize the way new therapeutics are discovered and developed. We are committed to driving meaningful scientific breakthroughs by combining strong AI research with deep cross-disciplinary collaboration. Job Description Role Level: Research Engineer Work Location: India Resource Count: 2 The Role This is a unique opportunity to work on scientifically impactful problems at the intersection of AI and biotechnology within Quantiphi Applied Research team. In this role, you will work on the development of core AI models and algorithms aimed at accelerating the drug discovery process. The position focuses on advancing foundational AI techniques such as generative modeling, optimization, and reinforcement learning, applied to molecular and bio-pharmaceutical data. The position involves working with a diverse, lively, and proactive group of nerds who are constantly raising the bar on translating the latest AI research in Healthcare and Life Sciences into tangible reusable assets for the community. Hence this would require a high level of conceptual understanding, attention to detail and agility in terms of adaptation to new technologies. While prior experience in the biotech or life sciences domain is highly valued and will elevate the candidate profile, we are equally open to exceptional AI/ML researchers from other domains who are excited to explore and learn the nuances of this rapidly growing field . Please note: This is a core AI research role, not a software engineering or system integration position. We are particularly keen to engage with candidates focused on scientific AI innovation rather than application development or LLM/GenAI-centric workflows . Responsibilities Stay ahead of the AI research curve, focusing on foundational AI methodologies applicable to drug discovery and molecular design. Build rapid prototypes, conduct detailed experimental studies, and develop advanced AI models in areas such as generative modeling, reinforcement learning, graph-based learning, and molecular property prediction. Work closely with interdisciplinary teams including biologists, chemists, and life science domain experts to design scientifically sound AI approaches. Contribute to Quantiphi IP portfolio through the development of novel algorithms, proof of concepts, and potential publications. Drive thought leadership through documentation, knowledge dissemination, and participation in conferences, blogs, webinars, and publications. Publish Research papers in prestigious Conferences and Journals Requirements Must Have: Master’s degree, PhD, or equivalent experience in Computer Science, Artificial Intelligence, Machine Learning, Applied Mathematics, or related fields. Minimum work experience required : from new graduates to 3+ yrs of research experience post graduation (in ML research) Strong foundation in AI/ML concepts with hands-on experience in model development, experimental design, and large-scale data analysis. Excellent in-depth understanding of ML concepts and the respective underlying mathematical know-how. Working knowledge of using NLP with biological sequences. Solid research mindset with a track record of working on complex AI problems—experience with drug discovery datasets is a plus but not a prerequisite. Excellent programming skills in Python, with experience using AI/ML frameworks like PyTorch or TensorFlow. Hands-on experience in developing and deploying models with various deep learning architectures in multiple ML areas like Computer-Vision, NLP, Statistics etc Ability to independently learn new scientific domains and apply AI techniques to novel bio-pharmaceutical problems. Strong communication skills with the ability to present complex ideas in an accessible format across audiences. Ability to translate abstract highlights into understandable insights in multiple knowledge-dissemination formats like blogs, presentations, paper-publications, tutorials and webinars Good to Have: Prior exposure to molecular datasets, cheminformatics, bioinformatics, or life sciences. Hands-on experience with insilico techniques in drug discovery Hands-on experience with HPC workflows with genome datasets Familiarity with generative chemistry models, graph neural networks, reinforcement learning, or multi-objective optimization. Demonstrated industry research experience will be considered as an additional bonus. Research publications in AI/ML conferences such as NeurIPS, ICML, ICLR, or relevant bioinformatics journals Experience with cloud environments like GCP or AWS and scalable model training. Strong classical education on math/physics/mechanics/CS/Engineering concepts will also be an advantage. Why Join Us? Opportunity to work at the cutting edge of AI and biotechnology, solving problems with real-world scientific impact. Exposure to interdisciplinary teams and a culture that encourages continuous learning and exploration. Contribute to an R&D environment that values curiosity, innovation, and the advancement of AI for good. Show more Show less

Posted 1 day ago

Apply

4.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Job Description Skills & Qualifications 4+ years of experience as a Python developer with strong client communication skills and team leadership experience. In-depth knowledge of Python frameworks such as Django, Flask, or FastAPI. Strong expertise in cloud technologies (AWS, Azure, GCP). Deep understanding of microservices architecture, multi-tenant architecture, and best practices in Python development. Familiarity with serverless architecture and frameworks like AWS Lambda or Azure Functions. Experience with deployment using Docker, Nginx, Gunicorn, Uvicorn, and Supervisor. Hands-on experience with SQL and NoSQL databases such as PostgreSQL and AWS DynamoDB. Proficiency with Object Relational Mappers (ORMs) like SQLAlchemy and Django ORM. Demonstrated ability to handle multiple API integrations and write modular, reusable code. Experience with frontend technologies such as React, Vue, HTML, CSS, and JavaScript to enhance full-stack development capabilities. Strong knowledge of user authentication and authorization mechanisms across multiple systems and environments. Familiarity with scalable application design principles and event-driven programming in Python. Solid experience in unit testing, debugging, and code optimization. Hands-on experience with modern software development methodologies, including Agile and Scrum. Familiarity with container orchestration tools like Kubernetes. Understanding of data processing frameworks such as Apache Kafka and Spark (Good to have). Experience with CI/CD pipelines and automation tools like Jenkins, GitLab CI, or CircleCI. (ref:hirist.tech) Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Job Title: Backend Developer Intern (Paid | Pre-Placement Offer Possible) Location: Ahmedabad (In-Office) Duration: 3 Months Internship (Rs. 10,000) | Full-Time Offer Based on Performance Start Date: Preferably Immediate Joiners About Us: We are an early-stage startup building scalable backend systems using modern cloud-native technologies. Join a focused team working on real-world problems. What You’ll Work On: Developing backend services using Flask (mandatory) Designing and implementing RESTful APIs with: Authentication (JWT / OAuth2) Rate limiting, pagination, and webhook integrations Working with Firebase Firestore for real-time NoSQL operations, including: Subcollections, batched writes, and efficient data modeling Managing data using both NoSQL (Firestore / MongoDB) and SQL (PostgreSQL / MySQL) , with a focus on performance optimization Implementing background task workflows with Celery and Redis , including scheduled and recurring jobs Using Redis or Valkey for: Caching strategies Managing concurrency with semaphores, async, or threading approaches Deploying microservices to Google Cloud Platform (GCP) using: Docker, Cloud Run, and Compute Engine Managing CI/CD pipelines and Docker-based workflows Integrating with Firebase Admin SDK and Firebase Cloud Messaging (FCM) Using tools like Git , Postman , and Swagger/OpenAPI for version control, API testing, and documentation Requirements: Experience with Flask (mandatory), Redis, and Firestore Familiarity with Docker, JWT, and CI/CD workflows Proactive, eager to learn, and able to work in a fast-paced environment Perks: Stipend provided Full-time offer for top performers Learn and grow in a startup environment Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description Must have minimum overall 6 + Yrs of professional experience in Development using Java Proficiency in coding using technologies in GCP Very good hands-on experience in Spring boot, microservices, Strong knowledge and experience in developing and consuming REST APIs. Good to have knowledge of databases like MySQL. Must have experience I/O, multi-threading, Restful Web Services Should have performance/memory leaks/crash/Multi-threaded/Algorithms. Proficiency in coding using Java, Spring boot, Maven, JDBC, JavaScript, Hibernate, Kafka Skills Required RoleJava Developer with GCP Industry TypeIT/ Computers - Software Functional AreaIT-Software Required Education Employment TypeFull Time, Permanent Key Skills JAVA SPRING SPRINGBOOT HIBERNATE Other Information Job CodeGO/JC/033/2025 Recruiter Name Show more Show less

Posted 1 day ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description Job Title: Candidate Specification: Minimum 10 to 18 Years of Experience. Job Description Overall 10+ years of experience designing and developing applications in C#, WPF and WCF, SQL Server. Design, develop, and maintain WPF and WCF applications using C#. Write and execute test cases in C# for WPF and WCF applications. Experience with migration of .NET framework and WCF services to modern architectures. Good to have experience in video management systems. Skills Required RoleFront-End Developer Principal Industry TypeIT Services & Consulting Functional Area Required Education BE, Bachelor Degree, B Tech Employment TypeFull Time, Permanent Key Skills .NET .NET CORE GCP C#/WCF WCF WPF Other Information Job CodeGO/JC/341/2025 Recruiter NameChristopher Show more Show less

Posted 1 day ago

Apply

30.0 years

0 Lacs

India

Remote

Linkedin logo

About CoreAIVideo At CoreAIVideo , we specialize in rapid, AI-powered video production , ideal for personal branding and marketing at scale. While the AI video market is projected to reach $42B by 2030 , many businesses find traditional production both expensive and time-consuming. By cutting 80% of the typical cost and effort, we’re making premium-quality short-form video truly accessible. Why Join Now Equity Partner : Step into an executive role in a company backed by nearly 30 years of tech expertise at Outsourcing4work GmbH in Germany (www.outsourcing4work.de). 100% Remote : Work from anywhere, helping shape a world-class tech product for a global audience. Upcoming Fundraising : We’re preparing our first investment round; we’ll share our pitch deck if you’re the right fit. For a deeper look at what we do, visit www.coreaivideo.com and our CEO´s LinkedIn profile (https://www.linkedin.com/in/knadeemarif/). Responsibilities Partner with the CTO to build the product end-to-end — hands-on development is key. (Our MVP is already in progress.) Architect and develop the initial SaaS platform with a focus on scalability, performance, and security. Integrate AI video workflows and orchestrate APIs and agents for scripting, audio & video cloning, editing, and rendering. Create key modules: user auth, video editing/rendering interface, content library, user dashboards, and billing integration. Choose and implement the tech stack — you’ll help define the foundation. Set up cloud infrastructure, CI/CD pipelines, deployment, and monitoring. Collaborate on the technical vision and contribute to the product roadmap. Ideal Profile A full-stack developer with experience building SaaS platforms from MVP to production . Proficient in React (for front-end) and Supabase (for backend, database, and auth). Fluent in writing scalable code using modern frameworks: Node.js , Python and TypeScript . Strong cloud skills — preferably with AWS , GCP , or similar. Solid command of version control and team collaboration via GitHub . Ideally has experience integrating AI/ML APIs and building or orchestrating AI agent-based systems , including LLMs and AI video generation engines . Experience with automation platforms like n8n and the ability to integrate them to streamline workflows, is a strong plus. A builder’s mindset: you take ownership, work independently, and want to co-create something meaningful. The Collaboration & Financial Setup Sweat Equity: Initially, your compensation will be in the form of sweat equity. Post-Funding Salary: Once our first fundraising round is complete, we’ll shift you into a salary model. Ready to Elevate Video Creation? If you’re excited about shaping the future of AI-powered video creation, we’d love to hear from you. Send your CV , a short note about your SaaS development journey , and tell us why you’re passionate about CoreAIVideo . Let’s redefine how people create premium video content—together. Show more Show less

Posted 1 day ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Noida, Chennai

Hybrid

Naukri logo

Deployment, configuration & maintenance of Databricks clusters & workspaces Security & Access Control Automate administrative task using tools like Python, PowerShell &Terraform Integrations with Azure Data Lake, Key Vault & implement CI/CD pipelines Required Candidate profile Azure, AWS, or GCP; Azure experience is preferred Strong skills in Python, PySpark, PowerShell & SQL Experience with Terraform ETL processes, data pipeline &big data technologies Security & Compliance

Posted 1 day ago

Apply

6.0 years

0 Lacs

India

Remote

Linkedin logo

Job Role: Data Scientist Job Type: Full-Time Mode of work: Remote Experience: 6+ Years Key Responsibilities Analyze complex datasets to uncover insights using AI techniques. Build and deploy machine learning, deep learning, and NLP models. Collaborate with teams to understand client needs and deliver AI-powered solutions. Communicate insights clearly to various stakeholders. Continuously refine models for performance and accuracy. Stay current on AI advancements and apply them in real-world scenarios. Contribute to AI tools and infrastructure development. Qualifications Bachelor’s/Master’s in Statistics, CS, Engineering, or related field. 6+ years of experience in IT and 3+ years of hands-on experience in Data Science/AI. Strong expertise in Python (Pandas, NumPy, scikit-learn, Streamlit) and SQL. Experience with ML/DL frameworks (e.g., TensorFlow, PyTorch). Familiarity with Agile methodologies and cloud platforms (AWS, Azure, GCP). Solid communication and problem-solving skills. Knowledge of RAG implementation and AI agents is a plus. Interest or background in Corporate Finance is an added advantage. Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: Data Engineer Location: Hyderabad, Kochi, Trivandrum Experience Required: 10-19 Yrs Skills: Primary - Scala, Pyspark, Python / Secondary - ETL, SQL, Azure Role Proficiency The role demands expertise in building robust, scalable data pipelines that support ingestion, wrangling, transformation, and integration of data from multiple sources. The ideal candidate should have hands-on experience with ETL tools (e.g., Informatica, AWS Glue, Databricks, GCP DataProc), and strong programming skills in Python, PySpark, SQL, and optionally Scala. Proficiency across various data domains and familiarity with modern data warehouse and lakehouse architectures (Snowflake, BigQuery, Delta Lake, Lakehouse) is essential. A solid understanding of DevOps and infrastructure cost optimization is required. Key Responsibilities & Outcomes Technical Development Develop high-performance data pipelines and applications. Optimize development using design patterns and reusable solutions. Create and tune code using best practices for performance and scalability. Develop schemas, data models, and data storage solutions (SQL/NoSQL/Delta Lake). Perform debugging, testing, and validation to ensure solution quality. Documentation & Design Produce high-level and low-level design (HLD, LLD, SAD) and architecture documentation. Prepare infra costing, source-target mappings, and business requirement documentation. Contribute to and govern documentation standards/templates/checklists. Project & Team Management Support Project Manager in planning, delivery, and sprint execution. Estimate effort and provide input on resource planning. Lead and mentor junior team members, define goals, and monitor progress. Monitor and manage defect lifecycle including RCA and proactive quality improvements. Customer Interaction Gather and clarify requirements with customers and architects. Present design alternatives and conduct product demos. Ensure alignment with customer expectations and solution architecture. Testing & Release Design and review unit/integration test cases and execution strategies. Provide support during system/integration testing and UAT. Oversee and execute release cycles and configurations. Knowledge Management & Compliance Maintain compliance with configuration management plans. Contribute to internal knowledge repositories and reusable assets. Stay updated and certified on relevant technologies/domains. Measures of Success (KPIs) Adherence to engineering processes and delivery schedules. Number of post-delivery defects and non-compliance issues. Reduction in recurring defects and faster resolution of production bugs. Timeliness in detecting, responding to, and resolving pipeline/data issues. Improvements in pipeline efficiency (e.g., runtime, resource utilization). Team engagement and upskilling; completion of relevant certifications. Zero or minimal data security/compliance breaches. Expected Deliverables Code High-quality data transformation scripts and pipelines. Peer-reviewed, optimized, and reusable code. Documentation Design documents, technical specifications, test plans, and infra cost estimations. Configuration & Testing Configuration management plans and test execution results. Knowledge Sharing Contributions to SharePoint, internal wikis, client university platforms. Skill Requirements Mandatory Technical Skills Languages : Python, PySpark, Scala ETL Tools : Apache Airflow, Talend, Informatica, AWS Glue, Databricks, DataProc Cloud Platforms : AWS, GCP, Azure (esp. BigQuery, DataFlow, ADF, ADLS) Data Warehousing : Snowflake, BigQuery, Delta Lake, Lakehouse architecture Performance Tuning : For large-scale distributed systems and pipelines Additional Skills Experience in data model design and optimization. Good understanding of data schemas, window functions, and data partitioning strategies. Awareness of data governance, security standards, and compliance. Familiarity with DevOps, CI/CD, infrastructure cost estimation. Certifications (Preferred) Cloud certifications (e.g., AWS Data Analytics, GCP Data Engineer) Informatica or Databricks certification Domain-specific certifications based on project/client need Soft Skills Strong analytical and problem-solving capabilities Excellent communication and documentation skills Ability to work independently and collaboratively in cross-functional teams Stakeholder management and customer interaction Show more Show less

Posted 1 day ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

LivePerson (NASDAQ: LPSN) is the global leader in enterprise conversations. Hundreds of the world’s leading brands — including HSBC, Chipotle, and Virgin Media — use our award-winning Conversational Cloud platform to connect with millions of consumers. We power nearly a billion conversational interactions every month, providing a uniquely rich data set and safety tools to unlock the power of Conversational AI for better customer experiences. At LivePerson, we foster an inclusive workplace culture that encourages meaningful connection, collaboration, and innovation. Everyone is invited to ask questions, actively seek new ways to achieve success, nd reach their full potential. We are continually looking for ways to improve our products and make things better. This means spotting opportunities, solving ambiguities, and seeking effective solutions to the problems our customers care about. Overview LivePerson is experiencing rapid growth, and we’re evolving our database infrastructure to scale faster than ever. We are building a team dedicated to optimizing data storage, accessibility, and performance across our applications. As a Senior Database Engineer, you will be a key contributor, driving innovation in cloud database solutions and automation. You Will Partner with cross-functional teams to define database requirements and architectural strategies. Design, implement, and maintain highly scalable, on-prem and cloud-based database systems on Google Cloud Platform (GCP). Develop automation solutions using Terraform, Ansible, and Python to streamline database provisioning and management. Ensure robust version control of infrastructure configurations for seamless deployments. Monitor, troubleshoot, and optimize database performance, addressing bottlenecks proactively. Establish and enforce backup, recovery, and disaster recovery protocols to protect data integrity. Collaborate with security teams to implement compliance and data protection measures. Lead incident resolution, analyzing root causes and driving long-term solutions. Stay ahead of industry trends in DevOps, cloud computing, and database technologies. Participate in on-call rotations, ensuring 24x7 support for mission-critical systems. You Have 8+ years of experience managing large-scale production database systems handling terabytes of data. Expertise in MySQL administration & replication. Experience with anyone of Elasticsearch, Kafka, Hadoop, and Vertica is plus Strong background in Google Cloud Platform (GCP) or AWS database deployments. Proficiency in Infrastructure as Code (IaC) using Terraform & Ansible. Skilled in Python & Bash scripting for automation. Hands-on experience with Liquibase or Flyway for database automation. Knowledge of monitoring tools like Prometheus, Grafana, PMM (Percona Monitoring and Management) and ELK stack (Elasticsearch, Kibana & Logstash). Strong problem-solving skills with a proactive approach to troubleshooting complex issues. Solid foundation in database architecture, optimization, and CI/CD concepts. Excellent collaboration & communication skills in a dynamic team environment. Highly accountable with a results-driven mindset. Able to create documentation, work on changes, incidents and jira tickets. Relevant certifications (AWS, GCP) are a plus. Benefits Health: Medical, Dental and Vision Time away: Vacation and holidays Equal opportunity employer Why You’ll Love Working Here As leaders in enterprise customer conversations, we celebrate diversity, empowering our team to forge impactful conversations globally. LivePerson is a place where uniqueness is embraced, growth is constant, and everyone is empowered to create their own success. And, we're very proud to have earned recognition from Fast Company, Newsweek, and BuiltIn for being a top innovative, beloved, and remote-friendly workplace. Belonging At LivePerson We are proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, color, family or medical care leave, gender identity or expression, genetic information, marital status, medical condition, national origin, physical or mental disability, protected veteran status, race, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable laws, regulations and ordinances. We also consider qualified applicants with criminal histories, consistent with applicable federal, state, and local law. We are committed to the accessibility needs of applicants and employees. We provide reasonable accommodations to job applicants with physical or mental disabilities. Applicants with a disability who require reasonable accommodation for any part of the application or hiring process should inform their recruiting contact upon initial connection. The talent acquisition team at LivePerson has recently been notified of a phishing scam targeting candidates applying for our open roles. Scammers have been posing as hiring managers and recruiters in an effort to access candidates' personal and financial information. This phishing scam is not isolated to only LivePerson and has been documented in news articles and media outlets. Please note that any communication from our hiring teams at LivePerson regarding a job opportunity will only be made by a LivePerson employee with an @ liveperson.com email address. LivePerson does not ask for personal or financial information as part of our interview process, including but not limited to your social security number, online account passwords, credit card numbers, passport information and other related banking information. If you have any questions and or concerns, please feel free to contact recruiting-lp@liveperson.com Show more Show less

Posted 1 day ago

Apply

3.0 - 5.0 years

15 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Position summary: We are seeking a Senior Software Development Engineer – Data Engineering with 3-5 years of experience to design, develop, and optimize data pipelines and analytics workflows using Snowflake, Databricks, and Apache Spark. The ideal candidate will have a strong background in big data processing, cloud data platforms, and performance optimization to enable scalable data-driven solutions. Key Responsibilities: Work with cloud-based data solutions (Azure, AWS, GCP). Implement data modeling and warehousing solutions. Developing and maintaining data pipelines for efficient data extraction, transformation, and loading (ETL) processes. Designing and optimizing data storage solutions, including data warehouses and data lakes. Ensuring data quality and integrity through data validation, cleansing, and error handling. Collaborating with data analysts, data architects, and software engineers to understand data requirements and deliver relevant data sets (e.g., for business intelligence). Implementing data security measures and access controls to protect sensitive information. Monitor and troubleshoot issues in data pipelines, notebooks, and SQL queries to ensure seamless data processing. Develop and maintain Power BI dashboards and reports. Work with DAX and Power Query to manipulate and transform data. Basic Qualifications Bachelor’s or master’s degree in computer science or data science 3-5 years of experience in data engineering, big data processing, and cloud-based data platforms. Proficient in SQL, Python, or Scala for data manipulation and processing. Proficient in developing data pipelines using Azure Synapse, Azure Data Factory, Microsoft Fabric. Experience with Apache Spark, Databricks and Snowflake is highly beneficial for handling big data and cloud-based analytics solutions. Preferred Qualifications Knowledge of streaming data processing (Apache Kafka, Flink, Kinesis, Pub/Sub). Experience in BI and analytics tools (Tableau, Power BI, Looker). Familiarity with data observability tools (Monte Carlo, Great Expectations). Contributions to open-source data engineering projects.

Posted 1 day ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Matillion is The Data Productivity Cloud. We are on a mission to power the data productivity of our customers and the world, by helping teams get data business ready, faster. Our technology allows customers to load, transform, sync and orchestrate their data. We are looking for passionate, high-integrity individuals to help us scale up our growing business. Together, we can make a dent in the universe bigger than ourselves. With offices in the UK, US and Spain, we are now thrilled to announce the opening of our new office in Hyderabad, India. This marks an exciting milestone in our global expansion, and we are now looking for talented professionals to join us as part of our founding team. About the Role We’re on the hunt for a Senior Quality Engineer to help us push the boundaries of SaaS quality assurance while integrating cutting-edge Generative AI (GenAI) technologies. At the heart of our engineering culture is a commitment to delivering high-quality, scalable, and secure software – and we believe quality should be a shared responsibility across the team. As a Senior QA Engineer , you'll play a pivotal role in shaping and maintaining quality throughout the SDLC – combining deep QA expertise with a passion for emerging AI technologies. You’ll work hands-on with cross-functional teams, take ownership of our testing strategies, and guide the adoption of best practices across automation, GenAI evaluations, and testing frameworks. What You'll Be Doing Champion a shift-left testing culture by embedding quality across every phase of our software development lifecycle Design, build, and maintain robust test automation frameworks for functional, integration, performance, security, and accessibility testing Collaborate closely with product managers, developers, and AI engineers to test and evaluate LLM-integrated features using both traditional and GenAI-specific testing methods Implement tools and techniques to test LLM-based functionality, including prompt evaluation, RAG-enhanced responses, and API behavior Drive the use of modern CI/CD pipelines, ensuring testing is efficient, scalable, and automated wherever possible Analyze, interpret, and share meaningful insights from test results – especially when comparing LLM model responses, prompt outcomes, and system behavior Mentor team members in test strategies, automation practices, and GenAI testing methodologies Continuously assess and improve the quality landscape, identifying opportunities to increase speed, accuracy, and innovation in how we test What We're Looking For Core QA Expertise Strong experience with end-to-end testing of distributed systems and SaaS applications Deep understanding of testing methodologies including exploratory, regression, risk-based, performance, and security testing Skilled in writing automated tests using tools like Cypress , Pact , or similar frameworks, with coding experience in Java/JavaScript Solid understanding of the Test Pyramid , CI/CD pipelines, and agile methodologies (Scrum, Kanban) Experience working with cloud platforms like AWS , GCP , or Azure , plus comfort with SQL and database testing A team player with a QA-first mindset and a drive to elevate engineering standards across the board GenAI/LLM Knowledge (Essential) Hands-on experience working with LLM APIs like OpenAI , Anthropic , or HuggingFace Familiarity with prompt engineering , fine-tuning prompts for desired outcomes Practical experience using Retrieval-Augmented Generation (RAG) to enrich LLM responses Understanding of key GenAI concepts like system/user prompts , tokens , embeddings , temperature , top-p , context windows , and stop sequences Nice-to-Have Experience with fine-tuning LLMs or training smaller models for specific tasks Familiarity with prompt compression techniques and advanced GenAI patterns like prompt chaining , agentic workflows , or LLM routing Exposure to abstraction libraries such as Spring AI or similar Experience with LLM eval frameworks to benchmark model performance and prompt effectiveness Matillion has fostered a culture that is collaborative, fast-paced, ambitious, and transparent, and an environment where people genuinely care about their colleagues and communities. Our 6 core values guide how we work together and with our customers and partners. We operate a truly flexible and hybrid working culture that promotes work-life balance, and are proud to be able to offer the following benefits: - Company Equity - 27 days paid time off - 12 days of Company Holiday - 5 days paid volunteering leave - Group Mediclaim (GMC) - Enhanced parental leave policies - MacBook Pro - Access to various tools to aid your career development More about Matillion Thousands of enterprises including Cisco, DocuSign, Slack, and TUI trust Matillion technology to load, transform, sync, and orchestrate their data for a wide range of use cases from insights and operational analytics, to data science, machine learning, and AI. With over $300M raised from top Silicon Valley investors, we are on a mission to power the data productivity of our customers and the world. We are passionate about doing things in a smart, considerate way. We’re honoured to be named a great place to work for several years running by multiple industry research firms. We are dual headquartered in Manchester, UK and Denver, Colorado. We are keen to hear from prospective Matillioners, so even if you don’t feel you match all the criteria please apply and a member of our Talent Acquisition team will be in touch. Alternatively, if you are interested in Matillion but don't see a suitable role, please email talent@matillion.com. Matillion is an equal opportunity employer. We celebrate diversity and we are committed to creating an inclusive environment for all of our team. Matillion prohibits discrimination and harassment of any type. Matillion does not discriminate on the basis of race, colour, religion, age, sex, national origin, disability status, genetics, sexual orientation, gender identity or expression, or any other characteristic protected by law. Show more Show less

Posted 2 days ago

Apply

2.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Junior Azure Cloud Engineer Implement scalable, secure, and highly available Azure infrastructure. Manage and optimize Azure services including VMs, Azure Functions, App Services, Azure SQL, Storage Accounts, VNET. Monitor and maintain cloud environments using Azure Monitor, Log Analytics, and Application Insights. Migrate workloads from Azure to Azure, On-premise to Azure, Other Cloud to Azure Troubleshoot and resolve infrastructure and application-related issues in a cloud environment. Closely work with sales team and customer team for requirement gathering and deployment phases Requirements 2+ years of IT experience with at least 1+ years in Azure cloud. Understanding of Azure IaaS, PaaS, and networking services. Understanding of M365, O365 Experience in Migrating workload Experience in deploying infrastructure in Azure Experience with monitoring, alerting, and performance tuning in Azure. What we Expect from you? University Degree or Equivalent Azure certifications Experience with hybrid cloud environments. Knowledge of cost optimization strategies in Azure. Exposure to other cloud platforms (AWS, GCP) is a plus. What you've got? Strong communication and documentation skills. Ability to work independently and in a team. Analytical mindset with a problem-solving attitude. Show more Show less

Posted 2 days ago

Apply

1.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. What you’ll do This position is at the forefront of Equifax's post cloud transformation, focusing on developing and enhancing Java applications within the Google Cloud Platform (GCP) environment. The ideal candidate will combine strong Java development skills with cloud expertise to drive innovation and improve existing systems Key Responsibilities Design, develop, test, deploy, maintain, and improve software applications on GCP Enhance existing applications and contribute to new initiatives leveraging cloud-native technologies Implement best practices in serverless computing, microservices, and cloud architecture Collaborate with cross-functional teams to translate functional and technical requirements into detailed architecture and design Participate in code reviews and maintain high development and security standards Provide technical oversight and direction for Java and GCP implementations What Experience You Need Bachelor's or Master's degree in Computer Science or equivalent experience 1+ years of IT experience with a strong focus on Java development Experience in modern Java development and cloud computing concepts Familiarity with agile methodologies and test-driven development (TDD) Strong understanding of software development best practices, including continuous integration and automated testing What could set you apart Experience with GCP or other cloud platforms (AWS, Azure) Active cloud certifications (e.g., Google Cloud Professional certifications) Experience with big data technologies (Spark, Kafka, Hadoop) and NoSQL databases Knowledge of containerization and orchestration tools (Docker, Kubernetes) Familiarity with financial services industry Experience with open-source frameworks (Spring, Ruby, Apache Struts, etc.) Experience with Python We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less

Posted 2 days ago

Apply

5.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Linkedin logo

About Beyond Key We are a Microsoft Gold Partner and a Great Place to Work-certified company. "Happy Team Members, Happy Clients" is a principle we hold dear. We are an international IT consulting and software services firm committed to providing. Cutting-edge services and products that satisfy our clients' global needs. Our company was established in 2005, and since then we've expanded our team by including more than 350+ Talented skilled software professionals. Our clients come from the United States, Canada, Europe, Australia, the Middle East, and India, and we create and design IT solutions for them. If you need any more details, you can get them at https://www.beyondkey.com/about. Job Description We are looking for a highly skilled and detail-oriented Senior QA Engineer with 5+ years of experience in software product testing. The ideal candidate will have a deep understanding of QA methodologies, strong analytical skills, and experience working in an Agile environment. You will be responsible for ensuring the quality of our product through the development and execution of test plans, automation scripts, and continuous collaboration with cross-functional teams. Key Responsibilities Design, develop, and execute manual test cases for web and mobile applications. Collaborate with Product Managers, Developers, and UX/UI Designers to ensure product quality and usability. Participate in all phases of the software development life cycle (SDLC) and Agile/Scrum ceremonies. Identify, document, and track defects and issues using bug tracking tools (e.g., DevOps). Conduct regression, smoke, and performance testing to validate software stability and performance. Maintain test documentation, including test plans, test cases, and test reports. Contribute to the continuous improvement of QA processes and best practices. Mentor junior QA team members when needed. Required Skills & Qualifications Bachelor’s degree in computer science, Engineering, or related field. 5+ years of experience in QA testing of enterprise software products or SaaS applications. Proficient in manual testing with strong analytical and problem-solving skills. Familiarity with REST APIs and tools like Postman for API testing. Solid understanding of Agile methodologies and tools like Jira, Confluence, DevOps. Knowledge of SQL and ability to validate data against the database. Excellent communication and documentation skills. Preferred Qualifications Experience with CI/CD pipelines and tools such as Jenkins and GitLab CI. Exposure to cloud platforms like AWS, Azure, or GCP. Experience in performance testing tools like JMeter or LoadRunner. ISTQB or equivalent certification is a plus. Share with someone awesome View all job openings Show more Show less

Posted 2 days ago

Apply

4.0 - 6.0 years

11 - 18 Lacs

Bengaluru

Hybrid

Naukri logo

Key Skills : .net core, Java, C#, Automated Testing, GCP, PostgreSQL Roles and Responsibilities: Lead the design of complex software development features and ensure solutions are scalable, effective, and maintainable. Collaborate with solution managers, designers, and other teams to gather requirements, translate them into technical specifications, and ensure alignment with priorities and project goals. Analyze and solve complex technical problems, identify bottlenecks, and prepare technical documentation to optimize system performance. Facilitate code reviews, provide constructive feedback, and lead by example in code quality, development best practices, and problem-solving approaches. Ensure code meets functional and performance requirements, and advocate for high-quality software and ensure rigorous testing processes, including automated unit tests, integration tests, and other testing frameworks. Leverage common GenAI tools for AI assisted development and understand the basics of prompt engineering. Perform other job duties as assigned. The majority of the work is focused on individual contributor objectives but may include formal oversight or mentorship of other individual contributors. If directly responsible for oversight of other individual contributors, additional management responsibilities may include hiring, training, and communication related to performance management and compensation decisions. This job is responsible for all stages of the software development lifecycle using a variety of technologies and tools to build impactful software solutions. The scope of this job includes building and optimizing comprehensive solutions that prioritize end-user efficiency and experience. Experience Requirements: Proven experience in leading the design and implementation of scalable, maintainable, and effective software solutions. Strong background in collaborating across cross-functional teams to gather and translate requirements into well-defined technical specifications. Demonstrated ability to solve complex technical challenges and optimize system performance. Experience conducting thorough code reviews and promoting best practices in software development. Proficiency in testing methodologies including unit, integration, and system testing using automation frameworks. Familiarity with GenAI tools and a foundational understanding of prompt engineering. Previous experience mentoring or providing oversight to other engineers is a plus. Hands-on experience across the full software development lifecycle, using modern tools and technologies to create user-centric solutions. Qualifications: B. Tech in Any software branch.

Posted 2 days ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Role: Java Developer - Software Engineer Experience: 4-9 Years Location: Chennai (HYBRID) Interview: F2F Mandatory: Java Spring Boot Microservice -React Js -AWS Cloud- DevOps- Node(Added Advantage) Job Description: Overall 4+ years of experience in Java Development Projects 3+Years of development experience in development with React 2+Years Of experience in AWS Cloud, Devops. Microservices development using Spring Boot Technical StackCore Java, Java, J2EE, Spring, MongoDB, GKE, Terraform, GitHub, GCP Developer, Kubernetes, Scala, Kafka Technical ToolsConfluence/Jira/Bitbucket or Git, CI / CD (Maven, Git, Jenkins), Eclipse or IntelliJ IDEA Experience in event-driven architectures (CQRS and SAGA patterns). Experience in Design patterns Build Tools (Gulp, Webpack), Jenkins, Docker, Automation, Bash, Redis, Elasticsearch, Kibana Technical Stack (UI)JavaScript, React JS, CSS/SCSS, HTML5, Git+ Show more Show less

Posted 2 days ago

Apply

40.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

About Iamneo Founded in 2016 and now part of the NIIT family, iamneo is a fast-growing, profitable B2B EdTech SaaS company that’s transforming how tech talent is upskilled, evaluated, and deployed. Our AI-powered learning and assessment platforms help enterprises and educational institutions build future-ready talent at scale. We specialize in Talent Upskilling, Assessment, and Workforce Transformation across sectors like ITeS, BFSI, and Higher Education. Our solutions are trusted by top corporates such as Wipro, HCLTech, LTIMindtree, Virtusa, Tech Mahindra, and Hexaware, and over 150+ leading institutions including BITS Pilani, VIT, SRM, LPU, and Manipal. As an NIIT Venture, we’re backed by NIIT’s 40+ years of legacy in learning and talent development — combining their global reputation and deep domain expertise with our AI-first, product-driven approach to modern upskilling. If you are passionate about innovation, growth, and redefining the future of tech learning — iamneo is the place for you. About The Role We’re looking for a Senior DevOps & Cloud Operations Engineer who can take end-to-end ownership of our cloud infrastructure and DevOps practices, with proven expertise in both Google Cloud Platform (GCP) and Microsoft Azure. This role is critical to driving scalable, secure, and high-performance deployment environments for our applications. If you thrive in a multi-cloud, automation-first environment and enjoy building robust systems that scale, we’d love to hear from you. 🔧 What You’ll Do Architect, deploy, and manage scalable, secure, and highly available cloud infrastructure Lead infrastructure optimization initiatives including performance tuning, cost control, and capacity planning Design and implement CI/CD pipelines using tools like Jenkins, GitHub Actions,Cloud Build or similar. Automate infrastructure provisioning and configuration using Terraform, Ansible, or similar tools Manage containerized environments using Docker and Kubernetes, with best practices for orchestration and lifecycle management Work with microservice-based architectures and support seamless deployment workflows Implement configuration management using tools such as Terraform, Ansible, or others. Set up and maintain monitoring, alerting, and logging systems (e.g., Prometheus, Grafana, Azure Monitor, Sentry, New Relic) Write automation and operational scripts in Bash, Python, or equivalent scripting languages Ensure security controls, compliance, and DevSecOps practices are implemented across environments Conduct regular infrastructure audits, backups, and disaster recovery drills Troubleshoot and resolve infrastructure-related issues proactively Collaborate with product and development teams to align infrastructure with application and business needs Support platform transitions, version upgrades, and cloud migration efforts Mentor junior engineers and promote DevOps best practices across teams ✅ What We’re Looking For 5+ years of hands-on experience in DevOps, cloud infrastructure, and system reliability Strong experience across cloud platforms with a preference for exposure to both GCP and Azure Proven expertise in CI/CD, infrastructure-as-code, and container orchestration Proficiency in scripting using Bash, Python, or similar languages Solid understanding of cloud-native and microservices architectures Strong problem-solving, documentation, and communication skills High ownership mindset and ability to work in fast-paced environments 🌟 Bonus Points For GCP and/or Azure certifications Experience with Agile and DevOps cultural practices Prior experience deploying Node.js, Python, or similar web applications Ability to work in fast paced environments Skills: azure monitor,bash,python,gcp,jenkins,ansible,sentry,kubernetes,new relic,grafana,infrastructure,ci/cd,microsoft azure,devops,docker,cloud build,cloud,azure,prometheus,terraform,google cloud platform (gcp),github actions Show more Show less

Posted 2 days ago

Apply

Exploring GCP Jobs in India

The job market for Google Cloud Platform (GCP) professionals in India is rapidly growing as more and more companies are moving towards cloud-based solutions. GCP offers a wide range of services and tools that help businesses in managing their infrastructure, data, and applications in the cloud. This has created a high demand for skilled professionals who can work with GCP effectively.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Hyderabad
  4. Pune
  5. Chennai

Average Salary Range

The average salary range for GCP professionals in India varies based on experience and job role. Entry-level positions can expect a salary range of INR 5-8 lakhs per annum, while experienced professionals can earn anywhere from INR 12-25 lakhs per annum.

Career Path

Typically, a career in GCP progresses from a Junior Developer to a Senior Developer, then to a Tech Lead position. As professionals gain more experience and expertise in GCP, they can move into roles such as Cloud Architect, Cloud Consultant, or Cloud Engineer.

Related Skills

In addition to GCP, professionals in this field are often expected to have skills in: - Cloud computing concepts - Programming languages such as Python, Java, or Go - DevOps tools and practices - Networking and security concepts - Data analytics and machine learning

Interview Questions

  • What is Google Cloud Platform and its key services? (basic)
  • Explain the difference between Google Cloud Storage and Google Cloud Bigtable. (medium)
  • How would you optimize costs in Google Cloud Platform? (medium)
  • Describe a project where you implemented CI/CD pipelines in GCP. (advanced)
  • How does Google Cloud Pub/Sub work and when would you use it? (medium)
  • What is Cloud Spanner and how is it different from other database services in GCP? (advanced)
  • Explain the concept of IAM and how it is implemented in GCP. (medium)
  • How would you securely transfer data between different regions in GCP? (advanced)
  • What is Google Kubernetes Engine (GKE) and how does it simplify container management? (medium)
  • Describe a scenario where you used Google Cloud Functions in a project. (advanced)
  • How do you monitor performance and troubleshoot issues in GCP? (medium)
  • What is Google Cloud SQL and when would you choose it over other database options? (medium)
  • Explain the concept of VPC (Virtual Private Cloud) in GCP. (basic)
  • How do you ensure data security and compliance in GCP? (medium)
  • Describe a project where you integrated Google Cloud AI services. (advanced)
  • What is the difference between Google Cloud CDN and Google Cloud Load Balancing? (medium)
  • How do you handle disaster recovery and backups in GCP? (medium)
  • Explain the concept of auto-scaling in GCP and when it is useful. (medium)
  • How would you set up a multi-region deployment in GCP for high availability? (advanced)
  • Describe a project where you used Google Cloud Dataflow for data processing. (advanced)
  • What are the best practices for optimizing performance in Google Cloud Platform? (medium)
  • How do you manage access control and permissions in GCP? (medium)
  • Explain the concept of serverless computing and how it is implemented in GCP. (medium)
  • What is the difference between Google Cloud Identity and Access Management (IAM) and AWS IAM? (advanced)
  • How do you ensure data encryption at rest and in transit in GCP? (medium)

Closing Remark

As the demand for GCP professionals continues to rise in India, now is the perfect time to upskill and pursue a career in this field. By mastering GCP and related skills, you can unlock numerous opportunities and build a successful career in cloud computing. Prepare well, showcase your expertise confidently, and land your dream job in the thriving GCP job market in India.

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies