Jobs
Interviews

66 Cloud Functions Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a member of the Platform Observability Engineering team within Ford's Data Platforms and Engineering (DP&E) organization, you will contribute to building and maintaining a top-tier platform for monitoring and observability. This platform focuses on the four golden signalslatency, traffic, errors, and saturationproviding essential data to support operations, root cause analysis, continuous improvement, and cost optimization. You will collaborate with platform architects to help design, develop, and maintain a scalable and reliable platform, ensuring smooth integration with systems used across various teams. Your contributions will be key in improving MTTR and MTTX through increased visibility into system performance. Working with stakeholders, you will integrate observability data into their workflows, develop insightful dashboards and reports, continuously improve platform performance and reliability, optimize costs, and stay updated with industry best practices and technologies. The role focuses on building and maintaining a robust platform rather than developing individual monitoring tools, creating a centralized, reliable source of observability data that empowers data-driven decisions and accelerates incident response across the organization. Responsibilities: - Design and Build Data Pipelines: Architect, develop, and maintain scalable data pipelines and microservices supporting real-time and batch processing on GCP. - Service-Oriented Architecture (SOA) and Microservices: Design and implement SOA and microservices-based architectures for modular, flexible, and maintainable data solutions. - Full-Stack Integration: Contribute to the seamless integration of front-end and back-end components, ensuring robust data access and UI-driven data exploration. - Data Ingestion and Integration: Lead the ingestion and integration of data from various sources into the data platform, ensuring standardized and optimized data for analytics. - GCP Data Solutions: Utilize GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.) to build and manage data platforms meeting business needs. - Data Governance and Security: Implement and manage data governance, access controls, and security best practices while leveraging GCP's native security features. - Performance Optimization: Continuously monitor and improve the performance, scalability, and efficiency of data pipelines and storage solutions. - Collaboration and Best Practices: Define best practices, design patterns, and frameworks for cloud data engineering by closely working with data architects, software engineers, and cross-functional teams. - Automation and Reliability: Automate data platform processes to enhance reliability, reduce manual intervention, and improve operational efficiency. Qualifications: - Technical Skills: Proficiency in Java, Angular, or any JavaScript technology with experience in designing and deploying cloud-based data pipelines and microservices using GCP tools like BigQuery, Dataflow, and Dataproc. - Service-Oriented Architecture and Microservices: Strong understanding of SOA, microservices, and their application within a cloud data platform context. Develop robust, scalable services using Java Spring Boot, Python, Angular, and GCP technologies. - Full-Stack Development: Knowledge of front-end and back-end technologies enabling collaboration on data access and visualization layers (e.g., React, Node.js). - Design and develop RESTful APIs for seamless integration across platform services. - Implement robust unit and functional tests to maintain high standards of test coverage and quality. - Database Management: Experience with relational (e.g., PostgreSQL, MySQL) and NoSQL databases, as well as columnar databases like BigQuery. - Data Governance and Security: Understanding of data governance frameworks and implementing RBAC, encryption, and data masking in cloud environments. - CI/CD and Automation: Familiarity with CI/CD pipelines, Infrastructure as Code (IaC) tools like Terraform, and automation frameworks. - Manage code changes with GitHub and troubleshoot and resolve application defects efficiently. - Ensure adherence to SDLC best practices, independently managing feature design, coding, testing, and production releases. - Problem-Solving: Strong analytical skills with the ability to troubleshoot complex data platform and microservices issues. Certifications (Preferred): GCP Data Engineer, GCP Professional Cloud,

Posted 12 hours ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

As a Principal Engineer at Walmart's Enterprise Business Services, you will play a pivotal role in shaping the engineering direction, driving architectural decisions, and ensuring the delivery of scalable, secure, and high-performing solutions across the platform. Your responsibilities will include leading the design and development of full stack applications, architecting complex cloud-native systems on Google Cloud Platform (GCP), defining best practices, and guiding engineering excellence. You will have the opportunity to work on crafting frontend experiences, building robust backend APIs, designing cloud infrastructure, and influencing the technical vision of the organization. Collaboration with product, design, and data teams to translate business requirements into scalable tech solutions will be a key aspect of your role. Additionally, you will champion CI/CD pipelines, Infrastructure as Code (IaC), and drive code quality through rigorous design reviews and automated testing. To be successful in this role, you are expected to bring 10+ years of experience in full stack development, with at least 2+ years in a technical leadership or principal engineering role. Proficiency in JavaScript/TypeScript, Python, or Go, along with expertise in modern frontend frameworks like React, is essential. Strong experience in cloud-native systems on GCP, microservices architecture, Docker, Kubernetes, and event-driven systems is required. Your role will also involve managing production-grade cloud systems, working with SQL and NoSQL databases, and staying ahead of industry trends by evaluating new tools and frameworks. Exceptional communication, leadership, and collaboration skills are crucial, along with a GCP Professional Certification and experience with serverless platforms and observability tools. Joining Walmart Global Tech means being part of a team that makes a significant impact on millions of people's lives through innovative technology solutions. You will have the opportunity to work in a flexible, hybrid environment that promotes collaboration and personal development. In addition to a competitive compensation package, Walmart offers various benefits and a culture that values diversity, inclusion, and belonging for all associates. As an Equal Opportunity Employer, Walmart fosters a workplace where unique styles, experiences, and identities are respected and valued, creating a welcoming environment for all.,

Posted 2 days ago

Apply

15.0 - 22.0 years

0 Lacs

karnataka

On-site

You will be responsible for solution design, client engagement, and delivery oversight, along with senior stakeholder management. Your role will involve leading and driving Google Cloud solutioning for customer requirements, RFPs, proposals, and delivery. You will establish governance frameworks, delivery methodologies, and reusable assets to scale the practice. It is essential to have the ability to take initiative and deliver in challenging engagements spread across multiple geographies. Additionally, you will lead the development of differentiated capabilities and offerings in areas such as application modernization & migration, cloud-native development, and AI agents. Collaboration with sales and pre-sales teams to shape solutions and win strategic deals, including large-scale application modernization and migrations will be a key aspect of your role. You will spearhead Google Cloud's latest products & services like AgentSpace, AI Agent development using GCP-native tools such as ADK, A2A Protocol, and Model Context Protocol. Building and mentoring a high-performing team of cloud architects, engineers, and consultants will also be part of your responsibilities. Driving internal certifications, specialization audits, and partner assessments to maintain Google Cloud Partner status is crucial. Representing the organization in partner forums, webinars, and industry/customer events is also expected from you. To qualify for this role, you should have 15+ years of experience in IT, with at least 3 years in Google Cloud applications architecture, design, and solutioning. Additionally, a minimum of 5 years of experience in designing and developing Java applications/platforms is required. Deep expertise in GCP services including Compute, Storage, BigQuery, Cloud Functions, Anthos, and Vertex AI is essential. Proven experience in leading Google Cloud transformation programs, strong solution architecture, and implementation experience for Google Cloud modernization & migration programs are important qualifications. Strong experience in stakeholder and client management is also necessary. Being Google Cloud certified with the Google Cloud Professional architect certification, self-motivated to quickly learn new technologies and platforms, and possessing excellent presentation and communication skills are crucial for this role. Preferred qualifications include Google Cloud Professional certifications (Cloud Architect), experience with partner ecosystems, co-selling with Google, and managing joint GTM motions, as well as exposure to regulated industries (e.g., BFSI, Healthcare) and global delivery models.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

You will be responsible for designing and building sophisticated and highly scalable apps using Flutter. Your role will involve building custom packages in Flutter by utilizing functionalities and APIs available in native Android and IOS. It will be your responsibility to translate designs and Wireframes into high-quality responsive UI code. You will explore feasible architectures for implementing new features and ensure that best practices are followed throughout the development process. Keeping everything structured and well-documented will be crucial. Managing the code and project on Git is essential to keep in sync with other team members and managers. You will be required to communicate with the Project Manager regarding project status and suggest appropriate deadlines for new functionalities. Ensuring that security guidelines are always followed while developing the app is of utmost importance. Your role will involve maintaining software through the product lifecycle, including design, development, verification, and bug fixes. Performing time profiling and memory leaks assessment will be part of your responsibilities. Your expertise in Flutter will be utilized to build cross-platform mobile apps for Android, IOS, and Web. This will include creating responsive UIs to efficiently query data and manage states in an optimized manner. Experience with Firebase, specifically Cloud Firestore, Push Notifications, Cloud Functions, and Analytics, will be required. Proficiency in Adobe XD is necessary to utilize design files and build the app accordingly. Git will be used to manage and collaborate on different projects with the rest of the team. Experience with continuous integration and deploying apps to cloud platforms such as AWS, Azure, or others will be beneficial for this role.,

Posted 3 days ago

Apply

0.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

To develop a secure, low-latency Google Home integration system that connects voice commands to Firebase Realtime Database operations, enabling users to control smart devices like the Neon SmartPlug through natural speech. --- ???? Scope of Work: 1. Google Assistant Integration Create an Action on Google project (using Dialogflow or latest Actions SDK / Google Smart Home platform). Enable account linking via OAuth2 / Firebase Authentication OTP Verification. Implement voice intents for: Turning devices ON/OFF. Setting schedules or timers. Fetching status of a device. Custom interactions (e.g. Is my plug on) 2. Firebase Realtime Database Integration Sync device states in Firebase Realtime Database. Set up secure and cost-efficient data structure. Implement optimized Cloud Functions for: Intent fulfillment. Updating device state. Fetching real-time status. Logging user actions (optional analytics). 3. Cloud Functions (Node.js / TypeScript) Firebase Realtime Database Integration Write backend code to: Parse and respond to Assistant requests. Validate user sessions (via uid and linked identity). Prevent race conditions with concurrent writes. Handle fallback or unknown commands. 4. Firebase Security & User Validation Define Firebase Rules to restrict read/write based on: User uid Device ownership Action scope Ensure cross-user access is completely blocked. Implement access token validation. 5. Multi-user & Multi-device Support Support simultaneous sessions. Structure DB nodes for each user with isolation: /users/uid/devices/device_id/status 6. Testing & Validation Unit test Cloud Functions. Test integration with: Multiple Google Accounts Google Home and Android devices Show more Show less

Posted 3 days ago

Apply

6.0 - 10.0 years

0 Lacs

noida, uttar pradesh

On-site

You should have 6-10 years of experience in development, specifically in Java/J2EE, with a strong knowledge of core Java. Additionally, you must be proficient in Spring frameworks, particularly in Spring MVC, Spring Boot, and JPA + Hibernate. It is essential to have hands-on experience with Microservice technology, including development of RESTFUL and SOAP Web Services. A good understanding of Oracle DB is required. Your communication skills, especially when interacting with clients, should be excellent. Experience in building tools like Maven, deployment, and troubleshooting issues is necessary. Knowledge of CI/CD tools such as Jenkins and experience with GIT or similar source control tools is expected. You should also be familiar with Agile/Scrum software development methodologies using tools like Jira, Confluence, and BitBucket and have performed Requirement Analysis. It would be beneficial to have knowledge of frontend stacks like React or Angular, as well as frontend and backend API integration. Experience with AWS, CI/CD best practices, and designing security reference architectures for AWS Infrastructure Applications is advantageous. You should possess good verbal and written communication skills, the ability to multitask in a fast-paced environment, and be highly organized and detail-oriented. Awareness of common information security principles and practices is required. TELUS International is committed to creating a diverse and inclusive workplace and is an equal opportunity employer. All employment decisions are based on qualifications, merits, competence, and performance without regard to any characteristic related to diversity.,

Posted 4 days ago

Apply

2.0 - 6.0 years

0 Lacs

kolkata, west bengal

On-site

As a Solution Architect & Technical Lead at RebusCode, you will play a crucial role in driving the design and architecture of our Big Data Analytics solutions within the Market Research industry. Your responsibilities will include providing technical leadership, ensuring governance, documenting solutions, and sharing knowledge effectively. Moreover, you will be actively involved in project management and ensuring timely delivery of projects. To excel in this role, you should have a minimum of 5 years of experience in software development, out of which at least 2 years should be in architecture or technical leadership positions. A proven track record of delivering enterprise-grade, cloud-native SaaS applications on Azure and/or GCP is essential for this role. Your technical skills should encompass a wide range of areas including Cloud & Infrastructure (Azure App Services, Functions, Kubernetes; GKE, Cloud Functions; Service Bus, Pub/Sub; Blob Storage, Cloud Storage; Key Vault, Secret Manager; CDN), Development Stack (C#/.NET 6/7/8, ASP.NET Core Web API, Docker, container orchestration), Data & Integration (SQL Server, Oracle, Cosmos DB, Spanner, BigQuery, ETL patterns, message-based integration), CI/CD & IaC (Azure DevOps, Cloud Build, GitHub Actions; ARM/Bicep, Terraform; container registries, automated testing), Security & Compliance (TLS/SSL certificate management, API gateway policies, encryption standards), and Monitoring & Performance (Azure Application Insights, Log Analytics, Stackdriver, performance profiling, load testing tools). Nice-to-have qualifications include certifications such as Azure Solutions Architect Expert, Google Professional Cloud Architect, PMP or PMI-ACP. Familiarity with front-end frameworks like Angular and React, as well as API client SDK generation, would be an added advantage. Prior experience in building low-code/no-code integration platforms or automation engines is also beneficial. Exposure to alternative clouds like AWS or on-prem virtualization platforms like VMware and OpenShift will be a plus. Join us at RebusCode, where you will have the opportunity to work on cutting-edge Big Data Analytics solutions and contribute to the growth and success of our market research offerings.,

Posted 6 days ago

Apply

3.0 - 7.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

As a Google Cloud Engineer at our company, you will play a crucial role in designing, building, deploying, and maintaining our cloud infrastructure and applications on Google Cloud Platform (GCP). Your collaboration with development, operations, and security teams will ensure that our cloud environment is scalable, secure, highly available, and cost-optimized. If you are enthusiastic about cloud-native technologies, automation, and overcoming intricate infrastructure challenges, we welcome you to apply. Your responsibilities will include: - Designing, implementing, and managing robust, scalable, and secure cloud infrastructure on GCP utilizing Infrastructure as Code (IaC) tools like Terraform. - Deploying, configuring, and managing core GCP services such as Compute Engine, Kubernetes Engine (GKE), Cloud SQL, Cloud Storage, Cloud Functions, BigQuery, Pub/Sub, and networking components. - Developing and maintaining CI/CD pipelines for automated deployment and release management using various tools. - Implementing and enforcing security best practices within the GCP environment, including IAM, network security, data encryption, and compliance adherence. - Monitoring cloud infrastructure and application performance, identifying bottlenecks, and implementing optimization solutions. - Troubleshooting and resolving complex infrastructure and application issues in production and non-production environments. - Collaborating with development teams to ensure cloud-native deployment, scalability, and resilience of applications. - Participating in on-call rotations for critical incident response and timely resolution of production issues. - Creating and maintaining comprehensive documentation for cloud architecture, configurations, and operational procedures. - Keeping up-to-date with new GCP services, features, and industry best practices to propose and implement improvements. - Contributing to cost optimization efforts by identifying and implementing efficiencies in cloud resource utilization. We require you to have: - A Bachelors or Masters degree in Computer Science, Software Engineering, or a related field. - 6+ years of experience with C#, .NET Core, .NET Framework, MVC, Web API, Entity Framework, and SQL Server. - 3+ years of experience with cloud platforms, preferably GCP, including designing and deploying cloud-native applications. - 3+ years of experience with source code management, CI/CD pipelines, and Infrastructure as Code. - Strong experience with Javascript and a modern Javascript framework, with VueJS preferred. - Proven leadership and mentoring skills with development teams. - Strong understanding of microservices architecture and serverless computing. - Experience with relational databases like SQL Server and PostgreSQL. - Excellent problem-solving, analytical, and communication skills, along with Agile/Scrum environment experience. What can make you stand out: - GCP Cloud Certification. - UI development experience with HTML, JavaScript, Angular, and Bootstrap. - Agile environment experience with Scrum, XP. - Relational database experience with SQL Server, PostgreSQL. - Proficiency in Atlassian tools like JIRA, Confluence, and Github. - Working knowledge of Python and exceptional problem-solving and analytical abilities, along with strong teamwork skills.,

Posted 1 week ago

Apply

5.0 - 13.0 years

0 Lacs

pune, maharashtra

On-site

You are a highly skilled and experienced Cloud Architect/Engineer with deep expertise in Google Cloud Platform (GCP). Your primary responsibility is to design, build, and manage scalable and reliable cloud infrastructure on GCP. You will leverage various GCP services such as Compute Engine, Cloud Run, BigQuery, Pub/Sub, Cloud Functions, Dataflow, Dataproc, IAM, and Cloud Storage to ensure high-performance cloud solutions. Your role also includes developing and maintaining CI/CD pipelines, automating infrastructure deployment using Infrastructure as Code (IaC) principles, and implementing best practices in cloud security, monitoring, performance tuning, and logging. Collaboration with cross-functional teams to deliver cloud solutions aligned with business objectives is essential. You should have 5+ years of hands-on experience in cloud architecture and engineering, with at least 3 years of practical experience on Google Cloud Platform (GCP). In-depth expertise in GCP services mentioned above is required. Strong understanding of networking, security, containerization (Docker, Kubernetes), and CI/CD pipelines is essential. Experience with monitoring, performance tuning, and logging in cloud environments is preferred. Familiarity with DevSecOps practices and tools such as HashiCorp Vault is a plus. Your role as a GCP Cloud Architect/Engineer will contribute to ensuring system reliability, backup, and disaster recovery strategies. This hybrid role is based out of Pune and requires a total of 10 to 13 years of relevant experience.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

punjab

On-site

As a GCP Data Engineer in Australia, you will be responsible for leveraging your experience in Google Cloud Platform (GCP) to handle various aspects of data engineering. Your role will involve working on data migration projects from legacy systems such as SQL and Oracle. You will also be designing and building ETL pipelines for data lake and data warehouse solutions on GCP. In this position, your expertise in GCP data and analytics services will be crucial. You will work with tools like Cloud Dataflow, Cloud Dataprep, Apache Beam/Cloud composer, Cloud BigQuery, Cloud Fusion, Cloud PubSub, Cloud storage, and Cloud Functions. Additionally, you will utilize Cloud Native GCP CLI/gsutil for operations and scripting languages like Python and SQL to enhance data processing efficiencies. Furthermore, your experience with data governance practices, metadata management, data masking, and encryption will be essential. You will utilize GCP tools such as Cloud Data Catalog and GCP KMS tools to ensure data security and compliance. Overall, this role requires a strong foundation in GCP technologies and a proactive approach to data engineering challenges in a dynamic environment.,

Posted 1 week ago

Apply

10.0 - 20.0 years

10 - 18 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Google Cloud Certification Associate Cloud Engineer or Professional Cloud Architect/Engineer Hands-on experience with GCP services (Compute Engine, GKE, Cloud SQL, BigQuery, etc.) Strong command of Linux , shell scripting , and networking fundamentals Proficiency in Terraform , Cloud Build , Cloud Functions , or other GCP-native tools Experience with containers and orchestration – Docker, Kubernetes (GKE) Familiarity with monitoring/logging – Cloud Monitoring , Prometheus , Grafana Understanding of IAM , VPCs , firewall rules , service accounts , and Cloud Identity Excellent written and verbal communication skills.

Posted 1 week ago

Apply

1.0 - 5.0 years

0 Lacs

karnataka

On-site

As an excellent hands-on engineer, you will be working on developing next-generation media server software as a part of a core team dedicated to revolutionizing video sharing technology over the internet. Your role will involve contributing significantly to the development of server-side components, providing a learning opportunity in the video streaming space. You should have good hands-on experience with AWS, solid programming skills in C/C++ and Python, along with knowledge of AWS services like Lambda, EFS, auto-scaling, and load balancing. Experience in building and provisioning dockerized applications is highly preferable, along with a good understanding of the HTTP protocol. Familiarity with Web Servers (Apache, Nginx), Databases (MySQL, Redis, MongoDB, Firebase), Python frameworks (Django, Flask), Source Control (Git), REST APIs, and strong understanding of memory management, file I/O, network I/O, concurrency, and multithreading is expected. Your specific responsibilities will include working on scalable video deployments, extending the Mobile Application Backend for customer-specific features, maintaining and extending existing software components in the Media Server software, and fostering a multi-paradigm engineering culture with a cross-functional team. To excel in this role, you should bring strong coding skills and experience with Python and cloud functions, at least 1-2 years of experience with AWS services and GitHub, 6 to 12 months of experience in S3 or other storage/CDN services, exposure to NoSQL databases for developing mobile backends, and proficiency in Agile and Jira tools. A BS or equivalent in Computer Science or Engineering is preferred. If you are ready to take on this exciting opportunity, please send your CV to careers@crunchmediaworks.com.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

About GlobalLogic GlobalLogic, a leader in digital product engineering with over 30,000 employees, helps brands worldwide in designing and developing innovative products, platforms, and digital experiences. By integrating experience design, complex engineering, and data expertise, GlobalLogic assists clients in envisioning possibilities and accelerating their transition into the digital businesses of tomorrow. Operating design studios and engineering centers globally, GlobalLogic extends its deep expertise to customers in various industries such as communications, financial services, automotive, healthcare, technology, media, manufacturing, and semiconductor. GlobalLogic is a Hitachi Group Company. Requirements Leadership & Strategy As a part of GlobalLogic, you will lead and mentor a team of cloud engineers, providing technical guidance and support for career development. You will define cloud architecture standards and best practices across the organization and collaborate with senior leadership to develop a cloud strategy and roadmap aligned with business objectives. Your responsibilities will include driving technical decision-making for complex cloud infrastructure projects and establishing and maintaining cloud governance frameworks and operational procedures. Leadership Experience With a minimum of 3 years in technical leadership roles managing engineering teams, you should have a proven track record of successfully delivering large-scale cloud transformation projects. Experience in budget management, resource planning, and strong presentation and communication skills for executive-level reporting are essential for this role. Certifications (Preferred) Preferred certifications include Google Cloud Professional Cloud Architect, Google Cloud Professional Data Engineer, and additional relevant cloud or security certifications. Technical Excellence You should have over 10 years of experience in designing and implementing enterprise-scale Cloud Solutions using GCP services. As a technical expert, you will architect and oversee the development of sophisticated cloud solutions using Python and advanced GCP services. Your role will involve leading the design and deployment of solutions utilizing Cloud Functions, Docker containers, Dataflow, and other GCP services. Additionally, you will design complex integrations with multiple data sources and systems, implement security best practices, and troubleshoot and resolve technical issues while establishing preventive measures. Job Responsibilities Technical Skills Your expertise should include expert-level proficiency in Python and experience in additional languages such as Java, Go, or Scala. Deep knowledge of GCP services like Dataflow, Compute Engine, BigQuery, Cloud Functions, and others is required. Advanced knowledge of Docker, Kubernetes, and container orchestration patterns, along with experience in cloud security, infrastructure as code, and CI/CD practices, will be crucial for this role. Cross-functional Collaboration Collaborating with C-level executives, senior architects, and product leadership to translate business requirements into technical solutions, leading cross-functional project teams, presenting technical recommendations to executive leadership, and establishing relationships with GCP technical account managers are key aspects of this role. What We Offer At GlobalLogic, we prioritize a culture of caring, continuous learning and development, interesting and meaningful work, balance and flexibility, and a high-trust organization. Join us to experience an inclusive culture, opportunities for growth and advancement, impactful projects, work-life balance, and a safe, reliable, and ethical global company. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for creating innovative digital products and experiences since 2000. Collaborating with forward-thinking companies globally, GlobalLogic continues to transform businesses and redefine industries through intelligent products, platforms, and services.,

Posted 1 week ago

Apply

5.0 - 7.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

J ob Title: Senior Data Engineer - Big Data, ETL & Java Experience Level: 5 + Years Employment Type: Full - time About the Role EXL is seeking a Senior Software Engineer with a strong foundation in Java , along with expertise in Big Data technologies and ETL development . In this role, you'll design and implement scalable, high - performance data and backend systems for clients in retail, media, and other data - driven industries. You'll work across cloud platforms such as AWS and GCP to build end - to - end data and application pipelines. Key Responsibilities . Design, develop, and maintain scalable data pipelines and ETL workflows using Apache Spark, Apache Airflow, and cloud platforms (AWS/GCP). . Build and support Java - based backend components , services, or APIs as part of end - to - end data solutions. . Work with large - scale datasets to support transformation, integration, and real - time analytics. . Optimize Spark, SQL, and Java processes for performance, scalability, and reliability. . Collaborate with cross - functional teams to understand business requirements and deliver robust solutions. . Follow engineering best practices in coding, testing, version control, and deployment. Required Qualifications . 5 + years of hands - on experience in software or data engineering. . Proven experience in developing ETL pipelines using Java and Spark . . Strong programming experience in Java (preferably with frameworks such as Spring or Spring Boot). . Experience in Big Data tools including Apache Spark , Apache Airflow , and cloud services such as AWS EMR, Glue, S3, Lambda or GCP BigQuery, Dataflow, Cloud Functions. . Proficiency in SQL and experience with performance tuning for large datasets. . Familiarity with data modeling, warehousing , and distributed systems. . Experience working in Agile development environments. . Strong problem - solving skills and attention to detail. . Excellent communication skills Preferred Qualifications . Experience building and integrating RESTful APIs or microservices using Java. . Exposure to data platforms like Snowflake, Databricks, or Kafka. . Background in retail, merchandising, or media domains is a plus. . Familiarity with CI/CD pipelines , DevOps tools, and cloud - based development workflows.

Posted 1 week ago

Apply

10.0 - 18.0 years

12 - 22 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Google Cloud Certification Associate Cloud Engineer or Professional Cloud Architect/Engineer Hands-on experience with GCP services (Compute Engine, GKE, Cloud SQL, BigQuery, etc.) Strong command of Linux , shell scripting , and networking fundamentals Proficiency in Terraform , Cloud Build , Cloud Functions , or other GCP-native tools Experience with containers and orchestration – Docker, Kubernetes (GKE) Familiarity with monitoring/logging – Cloud Monitoring , Prometheus , Grafana Understanding of IAM , VPCs , firewall rules , service accounts , and Cloud Identity Excellent written and verbal communication skills

Posted 1 week ago

Apply

1.0 - 4.0 years

3 - 6 Lacs

Hyderabad

Work from Office

Developed cross-platform apps using Flutter and Dart, integrated Firebase for auth and Firestore for data, implemented Razorpay QR and RazorpayX Escrow for automated payments, and built secure, scalable, real-time features.

Posted 1 week ago

Apply

3.0 - 8.0 years

14 - 20 Lacs

Noida

Work from Office

Location: Noida (Onsite) Department: Engineering Employment Type: Full-time Role Overview We are looking for a Senior App Developer with at least 3 years of experience in building high-quality mobile applications using Flutter. Youll be responsible for designing and delivering scalable, performant, and responsive mobile apps for Android and iOS that support critical event-tech workflows. This is an onsite role based in Noida—ideal for someone who thrives in collaborative product-engineering environments and has a keen eye for user experience. Key Responsibilities Build, maintain, and optimize mobile apps using Flutter & Dart for both Android and iOS platforms Implement state management solutions (Provider, Riverpod, Bloc, etc.) to manage shared app logic Integrate with REST APIs, Firebase services, and other backend platforms Ensure responsive UI across various screen sizes and mobile devices Manage authentication workflows including JWT-based login, session handling, and role-based access Handle app deployments via Play Console and App Store Connect , including signing, release builds, and updates Collaborate closely with product, design, and backend teams to deliver smooth, user-friendly features Write clean, maintainable, and well-documented code and participate in peer code reviews Required Skills & Experience 3+ years of mobile app development experience, with at least 2 published Flutter apps Strong command of Flutter & Dart, along with state management using Provider Experience with Firebase services: Authentication, Firestore/Realtime Database, Cloud Functions, and Push Notifications Proficient in REST API integration, responsive UI/UX, and common mobile design patterns Familiar with JWT-based authentication flows and secure mobile app practices Hands-on experience with deployment processes for Android and iOS platforms Solid understanding of Git workflows and working in Agile/Scrum teams Good to Have Experience with Riverpod, Bloc, or GetX for state management Knowledge of Clean Architecture or MVVM in Flutter Exposure to Firebase Analytics, Crashlytics, and Remote Config Familiarity with CI/CD tools (e.g., Codemagic, Bitrise) for automated builds and deployment Basic knowledge of native code (Swift/Kotlin) for platform-specific functionality Experience with app performance optimization using DevTools, isolates, etc. Understanding of unit and widget testing using Flutter's test framework Familiarity with design tools like Figma or Zeplin for UI handoff Experience integrating features like QR code rendering, PDF generation, or badge scanning systems Annual CTC - INR 14 to 20 LPA

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

As a member of the JM Financial team, you will be part of a culture that values recognition and rewards for the hard work and dedication of its employees. We believe that a motivated workforce is essential for the growth of our organization. Our management team acknowledges and appreciates the efforts of our personnel through promotions, bonuses, awards, and public recognition. By fostering an atmosphere of success, we celebrate achievements such as successful deals, good client ratings, and customer reviews. Nurturing talent is a key focus at JM Financial. We aim to prepare our employees for future leadership roles by creating succession plans and encouraging direct interactions with clients. Knowledge sharing and cross-functional interactions are integral to our business environment, fostering inclusivity and growth opportunities for our team members. Attracting and managing top talent is a priority for JM Financial. We have successfully built a diverse talent pool with expertise, new perspectives, and enthusiasm. Our strong brand presence in the market enables us to leverage the expertise of our business partners to attract the best talent. Trust is fundamental to our organization, binding our programs, people, and clients together. We prioritize transparency, two-way communication, and trust across all levels of the organization. Opportunities for growth and development are abundant at JM Financial. We believe in growing alongside our employees and providing them with opportunities to advance their careers. Our commitment to nurturing talent has led to the appointment of promising employees to leadership positions within the organization. With a focus on employee retention and a supportive environment for skill development, we aim to create a strong future leadership team. Emphasizing teamwork, we value both individual performance and collaborative group efforts. In a fast-paced corporate environment, teamwork is essential for achieving our common vision. By fostering open communication channels and facilitating information sharing, we ensure that every member of our team contributes to delivering value to our clients. As a Java Developer at JM Financial, your responsibilities will include designing, modeling, and building services to support new features and products. You will work on an integrated central platform to power various web applications, developing a robust backend framework and implementing features across different products using a combination of technologies. Researching and implementing new technologies to enhance our services will be a key part of your role. To excel in this position, you should have a BTech Degree in Computer Science or equivalent experience, with at least 3 years of experience building Java-based web applications in Linux/Unix environments. Proficiency in scripting languages such as JavaScript, Ruby, or Python, along with compiled languages like Java or C/C++, is required. Experience with Google Cloud Platform services, knowledge of design methodologies for backend services, and building scalable infrastructure are essential skills for this role. Our technology stack includes JavaScript, Angular, React, NextJS, HTML5/CSS3/Bootstrap, Windows/Linux/OSX Bash, Kookoo telephony, SMS Gupshup, Sendgrid, Optimizely, Mixpanel, Google Analytics, Firebase, Git, Bash, NPM, Browser Dev Console, NoSQL, Google Cloud Datastore, Google Cloud Platform (App Engine, PubSub, Cloud Functions, Bigtable, Cloud Endpoints). If you are passionate about technology and innovation, and thrive in a collaborative environment, we welcome you to join our team at JM Financial.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As an AI/ML Engineer, you will be responsible for identifying, defining, and delivering AI/ML and GenAI use cases in collaboration with business and technical stakeholders. Your role will involve designing, developing, and deploying models using Google Cloud's Vertex AI platform. You will be tasked with fine-tuning and evaluating Large Language Models (LLMs) for domain-specific applications and ensuring responsible AI practices and governance in solution delivery. Collaboration with data engineers and architects is essential to ensure robust and scalable pipelines. It will be your responsibility to document workflows and experiments for reproducibility and handover readiness. Your expertise in supervised, unsupervised, and reinforcement learning will be applied to develop solutions using Vertex AI features including AutoML, Pipelines, Model Registry, and Generative AI Studio. In this role, you will work on GenAI workflows, which includes prompt engineering, fine-tuning, and model evaluation. Proficiency in Python is required for developing in ML frameworks such as TensorFlow, PyTorch, scikit-learn, and Hugging Face Transformers. Effective communication and collaboration across product, data, and business teams are crucial for the success of the projects. The ideal candidate should have hands-on experience with Vertex AI on GCP for model training, deployment, endpoint management, and MLOps. Practical knowledge of PaLM, Gemini, or other LLMs via Vertex AI or open-source tools is preferred. Proficiency in Python for ML pipeline scripting, data preprocessing, and evaluation is necessary. Expertise in ML/GenAI libraries like scikit-learn, TensorFlow, PyTorch, Hugging Face, and LangChain is expected. Experience with CI/CD for ML, containerization using Docker/Kubernetes, and familiarity with GCP services like BigQuery, Cloud Functions, and Cloud Storage are advantageous. Knowledge of media datasets and real-world ML applications in OTT, DTH, and Web platforms will be beneficial in this role. Qualifications required for this position include a Bachelors or Masters degree in Computer Science, Artificial Intelligence, Data Science, or related fields. The candidate should have at least 3 years of hands-on experience in ML/AI or GenAI projects. Any relevant certifications in ML, GCP, or GenAI technologies are considered a plus.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

kochi, kerala

On-site

Beinex is seeking a skilled and motivated Google Cloud Consultant to join our dynamic team. As a Google Cloud Consultant, you will play a pivotal role in assisting our clients in harnessing the power of Google Cloud technologies to drive innovation and transformation. If you are passionate about cloud solutions, client collaboration, and cutting-edge technology, we invite you to join our journey. Responsibilities - Collaborate with clients to understand their business objectives and technology needs, translating them into effective Google Cloud solutions - Design, implement, and manage Google Cloud Platform (GCP) architectures, ensuring scalability, security, and performance - Provide technical expertise and guidance to clients on GCP services, best practices, and cloud-native solutions and adopt an Infrastructure as Code (IaC) approach to establish an advanced infrastructure for both internal and external stakeholders - Conduct cloud assessments and create migration strategies for clients looking to transition their applications and workloads to GCP - Work with cross-functional teams to plan, execute, and optimise cloud migrations, deployments, and upgrades - Assist clients in optimising their GCP usage by analysing resource utilisation, recommending cost-saving measures, and enhancing overall efficiency - Collaborate with development teams to integrate cloud-native technologies and solutions into application design and development processes - Stay updated with the latest trends, features, and updates in the Google Cloud ecosystem and provide thought leadership to clients - Troubleshoot and resolve technical issues related to GCP services and configurations - Create and maintain documentation for GCP architectures, solutions, and best practices - Conduct training sessions and workshops for clients to enhance their understanding of GCP technologies and usage Key Skills Requirements - Profound expertise in Google Cloud Platform services, including but not limited to Compute Engine, App Engine, Kubernetes Engine, Cloud Storage, BigQuery, Pub/Sub, Cloud Functions, VPC, IAM, and Cloud Security - Strong understanding of GCP networking concepts, including VPC peering, firewall rules, VPN, and hybrid cloud configurations - Experience with Infrastructure as Code (IaC) tools such as Terraform, Deployment Manager, or Google Cloud Deployment Manager - Hands-on experience with containerisation technologies like Docker and Kubernetes - Proficiency in scripting languages such as Python and Bash - Familiarity with cloud monitoring, logging, and observability tools and practices - Knowledge of DevOps principles and practices, including CI/CD pipelines and automation - Strong problem-solving skills and the ability to troubleshoot complex technical issues - Excellent communication skills to interact effectively with clients, team members, and stakeholders - Previous consulting or client-facing experience is a plus - Relevant Google Cloud certifications are highly desirable Perks: Careers at Beinex - Comprehensive Health Plans - Learning and development - Workation and outdoor training - Hybrid working environment - On-site travel Opportunity - Beinex Branded Merchandise,

Posted 1 week ago

Apply

1.0 - 5.0 years

0 Lacs

karnataka

On-site

Capgemini Invent is the digital innovation, consulting, and transformation brand of the Capgemini Group, a global business line that combines market-leading expertise in strategy, technology, data science, and creative design to help CxOs envision and build what's next for their businesses. In this role, you should have developed/worked on at least one Gen AI project and have experience in data pipeline implementation with cloud providers such as AWS, Azure, or GCP. You should also be familiar with cloud storage, cloud database, cloud data warehousing, and Data lake solutions like Snowflake, BigQuery, AWS Redshift, ADLS, and S3. Additionally, a good understanding of cloud compute services, load balancing, identity management, authentication, and authorization in the cloud is essential. Your profile should include a good knowledge of infrastructure capacity sizing, costing of cloud services to drive optimized solution architecture, leading to optimal infra investment vs. performance and scaling. You should be able to contribute to making architectural choices using various cloud services and solution methodologies. Proficiency in programming using Python is required along with expertise in cloud DevOps practices such as infrastructure as code, CI/CD components, and automated deployments on the cloud. Understanding networking, security, design principles, and best practices in the cloud is also important. At Capgemini, we value flexible work arrangements to provide support for maintaining a healthy work-life balance. You will have opportunities for career growth through various career growth programs and diverse professions tailored to support you in exploring a world of opportunities. Additionally, you can equip yourself with valuable certifications in the latest technologies such as Generative AI. Capgemini is a global business and technology transformation partner with a rich heritage of over 55 years. We have a diverse team of 340,000 members in more than 50 countries, working together to accelerate the dual transition to a digital and sustainable world while creating tangible impact for enterprises and society. Trusted by clients to unlock the value of technology, we deliver end-to-end services and solutions leveraging strengths from strategy and design to engineering, fueled by market-leading capabilities in AI, cloud, and data, combined with deep industry expertise and partner ecosystem. Our global revenues in 2023 were reported at 22.5 billion.,

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Work with the team in capacity of GCP Data Engineer on day to day activities Solve problems at hand with utmost clarity and speed Train and coach other team members Ability to turn around quickly Work with Data analysts and architects to help them solve any specific issues with tooling/processes Design, build and operationalize large scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Python/Java/React.js, AirFlow ETL skills - GCP services (BigQuery, Dataflow, Cloud SQL, Cloud Functions, Data Lake Design and build production data pipelines from ingestion to consumption within a big data architecture GCP BQ modeling and performance tuning techniques RDBMS and No-SQL database experience Knowledge on orchestrating workloads on cloud Implement Data warehouse & Big/Small data designs, data lake solutions with very good data quality capabilities Understanding and knowledge of deployment strategies CI/CD.

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Cloud Engineering Team Leader at GlobalLogic, you will be responsible for providing technical guidance and career development support to a team of cloud engineers. You will define cloud architecture standards and best practices across the organization, collaborating with senior leadership to develop a cloud strategy aligned with business objectives. Your role will involve driving technical decision-making for complex cloud infrastructure projects, establishing and maintaining cloud governance frameworks, and operational procedures. With a background in technical leadership roles managing engineering teams, you will have a proven track record of successfully delivering large-scale cloud transformation projects. Experience in budget management, resource planning, and strong presentation and communication skills for executive-level reporting are essential. Preferred certifications include Google Cloud Professional Cloud Architect, Google Cloud Professional Data Engineer, and additional relevant cloud or security certifications. You will leverage your 10+ years of experience in designing and implementing enterprise-scale Cloud Solutions using GCP services to architect sophisticated cloud solutions using Python and advanced GCP services. Leading the design and deployment of solutions utilizing Cloud Functions, Docker containers, Dataflow, and other GCP services will be part of your responsibilities. Ensuring optimal performance and scalability of complex integrations with multiple data sources and systems, implementing security best practices and compliance frameworks, and troubleshooting and resolving technical issues will be key aspects of your role. Your technical skills will include expert-level proficiency in Python with experience in additional languages, deep expertise with GCP services such as Dataflow, Compute Engine, BigQuery, Cloud Functions, and others, advanced knowledge of Docker, Kubernetes, and container orchestration patterns, extensive experience in cloud security, proficiency in Infrastructure as Code tools like Terraform, Cloud Deployment Manager, and CI/CD experience with advanced deployment pipelines and GitOps practices. As part of the GlobalLogic team, you will benefit from a culture of caring, continuous learning and development opportunities, interesting and meaningful work, balance and flexibility in work arrangements, and being part of a high-trust organization. You will have the chance to work on impactful projects, engage with collaborative teammates and supportive leaders, and contribute to shaping cutting-edge solutions in the digital engineering domain.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

ahmedabad, gujarat

On-site

You are an experienced Senior MEAN Stack Developer with 2-4 years of hands-on experience in designing, developing, and maintaining scalable web applications. Your expertise lies in MongoDB, Express.js, Angular, and Node.js (MEAN stack) with strong problem-solving abilities and leadership skills. Your responsibilities will include designing, developing, and deploying full-stack web applications using the MEAN stack. You will architect and optimize scalable, high-performance web applications, develop RESTful APIs and GraphQL services for seamless integration with frontend applications, and implement authentication and authorization mechanisms such as JWT, OAuth, and Role-Based Access Control. Additionally, you will optimize database queries and performance in MongoDB using Mongoose. In this role, you will mentor and guide junior developers, conduct code reviews and technical discussions, integrate third-party APIs, cloud services, and DevOps solutions for automation and deployment. You will also implement CI/CD pipelines, ensure best practices for software development and deployment, troubleshoot complex issues, debug applications, and improve code quality while staying updated with emerging technologies and contributing to the continuous improvement of development. To excel in this position, you should possess 3-5 years of experience in MEAN stack development, strong proficiency in Angular 15+ and frontend optimization techniques, advanced knowledge of Node.js and Express.js, including asynchronous programming and event-driven architecture. Expertise in MongoDB, MySQL & PostgreSQL, building microservices-based architectures, Docker, Kubernetes, CI/CD pipelines, and proficiency in Git, GitHub, or GitLab for version control is essential. Experience with message queues, WebSockets, real-time data processing, caching strategies, unit testing, integration testing, TDD, analytical and debugging skills, performance optimization, as well as excellent communication and leadership skills are required. Skills & Qualifications: - Strong proficiency in Angular 15+ and frontend optimization techniques - Advanced knowledge of Node.js and Express.js - Expertise in MongoDB, MySQL & PostgreSQL - Experience in building microservices-based architectures - Proficiency in Docker, Kubernetes, CI/CD pipelines - Proficiency in Git, GitHub, or GitLab - Experience with message queues (Redis, RabbitMQ, Kafka) - Understanding of WebSockets, real-time data processing, caching strategies - Hands-on experience in unit testing, integration testing, TDD - Strong analytical and debugging skills - Experience in performance optimization - Excellent communication and leadership skills Additional Skills: - Experience with GraphQL API development - Familiarity with AWS, Azure, Google Cloud Platform - Knowledge of Serverless architecture, cloud functions - Knowledge of Next.js, React.js - Experience in Angular Universal (Server-Side Rendering SSR) - Knowledge of Nginx, PM2, load balancing strategies - Exposure to AI/ML-based applications using Node.js - Utilization of AI tools like ChatGPT (ref:hirist.tech),

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You are NTT DATA, a global company striving to hire exceptional, innovative, and passionate individuals to grow with the organization. If you wish to be part of an inclusive, adaptable, and forward-thinking team, then this opportunity is for you. Currently, we are looking for a GCP BigQuery Developer to join our team in Hyderabad, Telangana (IN-TG), India (IN). As a Senior Application Developer in GCP, you should have mandatory skills in ETL Experience, Google Cloud Platform BigQuery, SQL, and Linux. Additionally, experience with Cloud Run and Cloud Functions would be desirable. We are seeking a Senior ETL Development professional with strong hands-on experience in Linux and SQL. While optional, experience or a solid conceptual understanding of GCP BigQuery is preferred. About NTT DATA: NTT DATA is a trusted global innovator of business and technology services with a value of $30 billion. We cater to 75% of the Fortune Global 100 and are dedicated to assisting clients in innovating, optimizing, and transforming for long-term success. Being a Global Top Employer, we have a diverse team of experts in over 50 countries and a robust partner ecosystem. Our services range from business and technology consulting to data and artificial intelligence solutions, industry-specific services, and the development, implementation, and management of applications, infrastructure, and connectivity. NTT DATA is at the forefront of digital and AI infrastructure globally and is a part of the NTT Group, investing over $3.6 billion annually in R&D to facilitate a confident and sustainable transition into the digital future. Visit us at us.nttdata.com.,

Posted 2 weeks ago

Apply
Page 1 of 3
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies