Jobs
Interviews

181 Pubsub Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

We are searching for enthusiastic technologists who are eager to lead client engagements and take on the responsibility of delivering intricate technical projects. Responsibilities: - Utilizing CSS/SCSS with BEM class naming - Front-end build proficiency using Gulp and Webpack - Version control knowledge with Git (SVN experience preferred) - Profound expertise in Javascript Frameworks like Angular, React - Building multi-tier SaaS applications with exposure to micro-services, caching, pub-sub, messaging technologies - Familiarity with MVC & MVVM design patterns - Knowledge of ES6/ES7, UI components library (such as material-ui or Bootstrap), and RESTful APIs - Experience with popular Vue.js workflows (such as Vuex) Qualifications: - Minimum of 3 years of experience required - Strong communication skills in English, both written and verbal, along with strong analytic skills and work ethic - Proficiency in HTML, CSS, SCSS - Skilled in hand-coding markup HTML5 and/or XHTML 1.0 - Proficient in CSS3 and responsive web coding - Experience in developing cross-browser and cross-platform compatible solutions - Proficiency in creating creative, visually appealing front ends for web-based applications - A Bachelor's degree in Computer Science (CS), Software Engineering (SE), or a related technical field is preferred for this position.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a candidate for the role at Wind Pioneers, your main responsibility will be to ensure that our flagship software products are driven by cutting-edge science and methodologies. You will play a crucial role in implementing and refining various analyses related to wind data analysis, wind resource assessments, site design, wake modeling, and other intermediary analyses. The enhancements you make in these analyses will need to be seamlessly integrated into our codebase and thoroughly validated. This position is ideal for a detail-oriented scientist or engineer who is passionate about advancing the state-of-the-art in wind farm design and development. At Wind Pioneers, our vision is to lead the world in designing and evaluating new wind farm sites. We are dedicated to creating a set of tools, approaches, and processes that elevate the technical management of wind farm development to a significantly higher level of sophistication compared to standard industry practices. The company relies on its own software as a testing ground for innovative and advanced methodologies, providing you with a unique opportunity to be at the forefront of wind farm design and development. Your role will involve driving improvements to our software from conceptualization to commercial deployment. This requires a deep understanding of scientific and engineering principles to implement new analytical approaches within our software stack and conduct comprehensive validation studies. Key Responsibilities include two main areas: A. Creating Scientific Services: - Utilizing research findings to enhance the accuracy and efficiency of wind resource assessment processes by incorporating new technologies, methodologies, and data sources. - Generating detailed technical reports, documentation, and presentations to effectively communicate research findings, tool developments, and project outcomes. - Conducting research and development tasks, including validation studies. B. Software Engineering: - Assisting the development team in creating high-quality web applications for wind farm design. - Engaging in Data Engineering using technologies like Postgres, BigQuery, Pub/Sub, and Terraform to build event-driven systems and data lakes, particularly for geospatial data. - Leveraging Python and optionally Rust to develop and maintain performance analysis tools for designing and optimizing multi-GW scale wind farms. Candidate Requirements: - Enthusiasm for wind resources and the role of renewable energy in addressing climate change. - Bachelor's or master's degree in a scientific or engineering discipline from a reputable institution. PhD holders are also encouraged to apply. - 3-5 years of relevant experience, demonstrating independent work and initiative. - Wind industry experience is preferred but not mandatory. - Proficiency with Git and Git Flow is beneficial. - Basic knowledge of software development and Python is advantageous. - Excellent written English skills. - International experience is desirable. - Self-directed and proactive work approach. - Excitement for working in a dynamic, high-growth startup environment. - Positive attitude and passion for wind energy. Wind Pioneers Offering: - Join a focused team with a clear vision dedicated to revolutionizing wind farm project discovery and evaluation. - Utilize Wind Pioneers" advanced in-house tools to design top-tier wind farms. - Contribute to the development of Wind Pioneers" flagship tool while benefiting from using it as an end user. - Learn and collaborate closely with our Product Architect and Senior Engineer. - Enjoy a friendly and relaxed office atmosphere and team culture. - Flexible working conditions. - Competitive salary with the opportunity for a six-monthly bonus through Wind Pioneers" revenue share scheme.,

Posted 2 days ago

Apply

6.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Must Have 6+ years of experience in the fields of software engineering (backend) 3+ years of experience using cloud-based platforms such as GCP, AWS, Azure, or similar technologies 2+ years of hands-on experience in managing and optimizing Big Data ecosystems such as Spark, Hadoop/MR, Kafka Excellent communication and collaboration skills and experience working with a remote, global team. Have a passion for building large-scale distributed systems and are comfortable writing maintainable, extensible, scalable, and high-performance code. Proficient in one or more compiled languages, preferably Java, Scala, and Go. 1+ years of experience in Observability, such as distributed tracing, service level indicator, Service level objective, and service level Agreement. Bachelor&aposs or Master&aposs of Science in Computer Science, Information Systems, or related degree Familiar with OOD and OOA-Track record of leading and delivering large-scale, cross-team, cross-functional projects. Enjoy collaborating on an agile development team and have empathy for teammates and customers. Comfortable evaluating and adapting to the latest tools and technologies. Must Have: Experience designing and implementing standard/common event systems. Experience designing and deploying high-performance systems with reliable monitoring and logging practices. Strong technical knowledge of cloud infrastructure, distributed systems, and reliability practices Experience with a real-time distributed database such as SingleStore (Memory database) Experience with GCP products such as BigTable, BigQuery, Dataproc, PubSub, or other data tools like Spark, Kafka, Hive, or Apache Beam Experience with development tools such as Terraform, Kubernetes, Helm, or Gradle Experience with designing and implementing interfaces and infrastructure for large-scale systems and RESTful APIs Show more Show less

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

You will play a crucial role as a Tech Lead in our team, where you will be responsible for driving the design, development, and deployment of our full-stack ERP SaaS application. Your primary focus will be leading a cross-functional team to ensure the delivery of scalable, secure, and high-performance solutions. Your key responsibilities will include designing and developing a cloud-native, multi-tenant ERP SaaS platform. You will lead a team of engineers working on frontend, backend, and DevOps aspects, while also overseeing seamless Azure-based deployments with CI/CD pipelines. Additionally, you will be responsible for integrating the system with financial, procurement, and field service applications, optimizing system performance, security, and scalability, and guiding best practices in DevOps, microservices, and API development. Collaboration with stakeholders to align technical decisions with business goals will also be a crucial part of your role. To excel in this position, you should have at least 5 years of experience in full-stack development, with a strong background in leadership. Expertise in Azure, DevOps, and microservices architecture is essential, along with proficiency in Java, React, Typescript, Node.js/Python, PostgreSQL, and MongoDB. Experience with ERP systems, financial integrations, and construction SaaS solutions will be highly beneficial, along with a proven track record of delivering scalable, multi-tenant SaaS applications. Strong problem-solving and communication skills are also key requirements for this role. Preferred qualifications include experience with Playwright for automated testing, familiarity with GraphQL, Docker, Pubsub, and event-driven architectures, and a background in AI/ML-driven analytics for construction technology.,

Posted 3 days ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

Sabre is a technology company that powers the global travel industry. By leveraging next-generation technology, we create global technology solutions that take on the biggest opportunities and solve the most complex challenges in travel. Positioned at the center of the travel industry, we shape the future by offering innovative advancements that pave the way for a more connected and seamless ecosystem. Our solutions power mobile apps, online travel sites, airline and hotel reservation networks, travel agent terminals, and many other platforms, connecting people with moments that matter. Sabre is seeking a talented senior software engineer full Senior Data Science Engineer for SabreMosaic Team. In this role, you will plan, design, develop, and test data science and data engineering software systems or applications for software enhancements and new products based on cloud-based solutions. Role and Responsibilities: - Develop, code, test, and debug new complex data-driven software solutions or enhancements to existing products. - Design, plan, develop, and improve applications using advanced cloud-native technology. - Work on issues requiring in-depth knowledge of organizational objectives and implement strategic policies in selecting methods and techniques. - Encourage high coding standards, best practices, and high-quality output. - Interact regularly with subordinate supervisors, architects, product managers, HR, and others on project or team performance matters. - Provide technical mentorship and cultural/competency-based guidance to teams. - Offer larger business/product context and mentor on specific tech stacks/technologies. Qualifications and Education Requirements: - Minimum 4-6 years of related experience as a full-stack developer. - Expertise in Data Engineering/DW projects with Google Cloud-based solutions. - Designing and developing enterprise data solutions on the GCP cloud platform. - Experience with relational databases and NoSQL databases like Oracle, Spanner, BigQuery, etc. - Expert-level SQL skills for data manipulation, validation, and manipulation. - Experience in designing data modeling, data warehouses, data lakes, and analytics platforms on GCP. - Expertise in designing ETL data pipelines and data processing architectures for Datawarehouse. - Strong experience in designing Star & Snowflake Schemas and knowledge of Dimensional Data Modeling. - Collaboration with data scientists, data teams, and engineering teams using Google Cloud platform for data analysis and data modeling. - Familiarity with integrating datasets from multiple sources for data modeling for analytical and AI/ML models. - Understanding and experience in Pub/Sub, Kafka, Kubernetes, GCP, AWS, Hive, Docker. - Expertise in Java Spring Boot / Python or other programming languages used for Data Engineering and integration projects. - Strong problem-solving and analytical skills. - Exposure to AI/ML, MLOPS, and Vertex AI is an advantage. - Familiarity with DevOps practices like CICD pipeline. - Airline domain experience is a plus. - Excellent spoken and written communication skills. - GCP Cloud Data Engineer Professional certification is a plus. We will carefully consider your application and review your details against the position criteria. Only candidates who meet the minimum criteria for the role will proceed in the selection process.,

Posted 3 days ago

Apply

4.0 - 8.0 years

0 Lacs

noida, uttar pradesh

On-site

At TELUS Digital, you will play a crucial role in enabling customer experience innovation by fostering spirited teamwork, embracing agile thinking, and embodying a caring culture that prioritizes customers. As the global arm of TELUS Corporation, a leading telecommunications service provider in Canada, we specialize in delivering contact center and business process outsourcing solutions to major corporations across various sectors such as consumer electronics, finance, telecommunications, and utilities. With our extensive global call center capabilities, we offer secure infrastructure, competitive pricing, skilled resources, and exceptional customer service, all supported by TELUS, our multi-billion dollar parent company. In this role, you will leverage your expertise in Data Engineering, backed by a minimum of 4 years of industry experience, to drive the success of our projects. Proficiency in Google Cloud Platform (GCP) services including Dataflow, BigQuery, Cloud Storage, and Pub/Sub is essential for effectively managing data pipelines and ETL processes. Your strong command over the Python programming language will be instrumental in performing data processing tasks efficiently. You will be responsible for optimizing data pipeline architectures, enhancing performance, and ensuring reliability through your software engineering skills. Your ability to troubleshoot and resolve complex pipeline issues, automate repetitive tasks, and monitor data pipelines for efficiency and reliability will be critical in maintaining operational excellence. Additionally, your familiarity with SQL, relational databases, and version control systems like Git will be beneficial in streamlining data management processes. As part of the team, you will collaborate closely with stakeholders to analyze, test, and enhance the reliability of GCP data pipelines, Informatica ETL workflows, MDM, and Control-M jobs. Your commitment to continuous improvement, SLA adherence, and post-incident reviews will drive the evolution of our data pipeline systems. Excellent communication, problem-solving, and analytical skills are essential for effectively documenting processes, providing insights, and ensuring seamless operations. This role offers a dynamic environment where you will have the opportunity to work in a 24x7 shift, contributing to the success of our global operations and making a meaningful impact on customer experience.,

Posted 3 days ago

Apply

5.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

It&aposs fun to work in a company where people truly BELIEVE in what they are doing! We&aposre committed to bringing passion and customer focus to the business. Location - Open Position: Data Engineer (GCP) Technology If you are an extraordinary developer and who loves to push the boundaries to solve complex business problems using creative solutions, then we wish to talk with you. As an Analytics Technology Engineer, you will work on the Technology team that helps deliver our Data Engineering offerings at large scale to our Fortune clients worldwide. The role is responsible for innovating, building and maintaining technology services. Responsibilities: Be an integral part of large scale client business development and delivery engagements Develop the software and systems needed for end-to-end execution on large projects Work across all phases of SDLC, and use Software Engineering principles to build scaled solutions Build the knowledge base required to deliver increasingly complex technology projects Qualifications & Experience: A bachelors degree in Computer Science or related field with 5 to 10 years of technology experience Desired Technical Skills: Data Engineering and Analytics on Google Cloud Platform: Basic Cloud Computing Concepts Bigquery, Google Cloud Storage, Cloud SQL, PubSub, Dataflow, Cloud Composer, GCP Data Transfer, gcloud CLI Python, Google Cloud Python SDK, SQL Experience in working with Any NoSQL/Columnar / MPP Database Experience in working with Any ETL Tool (Informatica/DataStage/Talend/Pentaho etc.) Strong Knowledge of database concepts, Data Modeling in RDBMS Vs NoSQL, OLTP Vs OLAP, MPP Architecture Other Desired Skills: Excellent communication and co-ordination skills Problem understanding, articulation and solutioning Quick learner & adaptable with regards to new technologies Ability to research & solve technical issues Responsibilities: Developing Data Pipelines (Batch/Streaming) Developing Complex data transformations ETL Orchestration Data Migration Develop and Maintain Datawarehouse / Data Lakes Good To Have: Experience in working with Apache Spark / Kafka Machine Learning concepts Google Cloud Professional Data Engineer Certification If you like wild growth and working with happy, enthusiastic over-achievers, you&aposll enjoy your career with us! Not the right fit Let us know you&aposre interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest! Show more Show less

Posted 3 days ago

Apply

4.0 - 9.0 years

13 - 23 Lacs

Mangaluru, Bengaluru

Work from Office

Position Overview We are looking for a FullStack Developer with hands-on experience in React, Node.js, and cloud platforms like AWS or Azure. Youll drive the development of scalable, high-performance systems using modern architectures, collaborate on migration strategies, and build robust APIs. Strong knowledge of cloud services, containerization, and IoT technologies is essential. Job Role: Full Stack ( MERN) Developer Job Type: Full Time Experience: Minimum 5+ years Job Location: Bangalore/ Mangalore Technical Skills:AWS Cloud, Azure Cloud, TypeScript, Node, React About Us: We are a multi-award-winning creative engineering company. Since 2011, we have worked with our customers as a design and technology enablement partner, helping them on their digital transformation journey. Roles and Responsibilities: Evaluate existing systems and propose enhancements to improve efficiency, security, and scalability. Create technical documentation and architectural guidelines for the development team. Experience in developing software platforms using event-driven architecture Develop high-performance and throughput systems. Ability to define, track and deliver items to schedule. Collaborate with cross-functional teams to define migration strategies, timelines, and milestone Technical Skills: Hands-on experience in React & Node Hands-on experience in any one of the cloud provider like AWS, GCP or Azure Multiple database proficiencies including SQL and NoSQL Highly skilled at facilitating and documenting requirements Experience developing REST API with JSON, XML for data transfer. Ability to develop both internal facing and external facing APIs using JWT and OAuth2.0 Good understanding of cloud technologies, such as Docker, Kubernetes, MQTT, EKS, Lambda, IoT Core, and Kafka. Good understanding of messaging systems like SQS, PubSub Ability to establish priorities and proceed with objectives without supervision. Familiar with HA/DR, scalability, performance, code optimizations Good organizational skills and the ability to work on more than one project at a time. Exceptional attention to detail and good communication skills. Experience with Amazon Web Services, JIRA, Confluence, GIT, Bitbucket. Other Skills: Experience working with Go & Python Good understanding of IoT systems. Exposure to or knowledge of the energy industry. What we offer: A competitive salary and comprehensive benefits package. The opportunity to work on international projects and cutting-edge technology. A dynamic work environment that promotes professional growth, continuous learning, and mentorship. If you are passionate to work in a collaborative and challenging environment, apply now!

Posted 4 days ago

Apply

2.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Java Developer, you will be responsible for analyzing, designing, programming, debugging, and modifying software enhancements and/or new products used in various computer programs. Your expertise in Java, Spring MVC, Spring Boot, Database design, and query handling will be utilized to write code, complete programming, and perform testing and debugging of applications. You will work on local, networked, cloud-based, or Internet-related computer programs, ensuring the code meets the necessary standards for commercial or end-user applications such as materials management, financial management, HRIS, mobile apps, or desktop applications products. Your role will involve working with RESTful Web Services/Microservices for JSON creation, data parsing/processing using batch and stream mode, and messaging platforms like Kafka, Pub/Sub, ActiveMQ, among others. Proficiency in OS, Linux, virtual machines, and open source tools/platforms is crucial for successful implementation. Additionally, you will be expected to have an understanding of data modeling and storage with NoSQL or relational DBs, as well as experience with Jenkins, Containerized Microservices deployment in Cloud environments, and Big Data development (Spark, Hive, Impala, Time-series DB). To excel in this role, you should have a solid understanding of building Microservices/Webservices using Java frameworks, REST API standards and practices, and object-oriented analysis and design patterns. Experience with cloud technologies like Azure, AWS, and GCP will be advantageous. A candidate with Telecom domain experience and familiarity with protocols such as TCP, UDP, SNMP, SSH, FTP, SFTP, Corba, SOAP will be preferred. Additionally, being enthusiastic about work, passionate about coding, a self-starter, and proactive will be key qualities for success in this position. Strong communication, analytical, and problem-solving skills are essential, along with the ability to write quality/testable/modular code. Experience in Big Data platforms, participation in Agile Development methodologies, and working in a start-up environment will be beneficial. Team leading experience is an added advantage, and immediate joiners will be given special priority. If you possess the necessary skills and experience, have a keen interest in software development, and are ready to contribute to a dynamic team environment, we encourage you to apply for this role.,

Posted 5 days ago

Apply

5.0 - 10.0 years

0 Lacs

maharashtra

On-site

You are a highly skilled and motivated Lead Data Scientist / Machine Learning Engineer sought to join a team pivotal in the development of a cutting-edge reporting platform. This platform is designed to measure and optimize online marketing campaigns effectively. Your role will involve focusing on data engineering, ML model lifecycle, and cloud-native technologies. You will be responsible for designing, building, and maintaining scalable ELT pipelines, ensuring high data quality, integrity, and governance. Additionally, you will develop and validate predictive and prescriptive ML models to enhance marketing campaign measurement and optimization. Experimenting with different algorithms and leveraging various models will be crucial in driving insights and recommendations. Furthermore, you will deploy and monitor ML models in production and implement CI/CD pipelines for seamless updates and retraining. You will work closely with data analysts, marketing teams, and software engineers to align ML and data solutions with business objectives. Translating complex model insights into actionable business recommendations and presenting findings to stakeholders will also be part of your responsibilities. Qualifications & Skills: Educational Qualifications: - Bachelors or Masters degree in Computer Science, Data Science, Machine Learning, Artificial Intelligence, Statistics, or related field. - Certifications in Google Cloud (Professional Data Engineer, ML Engineer) is a plus. Must-Have Skills: - Experience: 5-10 years with the mentioned skillset & relevant hands-on experience. - Data Engineering: Experience with ETL/ELT pipelines, data ingestion, transformation, and orchestration (Airflow, Dataflow, Composer). - ML Model Development: Strong grasp of statistical modeling, supervised/unsupervised learning, time-series forecasting, and NLP. - Programming: Proficiency in Python (Pandas, NumPy, Scikit-learn, TensorFlow/PyTorch) and SQL for large-scale data processing. - Cloud & Infrastructure: Expertise in GCP (BigQuery, Vertex AI, Dataflow, Pub/Sub, Cloud Storage) or equivalent cloud platforms. - MLOps & Deployment: Hands-on experience with CI/CD pipelines, model monitoring, and version control (MLflow, Kubeflow, Vertex AI, or similar tools). - Data Warehousing & Real-time Processing: Strong knowledge of modern data platforms for batch and streaming data processing. Nice-to-Have Skills: - Experience with Graph ML, reinforcement learning, or causal inference modeling. - Working knowledge of BI tools (Looker, Tableau, Power BI) for integrating ML insights into dashboards. - Familiarity with marketing analytics, attribution modeling, and A/B testing methodologies. - Experience with distributed computing frameworks (Spark, Dask, Ray). Location: - Bengaluru Brand: - Merkle Time Type: - Full time Contract Type: - Permanent,

Posted 5 days ago

Apply

2.0 - 6.0 years

0 Lacs

kolkata, west bengal

On-site

As a Solution Architect & Technical Lead at RebusCode, you will play a crucial role in driving the design and architecture of our Big Data Analytics solutions within the Market Research industry. Your responsibilities will include providing technical leadership, ensuring governance, documenting solutions, and sharing knowledge effectively. Moreover, you will be actively involved in project management and ensuring timely delivery of projects. To excel in this role, you should have a minimum of 5 years of experience in software development, out of which at least 2 years should be in architecture or technical leadership positions. A proven track record of delivering enterprise-grade, cloud-native SaaS applications on Azure and/or GCP is essential for this role. Your technical skills should encompass a wide range of areas including Cloud & Infrastructure (Azure App Services, Functions, Kubernetes; GKE, Cloud Functions; Service Bus, Pub/Sub; Blob Storage, Cloud Storage; Key Vault, Secret Manager; CDN), Development Stack (C#/.NET 6/7/8, ASP.NET Core Web API, Docker, container orchestration), Data & Integration (SQL Server, Oracle, Cosmos DB, Spanner, BigQuery, ETL patterns, message-based integration), CI/CD & IaC (Azure DevOps, Cloud Build, GitHub Actions; ARM/Bicep, Terraform; container registries, automated testing), Security & Compliance (TLS/SSL certificate management, API gateway policies, encryption standards), and Monitoring & Performance (Azure Application Insights, Log Analytics, Stackdriver, performance profiling, load testing tools). Nice-to-have qualifications include certifications such as Azure Solutions Architect Expert, Google Professional Cloud Architect, PMP or PMI-ACP. Familiarity with front-end frameworks like Angular and React, as well as API client SDK generation, would be an added advantage. Prior experience in building low-code/no-code integration platforms or automation engines is also beneficial. Exposure to alternative clouds like AWS or on-prem virtualization platforms like VMware and OpenShift will be a plus. Join us at RebusCode, where you will have the opportunity to work on cutting-edge Big Data Analytics solutions and contribute to the growth and success of our market research offerings.,

Posted 5 days ago

Apply

12.0 - 16.0 years

0 Lacs

karnataka

On-site

About KPMG in India KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada. KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. We are seeking an experienced and highly skilled Senior Google Cloud Analytics & Vertex AI Specialist for the position of Associate Director with 12-15 years of experience, specifically focusing on Google Vertex AI. The ideal candidate will have a deep understanding of Google Cloud Platform (GCP) and extensive hands-on experience with Google Cloud analytics services and Vertex AI. The role involves leading projects, designing scalable data solutions, driving the adoption of AI and machine learning practices within the organization, and supporting pre-sales activities. A minimum of 2 years of hands-on experience with Vertex AI is required. Key Responsibilities: - Architect and Implement: Design and implement end-to-end data analytics solutions using Google Cloud services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage. - Vertex AI Development: Develop, train, and deploy machine learning models using Vertex AI. Utilize Vertex AI's integrated tools for model monitoring, versioning, and CI/CD pipelines. Implement custom machine learning pipelines using Vertex AI Pipelines. Utilize Vertex AI Feature Store for feature management and Vertex AI Model Registry for model tracking. - Data Integration: Integrate data from various sources, ensuring data quality and consistency across different systems. - Performance Optimization: Optimize data pipelines and analytics processes for maximum efficiency and performance. - Leadership and Mentorship: Lead and mentor a team of data engineers and data scientists, providing guidance and support on best practices in GCP and AI/ML. - Collaboration: Work closely with stakeholders to understand business requirements and translate them into technical solutions. - Innovation: Stay updated with the latest trends and advancements in Google Cloud services and AI technologies, advocating for their adoption when beneficial. - Pre-Sales Support: Collaborate cross-functionally to understand client requirements, design tailored solutions, prepare and deliver technical presentations and product demonstrations, and assist in proposal and RFP responses. - Project Delivery: Manage and oversee the delivery of data analytics and AI/ML projects, ensuring timely and within budget completion while coordinating with cross-functional teams. Qualifications: - Experience: 12-15 years in data engineering, data analytics, and AI/ML with a focus on Google Cloud Platform. - Technical Skills: Proficient in Google Cloud services (BigQuery, Dataflow, Pub/Sub, Cloud Storage, Vertex AI), strong programming skills in Python and SQL, experience with machine learning frameworks (TensorFlow, PyTorch), data visualization tools (Looker, Data Studio). - Pre-Sales and Delivery Skills: Experience in supporting pre-sales activities, managing and delivering complex data analytics and AI/ML projects. - Certifications: Google Cloud Professional Data Engineer or Professional Machine Learning Engineer certification is a plus. - Soft Skills: Excellent problem-solving, communication, and leadership skills. Qualifications: - B.E./B.Tech/Post Graduate,

Posted 5 days ago

Apply

3.0 - 7.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

As a Google Cloud Engineer at our company, you will play a crucial role in designing, building, deploying, and maintaining our cloud infrastructure and applications on Google Cloud Platform (GCP). Your collaboration with development, operations, and security teams will ensure that our cloud environment is scalable, secure, highly available, and cost-optimized. If you are enthusiastic about cloud-native technologies, automation, and overcoming intricate infrastructure challenges, we welcome you to apply. Your responsibilities will include: - Designing, implementing, and managing robust, scalable, and secure cloud infrastructure on GCP utilizing Infrastructure as Code (IaC) tools like Terraform. - Deploying, configuring, and managing core GCP services such as Compute Engine, Kubernetes Engine (GKE), Cloud SQL, Cloud Storage, Cloud Functions, BigQuery, Pub/Sub, and networking components. - Developing and maintaining CI/CD pipelines for automated deployment and release management using various tools. - Implementing and enforcing security best practices within the GCP environment, including IAM, network security, data encryption, and compliance adherence. - Monitoring cloud infrastructure and application performance, identifying bottlenecks, and implementing optimization solutions. - Troubleshooting and resolving complex infrastructure and application issues in production and non-production environments. - Collaborating with development teams to ensure cloud-native deployment, scalability, and resilience of applications. - Participating in on-call rotations for critical incident response and timely resolution of production issues. - Creating and maintaining comprehensive documentation for cloud architecture, configurations, and operational procedures. - Keeping up-to-date with new GCP services, features, and industry best practices to propose and implement improvements. - Contributing to cost optimization efforts by identifying and implementing efficiencies in cloud resource utilization. We require you to have: - A Bachelors or Masters degree in Computer Science, Software Engineering, or a related field. - 6+ years of experience with C#, .NET Core, .NET Framework, MVC, Web API, Entity Framework, and SQL Server. - 3+ years of experience with cloud platforms, preferably GCP, including designing and deploying cloud-native applications. - 3+ years of experience with source code management, CI/CD pipelines, and Infrastructure as Code. - Strong experience with Javascript and a modern Javascript framework, with VueJS preferred. - Proven leadership and mentoring skills with development teams. - Strong understanding of microservices architecture and serverless computing. - Experience with relational databases like SQL Server and PostgreSQL. - Excellent problem-solving, analytical, and communication skills, along with Agile/Scrum environment experience. What can make you stand out: - GCP Cloud Certification. - UI development experience with HTML, JavaScript, Angular, and Bootstrap. - Agile environment experience with Scrum, XP. - Relational database experience with SQL Server, PostgreSQL. - Proficiency in Atlassian tools like JIRA, Confluence, and Github. - Working knowledge of Python and exceptional problem-solving and analytical abilities, along with strong teamwork skills.,

Posted 6 days ago

Apply

5.0 - 13.0 years

0 Lacs

pune, maharashtra

On-site

You are a highly skilled and experienced Cloud Architect/Engineer with deep expertise in Google Cloud Platform (GCP). Your primary responsibility is to design, build, and manage scalable and reliable cloud infrastructure on GCP. You will leverage various GCP services such as Compute Engine, Cloud Run, BigQuery, Pub/Sub, Cloud Functions, Dataflow, Dataproc, IAM, and Cloud Storage to ensure high-performance cloud solutions. Your role also includes developing and maintaining CI/CD pipelines, automating infrastructure deployment using Infrastructure as Code (IaC) principles, and implementing best practices in cloud security, monitoring, performance tuning, and logging. Collaboration with cross-functional teams to deliver cloud solutions aligned with business objectives is essential. You should have 5+ years of hands-on experience in cloud architecture and engineering, with at least 3 years of practical experience on Google Cloud Platform (GCP). In-depth expertise in GCP services mentioned above is required. Strong understanding of networking, security, containerization (Docker, Kubernetes), and CI/CD pipelines is essential. Experience with monitoring, performance tuning, and logging in cloud environments is preferred. Familiarity with DevSecOps practices and tools such as HashiCorp Vault is a plus. Your role as a GCP Cloud Architect/Engineer will contribute to ensuring system reliability, backup, and disaster recovery strategies. This hybrid role is based out of Pune and requires a total of 10 to 13 years of relevant experience.,

Posted 6 days ago

Apply

4.0 - 7.0 years

9 - 13 Lacs

Pune

Work from Office

skilled Java + GCP Developer Shell scripting and Python, Java, Spring Boot, BigQuery. The ideal candidate should have hands-on experience in Java, Spring Boot, and Google Cloud Platform (GCP)

Posted 6 days ago

Apply

5.0 - 7.0 years

5 - 14 Lacs

Pune, Gurugram, Bengaluru

Work from Office

• Handson experience in objectoriented programming using Python, PySpark, APIs, SQL, BigQuery, GCP • Building data pipelines for huge volume of data • Dataflow Dataproc and BigQuery • Deep understanding of ETL concepts

Posted 6 days ago

Apply

5.0 - 8.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Skills desired: Strong at SQL (Multi pyramid SQL joins) Python skills (FastAPI or flask framework) PySpark Commitment to work in overlapping hours GCP knowledge(BQ, DataProc and Dataflow) Amex experience is preferred(Not Mandatory) Power BI preferred (Not Mandatory) Flask, Pyspark, Python, Sql

Posted 6 days ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

We are seeking a talented and experienced Full-Stack Application Developer React JS, Node JS, Express JS, and JAVA to join our team. The ideal candidate will have at least 5+ years of experience in both front-end and backend application development, utilizing technologies such as React JS, Node JS, Express JS, and JAVA Services, as well as experience with Firebase or other NoSQL databases. This role involves designing, developing, and maintaining applications with a focus on performance, scalability, and reliability. Below are must tech skills. Nestjs GRPC Pub/Sub Microservices Key Responsibilities Design and develop both front-end and back-end components of web applications using React JS, Node JS,Express JS, and JAVA services. Develop and manage REST APIs to support frontend functionality. Work with Firebase or other NoSQL databases for data storage and retrieval. Implement microservices architecture to improve application modularity and scalability. Develop and deploy cloud functions, preferably in GCP Firebase, to support various application functionalities. Optimize application performance, threading, and scalability to ensure smooth user experiences. Troubleshoot and debug application issues to maintain high reliability and performance. Collaborate with frontend developers to integrate user-facing elements with server-side logic. Follow best practices for code versioning, using tools such as Git and Gerrit. Participate in agile development processes, including sprint planning and task tracking using Jira and Scrummethodologies. Ensure continuous integration and continuous deployment (CI/CD) processes are in place and maintained Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. At least 5+ years of proven experience in full-stack application development using React JS, Node JS, Express JS, and Java services. Hands-on experience with Firebase or other NoSQL databases. Strong knowledge of microservices development and cloud function development, preferably in GCP Firebase. Experience with performance tuning, threading, and scalability optimization. Excellent debugging skills and ability to troubleshoot complex issues. Proficient understanding of code versioning tools, such as Git. Familiarity with continuous integration and continuous deployment (CI/CD) practices. Experience with agile development methodologies, including Jira and Scrum practices. Strong problem-solving skills and attention to detail. Excellent communication and teamwork skills

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

You are an experienced software engineer who will be joining our growing software engineering product team at Ford Motor Company in Chennai. In this role, you will be responsible for supporting finished vehicle logistics by developing and maintaining a global logistics data warehouse solution on the GCP platform. This solution will provide visibility into the shipment of finished vehicles from the plant to dealers. Your main responsibilities will include working on Java, Full Stack Java Developer, Spring Boot, GCP, Big Query, GCP Cloud Run, Microservices, REST APIs, Pub/Sub, KAFKA, AI, and TERRAFORM technologies. Your expertise in these areas will be crucial in ensuring the successful development and maintenance of the logistics data warehouse solution. To be successful in this role, you should have a minimum of 8 years of experience as a Java and Spring full stack engineer. Additionally, experience with AI Agent is preferred but not mandatory. If you are someone who is passionate about software engineering, has a strong background in Java and Spring technologies, and is looking to work on cutting-edge solutions in the automotive industry, then we encourage you to apply for this position. Immediate joiners will be given preference.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. You would be working on: - Developing and implementing Generative AI / AI solutions on Google Cloud Platform - Working with cross-functional teams to design and deliver AI-powered products and services - Developing, versioning, and executing Python code - Deploying models as endpoints in Dev Environment - Having a solid understanding of python - Utilizing deep learning frameworks such as TensorFlow, PyTorch, or JAX - Working on Natural language processing (NLP) and machine learning (ML) - Utilizing Cloud storage, compute engine, VertexAI, Cloud Function, Pub/Sub, Vertex AI, etc. - Providing Generative AI support in Vertex, specifically hands-on experience with Generative AI models like Gemini, vertex Search, etc. Your Profile should include: - Experience in Generative AI development with Google Cloud Platform - Experience in delivering an AI solution on VertexAI platform - Experience in developing and deploying AI Solutions with ML What you'll love about working here: - You can shape your career with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. - You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage, or new parent support via flexible work. - You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market-leading capabilities in AI, generative AI, cloud, and data, combined with its deep industry expertise and partner ecosystem.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As a GCP Senior Data Engineer/Architect, you will play a crucial role in our team by designing, developing, and implementing robust and scalable data solutions on the Google Cloud Platform (GCP). Collaborating closely with Architects and Business Analysts, especially for our US clients, you will translate data requirements into effective technical solutions. Your responsibilities will include designing and implementing scalable data warehouse and data lake solutions, orchestrating complex data pipelines, leading cloud data lake implementation projects, participating in cloud migration projects, developing containerized applications, optimizing SQL queries, writing automation scripts in Python, and utilizing various GCP data services such as BigQuery, Bigtable, and Cloud SQL. Your expertise in data warehouse and data lake design and implementation, experience in data pipeline development and tuning, hands-on involvement in cloud migration and data lake projects, proficiency in Docker and GKE, strong SQL and Python scripting skills, and familiarity with GCP services like BigQuery, Cloud SQL, Dataflow, and Composer will be essential for this role. Additionally, knowledge of data governance principles, experience with dbt, and the ability to work effectively within a team and adapt to project needs are highly valued. Strong communication skills, the willingness to work in UK shift timings, and the openness to giving and receiving feedback are important traits that will contribute to your success in this role.,

Posted 1 week ago

Apply

12.0 - 15.0 years

35 - 60 Lacs

Chennai, Bengaluru

Hybrid

Job Description: Job Title: GCP Solution Architect Location : Chennai | Bangalore Experience : 12-15 years in IT Key Responsibilities Architect and lead GCP-native data and AI solutions tailored to AdTech use casessuch as real-time bidding, campaign analytics, customer segmentation, and look alike modeling. Design high-throughput data pipelines, audience data lakes, and analytics platforms leveraging GCP services like BigQuery, Dataflow, Pub/Sub, Cloud Storage, Vertex AI, etc. Collaborate with ad operations, marketing teams, and digital product owners to understand business goals and translate them into scalable and performant solutions. Integrate with third-party AdTech and MarTech platforms, including DSPs, SSPs, CDPs, DMPs, ad exchanges, and identity resolution systems. Ensure architectural alignment with data privacy regulations (GDPR, CCPA) and support consent management and data anonymization strategies. Drive technical leadership across multi-disciplinary teams (Data Engineering, MLOps, Analytics) and enforce best practices in data governance, model deployment, and cloud optimization. Lead discovery workshops, solution assessments, and architecture reviews during pre-sales and delivery cycles. Required Skills & Qualifications Bachelors or Masters degree in Computer Science, Engineering, or related field. BigQuery, Cloud Pub/Sub, Dataflow, Dataproc, Cloud Composer (Airflow), Vertex AI, AI Platform, AutoML, Cloud Functions, Cloud Run, Looker, Apigee, Dataplex, GKE Deep understanding of programmatic advertising (RTB, OpenRTB), cookie-less identity frameworks, and AdTech/MarTech data flows. Experience integrating or building components like: Data Management Platforms (DMPs) Customer Data Platforms (CDPs) Demand-Side Platforms (DSPs) Ad servers, attribution engines, and real-time bidding pipelines Event-driven and microservices architecture using APIs, streaming pipelines, and edge delivery networks. Integration with platforms like Google Marketing Platform, Google Ads Data Hub, Snowplow, Segment, or similar. Strong understanding of IAM, data encryption, PII anonymization, and regulatory compliance (GDPR, CCPA, HIPAA if applicable). Experience with CI/CD pipelines (Cloud Build), Infrastructure as Code (Terraform), and MLOps pipelines using Vertex AI or Kubeflow. Strong experience in Python and SQL; familiarity with Scala or Java is a plus. Experience with version control (Git), Agile delivery, and architectural documentation tools. If you know someone suitable, feel free to forward their resume to aarthi.murali@zucisystems.com. Regards, Aarthi Murali

Posted 1 week ago

Apply

4.0 - 9.0 years

20 - 35 Lacs

Gurugram

Work from Office

Job Description - The candidate should have extensive production experience (2+ Years ) in GCP - Strong background in Data engineering 2-3 Years of exp in Big Data technologies including, Hadoop, NoSQL, Spark, Kafka etc. - Exposure to enterprise application development is a must. Roles & Responsibilities 4-10 years of IT experience range is preferred. Able to effectively use GCP managed services e.g. Dataproc, Dataflow, pub/sub, Cloud functions, Big Query, GCS - At least 4 of these Services. Good to have knowledge on Cloud Composer, Cloud SQL, Big Table, Cloud Function. Strong experience in Big Data technologies Hadoop, Sqoop, Hive and Spark including DevOPs. Good hands on expertise on either Python or Java programming. Good Understanding of GCP core services like Google cloud storage, Google compute engine, Cloud SQL, Cloud IAM. Good to have knowledge on GCP services like App engine, GKE, Cloud Run, Cloud Built, Anthos. Ability to drive the deployment of the customers workloads into GCP and provide guidance, cloud adoption model, service integrations, appropriate recommendations to overcome blockers and technical road-maps for GCP cloud implementations. Experience with technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities. Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technologies. Act as a subject-matter expert OR developer around GCP and become a trusted advisor to multiple teams. Technical ability to become certified in required GCP technical certifications.

Posted 1 week ago

Apply

3.0 - 6.0 years

10 - 14 Lacs

Gurugram

Work from Office

As an entry level Application Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems. In your role, you may be responsible for: Working across the entire system architectureto design, develop, and support high quality, scalable products and interfaces for our clients Collaborate with cross-functional teams to understand requirements and define technical specifications for generative AI projects Employing IBM's Design Thinking to create products that provide a great user experience along with high performance, security, quality, and stability Working with a variety of relational databases (SQL, Postgres, DB2, MongoDB), operating systems (Linux, Windows, iOS, Android), and modern UI frameworks (Backbone.js, AngularJS, React, Ember.js, Bootstrap, and JQuery) Creating everything from mockups and UI components to algorithms and data structures as you deliver a viable product Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise SQL authoring, query, and cost optimisation primarily on Big Query. Python as an object-oriented scripting language. Data pipeline, data streaming and workflow management toolsDataflow, Pub Sub, Hadoop, spark-streaming Version control systemGIT & Preferable knowledge of Infrastructure as CodeTerraform. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience performing root cause analysis on internal and external data and processes to answer specific business questions Preferred technical and professional experience Experience building and optimising data pipelines, architectures and data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Working knowledge of message queuing, stream processing, and highly scalable data stores. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with experience in a Data Engineer role, who are also familiar with Google Cloud Platform.

Posted 1 week ago

Apply

6.0 - 10.0 years

20 - 30 Lacs

Bengaluru

Work from Office

Skilled and hands-on QA professional with over 6 years of experience in testing and automating APIs, microservices, and web applications in Agile environments. Proficient in Java , Rest Assured , Karate , and TestNG , with a strong foundation in Linux/Unix , SQL/NoSQL , and Kafka/pub-sub technologies. Experienced in building robust test automation frameworks, writing detailed functional/integration test plans, and debugging complex issues across the tech stack from UI to backend services and server logs. Core Competencies: API and Microservices testing & automation using Java, RestAssured/Karate Test framework development with TestNG, Cucumber BDD, Selenium Strong database testing with SQL and NoSQL; hands-on with Kafka/pub-sub CI/CD integration using Git, Jenkins, Azure DevOps Writing and maintaining modular, scalable test code Experience with Agile/Scrum methodologies and QA best practices Solid exposure to Unix/Linux-based environments Key Contributions: Collaborated with product managers and developers to review requirements and define detailed, edge-case-rich test cases. Executed test plans, published test reports, and ensured stakeholder alignment on QA metrics. Developed mock/stub components for isolated testing of microservices. Provided comprehensive bug reports with log/DB analysis to accelerate resolution.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies