Jobs
Interviews

20 Cloud Pubsub Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 5.0 years

0 Lacs

ahmedabad, gujarat

On-site

As a Data Engineer at Synoptek, you will be responsible for designing, developing, and maintaining robust and scalable data pipelines on the Google Cloud Platform (GCP). You will leverage your hands-on experience with GCP services such as BigQuery, Jitterbit, Cloud Dataflow, Cloud Pub/Sub, and Cloud Storage to build efficient data processing solutions. Collaborating with cross-functional teams, you will translate their data needs into technical requirements, ensuring data quality, integrity, and security throughout the data lifecycle. Your role will involve developing and optimizing ETL/ELT processes to extract, transform, and load data from various sources into data warehouses and data lakes. Additionally, you will build and maintain data models and schemas to support business intelligence and analytics, while troubleshooting data quality issues and performance bottlenecks. To excel in this position, you should have a Bachelor's degree in Computer Science, Engineering, or a related field, along with 3 to 4 years of experience as a Data Engineer focusing on GCP. Proficiency in Python, SQL, and BigQuery is essential, as well as hands-on experience with data ingestion, transformation, and loading tools like Jitterbit and Apache Beam. A strong understanding of data warehousing and data lake concepts, coupled with experience in data modeling and schema design, will be beneficial. The ideal candidate will exhibit excellent problem-solving and analytical skills, working both independently and collaboratively with internal and external teams. Familiarity with acquiring and managing data from various sources, as well as the ability to identify trends in complex datasets and propose business solutions, are key attributes for success in this role. At Synoptek, we value employees who embody our core DNA behaviors, including clarity, integrity, innovation, accountability, and a results-focused mindset. We encourage continuous learning, adaptation, and growth in a fast-paced environment, promoting a culture of teamwork, flexibility, respect, and collaboration. If you have a passion for data engineering, a drive for excellence, and a commitment to delivering impactful results, we invite you to join our dynamic team at Synoptek. Work hard, play hard, and let's achieve superior outcomes together.,

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

punjab

On-site

As a GCP Data Engineer in Australia, you will be responsible for leveraging your experience in Google Cloud Platform (GCP) to handle various aspects of data engineering. Your role will involve working on data migration projects from legacy systems such as SQL and Oracle. You will also be designing and building ETL pipelines for data lake and data warehouse solutions on GCP. In this position, your expertise in GCP data and analytics services will be crucial. You will work with tools like Cloud Dataflow, Cloud Dataprep, Apache Beam/Cloud composer, Cloud BigQuery, Cloud Fusion, Cloud PubSub, Cloud storage, and Cloud Functions. Additionally, you will utilize Cloud Native GCP CLI/gsutil for operations and scripting languages like Python and SQL to enhance data processing efficiencies. Furthermore, your experience with data governance practices, metadata management, data masking, and encryption will be essential. You will utilize GCP tools such as Cloud Data Catalog and GCP KMS tools to ensure data security and compliance. Overall, this role requires a strong foundation in GCP technologies and a proactive approach to data engineering challenges in a dynamic environment.,

Posted 6 days ago

Apply

3.0 - 5.0 years

5 - 8 Lacs

Chennai

Work from Office

" PLEASE READ THE JOB DESTCRIPTION AND APPLY" Data Engineer Job Description Position Overview Yesterday is history, tomorrow is a mystery, but today is a gift. That's why we call it the present. - Master Oogway Join CustomerLabs' dynamic data team as a Data Engineer and play a pivotal role in transforming raw marketing data into actionable insights that power our digital marketing platform. As a key member of our data infrastructure team, you will design, develop, and maintain robust data pipelines, data warehouses, and analytics platforms that serve as the backbone of our digital marketing product development. Sometimes the hardest choices require the strongest wills. - Thanos (but we promise, our data decisions are much easier! ) In this role, you will collaborate with cross-functional teams including Data Scientists, Product Managers, and Marketing Technology specialists to ensure seamless data flow from various marketing channels, ad platforms, and customer touchpoints to our analytics dashboards and reporting systems. You'll be responsible for building scalable, reliable, and efficient data solutions that can handle high-volume marketing data processing and real-time campaign analytics. What You'll Do: - Design and implement enterprise-grade data pipelines for marketing data ingestion and processing - Build and optimize data warehouses and data lakes to support digital marketing analytics - Ensure data quality, security, and compliance across all marketing data systems - Create data models and schemas that support marketing attribution, customer journey analysis, and campaign performance tracking - Develop monitoring and alerting systems to maintain data pipeline reliability for critical marketing operations - Collaborate with product teams to understand digital marketing requirements and translate them into technical solutions Why This Role Matters: I can do this all day. - Captain America (and you'll want to, because this role is that rewarding!) You'll be the backbone behind the data infrastructure that powers CustomerLabs' digital marketing platform, making marketers' lives easier and better. Your work directly translates to smarter automation, clearer insights, and more successful campaigns - helping marketers focus on what they do best while we handle the complex data heavy lifting. Sometimes you gotta run before you can walk. - Iron Man (and sometimes you gotta build the data pipeline before you can analyze the data! ) Our Philosophy: We believe in the power of data to transform lives, just like the Dragon Warrior transformed the Valley of Peace. Every line of code you write, every pipeline you build, and every insight you enable has the potential to change how marketers work and succeed. We're not just building data systems - we're building the future of digital marketing, one insight at a time. Your story may not have such a happy beginning, but that doesn't make you who you are. It is the rest of your story, who you choose to be. - Soothsayer What Makes You Special: We're looking for someone who embodies the spirit of both Captain America's unwavering dedication and Iron Man's innovative genius. You'll need the patience to build robust systems (like Cap's shield ) and the creativity to solve complex problems (like Tony's suit). Most importantly, you'll have the heart to make a real difference in marketers' lives. Inner peace... Inner peace... Inner peace... - Po (because we know data engineering can be challenging, but we've got your back! ) Key Responsibilities Data Pipeline Development - Design, build, and maintain robust, scalable data pipelines and ETL/ELT processes - Develop data ingestion frameworks to collect data from various sources (databases, APIs, files, streaming sources) - Implement data transformation and cleaning processes to ensure data quality and consistency - Optimize data pipeline performance and reliability Data Infrastructure Management - Design and implement data warehouse architectures - Manage and optimize database systems (SQL and NoSQL) - Implement data lake solutions and data governance frameworks - Ensure data security, privacy, and compliance with regulatory requirements Data Modeling and Architecture - Design and implement data models for analytics and reporting - Create and maintain data dictionaries and documentation - Develop data schemas and database structures - Implement data versioning and lineage tracking Data Quality, Security, and Compliance - Ensure data quality, integrity, and consistency across all marketing data systems - Implement and monitor data security measures to protect sensitive information - Ensure privacy and compliance with regulatory requirements (e.g., GDPR, CCPA) - Develop and enforce data governance policies and best practices Collaboration and Support - Work closely with Data Scientists, Analysts, and Business stakeholders - Provide technical support for data-related issues and queries Monitoring and Maintenance - Implement monitoring and alerting systems for data pipelines - Perform regular maintenance and optimization of data systems - Troubleshoot and resolve data pipeline issues - Conduct performance tuning and capacity planning Required Qualifications Experience - 2+ years of experience in data engineering or related roles - Proven experience with ETL/ELT pipeline development - Experience with cloud data platform (GCP) - Experience with big data technologies Technical Skills - Programming Languages : Python, SQL, Golang (preferred) - Databases: PostgreSQL, MySQL, Redis - Big Data Tools: Apache Spark, Apache Kafka, Apache Airflow, DBT, Dataform - Cloud Platforms: GCP (BigQuery, Dataflow, Cloud run, Cloud SQL, Cloud Storage, Pub/Sub, App Engine, Compute Engine etc.) - Data Warehousing: Google BigQuery - Data Visualization: Superset, Looker, Metabase, Tableau - Version Control: Git, GitHub - Containerization: Docker Soft Skills - Strong problem-solving and analytical thinking - Excellent communication and collaboration skills - Ability to work independently and in team environments - Strong attention to detail and data quality - Continuous learning mindset Preferred Qualifications Additional Experience - Experience with real-time data processing and streaming - Knowledge of machine learning pipelines and MLOps - Experience with data governance and data catalog tools - Familiarity with business intelligence tools (Tableau, Power BI, Looker, etc.) - Experience using AI-powered tools (such as Cursor, Claude, Copilot, ChatGPT, Gemini, etc.) to accelerate coding, automate tasks, or assist in system design ( We belive run with machine, not against machine ) Interview Process 1. Initial Screening: Phone/video call with HR 2. Technical Interview: Deep dive into data engineering concepts 3. Final Interview: Discussion with senior leadership Note: This job description is intended to provide a general overview of the position and may be modified based on organizational needs and candidate qualifications. Our Team Culture We are Groot. - We work together, we grow together, we succeed together. We believe in: - Innovation First - Like Iron Man, we're always pushing the boundaries of what's possible - Team Over Individual - Like the Avengers, we're stronger together than apart - Continuous Learning - Like Po learning Kung Fu, we're always evolving and improving - Making a Difference - Like Captain America, we fight for what's right (in this case, better marketing!) Growth Journey There is no charge for awesomeness... or attractiveness. - Po Your journey with us will be like Po's transformation from noodle maker to Dragon Warrior: - Level 1 : Master the basics of our data infrastructure - Level 2: Build and optimize data pipelines - Level 3 : Lead complex data projects and mentor others - Level 4: Become a data engineering legend (with your own theme music! ) What We Promise I am Iron Man. - We promise you'll feel like a superhero every day! - Work that matters - Every pipeline you build helps real marketers succeed - Growth opportunities - Learn new technologies and advance your career - Supportive team - We've got your back, just like the Avengers - Work-life balance - Because even superheroes need rest!

Posted 1 week ago

Apply

7.0 - 12.0 years

11 - 15 Lacs

Noida

Work from Office

Primary Skill(s): Lead Data Visualization Engineer with experience in Sigma BI Experience: 7+ Years in of experience in Data Visualization with experience in Sigma BI, PowerBI, Tableau or Looker Job Summary: Lead Data Visualization Engineer with deep expertise in Sigma BI and a strong ability to craft meaningful, insight-rich visual stories for business stakeholders. This role will be instrumental in transforming raw data into intuitive dashboards and visual analytics, helping cross-functional teams make informed decisions quickly and effectively. Key Responsibilities: Lead the design, development, and deployment of Sigma BI dashboards and reports tailored for various business functions. Translate complex data sets into clear, actionable insights using advanced visualization techniques. Collaborate with business stakeholders to understand goals, KPIs, and data requirements. Build data stories that communicate key business metrics, trends, and anomalies. Serve as a subject matter expert in Sigma BI and guide junior team members on best practices. Ensure visualizations follow design standards, accessibility guidelines, and performance optimization. Partner with data engineering and analytics teams to source and structure data effectively. Conduct workshops and training sessions to enable business users in consuming and interacting with dashboards. Drive the adoption of self-service BI tools and foster a data-driven decision-making culture. Required Skills & Experience: 7+ years of hands-on experience in Business Intelligence, with at least 2 years using Sigma BI. Proven ability to build end-to-end dashboard solutions that tell a story and influence decisions. Strong understanding of data modeling, SQL, and cloud data platforms (Snowflake, BigQuery, etc.). Demonstrated experience working with business users, gathering requirements, and delivering user-friendly outputs. Proficient in data storytelling, UX design principles, and visualization best practices. Experience integrating Sigma BI with modern data stacks and APIs is a plus. Excellent communication and stakeholder management skills. Preferred Qualifications: Experience with other BI tools (such as Sigma BI, Tableau, Power BI, Looker) is a plus. Familiarity with AWS Cloud Data Ecosystems (AWS Databricks). Background in Data Analysis, Statistics, or Business Analytics. Working Hours: 2 PM 11 PM IST [~4.30 AM ET 1.30 PM ET]. Communication skills: Good Mandatory Competencies BI and Reporting Tools - BI and Reporting Tools - Power BI BI and Reporting Tools - BI and Reporting Tools - Tableau Database - Database Programming - SQL Cloud - GCP - Cloud Data Fusion, Dataproc, BigQuery, Cloud Composer, Cloud Bigtable Data Science and Machine Learning - Data Science and Machine Learning - Databricks Cloud - AWS - ECS DMS - Data Analysis Skills Beh - Communication and collaboration

Posted 1 week ago

Apply

5.0 - 10.0 years

6 - 10 Lacs

Pune

Work from Office

Educational BSc,BCA,Master Of Engineering,MSc,MCA,MTech,BTech,Bachelor of Engineering Service Line Application Development and Maintenance Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Preferred LocationsBangalore, Hyderabad, Chennai, Pune Experience Required3 to 5 years of experiencePure hands on and expertise on the skill, able to deliver without any support Experience Required5 - 9 years of experienceDesign knowledge, estimation technique, leading and guiding the team on technical solution Experience Required9 - 13 years of experienceArchitecture, Solutioning, (Optional) proposal Containerization, micro service development on AWS/Azure/GCP is preferred. In-depth knowledge of design issues and best practices Solid understanding of object-oriented programming Familiar with various design, architectural patterns and software development process. Implementing automated testing platforms and unit tests Strong experience in building and developing applications using technologies like Java/J2EE, Spring boot, Microservices Knowledge about RESTful APIs and ability to design cloud ready applications using cloud SDK’s Exposure to cloud compute services like VM’s, PaaS services, containers, serverless and storage services on AWS/Azure/GCP Good understanding of application development design patterns Technical and Professional : Primary SkillCore Java, Spring boot, Microservices, (Optional) Data Base like SQL etc. Secondary Skills: AWS/Azure/GCP Preferred Skills: Technology-Java-Springboot Generic Skills: Technology-Cloud Platform-Azure Development & Solution Architecting Technology-Cloud Platform-Azure Devops-data on cloud-aws Technology-Cloud Platform-GCP Devops

Posted 4 weeks ago

Apply

5.0 - 7.0 years

4 - 9 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering,BTech,Bachelor Of Technology,BCA,BSc,MTech,MCA Service Line Application Development and Maintenance Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Preferred LocationsBangalore, Hyderabad, Chennai, Pune Experience Required3 to 5 years of experiencePure hands on and expertise on the skill, able to deliver without any support Experience Required5 - 9 years of experienceDesign knowledge, estimation technique, leading and guiding the team on technical solution Experience Required9 - 13 years of experienceArchitecture, Solutioning, (Optional) proposal Containerization, micro service development on AWS/Azure/GCP is preferred. In-depth knowledge of design issues and best practices Solid understanding of object-oriented programming Familiar with various design, architectural patterns and software development process. Implementing automated testing platforms and unit tests Strong experience in building and developing applications using technologies like Python Knowledge about RESTful APIs and ability to design cloud ready applications using cloud SDK’s , microservices Exposure to cloud compute services like VM’s, PaaS services, containers, serverless and storage services on AWS/Azure/GCP Good understanding of application development design patterns Technical and Professional : Primary SkillPythonSecondary Skills: AWS/Azure/GCP Preferred Skills: Technology-Machine Learning-Python Generic Skills: Technology-Cloud Platform-AWS App Development Technology-Cloud Platform-Azure Development & Solution Architecting Technology-Cloud Platform-GCP Devops

Posted 4 weeks ago

Apply

5.0 - 9.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Educational Bachelor of Engineering,Bachelor Of Technology,MSc,MTech,MCA Service Line Application Development and Maintenance Responsibilities Responsibilities Application migration to AWS cloud User requirements, envisioning system features and functionality. Identify bottlenecks and bugs, and recommend system solutions by comparing advantages and disadvantages of custom development Contributing to team meetings, troubleshooting development and production problems across multiple environments and operating platforms Understand Architecture and ensure effective Design, Development, Validation and Support activities Understand and analyze client requirements, refactor systems for workload migration / modernization to cloud (AWS) End-to-end feature development and resolving challenges faced in the implementation Create detailed design artifacts, work on development, and perform code reviews, implement validation and support activities Contribute to thought leadership within the area of technology specialization Additional Responsibilities: Skills: Containerization, micro service development on AWS is preferred. In-depth knowledge of design issues and best practices Solid understanding of object-oriented programming Familiar with various design, architectural patterns and software development process. Implementing automated testing platforms and unit tests Strong experience in building and developing applications using technologies like Java/J2EE, Springboot/Python Knowledge about RESTful APIs and ability to design cloud ready applications using cloud SDK’s , microservices Exposure to cloud compute services like VM’s, PaaS services, containers, serverless and storage services on AWS Good understanding of application development design patternsCompetencies: Good verbal and written communication skills Ability to communicate with remote teams in effective manner High flexibility to travel Ability to work both independently and in a multi-disciplinary team environment Technical and Professional : AWS - ELB/RDS/EC2/S3/IAM, Java/J2EE, Springboot/Python Preferred Skills: Technology-Cloud Security-AWS - Infrastructure Security-AWS Systems Manager

Posted 4 weeks ago

Apply

5.0 - 7.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering,BTech,Bachelor Of Technology,BCA,BSc,MTech,MCA Service Line Application Development and Maintenance Responsibilities A day in the life of an InfoscionResponsibilities Application migration to AWS/Azure/GCP cloud User requirements, envisioning system features and functionality. Identify bottlenecks and bugs, and recommend system solutions by comparing advantages and disadvantages of custom development Contributing to team meetings, troubleshooting development and production problems across multiple environments and operating platforms Understand Architecture and ensure effective Design, Development, Validation and Support activities Understand and analyze client requirements, refactor systems for workload migration / modernization to cloud (AWS, Azure, GCP) End-to-end feature development and resolving challenges faced in the implementation Create detailed design artifacts, work on development, and perform code reviews, implement validation and support activities Contribute to thought leadership within the area of technology specializationIf you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Preferred LocationsBangalore, Hyderabad, Chennai, Pune Experience Required3 to 5 years of experiencePure hands on and expertise on the skill, able to deliver without any support Experience Required5 - 9 years of experienceDesign knowledge, estimation technique, leading and guiding the team on technical solution Experience Required9 - 13 years of experienceArchitecture, Solutioning, (Optional) proposal Containerization, micro service development on AWS/Azure/GCP is preferred. In-depth knowledge of design issues and best practices Solid understanding of object-oriented programming Familiar with various design, architectural patterns and software development process. Implementing automated testing platforms and unit tests Strong experience in building and developing applications using technologies like Devops, Terraform Knowledge about RESTful APIs and ability to design cloud ready applications using cloud SDK’s Exposure to cloud compute services like VM’s, PaaS services, containers, serverless and storage services on AWS/Azure/GCP Good understanding of application development design patterns Technical and Professional : Primary SkillAWS/Azure/GCP + Devops + Terraform Preferred Skills: Technology-Cloud Platform-Azure App Development-Azure API Management Technology-Cloud Platform-GCP App Development Generic Skills: Technology-Cloud Platform-AWS App Development Technology-Cloud Platform-Azure Devops Technology-Cloud Platform-GCP Devops

Posted 4 weeks ago

Apply

5.0 - 9.0 years

5 - 9 Lacs

Chennai

Work from Office

Educational Bachelor of Engineering,MTech,MCA Service Line Application Development and Maintenance Responsibilities Responsibilities Application migration to AWS cloud User requirements, envisioning system features and functionality. Identify bottlenecks and bugs, and recommend system solutions by comparing advantages and disadvantages of custom development Contributing to team meetings, troubleshooting development and production problems across multiple environments and operating platforms Understand Architecture and ensure effective Design, Development, Validation and Support activities Understand and analyze client requirements, refactor systems for workload migration / modernization to cloud (AWS) End-to-end feature development and resolving challenges faced in the implementation Create detailed design artifacts, work on development, and perform code reviews, implement validation and support activities Contribute to thought leadership within the area of technology specialization Additional Responsibilities: Skills: Containerization, micro service development on AWS is preferred. In-depth knowledge of design issues and best practices Solid understanding of object-oriented programming Familiar with various design, architectural patterns and software development process. Implementing automated testing platforms and unit tests Strong experience in building and developing applications using technologies like Java/J2EE, Springboot/Python Knowledge about RESTful APIs and ability to design cloud ready applications using cloud SDK’s , microservices Exposure to cloud compute services like VM’s, PaaS services, containers, serverless and storage services on AWS Good understanding of application development design patterns Technical and Professional : AWS - ELB/RDS/EC2/S3/IAM, Java/J2EE, Springboot/Python Preferred Skills: Technology-Data On Cloud - Platform-AWS

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Noida

Hybrid

Data Engineer (SaaS-Based). Immediate Joiners Preferred. Shift : 3 PM to 12 AM IST. Good to have : GCP Certified Data Engineer. Overview Of The Role. As a GCP Data Engineer, you'll focus on solving problems and creating value for the business by building solutions that are reliable and scalable to work with the size and scope of the company. You will be tasked with creating custom-built pipelines as well as migrating on-prem data pipelines to the GCP stack. You will be part of a team tackling intricate problems by designing and deploying reliable and scalable solutions tailored to the company's data landscape.. Required Skills: 5+ years of industry experience in software development, data engineering, business intelligence, or related field with experience in manipulating, processing, and extracting value from datasets.. Extensive experience in doing requirement discovery, analysis and data pipeline solution design.. Design, build and deploy internal applications to support our technology life cycle, collaboration and spaces, service delivery management, data and business intelligence among others.. Building Modular code for multi usable pipeline or any kind of complex Ingestion Framework used to ease the job to load the data into Datalake or Data Warehouse from multiple sources.. Work closely with analysts and business process owners to translate business requirements into technical solutions.. Coding experience in scripting and languages (Python, SQL, PySpark).. Expertise in Google Cloud Platform (GCP) technologies in the data warehousing space (BigQuery, GCP Workflows, Cloud Scheduler, Secret Manager, Batch, Cloud Logging, Cloud SDK, Google Cloud Storage, IAM).. Exposure of Google Dataproc and Dataflow.. Maintain highest levels of development practices including: technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, writing clean, modular and self-sustaining code, with repeatable quality and predictability.. Understanding CI/CD Processes using Pulumi, GitHub, Cloud Build, Cloud SDK, Docker. Experience with SAS/SQL Server/SSIS is an added advantage.. Qualifications: Bachelor's degree in Computer Science or related technical field, or equivalent practical experience.. GCP Certified Data Engineer (preferred). Excellent verbal and written communication skills with the ability to effectively advocate technical solutions to other engineering teams and business audiences..

Posted 1 month ago

Apply

5.0 - 10.0 years

9 - 19 Lacs

Pune, Chennai, Bengaluru

Hybrid

Project Role : Cloud Platform Architect Project Role Description : Oversee application architecture and deployment in cloud platform environments -- including public cloud, private cloud and hybrid cloud. This can include cloud adoption plans, cloud application design, and cloud management and monitoring. Must have skills : Google Cloud Platform Architecture Summary: As a Cloud Platform Architect, you will be responsible for overseeing application architecture and deployment in cloud platform environments, including public cloud, private cloud, and hybrid cloud. Your typical day will involve designing cloud adoption plans, managing and monitoring cloud applications, and ensuring cloud application design meets business requirements. Roles & Responsibilities: - Design and implement cloud adoption plans, including public cloud, private cloud, and hybrid cloud environments. - Oversee cloud application design, ensuring it meets business requirements and aligns with industry best practices. - Manage and monitor cloud applications, ensuring they are secure, scalable, and highly available. - Collaborate with cross-functional teams to ensure cloud applications are integrated with other systems and services. - Stay up-to-date with the latest advancements in cloud technology, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: - Must To Have Skills: Strong experience in Google Cloud Platform Architecture. - Good To Have Skills: Experience with other cloud platforms such as AWS or Azure. - Experience in designing and implementing cloud adoption plans. - Strong understanding of cloud application design and architecture. - Experience in managing and monitoring cloud applications. - Solid grasp of cloud security, scalability, and availability best practices.

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Hyderabad

Remote

Canterr is looking for talented and passionate professionals for exciting opportunities with a US-based MNC product company! You will be working permanently with Canterr and deployed to a top-tier global tech client. Key Responsibilities: Design and develop data pipelines and ETL processes to ingest, process, and store large volumes of data. Implement and manage big data technologies such as Kafka, Dataflow, BigQuery, CloudSQL, Kafka, PubSub Collaborate with stakeholders to understand data requirements and deliver high-quality data solutions. Monitor and troubleshoot data pipeline issues and implement solutions to prevent future occurrences. Required Skills and Experience: Generally, we use Google Cloud Platform (GCP) for all software deployed at Wayfair. Data Storage and Processing BigQuery CloudSQL PostgreSQL DataProc Pub/Sub Data modeling: Breaking the business requirements (KPIs) to data points. Building the scalable data model ETL Tools: DBT SQL Data Orchestration and ETL Dataflow Cloud Composer Infrastructure and Deployment Kubernetes Helm Data Access and Management Looker Terraform Ideal Business Domain Experience: Supply chain or warehousing experience: The project is focused on building a normalized data layer which ingests information from multiple Warehouse Management Systems (WMS) and projects it for back-office analysis

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Mumbai

Work from Office

Job Summary This position provides leadership in full systems life cycle management (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.) to ensure delivery is on time and within budget. He/She directs component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements and ensure compliance. This position develops and leads AD project activities and integrations. He/She guides teams to ensure effective communication and achievement of objectives. This position researches and supports the integration of emerging technologies. He/She provides knowledge and support for applications development, integration, and maintenance. This position leads junior team members with project related activities and tasks. He/She guides and influences department and project teams. This position facilitates collaboration with stakeholders. Responsibilities: GCP services (like Big Query, GKE, Spanner, Cloud run, Data flow etc.,) Angular, Java (Rest Api), SQL, Python, Terraforms, Azure DevOps CICD Pipelines Leads systems analysis and design. Leads design and development of applications. Develops and ensures creation of application documents. Defines and produces integration builds. Monitors emerging technology trends. Leads maintenance and support. Primary Skills: Minimum 5 years Java-Springboot/J2EE (Full Stack Developer) Minimum 2 years in GCP platform (Cloud PubSub, GKE, BigQuery) - Experience in BigTable and Spanner will be a plus Working in Agile environment, CI/CD experience. Qualifications: Bachelors Degree or International equivalent Bachelor's Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Mumbai

Work from Office

Job Summary This position provides leadership in full systems life cycle management (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.) to ensure delivery is on time and within budget. He/She directs component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements and ensure compliance. This position develops and leads AD project activities and integrations. He/She guides teams to ensure effective communication and achievement of objectives. This position researches and supports the integration of emerging technologies. He/She provides knowledge and support for applications development, integration, and maintenance. This position leads junior team members with project related activities and tasks. He/She guides and influences department and project teams. This position facilitates collaboration with stakeholders. Responsibilities: Leads systems analysis and design. Leads design and development of applications. Develops and ensures creation of application documents. Defines and produces integration builds. Monitors emerging technology trends. Leads maintenance and support. Primary Skills: Minimum 5 years Java-Springboot/J2EE (Full Stack Developer) Minimum 2 years in Angular, GCP platform (Cloud PubSub, GKE, BigQuery). Working in Agile environment, CI/CD experience. Qualifications: Bachelors Degree or International equivalent Bachelor's Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred

Posted 1 month ago

Apply

8.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

What you’ll be doing: Assist in developing machine learning models based on project requirements Work with datasets by preprocessing, selecting appropriate data representations, and ensuring data quality. Performing statistical analysis and fine-tuning using test results. Support training and retraining of ML systems as needed. Help build data pipelines for collecting and processing data efficiently. Follow coding and quality standards while developing AI/ML solutions Contribute to frameworks that help operationalize AI models What we seek in you: 8+ years of experience in IT Industry Strong on programming languages like Python One cloud hands-on experience (GCP preferred) Experience working with Dockers Environments managing (e.g venv, pip, poetry, etc.) Experience with orchestrators like Vertex AI pipelines, Airflow, etc Understanding of full ML Cycle end-to-end Data engineering, Feature Engineering techniques Experience with ML modelling and evaluation metrics Experience with Tensorflow, Pytorch or another framework Experience with Models monitoring Advance SQL knowledge Aware of Streaming concepts like Windowing, Late arrival, Triggers etc Storage: CloudSQL, Cloud Storage, Cloud Bigtable, Bigquery, Cloud Spanner, Cloud DataStore, Vector database Ingest: Pub/Sub, Cloud Functions, AppEngine, Kubernetes Engine, Kafka, Micro services Schedule: Cloud Composer, Airflow Processing: Cloud Dataproc, Cloud Dataflow, Apache Spark, Apache Flink CI/CD: Bitbucket+Jenkins / Gitlab, Infrastructure as a tool: Terraform Life at Next: At our core, we're driven by the mission of tailoring growth for our customers by enabling them to transform their aspirations into tangible outcomes. We're dedicated to empowering them to shape their futures and achieve ambitious goals. To fulfil this commitment, we foster a culture defined by agility, innovation, and an unwavering commitment to progress. Our organizational framework is both streamlined and vibrant, characterized by a hands-on leadership style that prioritizes results and fosters growth. Perks of working with us: Clear objectives to ensure alignment with our mission, fostering your meaningful contribution. Abundant opportunities for engagement with customers, product managers, and leadership. You'll be guided by progressive paths while receiving insightful guidance from managers through ongoing feedforward sessions. Cultivate and leverage robust connections within diverse communities of interest. Choose your mentor to navigate your current endeavors and steer your future trajectory. Embrace continuous learning and upskilling opportunities through Nexversity. Enjoy the flexibility to explore various functions, develop new skills, and adapt to emerging technologies. Embrace a hybrid work model promoting work-life balance. Access comprehensive family health insurance coverage, prioritizing the well-being of your loved ones. Embark on accelerated career paths to actualize your professional aspirations. Who we are? We enable high growth enterprises build hyper personalized solutions to transform their vision into reality. With a keen eye for detail, we apply creativity, embrace new technology and harness the power of data and AI to co-create solutions tailored made to meet unique needs for our customers. Join our passionate team and tailor your growth with us!

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 8 Lacs

Bengaluru

Work from Office

What you’ll be doing: Assist in developing machine learning models based on project requirements Work with datasets by preprocessing, selecting appropriate data representations, and ensuring data quality. Performing statistical analysis and fine-tuning using test results. Support training and retraining of ML systems as needed. Help build data pipelines for collecting and processing data efficiently. Follow coding and quality standards while developing AI/ML solutions Contribute to frameworks that help operationalize AI models What we seek in you: Strong on programming languages like Python, Java One cloud hands-on experience (GCP preferred) Experience working with Dockers Environments managing (e.g venv, pip, poetry, etc.) Experience with orchestrators like Vertex AI pipelines, Airflow, etc Understanding of full ML Cycle end-to-end Data engineering, Feature Engineering techniques Experience with ML modelling and evaluation metrics Experience with Tensorflow, Pytorch or another framework Experience with Models monitoring Advance SQL knowledge Aware of Streaming concepts like Windowing, Late arrival, Triggers etc Storage: CloudSQL, Cloud Storage, Cloud Bigtable, Bigquery, Cloud Spanner, Cloud DataStore, Vector database Ingest: Pub/Sub, Cloud Functions, AppEngine, Kubernetes Engine, Kafka, Micro services Schedule: Cloud Composer, Airflow Processing: Cloud Dataproc, Cloud Dataflow, Apache Spark, Apache Flink CI/CD: Bitbucket+Jenkins / Gitlab, Infrastructure as a tool: Terraform Life at Next: At our core, we're driven by the mission of tailoring growth for our customers by enabling them to transform their aspirations into tangible outcomes. We're dedicated to empowering them to shape their futures and achieve ambitious goals. To fulfil this commitment, we foster a culture defined by agility, innovation, and an unwavering commitment to progress. Our organizational framework is both streamlined and vibrant, characterized by a hands-on leadership style that prioritizes results and fosters growth. Perks of working with us: Clear objectives to ensure alignment with our mission, fostering your meaningful contribution. Abundant opportunities for engagement with customers, product managers, and leadership. You'll be guided by progressive paths while receiving insightful guidance from managers through ongoing feedforward sessions. Cultivate and leverage robust connections within diverse communities of interest. Choose your mentor to navigate your current endeavors and steer your future trajectory. Embrace continuous learning and upskilling opportunities through Nexversity. Enjoy the flexibility to explore various functions, develop new skills, and adapt to emerging technologies. Embrace a hybrid work model promoting work-life balance. Access comprehensive family health insurance coverage, prioritizing the well-being of your loved ones. Embark on accelerated career paths to actualize your professional aspirations. Who we are? We enable high growth enterprises build hyper personalized solutions to transform their vision into reality. With a keen eye for detail, we apply creativity, embrace new technology and harness the power of data and AI to co-create solutions tailored made to meet unique needs for our customers. Join our passionate team and tailor your growth with us!

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru

Work from Office

Job Title .Net Core with AWS/Azure/GCP _ Q1 Hiring Responsibilities A day in the life of an InfoscionResponsibilities Application migration to AWS/Azure/GCP cloud User requirements, envisioning system features and functionality. Identify bottlenecks and bugs, and recommend system solutions by comparing advantages and disadvantages of custom development Contributing to team meetings, troubleshooting development and production problems across multiple environments and operating platforms Understand Architecture and ensure effective Design, Development, Validation and Support activities Understand and analyze client requirements, refactor systems for workload migration / modernization to cloud (AWS, Azure, GCP) End-to-end feature development and resolving challenges faced in the implementation Create detailed design artifacts, work on development, and perform code reviews, implement validation and support activities Contribute to thought leadership within the area of technology specializationIf you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional : Primary Skill.Net CoreSecondary Skills: AWS/Azure/GCP Preferred Skills: .Net Technology-Cloud Platform-Azure App Development .Net-.Net Core Generic Skills: Technology-Cloud Platform-AWS App Development Technology-Cloud Platform-GCP App Development Additional Responsibilities: Preferred LocationsBangalore, Hyderabad, Chennai, Pune Experience Required3 to 5 years of experiencePure hands on and expertise on the skill, able to deliver without any support Experience Required5 - 9 years of experienceDesign knowledge, estimation technique, leading and guiding the team on technical solution Experience Required9 - 13 years of experienceArchitecture, Solutioning, (Optional) proposal Containerization, micro service development on AWS/Azure/GCP is preferred. In-depth knowledge of design issues and best practices Solid understanding of object-oriented programming Familiar with various design, architectural patterns and software development process. Implementing automated testing platforms and unit tests Strong experience in building and developing applications using technologies like .Net Core Knowledge about RESTful APIs and ability to design cloud ready applications using cloud SDKs Exposure to cloud compute services like VMs, PaaS services, containers, serverless and storage services on AWS/Azure/GCP Good understanding of application development design patterns Educational Master Of Engineering,MCA,MSc,MTech,Bachelor of Engineering,BCA,BSc,BTech Service Line Application Development and Maintenance* Location of posting is subject to business requirements

Posted 1 month ago

Apply

8.0 - 10.0 years

9 - 13 Lacs

Bengaluru

Work from Office

What you’ll be doing: Assist in developing machine learning models based on project requirements Work with datasets by preprocessing, selecting appropriate data representations, and ensuring data quality. Performing statistical analysis and fine-tuning using test results. Support training and retraining of ML systems as needed. Help build data pipelines for collecting and processing data efficiently. Follow coding and quality standards while developing AI/ML solutions Contribute to frameworks that help operationalize AI models What we seek in you: 8+ years of experience in IT Industry Strong on programming languages like Python One cloud hands-on experience (GCP preferred) Experience working with Dockers Environments managing (e.g venv, pip, poetry, etc.) Experience with orchestrators like Vertex AI pipelines, Airflow, etc Understanding of full ML Cycle end-to-end Data engineering, Feature Engineering techniques Experience with ML modelling and evaluation metrics Experience with Tensorflow, Pytorch or another framework Experience with Models monitoring Advance SQL knowledge Aware of Streaming concepts like Windowing, Late arrival, Triggers etc Storage: CloudSQL, Cloud Storage, Cloud Bigtable, Bigquery, Cloud Spanner, Cloud DataStore, Vector database Ingest: Pub/Sub, Cloud Functions, AppEngine, Kubernetes Engine, Kafka, Micro services Schedule: Cloud Composer, Airflow Processing: Cloud Dataproc, Cloud Dataflow, Apache Spark, Apache Flink CI/CD: Bitbucket+Jenkins / Gitlab, Infrastructure as a tool: Terraform Life at Next: At our core, we're driven by the mission of tailoring growth for our customers by enabling them to transform their aspirations into tangible outcomes. We're dedicated to empowering them to shape their futures and achieve ambitious goals. To fulfil this commitment, we foster a culture defined by agility, innovation, and an unwavering commitment to progress. Our organizational framework is both streamlined and vibrant, characterized by a hands-on leadership style that prioritizes results and fosters growth. Perks of working with us: Clear objectives to ensure alignment with our mission, fostering your meaningful contribution. Abundant opportunities for engagement with customers, product managers, and leadership. You'll be guided by progressive paths while receiving insightful guidance from managers through ongoing feedforward sessions. Cultivate and leverage robust connections within diverse communities of interest. Choose your mentor to navigate your current endeavors and steer your future trajectory. Embrace continuous learning and upskilling opportunities through Nexversity. Enjoy the flexibility to explore various functions, develop new skills, and adapt to emerging technologies. Embrace a hybrid work model promoting work-life balance. Access comprehensive family health insurance coverage, prioritizing the well-being of your loved ones. Embark on accelerated career paths to actualize your professional aspirations. Who we are? We enable high growth enterprises build hyper personalized solutions to transform their vision into reality. With a keen eye for detail, we apply creativity, embrace new technology and harness the power of data and AI to co-create solutions tailored made to meet unique needs for our customers. Join our passionate team and tailor your growth with us!

Posted 1 month ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Dear Candidate, We are hiring a Cloud Operations Engineer to manage and optimize cloud-based environments. Ideal for engineers passionate about automation, monitoring, and cloud-native technologies. Key Responsibilities: Maintain cloud infrastructure (AWS, Azure, GCP) Automate deployments and system monitoring Ensure availability, performance, and cost optimization Troubleshoot incidents and resolve system issues Required Skills & Qualifications: Hands-on experience with cloud platforms and DevOps tools Proficiency in scripting (Python, Bash) and IaC (Terraform, CloudFormation) Familiarity with logging/monitoring tools (CloudWatch, Datadog, etc.) Bonus: Experience with Kubernetes or serverless architectures Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 2 months ago

Apply

5 - 10 years

9 - 19 Lacs

Chennai, Bengaluru, Mumbai (All Areas)

Hybrid

Google BigQuery Location- Pan India Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Key Responsibilities : Analyze and model client market and key performance data Use analytical tools and techniques to develop business insights and improve decisionmaking \n1:Data Proc PubSub Data flow Kalka Streaming Looker SQL No FLEX\n2:Proven track record of delivering data integration data warehousing soln\n3: Strong SQL And Handson Pro in BigQuery SQL languageExp in Shell Scripting Python No FLEX\n4:Exp with data integration and migration projects Oracle SQL Technical Experience : Google BigQuery\n\n1: Expert in Python NO FLEX Strong handson knowledge in SQL NO FLEX Python programming using Pandas NumPy deep understanding of various data structure dictionary array list tree etc experiences in pytest code coverage skills\n2: Exp with building solutions using cloud native services: bucket storage Big Query cloud function pub sub composer and Kubernetes NO FLEX\n3: Pro with tools to automate AZDO CI CD pipelines like ControlM GitHub JIRA confluence CI CD Pipeline Professional Attributes :

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies