Home
Jobs

3 Cloud Build Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

20 - 30 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Role & responsibilities Key Skills: 3 years of experience with building modern applications utilizing GCP services like Cloud Build, Cloud Functions/ Cloud Run, GKE, Logging, GCS, CloudSQL & IAM. Primary proficiency in Python and experience with a secondary language such as Golang or Java. In-depth knowledge and hands-on experience with GKE/K8s. You place a high emphasis on Software Engineering fundamentals such as code and configuration management, CICD/Automation and automated testing. Working with operations, security, compliance and architecture groups to develop secure, scalable and supportable solutions. Working and delivering solution in a complex enterprise environment. Proficiency in designing and developing scalable and decoupled microservices and adeptness in implementing event-driven architecture to ensure seamless and responsive service interactions. Proficiency in designing scalable and robust solutions leveraging cloud-native technologies and architectures. Expertise in managing diverse stakeholder expectations and adept at prioritizing tasks to align with strategic objectives and deliver optimal outcomes. Good to have knowledge, skills and experiences The ‘good to have’ knowledge, skill and experience (KSE) the role requires are: Ability to integrate Kafka to handle real-time data. Proficiency in monitoring tools Experience using Robot Framework for automated UAT is highly desirable.

Posted 1 day ago

Apply

2.0 - 3.0 years

4 - 7 Lacs

Hyderabad, Gachibowli

Work from Office

Naukri logo

Job Summary Synechron is seeking a highly motivated and skilled Senior Cloud Data Engineer GCP to join our cloud solutions team. In this role, you will collaborate closely with clients and internal stakeholders to design, implement, and manage scalable, secure, and high-performance cloud-based data solutions on Google Cloud Platform (GCP). You will leverage your technical expertise to ensure the integrity, security, and efficiency of cloud data architectures, enabling the organization to derive maximum value from cloud data assets. This role contributes directly to our mission of delivering innovative digital transformation solutions and supports the organizations strategic objectives of scalable and sustainable cloud infrastructure. Software Requirements Required Skills: Proficiency with Google Cloud Platform (GCP) services (Compute Engine, Cloud Storage, BigQuery, Cloud Pub/Sub, Dataflow, etc.) Basic scripting skills with Python, Bash, or similar languages Familiarity with virtualization and cloud networking concepts Understanding of cloud security best practices and compliance standards Experience with infrastructure as code tools (e.g., Terraform, Deployment Manager) Strong knowledge of data management, data pipelines, and ETL processes Preferred Skills: Experience with other cloud platforms (AWS, Azure) Knowledge of SQL and NoSQL databases Familiarity with containerization (Docker, GKE) Experience with data visualization tools Overall Responsibilities Design, implement, and operate cloud data solutions that are secure, scalable, and optimized for performance Collaborate with clients and internal teams to identify infrastructure and data architecture requirements Manage and monitor cloud infrastructure and ensure operational reliability Resolve technical issues related to cloud data workflows and storage solutions Participate in project planning, timelines, and technical documentation Contribute to best practices and continuous improvement initiatives within the organization Educate and support clients in adopting cloud data services and best practices Technical Skills (By Category) Programming Languages: Essential: Python, Bash scripts Preferred: SQL, Java, or other data processing languages Databases & Data Management: Essential: BigQuery, Cloud SQL, Cloud Spanner, Cloud Storage Preferred: NoSQL databases like Firestore, MongoDB Cloud Technologies: Essential: Google Cloud Platform core services (Compute, Storage, BigQuery, Dataflow, Pub/Sub) Preferred: Cloud monitoring, logging, and security tools Frameworks & Libraries: Essential: Data pipeline frameworks, Cloud SDKs, APIs Preferred: Apache Beam, Data Studio Development Tools & Methodologies: Essential: Infrastructure as Code (Terraform, Deployment Manager) Preferred: CI/CD tools (Jenkins, Cloud Build) Security Protocols: Essential: IAM policies, data encryption, network security best practices Preferred: Compliance frameworks such as GDPR, HIPAA Experience Requirements 2-3 years of experience in cloud data engineering, cloud infrastructure, or related roles Hands-on experience with GCP is preferred; experience with AWS or Azure is a plus Background in designing and managing cloud data pipelines, storage, and security solutions Proven ability to deliver scalable data solutions in cloud environments Experience working with cross-functional teams on cloud deployments Alternative experience pathways: academic projects, certifications, or relevant internships demonstrating cloud data skills Day-to-Day Activities Develop and deploy cloud data pipelines, databases, and analytics solutions Collaborate with clients and team members to plan and implement infrastructure architecture Perform routine monitoring, maintenance, and performance tuning of cloud data systems Troubleshoot technical issues affecting data workflows and resolve performance bottlenecks Document system configurations, processes, and best practices Engage in continuous learning on new cloud features and data management tools Participate in project meetings, code reviews, and knowledge sharing sessions Qualifications Bachelors or Masters degree in computer science, engineering, information technology, or a related field Relevant certifications (e.g., Google Cloud Professional Data Engineer, Cloud Architect) are preferred Training in cloud security, data management, or infrastructure design is advantageous Commitment to professional development and staying updated with emerging cloud technologies Professional Competencies Critical thinking and problem-solving skills to resolve complex cloud architecture challenges Ability to work collaboratively with multidisciplinary teams and clients Strong communication skills for technical documentation and stakeholder engagement Adaptability to evolving cloud technologies and project priorities Organized with a focus on quality and detail-oriented delivery Proactive learner with a passion for innovation in cloud data solutions Ability to manage multiple tasks effectively and prioritize in a fast-paced environment

Posted 1 week ago

Apply

5.0 - 10.0 years

4 - 7 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Key Responsibilities: 1. ETL Pipeline Development: Design, develop, and maintain scalable ETL processes to extract, transform, and load data from various structured and unstructured sources into GCP-based data warehouses (BigQuery, Cloud SQL, Cloud Storage, etc.). Develop efficient SQL queries and scripts to support data transformation, aggregation, and validation. Optimize ETL workflows to ensure low-latency data processing and high performance. 2. Google Cloud Dataform & Data Transformation: Utilize Google Cloud Dataform to implement SQL-based data transformations in BigQuery following best practices in data modeling, version control, and dependency management. Develop modular SQL workflows using Dataform to simplify transformation logic and enhance reusability. Integrate Dataform into existing ETL/ELT pipelines to streamline data engineering and analytics workflows. Leverage Dataform's automated testing, scheduling, and Git-based version control for collaborative development and data quality assurance. 3. Data Integration & Management: Work with diverse data sources (databases, APIs, streaming data, and cloud storage) to integrate data into centralized repositories. Ensure data consistency, integrity, and accuracy through rigorous testing and validation. Implement incremental data loads, change data capture (CDC), and batch/real-time ETL strategies . Leverage GCP services like Dataflow, Dataproc, Cloud Functions, and Pub/Sub to handle data ingestion and transformation. 4. Database & SQL Development: Write complex SQL queries, stored procedures, and functions to support analytical and operational data needs. Optimize SQL queries for performance tuning and cost efficiency in BigQuery, Cloud SQL, and other relational databases. Ensure proper indexing, partitioning, and clustering strategies for optimal query performance. 5. Cloud & DevOps Integration: Deploy and monitor ETL workflows using GCP-native tools (Cloud Composer/Airflow, Dataform, Dataflow, Dataprep, etc.). Implement CI/CD pipelines for ETL jobs using Terraform, Cloud Build, GitHub Actions, or Jenkins. Work with Infrastructure and DevOps teams to ensure secure and reliable deployment of ETL solutions in a cloud environment. 6. Data Quality & Governance: Implement data validation, data cleansing, and error-handling mechanisms in ETL pipelines. Monitor data pipeline performance and ensure timely resolution of issues and failures. Work with stakeholders to define data governance policies , metadata management, and access controls. 7. Documentation & Collaboration: Maintain comprehensive documentation for ETL workflows, data transformations, and technical design. Collaborate with data engineers, data analysts, and business teams to understand data needs and optimize data processing workflows. Conduct code reviews and provide mentorship to junior developers when necessary. Required Skills & Qualifications: 1. Technical Skills: ETL Development: Hands-on experience in designing and implementing ETL pipelines. Proficiency in ETL tools such as Apache Airflow (Cloud Composer), Dataflow, or Informatica . SQL & Database Management: Strong expertise in SQL (DDL, DML, performance tuning, indexing, partitioning, stored procedures, etc.). Experience working with relational (Cloud SQL, PostgreSQL, MySQL) and NoSQL databases (Bigtable, Firestore, MongoDB, etc.). Cloud (GCP) Expertise: Strong hands-on experience with Google Cloud Platform (GCP) services: BigQuery (data warehousing & analytics) Cloud Storage (data lake storage) Cloud Composer (Apache Airflow) (workflow orchestration) Cloud Functions (serverless ETL tasks) Cloud Dataflow (Apache Beam-based data processing) Pub/Sub (real-time streaming) Dataproc (Hadoop/Spark-based processing) Google Cloud Dataform (SQL-based transformations for BigQuery) Programming & Scripting: Experience with Python, SQL scripting, and Shell scripting for ETL automation. Knowledge of PySpark or Apache Beam is a plus. CI/CD & DevOps: Experience in deploying ETL workflows using Terraform, Cloud Build, or Jenkins. Familiarity with Git/GitHub for version control.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies