Posted:3 hours ago|
Platform:
Hybrid
Full Time
Key Responsibilities
Design, build, and maintain end-to-end GCP data pipelines (batch and streaming). Ensure data platform uptime and performance in alignment with defined SLAs/SLOs. Develop and optimize ETL/ELT workflows using Cloud Composer (Airflow), Dataflow (Apache Beam), and BigQuery. Manage and enhance data lakes and warehouses using BigQuery and Cloud Storage. Implement streaming data solutions using Pub/Sub, Dataflow, or Kafka. Build data APIs and microservices for data consumption using Cloud Run, Cloud Functions, or App Engine.
Define and enforce data quality, governance, and lineage using Data Catalog
and Cloud Data Quality tools. Collaborate with DevOps to build CI/CD pipelines, infrastructure as code, and automated monitoring for data workflows. Participate in incident management, RCA, and change control processes following ITIL best practices. Proactively identify optimization opportunities to improve cost efficiency, latency, and reliability. Mentor junior engineers and ensure adherence to engineering best practices.
Technical Skills & Experience
Programming: Python (advanced), SQL (expert), Shell scripting. GCP Data Stack: o Data Storage: BigQuery, Cloud Storage o Data Processing: Dataflow (Apache Beam), Dataproc (Spark), Cloud Composer (Airflow) o Streaming: Pub/Sub, Kafka (optional) o Orchestration: Cloud Composer / Airflow o APIs & Services: Cloud Run, Cloud Functions, API Gateway o Monitoring: Cloud Logging, Cloud Monitoring, Stackdriver, or Prometheus/Grafana
DevOps: Terraform, Git, Docker, Kubernetes, Cloud Build / Jenkins.
Data Visualization: Looker Studio, Power BI, or Tableau. Version Control & CI/CD: GitHub, GitLab, or Bitbucket pipelines
Key Competencies
Proven experience in managed service delivery under strict SLAs (availability, latency, and resolution timelines). Strong understanding of GCP networking, IAM, and cost management. Strong knowledge of incident management, change management, and problem management using ITSM tools (e.g., ServiceNow, Jira). Ability to lead on-call operations in a 16/5 support model, coordinating across global teams. Understanding of data security, governance, and compliance. Excellent communication and client-handling skills. Ability to work in a 16/5 model, coordinating with global teams across time zones.
Good to Have
Certifications: o Google Professional Data Engineer o Google Professional Cloud Architect Experience with data observability tools (Monte Carlo, Databand, or similar). Exposure to MLOps pipelines on GCP (Vertex AI, AI Platform).
Niveus Solution
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Nowmangaluru
6.0 - 10.0 Lacs P.A.
chennai
12.0 - 13.0 Lacs P.A.
Salary: Not disclosed
chennai, tamil nadu, india
Salary: Not disclosed
7.0 - 9.0 Lacs P.A.
mumbai, maharashtra, india
Experience: Not specified
Salary: Not disclosed
chennai, tamil nadu, india
Salary: Not disclosed
chennai, tamil nadu, india
Salary: Not disclosed
6.0 - 10.0 Lacs P.A.
chennai, tamil nadu, india
Salary: Not disclosed