Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
madurai, tamil nadu
On-site
You are a highly skilled GCP Data Engineer/Lead/Architect with a specialization in real-time streaming data architectures. Your primary responsibility will be to design, develop, and optimize data pipelines within the Google Cloud Platform (GCP) ecosystem. You must possess a strong architectural vision and be willing to be hands-on in building scalable, low-latency streaming data pipelines using tools such as Pub/Sub, Dataflow (Apache Beam), and BigQuery. Your tasks will include architecting and implementing end-to-end streaming data solutions on GCP, designing real-time ingestion, enrichment, and transformation pipelines for high-volume event data, collaborating with stakeholders to understand data requirements and translate them into scalable designs, optimizing streaming pipeline performance, latency, and throughput, building and managing orchestration workflows using Cloud Composer (Airflow), defining schema design, partitioning, and clustering strategies in BigQuery, and ensuring robust security, encryption, and access controls across all data layers. Key Responsibilities: - Architect and implement end-to-end streaming data solutions on GCP using Pub/Sub, Dataflow, and BigQuery. - Design real-time ingestion, enrichment, and transformation pipelines for high-volume event data. - Work closely with stakeholders to understand data requirements and translate them into scalable designs. - Optimize streaming pipeline performance, latency, and throughput. - Build and manage orchestration workflows using Cloud Composer (Airflow). - Drive schema design, partitioning, and clustering strategies in BigQuery for both real-time and batch datasets. - Define SLAs, monitoring, logging, and alerting for streaming jobs using Cloud Monitoring, Error Reporting, and Stackdriver. - Ensure robust security, encryption, and access controls across all data layers. - Collaborate with DevOps for CI/CD automation of data workflows using Terraform, Cloud Build, and Git. - Document streaming architecture, data lineage, and deployment runbooks. Requirements: - 5+ years of experience in data engineering or architecture. - 3+ years of hands-on GCP data engineering experience. - Strong expertise in Google Pub/Sub, Dataflow (Apache Beam), BigQuery, Cloud Composer (Airflow), Cloud Storage (GCS). - Solid understanding of streaming design patterns, exactly-once delivery, and event-driven architecture. - Deep knowledge of SQL and NoSQL data modeling. - Hands-on experience with monitoring and performance tuning of streaming jobs. - Experience using Terraform or equivalent for infrastructure as code. - Familiarity with CI/CD pipelines for data workflows. Join us at TechMango as a GCP Data Engineer and be a part of an innovative team that drives data excellence through cutting-edge technology. Take advantage of our benefits such as a healthy work-life balance, badminton ground, free accommodation, cab facility for female employees, insurance, gym, subsidized food, awards and recognition, and medical checkup. Apply now for this exciting career opportunity!,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
You should have hands-on experience in deploying and managing large-scale dataflow products such as Cribl, Logstash, or Apache NiFi. Additionally, you should be proficient in integrating data pipelines with cloud platforms like AWS, Azure, Google Cloud, and on-premises systems. It is essential to have experience in developing and validating field extraction using regular expressions. A strong understanding of Operating Systems and Networking concepts is required, including Linux/Unix system administration, HTTP, and encryption. You should possess knowledge of software version control, deployment, and build tools following DevOps SDLC practices such as Git, Jenkins, and Jira. Strong analytical and troubleshooting skills are crucial for this role, along with excellent verbal and written communication skills. An appreciation of Agile methodologies, specifically Kanban, is also expected. Desirable skills for this position include enterprise experience with a distributed event streaming platform like Apache Kafka, AWS Kinesis, Google Pub/Sub, or MQ. Experience in infrastructure automation and integration, preferably using Python and Ansible, would be beneficial. Familiarity with cybersecurity concepts, event types, and monitoring requirements is a plus. Experience in Parsing and Normalizing data in Elasticsearch using Elastic Common Schema (ECS) would also be advantageous.,
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40098 Jobs | Dublin
Wipro
19606 Jobs | Bengaluru
Accenture in India
17147 Jobs | Dublin 2
EY
15891 Jobs | London
Uplers
11674 Jobs | Ahmedabad
Amazon
10661 Jobs | Seattle,WA
Oracle
9452 Jobs | Redwood City
IBM
9401 Jobs | Armonk
Accenture services Pvt Ltd
8681 Jobs |
Capgemini
7992 Jobs | Paris,France