Data Engineer | 4+ | Bangalore | C2H-TCS | Walkin | 28th Mar

4 - 6 years

6.0 - 12.0 Lacs P.A.

Bengaluru

Posted:2 months ago| Platform: Naukri logo

Apply Now

Skills Required

Data EngineeringGCPPySparkPython

Work Mode

Work from Office

Job Type

Full Time

Job Description

Must Have : GCP , PySpark Roles and Responsibilities Design, develop, and maintain large-scale data pipelines using Python and GCP. Collaborate with cross-functional teams to identify business requirements and design solutions. Develop scalable data architectures on Google Cloud Platform (GCP) using PySpark, BigQuery, Pub/Sub, etc. Ensure high-quality data processing by implementing robust testing frameworks such as unit tests, integration tests, and ETL processes. Participate in code reviews to ensure adherence to coding standards and best practices. Desired Candidate Profile 4-6 years of experience in Data Engineering with expertise in Python programming language. Strong understanding of big data technologies like Hadoop ecosystem (HDFS), Spark (PySpark), Kafka (Pub/Sub). Experience working with cloud-based platforms like GCP (BigQuery, Pub/Sub) is essential. Proficiency in writing efficient SQL queries for querying large datasets.

Digital Marketing
Marketing City

RecommendedJobs for You

Chennai, Pune, Mumbai, Bengaluru, Gurgaon

Chennai, Pune, Delhi, Mumbai, Bengaluru, Hyderabad, Kolkata

Pune, Bengaluru, Mumbai (All Areas)