Home
Jobs

Gcp Data Engineer / GCP DevOps Engineer- Contract To Hire role

5 - 10 years

12 - 17 Lacs

Posted:1 week ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Role 1 - GCP Data Engineer Must have skills /Mandatory Skills GCP, Big Query, Dataflow, Cloud Composer Role 2. Big Data Engineer Must have skills /Mandatory Skills -Big Data, PySpark, Scala, Python Role 3 - GCP DevOps Engineer Must have skills /Mandatory Skills – GCP DevOps Experience Range – 5+ Years Location – Only Pune & Hyderabad, If you are applying from outside Pune or Hyd, then you have to relocate to Pune or Hyd . Work Mode – Min 2 days are mandatory to work from home. Salary – 12-16 LPA Point to be remember Pls fill the Candidate Summary Sheet. Not Considering more than 30 days’ notice period. Highlights of this role. It’s a long term role. High Possibility of conversion within 6 Months or After 6 months ( if you perform well). Interview -Total 2 rounds of Interview ( Both Virtual), but one face to face meeting is mandatory @ any location - Pune/Hyderabad /Bangalore /Chennai. UAN Verification will be done in Background Check. Any overlapping in past employment will eliminate you. Last 4 Years Continuous PF deduction is mandatory. One face to face meeting is mandatory, Otherwise we can’t onboard you. Client Company – One of Leading Technology Consulting Payroll Company – One of Leading IT Services & Staffing Company ( This company has a presence in India, UK, Europe , Australia , New Zealand, US, Canada, Singapore, Indonesia, and Middle east. Do not change the subject line or do not create new email while sharing /applying for this position. Pls reply on this email thread only. Role 1 - GCP Data Engineer Must have skills /Mandatory Skills – GCP, Big Query, Dataflow, Cloud Composer About the Role: We are seeking a highly skilled and passionate GCP Data Engineer to join our growing data team. In this role, you will be instrumental in designing, building, and maintaining scalable and robust data pipelines and solutions on Google Cloud Platform (GCP). You will work closely with data scientists, analysts, and other stakeholders to translate business requirements into efficient data architectures, enabling data-driven decision-making across the organization. Qualifications: Bachelor's degree in Computer Science, Engineering, Information Technology, or a related quantitative field. Min 5+ years of experience (e.g., 3-8 years) as a Data Engineer, with a strong focus on Google Cloud Platform (GCP). Mandatory hands-on experience with core GCP data services: BigQuery (advanced SQL, data modeling, query optimization) Dataflow (Apache Beam, Python/Java SDK) Cloud Composer / Apache Airflow for workflow orchestration Cloud Storage (GCS) Cloud Pub/Sub for messaging/streaming Strong programming skills in Python (preferred) or Java/Scala for data manipulation and pipeline development. Proficiency in SQL and experience with relational and NoSQL databases. Experience with data warehousing concepts, ETL/ELT processes, and data modeling techniques. Understanding of distributed systems and big data technologies (e.g., Spark, Hadoop concepts, Kafka). Familiarity with CI/CD practices and tools. Role 2. Big Data Engineer Must have skills /Mandatory Skills -Big Data, PySpark, Scala, Python About the Role: We are looking for an experienced and passionate Big Data Engineer to join our dynamic team. In this role, you will be responsible for designing, building, and maintaining scalable, high-performance data processing systems and pipelines capable of handling vast volumes of structured and unstructured data. You will play a crucial role in enabling our data scientists, analysts, and business teams to derive actionable insights from complex datasets. Qualifications : bachelor’s degree in Computer Science, Engineering, Information Technology, or a related field. Min 5 years of proven experience as a Big Data Engineer or a similar role. Extensive hands-on experience with Apache Spark (PySpark, Scala) for data processing. Strong expertise in the Hadoop ecosystem (HDFS, Hive, MapReduce). Proficiency in Python and/or Scala/Java. Solid SQL skills and experience with relational databases. Experience designing and building complex ETL/ELT pipelines. Familiarity with data warehousing concepts and data modeling techniques (star schema, snowflake, data vault). Understanding of distributed computing principles. Excellent problem-solving, analytical, and communication skills Role 3r - GCP DevOps Engineer Must have skills /Mandatory Skills – GCP DevOps We are seeking a highly motivated and experienced GCP DevOps Engineer to join our innovative engineering team. You will be responsible for designing, implementing, and maintaining robust, scalable, and secure cloud infrastructure and automation pipelines on Google Cloud Platform (GCP). This role involves working closely with development, operations, and QA teams to streamline the software delivery lifecycle, enhance system reliability, and promote a culture of continuous improvement. Qualifications: Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. 5 years of experience in a DevOps or SRE role, with significant hands-on experience on Google Cloud Platform (GCP). Strong expertise in core GCP services relevant to DevOps: Compute Engine, GKE, Cloud SQL, Cloud Storage, VPC, Cloud Load Balancing, IAM. Proficiency with Infrastructure as Code (IaC) tools, especially Terraform. Extensive experience in designing and implementing CI/CD pipelines using tools like Cloud Build, Jenkins, or GitLab CI. Hands-on experience with containerization (Docker) and container orchestration (Kubernetes/GKE). Strong scripting skills in Python and Bash/Shell. Experience with monitoring and logging tools (Cloud Monitoring, Prometheus, Grafana, ELK stack). Solid understanding of networking concepts (TCP/IP, DNS, Load Balancers, VPNs) in a cloud environment. Familiarity with database concepts and experience managing cloud databases (e.g., Cloud SQL, Firestore). *** Mandatory to share ***Candidate Summary Sheet*** Interested parties can share their resume at (shant@harel-consulting.com) along with below details Applying for which role ( Pls mention the role name)- Your Name – Contact NO – Email ID – Do you have valid passport – Total Experience – Role 1 . Experience in GCP - Experience in Big Query - Experience in Data Flow - Experience in Cloud Composer – Experience in Apache Airflow – Experience in Python OR Java OR Scala and how much – Role 2nd Experience in Big data- Experience in Hive – Experience in Python OR Java OR Scala and how much – Experience in Pyspark- Role 3. Experience in GCP Devops – Experience in Python Current CTC – Expected CTC – What is your notice period in your current Company- Are you currently working or not- If not working then when you have left your last company – Current location – Preferred Location – It’s a Contract to Hire role, Are you ok with that- Highest Qualification – Current Employer (Payroll Company Name) Previous Employer (Payroll Company Name)- 2nd Previous Employer (Payroll Company Name) – 3rd Previous Employer (Payroll Company Name)- Are you holding any Offer – Are you Expecting any offer - Are you open to consider Contract to Hire role , It’s a C2H Role- PF Deduction is happening in Current Company – PF Deduction happened in 2nd last Employer- PF Deduction happened in 3 last Employer – Latest Photo - If incase you are working with a company whose employee strength is less than 2000 employees, than its mandatory to share UAN Service history. BR Shantpriya Harel Consulting shant@harel-consulting.com 9582718094

Mock Interview

Practice Video Interview with JobPe AI

Start Airflow Interview Now

My Connections Harel Hr Consulting

Download Chrome Extension (See your connection in the Harel Hr Consulting )

chrome image
Download Now

RecommendedJobs for You

Hyderabad, Chennai, Bengaluru