Data Engineer ( GCP & Spark Expertise)

6 - 8 years

10 - 15 Lacs

Posted:1 week ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Job Summary

Synechron is seeking a highly skilled and proactive Data Engineer to join our dynamic data analytics team. In this role, you will be instrumental in designing, developing, and maintaining scalable data pipelines and solutions on the Google Cloud Platform (GCP). With your expertise, you'll enable data-driven decision-making, contribute to strategic business initiatives, and ensure robust data infrastructure. This position offers an opportunity to work in a collaborative environment with a focus on innovative technologies and continuous growth.

Software Requirements

Required:

  • Proficiency in Data Engineering tools and frameworks such as Hive, Apache Spark, and Python (version 3.x)
  • Extensive experience working with Google Cloud Platform (GCP) offerings including Dataflow, BigQuery, Cloud Storage, and Pub/Sub
  • Familiarity with Git, Jira, and Confluence for version control and collaboration

Preferred:

  • Experience with additional GCP services like DataProc, Data Studio, or Cloud Composer
  • Exposure to other programming languages such as Java or Scala
  • Knowledge of data security best practices and tools
Overall Responsibilities
  • Design, develop, and optimize scalable data pipelines on GCP to support analytics and reporting needs
  • Collaborate with cross-functional teams to translate business requirements into technical solutions
  • Build and maintain data models, ensuring data quality, integrity, and security
  • Participate actively in code reviews, adhering to best practices and standards
  • Develop automated and efficient data workflows to improve system performance
  • Stay updated with emerging data engineering trends and continuously improve technical skills
  • Provide technical guidance and support to team members, fostering a collaborative environment
  • Ensure timely delivery of deliverables aligned with project milestones
Technical Skills (By Category)

Programming Languages:

  • Essential: Python (required)
  • Preferred: Java, Scala

Data Management & Databases:

  • Experience with Hive, BigQuery, and relational databases
  • Knowledge of data warehousing concepts and SQL proficiency

Cloud Technologies:

  • Extensive hands-on experience with GCP services including Dataflow, BigQuery, Cloud Storage, Pub/Sub, and Composer
  • Ability to build and optimize data pipelines leveraging GCP offerings

Frameworks & Libraries:

  • Spark (PySpark preferred), Hadoop ecosystem experience is advantageous

Development Tools & Methodologies:

  • Agile/Scrum methodologies, version control with Git, project tracking via JIRA, documentation on Confluence

Security Protocols:

  • Understanding of data security, privacy, and compliance standards
Experience Requirements
  • Minimum of 6-8 years in data or software engineering roles with a focus on data pipeline development
  • Proven experience in designing and implementing data solutions on cloud platforms, particularly GCP
  • Prior experience working in agile teams, participating in code reviews, and delivering end-to-end data projects
  • Experience working with cross-disciplinary teams and understanding varied stakeholder requirements
  • Exposure to industry best practices for data security, governance, and quality assurance is desired
Day-to-Day Activities
  • Attend daily stand-up meetings and contribute to project planning sessions
  • Collaborate with business analysts, data scientists, and other stakeholders to understand data needs
  • Develop, test, and deploy scalable data pipelines, ensuring efficiency and reliability
  • Perform regular code reviews, provide constructive feedback, and uphold coding standards
  • Document technical solutions and maintain clear records of data workflows
  • Troubleshoot and resolve technical issues in data processing environments
  • Participate in continuous learning initiatives to stay abreast of technological developments
  • Support team members by sharing knowledge and resolving technical challenges
Qualifications
  • Bachelor's or Masters degree in Computer Science, Information Technology, or a related field
  • Relevant professional certifications in GCP (such as Google Cloud Professional Data Engineer) are preferred but not mandatory
  • Demonstrable experience in data engineering and cloud technologies
Professional Competencies
  • Strong analytical and problem-solving skills, with a focus on outcome-driven solutions
  • Excellent communication and interpersonal skills to effectively collaborate within teams and with stakeholders
  • Ability to work independently with minimal supervision and manage multiple priorities effectively
  • Adaptability to evolving technologies and project requirements
  • Demonstrated initiative in driving tasks forward and continuous improvement mindset
  • Strong organizational skills with a focus on quality and attention to detail

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Synechron logo
Synechron

Information Technology and Services

New York

RecommendedJobs for You