Python/Pyspark, Bigquery with GCP, Apache Iceberg

5 years

0 Lacs

Posted:5 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

TCS Hiring !!Virtual Drive ***

TCS - Hyderabad

12 PM to 1 PM

Immediate Joiners

5 to 7 years



Role**


Exp -



NOTE: If the skills/profile matches and interested, please reply to this email by attaching your latest updated CV and with below few details:

Name:

Contact Number:

Email ID:

Highest Qualification in: (Eg. B.Tech/B.E./M.Tech/MCA/M.Sc./MS/BCA/B.Sc./Etc.)

Current Organization Name:

Total IT Experience-

Location : Hyderabad

Current CTC

Expected CTC

Notice period: Immediate

Whether worked with TCS - Y/N


Must-Have**

(Ideally should not be more than 3-5)

  • Strong proficiency in

    Python

    programming.
  • Hands-on experience with

    PySpark

    and

    Apache Spark

    .
  • Knowledge of

    Big Data

    technologies (Hadoop, Hive, Kafka, etc.).
  • Experience with

    SQL

    and relational/non-relational databases.
  • Familiarity with

    distributed computing

    and

    parallel processing

    .
  • Understanding of

    data engineering

    best practices.
  • Experience with

    REST APIs

    ,

    JSON/XML

    , and data serialization.
  • Exposure to

    cloud computing

    environments.

Good-to-Have

  • 5+ years of experience in Python and PySpark development.
  • Experience with

    data warehousing

    and

    data lakes

    .
  • Knowledge of

    machine learning

    libraries (e.g., MLlib) is a plus.
  • Strong problem-solving and debugging skills.
  • Excellent communication and collaboration abilities.


SN

Responsibility of / Expectations from the Role

1

  • Develop and maintain scalable data pipelines using

    Python

    and

    PySpark

    .
  • Design and implement

    ETL (Extract, Transform, Load)

    processes.
  • Optimize and troubleshoot existing PySpark applications for performance.
  • Collaborate with cross-functional teams to understand data requirements.
  • Write clean, efficient, and well-documented code.
  • Conduct code reviews and participate in design discussions.
  • Ensure data integrity and quality across the data lifecycle.
  • Integrate with cloud platforms like

    AWS

    ,

    Azure

    , or

    GCP

    .
  • Implement data storage solutions and manage large-scale datasets.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Tata Consultancy Services logo
Tata Consultancy Services

Information Technology and Consulting

Thane

RecommendedJobs for You