PySpark_Hyderabad, chennai, Kolkata

4 years

0 Lacs

Posted:3 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Greetings from TCS!!!!!!!

TCS Hiring for PySpark

Job Location: Chennai, Hyderabad, Kolkata

Experience Range: 4-8 Years

• 5–6 years of total experience in data engineering or big data development.

• 2–3 years hands-on experience with Apache Spark.

• Strong programming skills in PySpark, Python, and Scala.

• 2+ years of experience in Scala backend development.

• Proficient in Scala, both object oriented and functional programming concepts.

• Deep understanding and application of advanced functional programming concepts like category theory, monads, applicatives, and type classes.

• Hands-On knowledge with Scala Typelevel libraries like Cats, Shapeless, and others used for building applications with strong typing and efficient concurrency.

• Solid understanding of data lakes, lakehouses, and Delta Lake concepts.

• Experience in SQL development and performance tuning.

Proficient in cloud services (e.g. S3, Glue, Lambda, EMR, Redshift, CloudWatch, IAM).

• Familiarity with Airflow, dbt, or similar orchestration tools is a plus.

• Experience in CI/CD tools like Jenkins, GitHub Actions, or Code Pipeline.

• Knowledge of data security, governance, and compliance frameworks.

Responsibilities:

Develop and maintain scalable data pipelines using Apache Spark on Databricks.

• Build end-to-end ETL/ELT pipelines on AWS/GCP/Azure using services like S3, Glue, Lambda, EMR, and Step Functions.

• Collaborate with data scientists, analysts, and business stakeholders to deliver high-quality data solutions.

• Design and implement data models, schemas, and Lakehouse architecture in Databrick/Snowflake.

• Optimize and tune Spark jobs for performance and cost-efficiency.

• Integrate data from multiple structured and unstructured data sources.

• Monitor and manage data workflows, ensuring data quality, consistency, and security.

• Follow best practices in CI/CD, code versioning (Git), and DevOps practices for data applications.

• Write clean, reusable, well-documented code using Python / PySpark / Scala.

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Tata Consultancy Services logo
Tata Consultancy Services

Information Technology and Consulting

Thane

RecommendedJobs for You