Posted:1 day ago|
Platform:
On-site
Contractual
Our client is a French multinational information technology (IT) services and consulting company, headquartered in Paris, France. Founded in 1967, It has been a leader in business transformation for over 50 years, leveraging technology to address a wide range of business needs, from strategy and design to managing operations.
The company is committed to unleashing human energy through technology for an inclusive and sustainable future, helping organisations accelerate their transition to a digital and sustainable world.
They provide a variety of services, including consulting, technology, professional, and outsourcing services.
location: Pune , Chennai
Mode Of Work : Hybrid
Notice Period : Immediate Joiners
Experience : 6-8 yrs
Type Of Hire : Contract to Hire
Must have skills : Pyspark and Python
Good to have: Familiarity with orchestration tools like Apache Airflow
Grade(s) : C1/C2
Location(s) : Pune/Chennai
"Design, develop, and maintain scalable and efficient data processing pipelines using PySpark and Python.
Build and implement ETL (Extract, Transform, Load) processes to ingest data from various sources and load it into target destinations.
Optimize PySpark applications for performance and troubleshoot existing code.
Ensure data integrity and quality throughout the data lifecycle.
Collaborate with cross-functional teams, including data engineers and data scientists, to understand and fulfill data needs.
Provide technical leadership, conduct code reviews, and mentor junior team members.
Translate business requirements into technical solutions and contribute to architectural discussions.
Stay current with the latest industry trends in big data and distributed computing.
Mandatory skills and experience
PySpark and Python: Advanced proficiency and extensive experience in building data processing applications.
Distributed Computing: In-depth understanding of principles, including performance tuning.
Big Data Ecosystem: Experience with technologies such as Hadoop, Hive, Sqoop, and Spark.
Cloud Services: Hands-on experience with cloud platforms like AWS (e.g., Glue, Lambda, Kinesis) is often required.
Database and SQL: Strong knowledge of SQL and experience with relational databases and data warehousing.
Software Development: Experience with software development best practices, including version control (Git), unit testing, and code reviews.
Desired or "nice-to-have" skills
Familiarity with orchestration tools like Apache Airflow.
Experience with other data processing tools like Kafka or Pandas/Numpy.
Knowledge of API development and creating RESTful services.
Experience with data file formats like Parquet, ORC, and Avro.
Experience with Agile development methodologies.
People Prime Worldwide
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Nowpune, maharashtra
Salary: Not disclosed
pune, maharashtra, india
Salary: Not disclosed
pune, maharashtra, india
Salary: Not disclosed
bengaluru
10.0 - 11.0 Lacs P.A.
karnataka
Salary: Not disclosed
hyderabad, pune
3.0 - 6.0 Lacs P.A.
hyderabad, pune
3.0 - 6.0 Lacs P.A.
0.5 - 1.0 Lacs P.A.
Hyderabad, Telangana, India
Salary: Not disclosed
Hyderabad, Telangana, India
Salary: Not disclosed