Posted:3 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Contractual

Job Description

About Client :-


Our client is a French multinational information technology (IT) services and consulting company, headquartered in Paris, France. Founded in 1967, It has been a leader in business transformation for over 50 years, leveraging technology to address a wide range of business needs, from strategy and design to managing operations.

The company is committed to unleashing human energy through technology for an inclusive and sustainable future, helping organisations accelerate their transition to a digital and sustainable world.

They provide a variety of services, including consulting, technology, professional, and outsourcing services.


Job Details:-


location : Pune ,Chennai

Mode Of Work : Hybrid

Notice Period : Immediate Joiners

Experience : 6-15 yrs

Type Of Hire : Contract to Hire


JOB DESCRIPTION:

Responsibilities:

  • Design, develop, and maintain scalable data pipelines using Python and modern data engineering tools.
  • Understanding of banking data models, financial reporting, and regulatory compliance.
  • Integrate data from legacy banking platforms into modern data lakes and warehouses.
  • Collaborate with business analysts, architects, and operations teams to understand core banking workflows and data requirements.
  • Design, build, and maintain scalable and reliable data pipelines using Azure Data Factory, Databricks, Synapse (Azure SQL DW), and Azure Data Lake.
  • Perform data analysis and transformation using PySpark on Azure Databricks or Apache Spark.
  • Design and implement data models optimized for storage and various query patterns.
  • Work with structured, semi-structured, and unstructured data.
  • Utilize various database technologies including Traditional RDBMS, MPP, and NoSQL.
  • Ensure compliant handling and management of data.
  • Experience in building end-to-end data pipelines in a cloud environment.
  • Mentor junior engineers and contribute to the development of best practices and reusable frameworks.

Skills Expected:

  • 8+ years of relevant IT experience in the BI/DW domain with hands-on experience on the Azure modern data platform.
  • Strong knowledge and experience with the Python programming language
  • Experience using version control systems like Git, bitbucket.
  • Experience in data analysis and transformation using PySpark on Azure Databricks or Apache Spark.
  • Good knowledge of Distributed Processing using Databricks or Apache Spark.
  • Experience in creating data structures optimized for storage and various query patterns.
  • Experience of integrating Unity Data Catalog with the Databricks and registering data assets etc.
  • Understanding of CI/CD pipelines.
  • Knowledge of Agile/Scrum methodologies.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

madurai, tamil nadu, india

chennai, tamil nadu, india