Home
Jobs

Risk and Finance Python with ADB Developer

4 - 9 years

6 - 11 Lacs

Posted:2 months ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

About The Role Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. As a Python Developer with Databricks, you will be responsible for developing and maintaining scalable data pipelines, managing cloud environments on Azure, and ensuring smooth integration with APIs. The ideal candidate will be proficient in Python, Databricks (PySpark), and Azure DevOps, with a strong understanding of cloud services, DevOps practices, and API testing. Notice Period 30 to 90 days Key Responsibilities Develop and Maintain Data PipelinesDesign, develop, and maintain scalable data pipelines using Python and Databricks (PySpark). Data ProcessingApply strong proficiency in Python and advanced Python concepts to process and manipulate large datasets effectively. API IngestionExperience in API ingestion, working with data in JSON format to integrate and automate data workflows. Cloud ManagementUse Azure Portal for managing cloud environments and services. Databricks PySparkWork with Databricks and PySpark to build distributed data processing applications. DevOps & Agile MethodologyImplement DevOps best practices and work within a Scrum framework to ensure continuous integration and continuous delivery (CI/CD) pipelines. API Testing & AutomationUse Postman to test and automate APIs for robust integration and data workflows. CollaborationWork closely with cross-functional teams to implement solutions aligned with business objectives and technical requirements. Primary Skills Required Qualifications Programming Skills: Strong proficiency in Python with experience in data processing libraries (e.g., Pandas, NumPy). Databricks ExperienceHands-on experience with Databricks (PySpark) for data processing and analysis. Cloud PlatformExperience using Azure Portal to manage cloud environments and services. API HandlingExpertise in working with APIs, specifically with data ingestion and integration in JSON format. DevOps MethodologyFamiliarity with DevOps practices and experience working in Agile/Scrum environments. API Testing ToolsProficiency with Postman for API testing and automation. Version ControlExperience using Visual Studio Code and version control systems like Git. Preferred Qualifications Familiarity with Azure DevOps for building and deploying CI/CD pipelines. Experience working with large-scale data processing frameworks such as Apache Spark or Hadoop. Azure Certifications (e.g., Azure Data Engineer, Azure Developer) are a plus. Skills & Attributes Excellent problem-solving and troubleshooting skills. Strong communication and collaboration abilities. Ability to manage multiple priorities and meet deadlines in a fast-paced environment. A proactive mindset focused on continuous improvement and automation. About Capgemini

Mock Interview

Practice Video Interview with JobPe AI

Start Azure Devops Interview Now

My Connections Capgemini

Download Chrome Extension (See your connection in the Capgemini )

chrome image
Download Now
Capgemini
Capgemini

IT Services and IT Consulting

Paris France

10001 Employees

5131 Jobs

    Key People

  • Aiman Ezzat

    Chief Executive Officer
  • Carole Ferrand

    Group Chief Financial Officer

RecommendedJobs for You

Bengaluru / Bangalore, Karnataka, India

Hyderabad / Secunderabad, Telangana, Telangana, India

Noida, Uttar Pradesh, India

Patan - Gujarat, Gujrat, India