Posted:2 months ago|
Platform:
Work from Office
Full Time
About The Role Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. As a Python Developer with Databricks, you will be responsible for developing and maintaining scalable data pipelines, managing cloud environments on Azure, and ensuring smooth integration with APIs. The ideal candidate will be proficient in Python, Databricks (PySpark), and Azure DevOps, with a strong understanding of cloud services, DevOps practices, and API testing. Notice Period 30 to 90 days Key Responsibilities Develop and Maintain Data PipelinesDesign, develop, and maintain scalable data pipelines using Python and Databricks (PySpark). Data ProcessingApply strong proficiency in Python and advanced Python concepts to process and manipulate large datasets effectively. API IngestionExperience in API ingestion, working with data in JSON format to integrate and automate data workflows. Cloud ManagementUse Azure Portal for managing cloud environments and services. Databricks PySparkWork with Databricks and PySpark to build distributed data processing applications. DevOps & Agile MethodologyImplement DevOps best practices and work within a Scrum framework to ensure continuous integration and continuous delivery (CI/CD) pipelines. API Testing & AutomationUse Postman to test and automate APIs for robust integration and data workflows. CollaborationWork closely with cross-functional teams to implement solutions aligned with business objectives and technical requirements. Primary Skills Required Qualifications Programming Skills: Strong proficiency in Python with experience in data processing libraries (e.g., Pandas, NumPy). Databricks ExperienceHands-on experience with Databricks (PySpark) for data processing and analysis. Cloud PlatformExperience using Azure Portal to manage cloud environments and services. API HandlingExpertise in working with APIs, specifically with data ingestion and integration in JSON format. DevOps MethodologyFamiliarity with DevOps practices and experience working in Agile/Scrum environments. API Testing ToolsProficiency with Postman for API testing and automation. Version ControlExperience using Visual Studio Code and version control systems like Git. Preferred Qualifications Familiarity with Azure DevOps for building and deploying CI/CD pipelines. Experience working with large-scale data processing frameworks such as Apache Spark or Hadoop. Azure Certifications (e.g., Azure Data Engineer, Azure Developer) are a plus. Skills & Attributes Excellent problem-solving and troubleshooting skills. Strong communication and collaboration abilities. Ability to manage multiple priorities and meet deadlines in a fast-paced environment. A proactive mindset focused on continuous improvement and automation. About Capgemini
Capgemini
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
My Connections Capgemini
6.0 - 11.0 Lacs P.A.
Chennai, Tamil Nadu, India
6.0 - 10.0 Lacs P.A.
Chennai, Tamil Nadu, India
7.0 - 10.0 Lacs P.A.
Bengaluru / Bangalore, Karnataka, India
3.0 - 7.0 Lacs P.A.
Hyderabad / Secunderabad, Telangana, Telangana, India
3.0 - 7.0 Lacs P.A.
Delhi, Delhi, India
3.0 - 7.0 Lacs P.A.
Noida, Uttar Pradesh, India
3.0 - 9.5 Lacs P.A.
Gurgaon / Gurugram, Haryana, India
7.0 - 14.0 Lacs P.A.
Noida, Uttar Pradesh, India
7.0 - 14.0 Lacs P.A.
Patan - Gujarat, Gujrat, India
4.0 - 11.0 Lacs P.A.