Home
Jobs

9 Python Coding Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 11.0 years

18 - 30 Lacs

Hyderabad

Hybrid

Naukri logo

Greetings from Euclid Innovations!!! We have Python Developer with one of our MNC Client in Hyderabad. Position : Python Developer Experience : 8+ Years Location : Hyderabad Work Mode: Hybrid Interview Mode: Client Round F2F-In Hyderabad Notice Period: Immediate to 20 Days Max Job Description: We are seeking a skilled Python Developer to join our team and contribute to designing, developing, and maintaining high-performance applications. The ideal candidate should have strong experience in Python, along with expertise in web frameworks like Flask or Django , database management, and API development. Required Skills & Qualifications: Strong proficiency in Python (3.x) and knowledge of OOP principles. Experience with Flask or Django for web application development. Proficiency in working with databases ( SQL and NoSQL ). Hands-on experience with RESTful API development and integration. Familiarity with version control tools like Git, GitHub, or GitLab . Experience with cloud platforms ( AWS, Azure, or Google Cloud ) is a plus. Knowledge of containerization tools like Docker and Kubernetes is an advantage. Strong debugging, testing, and problem-solving skills. Experience with CI/CD pipelines is a plus. Ability to work independently and collaboratively in an agile environment. Preferred Qualifications: Bachelors or Masters degree in Computer Science, Engineering, or related field. Experience with asynchronous programming (Celery, RabbitMQ) is a plus. Knowledge of data processing, analytics, or AI/ML frameworks is beneficial.

Posted 5 days ago

Apply

6.0 - 8.0 years

8 - 12 Lacs

Chennai

Work from Office

Naukri logo

Position: Senior Python Developer Experience: 6-8 Years Location: Chennai About the Role: We are looking for someone who is passionate about writing clean, high-performance code and is excited by data-driven applications. The ideal candidate will bring strong expertise in Python and modern data processing libraries, with a proven track record of building and scaling backend systems. This role is perfect for someone who is proactive, quick to learn, and thrives in a fast-paced environment. Key Responsibilities: Design, develop, and maintain backend systems and APIs using Python and FastAPI. Work with data-centric libraries like pandas , polars , and numpy to build scalable data pipelines and services. Implement and manage asynchronous task scheduling using FastAPI schedulers or similar tools. Contribute to architectural decisions and mentor junior developers. Participate in code reviews, testing, and good at troubleshooting. Work closely with data teams and business stakeholders to understand requirements and deliver effective solutions. Continuously learn and adapt to new technologies and best practices. Must-Have Qualifications: 6-8 years of professional experience with Python . Proficient in pandas , polars , and numpy . Strong experience building APIs with FastAPI or similar frameworks. Hands-on experience with task scheduling and background job management . Excellent problem-solving and communication skills. A passion for learning and the ability to pick up new technologies quickly. Good to Have: Experience with C# in enterprise environments. Strong knowledge of SQL and experience with relational databases (e.g., PostgreSQL, SQL Server). Familiarity with Docker, data versioning, job orchestration tools, or cloud platforms.

Posted 2 weeks ago

Apply

2.0 - 4.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

We're Hiring: Python Automation Test Engineer/Automation Developer Location: Hyderabad Experience: 2-4 years Are you a Python enthusiast with a passion for automation? We're looking for a skilled professional who can take ownership, innovate, and drive automation initiatives. Must-Have Skills: Strong Python experience (23 years) Ability to create automation scripts and build flows Direct collaboration with developers Quick learner, especially for UI-based tools Good to Have: Experience with Power Automate or UI automation tools (part of RPA) SQL knowledge Ability to troubleshoot and improve existing flows Capability to create new use cases and deliver individual projects We're exploring new tools and looking for someone who can adapt and grow with us. If you're ready to be part of a dynamic environment and contribute hands-on to automation innovations we’d love to connect!

Posted 3 weeks ago

Apply

10 - 13 years

20 - 30 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

SDET + Python Coding. Location Bangalore/Chennai/Hyderabad NP – Immediate Joiners 10+ years of exp required.. 10+ years of exp , in quality processes and test automation Python coding skills Good knowledge on framework development Strong on SQL queries and procedures.

Posted 4 weeks ago

Apply

9 - 12 years

20 - 30 Lacs

Pune

Work from Office

Naukri logo

Role & responsibilities Advise, manage, support an enterprise class Splunk environment. Support system administration activities on Linux OS and Splunk Enterprise and related applications Participate in production support activities of Splunk. Design Splunk system to meet growth while maintaining balance between performance/stability and agility. Develops advanced scripts for manipulation of multiple data repositories to support analyst requirements. Onboard and normalize new security and privacy event data into Splunk Develops advanced reports, dashboards or alerts to meet the requirements of critical initiatives. Develops scalable security management tools and processes. Develops automation supporting Splunk application and data management. Create customized searches and applications using programming/development skills such as java, python, shell scripting, regular expression etc. Automate deployment, integration and testing of enterprise system and services Communicate clearly to technical and business audiences Be well organized with a healthy sense of urgency, and able to set, communicate, and meet aggressive deadlines and milestones Self-motivated, learns quickly and delivers results with minimal supervision Quickly understand and interpret customer problem and navigate through complex organizations Represent the group in a friendly, courteous, and professional manner Preferred candidate profile Splunk Core Certified Consultant Splunk Enterprise Certified Architect Additional Benefits: Free transport (pickup & drop) Relocation benefits will be provided Working Model - WFO 5 Days initially hybrid later Shift - 3:00 PM IST to 12.00 AM IST Weekdays and weekends, There will be one person on-call to login outside of shift hours. Ensuring 24/7 on-call coverage. Shift Allowance - NA Interview Process - 3-4 rounds

Posted 1 month ago

Apply

3 - 4 years

6 - 12 Lacs

Chennai

Work from Office

Naukri logo

Position: Python Developer Experience: 3-4 Years Location: Chennai About the Role: We are seeking a motivated Python Programmer with 2-3 years of professional experience to join our dynamic team. The ideal candidate will possess strong problem-solving skills and practical experience with Python libraries, API development, and SQL querying. This role offers an excellent opportunity for individuals eager to expand their skills by learning and working with PowerFX for app development in PowerApps and Power Automate. Key Responsibilities: Develop, test, and maintain robust Python-based solutions. Utilize Pandas and NumPy for data analysis and manipulation. Build and optimize APIs using FastAPI or other web frameworks. Write efficient SQL queries to extract and manipulate data from relational databases. Debug and troubleshoot code effectively to ensure high-quality deliverables. Collaborate with cross-functional teams to understand project requirements and deliver solutions. Actively learn and implement PowerFX for developing applications using PowerApps and Power Automate. Required Skills and Qualifications: Programming Skills: Strong problem-solving abilities in Python. Libraries: Hands-on experience with Pandas and NumPy. API Development: Proficiency in FastAPI or similar web frameworks for building APIs. Debugging: Excellent debugging and troubleshooting skills. Database Knowledge: Proficient in SQL querying and optimization. Willingness to Learn: Enthusiastic about learning and working with PowerFX for app development in PowerApps and Power Automate.

Posted 2 months ago

Apply

10 - 20 years

45 - 50 Lacs

Mumbai

Work from Office

Naukri logo

Model Risk Managements mission is to manage, independently and actively, model risk globally in line with the banks risk appetite with responsibility for: Performing robust independent model validation. Ensuring early and proactive identification of Model Risks. Designing and recommending model risk appetite. Effectively managing and mitigating model risks. Establishing Model Risks metrics. Designing and implementing a strong Model Risk Management and governance framework. Creating bank-wide Market Risk policies. IMM(Internal Model Method) is a risk management approach used by bank to calculate CCR exposure for derivatives, securities financing transactions(SFTs) and other financial instruments. The IMM Model Validation team as part of MoRM is responsible for independent review and analysis of all IMM forward pricing models used to calculate key components of IMM i.e. Potential Future Exposure (PFE), Expected Exposure (EE), Effective Expected Positive Exposure (EEPE). Your key responsibilities The role is to independently review and validate IMM forward pricing models. The role as a Quantitative analyst in Mumbai will work closely with validation team in Berlin and London to produce, analyse and document validation testing. Review and analysis require a good understanding of the derivative pricing models, implementation methods, derivative products and associated risks. The outcome of review and analysis and independent will form the basis of discussion with key stakeholders including : Front office quants, Market risk managers and Finance Controllers. Your skills and experience Excellent mathematical ability with an understanding of stochastic calculus, Partial differential equations, Longstaff-Schwartz Monte Carlo and Numerical Algorithms Strong understanding in financial markets (specially derivative pricing) demonstrated by qualifications and experience. Strong understanding of key matrices in IMM PFE, EE, EEPE, CVA, WWR Experience in model validation. Proficiency in Python coding. Excellent communication skills - both written and oral. Education/Qualifications Academic degree in a quantitative discipline (e.g. Mathematical Finance, Maths , Physics, Financial engineering).

Posted 3 months ago

Apply

6 - 9 years

35 Lacs

Bengaluru

Work from Office

Naukri logo

We are hiring a Senior GCP Developer to join our high-performance data engineering team. This is a mission-critical role where you will design, build, and maintain scalable ETL pipelines and frameworks in a Data Mesh architecture. You will work with modern tools like Python, dbt, BigQuery (GCP), and SQL to deliver high-quality data products that power decision-making across the organization. We are looking for a highly skilled professional who thrives in demanding environments, takes ownership of their work, and delivers results with precision and reliability. Key Responsibilities * Design, Build, and Maintain ETL Pipelines: Develop robust, scalable, and efficient ETL workflows to ingest, transform, and load data into distributed data products within the Data Mesh architecture. * Data Transformation with dbt: Use dbt to build modular, reusable transformation workflows that align with the principles of Data Products. * Cloud Expertise: Leverage Google Cloud Platform (GCP) services such as BigQuery, Cloud Storage, Pub/Sub, and Dataflow to implement highly scalable data solutions. * Data Quality & Governance: Enforce strict data quality standards by implementing validation checks, anomaly detection mechanisms, and monitoring frameworks. * Performance Optimization: Continuously optimize ETL pipelines for speed, scalability, and cost efficiency. * Collaboration & Ownership: Work closely with data product owners, BI developers, and stakeholders to understand requirements and deliver on expectations. Take full ownership of your deliverables. * Documentation & Standards: Maintain detailed documentation of ETL workflows, enforce coding standards, and adhere to best practices in data engineering. * Troubleshooting & Issue Resolution: Proactively identify bottlenecks or issues in pipelines and resolve them quickly with minimal disruption. Required Skills & Experience * 10+ or 7+ years of hands-on experience in designing and implementing ETL workflows in large-scale environments (Lead & Dev) * Advanced proficiency in Python for scripting, automation, and data processing. * Expert-level knowledge of SQL for querying large datasets with performance optimization techniques. * Deep experience working with modern transformation tools like dbt in production environments. * Strong expertise in cloud platforms like Google Cloud Platform (GCP) with hands-on experience using BigQuery. * Familiarity with Data Mesh principles and distributed data architectures is mandatory. * Proven ability to handle complex projects under tight deadlines while maintaining high-quality standards. * Exceptional problem-solving skills with a strong focus on delivering results. What We Expect This is a demanding role that requires: 1. A proactive mindset you take initiative without waiting for instructions. 2. A commitment to excellence no shortcuts or compromises on quality. 3. Accountability you own your work end-to-end and deliver on time. 4. Attention to detail precision matters; mistakes are not acceptable. Location - Pan india

Posted 3 months ago

Apply

8 - 10 years

25 - 30 Lacs

Pune

Work from Office

Naukri logo

Write in Python to deliver a wide variety of Machine Learning & Data Science solutions Implement models and algorithms on Databricks and H2O Building tools to accelerate feature experimentation and exploration with lineage, data privacy protection and easier model performance debugging. Collaborate with Product Owners to apply Workdays agile processes and be responsible for the initiation, delivery, and communication of projects for the stakeholder organizations Building ML-as-a-service, with the purpose of taking experiments to production quickly. Share learnings and project findings with the wider Workday Machine Learning Community Willing to work across multiple time zones. What You will Bring Proficient experience in Python coding, SQL, Spark and experience developing models and other data science work with Python 3 - 4+ years of experience implementing models or machine learning algorithms in production Experience on any of these cloud platforms (AWS, Azure, GCP) Primary skill - Python, Pandas, NumPy, R, Go, Julia, Jupyter Notebooks, ML Libraries - Scikit Learn, XGboost, H2O, AutoML, Deep Learning Libraries - PyTorch, Deep Learning Libraries - Tensorflow-Keras, Deep Learning Libraries - GNN, Deep Learning Libraries - EdgeAI, NLP Libraries -NLTK, Gensim, Spacy, Sentence Transformers, Computer Vision - OpenCV, PIL, Evaluation and Explainability - SHAP, LIME

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies