Jobs
Interviews

24 Advanced Python Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Analytics Lead at Cummins Inc., you will be responsible for facilitating data, compliance, and environment governance processes for the assigned domain. Your role includes leading analytics projects to provide insights for the business, integrating data analysis findings into governance solutions, and ingesting key data into the data lake while ensuring the creation and maintenance of relevant metadata and data profiles. You will coach team members, business teams, and stakeholders to find necessary and relevant data, contribute to communities of practice promoting responsible analytics use, and develop the capability of peers and team members within the Analytics Ecosystem. Additionally, you will mentor and review the work of less experienced team members, integrate data from various source systems to build models for business use, and cleanse data to ensure accuracy and reduce redundancy. Your responsibilities will also involve leading the preparation of communications to leaders and stakeholders, designing and implementing data/statistical models, collaborating with stakeholders on analytics initiatives, and automating complex workflows and processes using tools like Power Automate and Power Apps. You will manage version control and collaboration using GITLAB, utilize SharePoint for project management and data collaboration, and provide regular updates on work progress via JIRA/Meets to stakeholders. Qualifications: - College, university, or equivalent degree in a relevant technical discipline, or relevant equivalent experience required. - This position may require licensing for compliance with export controls or sanctions regulations. Competencies: - Balancing stakeholders - Collaborating effectively - Communicating clearly and effectively - Customer focus - Managing ambiguity - Organizational savvy - Data Analytics - Data Mining - Data Modeling - Data Communication and Visualization - Data Literacy - Data Profiling - Data Quality - Project Management - Valuing differences Technical Skills: - Advanced Python - Databricks, Pyspark - Advanced SQL, ETL tools - Power Automate - Power Apps - SharePoint - GITLAB - Power BI - Jira - Mendix - Statistics Soft Skills: - Strong problem-solving and analytical abilities - Excellent communication and stakeholder management skills - Proven ability to lead a team - Strategic thinking - Advanced project management Experience: - Intermediate level of relevant work experience required - This is a Hybrid role Join Cummins Inc. and be part of a dynamic team where you can utilize your technical and soft skills to make a significant impact in the field of data analytics.,

Posted 2 days ago

Apply

5.0 - 10.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

We are seeking a skilled Data Science Trainer to join our team. Salary: ?35,000 - ?60,000 per month Experience: 5 to 10 years Job Type: Full-time, Permanent, Fresher Skills Required: Expertise in Machine Learning Proficiency in Power BI Strong knowledge of Artificial Intelligence In-depth knowledge of Core Python and Advanced Python Proficiency in libraries like Numpy, Pandas, and Matplotlib Excellent Hands-on English Language to Deliver the Lesson on various video Channels / offline / online trainings also (trainings delivery experience for freshers and experience candidates). Qualification: Graduate degree (B.Tech, BCA, or MCA preferred) - Any graduate can also apply with good knowledge. Requirements: Excellent communication skills in English. Ability to deliver engaging and practical training sessions. Experience in creating training materials and curriculum development. Job Type: Full-time, Work from Office Location: G-13, 2nd Floor, Sector 3, Near Sec 16 Metro Station, Noida If interested, please send your CV to [HIDDEN TEXT] or contact us at ? +91-8448085414? . Show more Show less

Posted 2 days ago

Apply

2.0 - 6.0 years

0 Lacs

ahmedabad, gujarat

On-site

You will be joining our engineering team in Ahmedabad as a software engineer. Your main responsibility will involve designing and developing Enterprise Software for our Global Fortune 500 clients in Data Analytics, Security, and Cloud segments. Your expertise in Core & Advanced Python with experience in developing REST API using any framework will be crucial for this role. Your responsibilities will include defining, integrating, and upgrading a comprehensive architecture to support Java applications to achieve organization goals. You will provide expertise in the software development life cycle, lead and mentor a small-sized team, ensure code reviews and development best practices are followed, and actively engage in regular client communication. Estimating efforts, identifying risks, providing technical support, and effective people and task management will be key aspects of your role. You must also demonstrate the ability to multitask, re-prioritize responsibilities based on dynamic requirements, and work with minimal supervision. To be successful in this role, you should have at least 2 years of experience in software architecture, system design, and development, along with extensive software development experience in Python. Experience in developing RESTful Web services using any framework, strong Computer Science fundamentals in Object-Oriented Design and Data Structures, and familiarity with Linux programming are essential. Expertise in Big Data, Networking, Storage, or Virtualization is a plus. Working knowledge of Agile Software development methodology, excellent oral and written communication skills, problem-solving abilities, and analytical skills are also required. You must hold a minimum qualification of BE in Computer Science or equivalent.,

Posted 4 days ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru, Karnataka, India

On-site

Responsibilities: Implementing various development, testing, automation tools, and IT infrastructure Excellent understanding of Ruby, Perl, Python, API, Java, OOP Concepts, Advanced Python, Jenkins, and services experience Automation framework experience with: Python Framework and Automation Selenium WebDriver; TestNG Robotium Experience working on Linux-based infrastructure Experience in UI test case automation is a must Experience in Backend API test automation is a must Working knowledge of Python and known DevOps tools like Git and GitHub Experience in network, server, and application-status monitoring Build and release management using Jenkins Continuous Integration tool Configuration and managing databases such as Postgres with replication, etc. Excellent troubleshooting skills Working knowledge of various tools, open-source technologies, and cloud services Awareness of critical concepts in DevOps and Agile principles Working on ways to automate and improve development and release processes Monitoring processes during the entire lifecycle for adherence and updating or creating new processes for improvement and minimizing wastage Incident management and root cause analysis Strive for continuous improvement and build continuous integration, continuous development, and constant deployment pipeline (CI/CD Pipeline) Working knowledge of databases and SQL (Structured Query Language) Expert in code deployment tools (Ansible) Experience with GitHub; Perforce is preferred Technical Skills: Ruby, Perl, Python, API, Java, OOP Concepts, Advanced Python, Jenkins, Selenium WebDriver, TestNG, Robotium, Linux, UI Test Automation, Backend API Test Automation, Git, GitHub, Network Monitoring, Server Monitoring

Posted 5 days ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Chandigarh, India

On-site

Responsibilities: Implementing various development, testing, automation tools, and IT infrastructure Excellent understanding of Ruby, Perl, Python, API, Java, OOP Concepts, Advanced Python, Jenkins, and services experience Automation framework experience with: Python Framework and Automation Selenium WebDriver; TestNG Robotium Experience working on Linux-based infrastructure Experience in UI test case automation is a must Experience in Backend API test automation is a must Working knowledge of Python and known DevOps tools like Git and GitHub Experience in network, server, and application-status monitoring Build and release management using Jenkins Continuous Integration tool Configuration and managing databases such as Postgres with replication, etc. Excellent troubleshooting skills Working knowledge of various tools, open-source technologies, and cloud services Awareness of critical concepts in DevOps and Agile principles Working on ways to automate and improve development and release processes Monitoring processes during the entire lifecycle for adherence and updating or creating new processes for improvement and minimizing wastage Incident management and root cause analysis Strive for continuous improvement and build continuous integration, continuous development, and constant deployment pipeline (CI/CD Pipeline) Working knowledge of databases and SQL (Structured Query Language) Expert in code deployment tools (Ansible) Experience with GitHub; Perforce is preferred Technical Skills: Ruby, Perl, Python, API, Java, OOP Concepts, Advanced Python, Jenkins, Selenium WebDriver, TestNG, Robotium, Linux, UI Test Automation, Backend API Test Automation, Git, GitHub, Network Monitoring, Server Monitoring

Posted 5 days ago

Apply

5.0 - 7.0 years

20 - 30 Lacs

Bengaluru

Work from Office

Role & responsibilities Key Responsibilities Design, build and maintain Python plugins that encapsulate business rules and transformation logic such as custom aggregation/disaggregation, rule-based data enrichment, and time-grain manipulations. Implement data-cleaning, validation, mapping and merging routines using Pandas, NumPy and other Python libraries to prepare inputs for downstream analytics. Define clear interfaces and configuration schemes for plugins (for example, picklists for output grains and rule codes) and package them for easy consumption by implementation teams. Profile and optimize Python code pathsleveraging vectorized operations, efficient merges and in-memory transformationsto handle large datasets with low latency. Establish and enforce coding standards, write comprehensive unit and integration tests with pytest (or similar), and ensure high coverage for all new components. Triage and resolve issues in legacy scripts, refactor complex routines for readability and extensibility, and manage versioning across releases. Collaborate with data engineers and analysts to capture requirements, document plugin behaviors, configuration parameters, and usage examples in code repositories or internal wikis. Requirements Must-Have Skills 57 years of hands-on experience writing well-structured Python code, with deep familiarity in both object-oriented and functional programming paradigms. Expert mastery of Pandas and NumPy for transformations (group-by aggregations, merges, column operations), plus strong comfort with Python’s datetime and copy modules. Proven experience designing modular Python packages, exposing configuration through parameters or picklists, and managing versioned releases. Ability to translate complex, rule-driven requirements (such as disaggregation rules, external-table merges, and priority ranking) into clean Python functions and classes. Proficiency with pytest (or equivalent), mocking, and integrating with CI pipelines (e.g., GitHub Actions, Jenkins) for automated testing. Skilled use of Python’s logging module to instrument scripts, manage log levels, and capture diagnostic information. Strong Git workflow experience, including branching strategies, pull requests, code reviews, and merge management.

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

As an experienced FS Python trainer at Naresh IT KPHB & Ameerpet branches, you will be responsible for delivering high-quality classroom training in both Core and Advanced Python, as well as DJango. We are looking for a candidate who possesses a strong understanding of these technologies and has a rich background in providing software training. This is a full-time position with a day shift schedule. The ideal candidate should have at least 4 years of experience in software training, with a focus on Python. The ability to effectively communicate complex concepts and engage with students is essential for this role. The work location for this position is in Hyderabad, Telangana. The successful candidate will be expected to deliver in-person training sessions at our branches. If you are passionate about Python and have a proven track record in training, we encourage you to apply for this exciting opportunity.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You will be joining a pioneering AI team where you will be responsible for designing and deploying cutting-edge deep learning solutions for computer vision and audio analysis. Your main tasks will include designing, developing, and optimizing deep learning models for image/video analysis (object detection, segmentation) and audio classification tasks. You will work with CNN architectures, Vision Transformers (ViT, Swin), and attention mechanisms (SE, CBAM, self/cross-attention) to address complex real-world challenges. In this role, you will process multi-modal data including video and audio. For video, you will apply spatiotemporal modeling (3D CNNs, temporal attention), while for audio, you will extract features (spectrograms, MFCCs) and build classification pipelines. You will also utilize pretrained models through transfer learning and multi-task learning frameworks, and optimize models for accuracy, speed, and robustness using PyTorch/TensorFlow. Collaboration with MLOps teams to deploy solutions into production is a key aspect of this role. Your required skills include advanced programming in Python (PyTorch/TensorFlow), expertise in computer vision concepts such as Vision Transformers, object detection techniques (YOLO, SSD, Faster R-CNN, DETR), and video analysis methods including temporal modeling. Additionally, you should have experience in audio processing, attention mechanisms, transfer learning, and training strategies. Experience in handling large-scale datasets and building data pipelines is also essential. Preferred qualifications for this role include exposure to multi-modal learning, familiarity with R for statistical analysis, and a background in publications or projects related to computer vision or machine learning conferences such as CVPR, NeurIPS, ICML. Please note that this position is for a client of Hubnex Labs, and selected candidates will work directly with the client's AI team while representing Hubnex.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

The role requires advanced level skills and very good working knowledge in Python and DataIku (Analytics tool). You should have very good knowledge in developing DataIku dashboards and reports. Utilize your knowledge of applications development procedures and concepts and other technical aspects to identify and define requirements to enhance the system. Identify and analyze issues, make recommendations, and implement solutions. Utilize your knowledge of business processes, system processes, and industry standards to solve complex issues. You should have experience on SDLC Deployment Cycle and Agile Methodology. Good conceptual and working knowledge of DEV Ops Tools such as Jira, Bitbucket, Jenkins, etc. You will be expected to extend support in release timing during the weekend. Also, you should be willing to upskill in new tech stack/skills for project needs. As a DataIku Developer, you should have at least 2+ years of relevant experience with strong analytics query writing skills. You need to have a strong knowledge of Basic/Advanced Python and experience with the DataIku Analytics tool. It is essential to have experience in programming/debugging used in business applications. Working knowledge of industry practices and standards is required. You should have comprehensive knowledge of the specific business area for application development. Also, working knowledge of program languages, analytical, logical & problem-solving capabilities is expected. Consistently demonstrating clear and concise written and verbal communication skills is crucial. Knowledge of Excel/SQL would be good to have. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity, review Accessibility at Citi. Citi is an equal opportunity employer. View Citis EEO Policy Statement and the Know Your Rights poster.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY's Financial Services Office (FSO) is a unique, industry-focused business unit that provides a broad range of integrated services leveraging deep industry experience with strong functional capability and product knowledge. The FSO practice offers integrated advisory services to financial institutions and other capital markets participants, including commercial banks, investment banks, broker-dealers, asset managers, insurance and energy trading companies, and Corporate Treasury functions of leading Fortune 500 Companies. The service offerings include market, credit, and operational risk management, regulatory advisory, quantitative advisory, technology enablement, and more. Within EY's FSO Advisory Practice, the Financial Services Risk Management (FSRM) group provides solutions to help clients identify, measure, manage, and monitor market, credit, operational, and regulatory risks associated with trading, asset-liability management, and capital markets activities. The Credit Risk (CR) team within FSRM assists clients in designing and implementing strategic and functional changes across risk management within banking book portfolios of large domestic and global financial institutions. Key Responsibilities: - Demonstrate deep technical capabilities and industry knowledge of financial products, particularly lending products. - Stay informed about market trends and demands in the financial services sector and issues faced by clients. - Monitor progress, manage risk, and communicate effectively with key stakeholders. - Mentor junior consultants and review tasks completed by them. - Work on projects involving model audits, validation, and development activities. Qualifications, Certifications, and Education: Must-have: - Postgraduate degree in accounting, finance, economics, statistics, or related field with at least 3 years of related work experience. - Understanding of climate risk models, ECL, stress testing, and regulatory requirements related to credit risk. - Knowledge of Credit Risk and Risk Analytics techniques. - Hands-on experience in data preparation, manipulation, and consolidation. - Strong documentation skills and ability to summarize key details effectively. - Proficiency in statistics, econometrics, and technical skills in Advanced Python, SAS, SQL, R, and Excel. Good-to-have: - Certifications such as FRM, CFA, PRM, SCR. - Experience in Data/Business Intelligence Reporting and knowledge of Machine Learning models. - Willingness to travel and previous project management experience. EY exists to build a better working world, creating long-term value for clients, people, and society while building trust in capital markets. EY teams across 150 countries provide trust through assurance and help clients grow, transform, and operate in various sectors like assurance, consulting, law, strategy, tax, and transactions, addressing complex issues globally.,

Posted 1 week ago

Apply

1.0 - 3.0 years

1 - 3 Lacs

Mumbai, Maharashtra, India

On-site

About the job Canonical is a leading provider of open source software and operating systems to the global enterprise and technology markets. Our platform, Ubuntu, is very widely used in breakthrough enterprise initiatives such as public cloud, data science, AI, engineering innovation, and IoT. Our customers include the world's leading public cloud and silicon providers, and industry leaders in many sectors. The company is a pioneer of global distributed collaboration, with 1200+ colleagues in 75+ countries and very few office-based roles. Teams meet two to four times yearly in person, in interesting locations around the world, to align on strategy and execution. The company is founder-led, profitable, and growing. We are hiring a Junior Cloud Field Engineer to help global companies embrace the latest private cloud infrastructure, Linux and cloud native operations, and open source applications. Our team applies expert insights to real-world customer problems, enabling the enterprise adoption of Linux Ubuntu, OpenStack, Kubernetes and a wide range of associated technology. This role has very diverse responsibilities. The team members are Linux and cloud solutions architects for our customers, designing private and public cloud solutions fitting their workload needs. They are the cloud consultants who work hands-on with the technologies by deploying, testing and handing over the solution to our support or managed services team at the end of a project. They are also software engineers who use Python to develop Kubernetes operators and Linux open source infrastructure-as-code. The people who love this role are developers who like to solve customer problems through architecture, presentations and training. Location : This role will be home based. What your day will look like Work across the entire Linux stack, from kernel, networking, storage, to applications Work in Python to design and deliver open source code Architect cloud infrastructure solutions like OpenStack, Kubernetes, Ceph, Hadoop and Spark either On-Premises or in Public Cloud (AWS, Azure, Google Cloud) Coach and develop your colleagues where you have insights Grow a healthy, collaborative engineering culture in line with the company values Work from the comfort of your home Global travel up to 25% of time for internal and external events What we are looking for in you University degree in Computer Science or related software engineering expertise You have sound knowledge of cloud computing concepts & technologies, such as Kubernetes, OpenStack, AWS, GCP, Azure, Ceph, etc. You have practical knowledge of Linux and networking You have Intermediate to Advanced level of Python programming skills You are a dynamic person who loves to jump in new projects and interact with people You have a demonstrated drive for continual learning Excellent communication and presentation skills (English) You have great organisational skills and follow-up reliably on commitments (Optional) You speak a second language What you'll learn OpenStack and Kubernetes infrastructure Linux Ubuntu and networking knowledge Wide range of open source applications and skills Work directly with customers in a range of different businesses Real-life and hands-on exposure to a wide range of emerging technologies and tools What we offer colleagues We consider geographical location, experience, and performance in shaping compensation worldwide. We revisit compensation annually (and more often for graduates and associates) to ensure we recognize outstanding performance. In addition to base pay, we offer a performance-driven annual bonus or commission. We provide all team members with additional benefits which reflect our values and ideals. We balance our programs to meet local needs and ensure fairness globally. Distributed work environment with twice-yearly team sprints in person Personal learning and development budget of USD 2,000 per year Annual compensation review Recognition rewards Annual holiday leave Maternity and paternity leave Team Member Assistance Program & Wellness Platform Opportunity to travel to new locations to meet colleagues Priority Pass and travel upgrades for long-haul company events

Posted 1 week ago

Apply

2.0 - 6.0 years

3 - 5 Lacs

Vijayawada, Hyderabad

Work from Office

Job Title: Advanced Python Programming Offline Trainer Location: Andhra Pradesh & Telangana Job Type: Full-Time | Offline Experience Required: Minimum 2 Years (Training/Teaching Experience) Job Summary: We are seeking an experienced and passionate Python Trainer to deliver Advanced Python Programming sessions in an offline, classroom-based environment . The ideal candidate must have a strong command of Python, proven training/teaching experience, and a hands-on approach to real-time project-based instruction. Key Responsibilities: Deliver interactive and engaging classroom training on Advanced Python Programming to students and professionals. Design, structure, and update the course curriculum and learning resources in alignment with industry standards. Explain complex programming concepts in a simple, relatable, and practical manner. Guide students through hands-on projects , coding exercises, and real-time assignments. Assess student progress through quizzes, coding tests, and evaluations. Provide constructive feedback, mentorship, and technical guidance throughout the learning journey. Stay updated with the latest Python trends, libraries (e.g., NumPy, Pandas, Flask, Django), and industry use cases. Maintain high levels of energy, clarity, and professionalism in the classroom. Required Skills & Qualifications: Minimum 2 years of experience in teaching/training Python to students (academic or professional). Strong expertise in Core and Advanced Python , including OOPs, file handling, error handling, decorators, generators, multithreading, DSA, libraries etc. Practical experience with Python libraries and frameworks such as NumPy, Pandas, Matplotlib, Flask/Django is a must. Excellent communication and presentation skills in English and/or local languages. Prior experience with curriculum development, LMS platforms, or conducting bootcamps is an added advantage. Bachelor’s degree in Computer Science/Engineering or relevant field. Preferred Qualities: Passion for teaching and mentoring young minds. Ability to handle doubts, explain code line-by-line, and simplify real-world problems. Professional, punctual, and committed to delivering outcomes. Self-driven with a continuous learning mindset. Why Join Us? Work with a purpose-driven EdTech brand impacting thousands of learners. Opportunity to shape and lead high-quality offline training programs. Be part of a growing and collaborative teaching community.

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

The Machine Learning Engineer (Azure Databricks) position is available in Navi Mumbai, Bengaluru, Pune, Gurugram, and Chandigarh. We are looking for a skilled individual with at least 3 years of relevant experience in Python, data science, ETL, advanced Python, machine learning, ML frameworks, Azure Cloud, and Databricks. Familiarity with deep learning, NLP, computer vision, and Python for image processing is also desired. As a Machine Learning Engineer, you will be responsible for leading machine learning projects, developing and optimizing algorithms, preparing and transforming datasets, evaluating model performance, and deploying models in production environments. Collaboration with cross-functional teams and basic understanding of DevOps practices are essential for success in this role. The ideal candidate will have hands-on experience in Python, advanced Python coding, machine learning model development, ML frameworks like Scikit-learn or TensorFlow, Azure cloud services, Databricks, and basic DevOps knowledge. An understanding of deep learning principles, NLP, computer vision, and image processing libraries is beneficial. If you join our team, you can expect opportunities for learning and certification, comprehensive medical coverage, a flexible work environment, and a fun, collaborative, and innovative workplace culture. We are committed to your professional growth and well-being, offering a supportive and dynamic environment to thrive in the field of AI and ML.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

As a Software Developer at UST in Trivandrum, you will be responsible for developing applications, contributing to maintenance, and optimizing performance by employing design patterns and reusing proven solutions. You will also oversee the developmental activities of junior team members with some guidance from the Product Engineer I. Your key responsibilities will include understanding product requirements and user stories from the customer discovery process, ensuring requirements coverage of complex features with unit test cases, troubleshooting development and production problems across multiple environments and operating platforms, and providing technical guidance to the team to resolve challenging programming and design problems. You will be required to create effort estimations and ETAs for the deliverables assigned to ensure adherence to timelines/SLAs wherever applicable, participate actively in code reviews to ensure code coverage and code quality, and adhere to best practices and standards with provision of periodic status updates. Additionally, you will execute test cases, prepare release documents in line with the requirements, and influence and improve customer satisfaction. In this role, you will be expected to adhere to engineering processes and standards with minimal or no code review comments, adhere to project schedule timelines and effort estimation, uncover technical issues during project execution, address defects in the code, track defects post-delivery, and manage noncompliance issues. Furthermore, you will be required to complete mandatory domain/technical certifications on a quarterly/timely basis. Your outputs are expected to include understanding functional/non-functional requirements gathered from stakeholders for enhancement, contributing to product development, following SLA and delivery timelines, creating POCs to identify the feasibility of new technologies/products, providing technical inputs for product research, design, analysis, testing, process improvement, and complex troubleshooting for critical and large projects. You will also participate in code reviews, eliminate implementation problems early in the development cycle, support the client in user acceptance testing if required, ensure code quality and 100% code coverage, review test cases/test plans, conduct integration testing, resolve defects/bugs, provide inputs to technical publications, review documentation of key features, and resolve existing issues for product sign-offs. As a Software Developer at UST, you are expected to upskill regularly, complete mandatory domain/technical certifications on time, and possess skills such as using domain/industry knowledge to understand and capture business requirements, product design knowledge to design/implement business and non-functional requirements, knowledge of product features/functionality to understand technical dependencies and apply best practices, ability to design, install, configure, and troubleshoot CI/CD pipelines, software design & development knowledge to develop code per requirement specifications/user stories, and UX knowledge to enhance product/system usability. You should also have knowledge of domain/industry processes, product design, product features/functionality, configuration/build/deploy processes and tools, IAAS-cloud providers, application development lifecycle, quality assurance processes, quality automation processes and tools, and user experience knowledge. In summary, as a Software Developer at UST, you will design, develop, test, and deliver offerings using leading edge and/or proven technologies in an Agile, collaborative environment. You will work closely with stakeholders to understand requirements, design and test innovative software solutions, and ensure the implemented solutions are unit tested and ready for integration into the product. Additionally, you will be involved in mentoring and providing technical guidance to other team members, completing software development activities related to existing and new product development, and performing verification and validation activities according to the Watson Health Standard Operating Procedures.,

Posted 3 weeks ago

Apply

5.0 - 10.0 years

8 - 18 Lacs

Nashik, Pune, Mumbai (All Areas)

Work from Office

Job title: AI Expert Total experience: 5+ years Location: Pune and Nashik Job description: • In-depth experience with the Eliza framework and its agent coordination capabilities • In-depth experience with the Agentic AI • Practical implementation experience with vector databases (Pinecone, Weaviate, Milvus, or Chroma) • Hands-on experience with embedding models (e.g., OpenAI, Cohere, or open-source alternatives) • Deep knowledge of LangChain/LlamaIndex for agent memory and tool integration • Experience designing and implementing knowledge graphs at scale • Strong background in semantic search optimization and efficient RAG architectures • Experience with Model Control Plane (MCP) for both LLM orchestration and enterprise system integration • Advanced Python development with expertise in async patterns and API design

Posted 3 weeks ago

Apply

5.0 - 8.0 years

10 - 20 Lacs

Pune

Work from Office

5-8 years of experience in automation testing with Python/Advanced Python. Proficiency in web application and REST API testing. ISTQB Foundation Level certification is a plus.

Posted 1 month ago

Apply

4.0 - 7.0 years

5 - 7 Lacs

Kolkata, Bhiwani, Raipur

Work from Office

Joining Location: Raipur, Chhattisgarh (Relocation Required) Experience: Minimum 4+ Years Job Type: Full-Time Accommodation: Provided by the Company Job Description: We are looking for an experienced and passionate Python Trainer or Teacher to join our training division. The selected candidate will be responsible for delivering Python training sessions to university or college students / Corporate employees at assigned locations. This is an exciting opportunity for individuals who are enthusiastic about teaching and have a strong command on Python programming. Key Responsibilities: Deliver structured and engaging training sessions on Python programming to university students. Develop, update, and maintain training content, assignments, and assessments. Evaluate students' performance through assessments, quizzes, and practical projects. Ensure the training objectives are met within the given timelines. Assist in resolving students' doubts and provide additional mentoring when needed. Relocate to various training locations as per project requirements (initial joining at Raipur). Provide feedback to the internal team on course content and student engagement. Required Skills & Qualifications: Minimum 4+ years of experience in Python development and/or training. Strong understanding of core Python concepts, libraries, and frameworks. Good communication and classroom management skills. Prior experience in training college/university-level /corporate employees or students is a plus. Willingness to relocate and stay at different locations during training assignments ranging from a minimum of one semester (6 months) or more. Flexibility to adapt to dynamic project needs and schedules. Location & Travel: Initial joining location is Raipur, Chhattisgarh . Trainers can be based from anywhere in India but must be willing to relocate as per University location. Accommodation will be provided by the company at the training site or University.

Posted 1 month ago

Apply

1.0 - 3.0 years

8 - 10 Lacs

Mysore, Karnataka, India

On-site

Technical Skills Required: ETL Concepts: Strong understanding of Extract, Transform, Load (ETL) processes. Ability to design, develop, and maintain robust ETL pipelines. Database Fundamentals: Proficiency in working with relational databases (e.g., MySQL, PostgreSQL, Oracle, or MS SQL Server). Knowledge of database design and optimization techniques. Basic Data Visualization: Ability to create simple dashboards or reports using visualization tools (e.g., Tableau, Power BI, or similar). Query Optimization: Expertise in writing efficient, optimized queries to handle large datasets. Testing and Documentation: Experience in validating data accuracy and integrity through rigorous testing. Ability to document data workflows, processes, and technical specifications clearly. Key Responsibilities: Data Engineering Tasks: Design, develop, and implement scalable data pipelines to support business needs. Ensure data quality and integrity through testing and monitoring. Optimize ETL processes for performance and reliability. Database Management: Manage and maintain databases, ensuring high availability and security. Troubleshoot database-related issues and optimize performance. Collaboration: Work closely with data analysts, data scientists, and other stakeholders to understand and deliver on data requirements. Provide support for data-related technical issues and propose solutions. Documentation and Reporting: Create and maintain comprehensive documentation for data workflows and technical processes. Develop simple reports or dashboards to visualize key metrics and trends. Learning and Adapting: Stay updated with new tools, technologies, and methodologies in data engineering. Adapt quickly to new challenges and project requirements. Additional Requirements: Strong communication skills, both written and verbal. Analytical mindset with the ability to solve complex data problems. Quick learner and willingness to adopt new tools and technologies as needed. Flexibility to work in shifts, if required. Preferred Skills (Not Mandatory): Experience with cloud platforms (e.g., AWS, Azure, or GCP). Familiarity with big data technologies such as Hadoop or Spark. Basic understanding of machine learning concepts and data science workflows. Mandatory Key Skills Python Programming, ETL Concepts, Database Management, Query Optimization, Data Visualization, Cloud Platform, AWS, Azure, GCP, Advanced Python, Tableau, Power BI, SQL

Posted 1 month ago

Apply

4.0 - 9.0 years

8 - 18 Lacs

Navi Mumbai, Pune, Mumbai (All Areas)

Hybrid

Job Description : Job Overview: We are seeking a highly skilled Data Engineer with expertise in SQL, Python, Data Warehousing, AWS, Airflow, ETL, and Data Modeling . The ideal candidate will be responsible for designing, developing, and maintaining robust data pipelines, ensuring efficient data processing and integration across various platforms. This role requires strong problem-solving skills, an analytical mindset, and a deep understanding of modern data engineering frameworks. Key Responsibilities: Design, develop, and optimize scalable data pipelines and ETL processes to support business intelligence, analytics, and operational data needs. Build and maintain data models (conceptual, logical, and physical) to enhance data storage, retrieval, and transformation efficiency. Develop, test, and optimize complex SQL queries for efficient data extraction, transformation, and loading (ETL). Implement and manage data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) for structured and unstructured data storage. Work with AWS, Azure , and cloud-based data solutions to build high-performance data ecosystems. Utilize Apache Airflow for orchestrating workflows and automating data pipeline execution. Collaborate with cross-functional teams to understand business data requirements and ensure alignment with data strategies. Ensure data integrity, security, and compliance with governance policies and best practices. Monitor, troubleshoot, and improve the performance of existing data systems for scalability and reliability. Stay updated with emerging data engineering technologies, frameworks, and best practices to drive continuous improvement. Required Skills & Qualifications: Proficiency in SQL for query development, performance tuning, and optimization. Strong Python programming skills for data processing, automation, and scripting. Hands-on experience with ETL development , data integration, and transformation workflows. Expertise in data modeling for efficient database and data warehouse design. Experience with cloud platforms such as AWS (S3, Redshift, Lambda), Azure, or GCP. Working knowledge of Airflow or similar workflow orchestration tools. Familiarity with Big Data frameworks like Hadoop or Spark (preferred but not mandatory). Strong problem-solving skills and ability to work in a fast-paced, dynamic environment. Role & responsibilities Preferred candidate profile

Posted 1 month ago

Apply

5.0 - 7.0 years

25 - 35 Lacs

Bengaluru

Work from Office

Senior Software Engineer - Backend (Python) Experience: 5 - 7 Years Exp. Salary : INR 25-35 Lacs per annum Preferred Notice Period : Within 30 Days Shift : 09:00AM to 06:00PM IST Opportunity Type: Onsite (Bengaluru) Placement Type: Contractual Contract Duration: Full-Time, Indefinite Period (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : Advanced python, FastAPI, Api & microservices architecture, Cloud infrastructure (aws), Docker/Kubernetes, Database management (PostgreSQL/ MySQL/ MongoDB/ Redis), Integration with ML/Video Systems, Flask/ Django Good to have skills : Asynchronous programming, security best practices, Stream Processing & Messaging, Domain Knowledge in AI/ computer vision Radius AI (One of Uplers' Clients) is Looking for: Senior Software Engineer - Backend (Python) who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description RadiusAI is looking for a Senior Software Engineer (Python) to build and optimize the backend infrastructure that drives our real-time AI products. This is a hands-on role ideal for an engineer who has a deep understanding of backend architecture, API design, and distributed systems and can scale systems to support intensive machine learning and video processing workloads. You will be a key part of a cross-functional team building robust, scalable, and secure platforms for AI deployment. Key Responsibilities Design and implement backend services, APIs, and data pipelines to support AI and CV platforms. Build scalable microservices and RESTful APIs using Python (FastAPI, Flask, or Django). Integrate with computer vision systems and ML inference engines via APIs or streaming data interfaces. Optimize system performance for real-time or near-real-time processing, especially in video-based environments. Work with cloud services (AWS, GCP, or Azure) for deployment, scaling, and observability. Implement robust logging, monitoring, and alerting across backend services. Collaborate closely with ML engineers, DevOps, and frontend teams to deliver full-stack features. Own the entire software development lifecycle: architecture, development, testing, deployment, and maintenance. Write clean, testable, scalable, and maintainable code. Participate in code reviews, mentoring, and setting engineering best practices. Required Qualifications 5+ years of experience in backend development, with Python as the primary language. Strong experience with Python web frameworks such as FastAPI, Django, or Flask. Expertise in designing and building RESTful APIs and microservices architectures.\ Solid understanding of software architecture, design patterns, and scalability principles. Experience working with databases (PostgreSQL, MySQL, MongoDB, Redis, etc.). Proficient with Docker, Kubernetes and experience containerizing applications for local and cloud deployment. Hands-on experience working with cloud platforms. Experience integrating with machine learning models and handling high-throughput data (image/video or time-series is a plus).\ Familiarity with CI/CD practices, Git, unit testing, and agile methodologies.\ Strong problem-solving skills and a collaborative mindset. Preferred Qualifications Experience with asynchronous programming (e.g., asyncio, aiohttp, FastAPI with async). Familiarity with message queues and stream processing (Kafka, RabbitMQ, Redis Streams, etc.). Exposure to real-time data processing systems, especially involving video or IoT sensor data. Knowledge of security best practices in backend systems (authentication, authorization, rate limiting). Prior experience in computer vision or AI-focused products is a strong plus. Contributions to open-source Python projects or backend infrastructure tooling. Interview rounds 1st - Technical screening 2nd - Live coding 3rd - Technical & cultural discussion How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: RadiusAI is a pioneering computer vision analytics company revolutionizing retail operations with advanced, human-centric AI solutions. We offer the world's most advanced VisionAI checkout and we provide real-time data to improve operational efficiency across the entire retail industry, focusing on enterprise-level customers and secure edge integration About Uplers: Uplers is the #1 hiring platform for SaaS companies, designed to help you hire top product and engineering talent quickly and efficiently. Our end-to-end AI-powered platform combines artificial intelligence with human expertise to connect you with the best engineering talent from India. With over 1M deeply vetted professionals, Uplers streamlines the hiring process, reducing lengthy screening times and ensuring you find the perfect fit. Companies like GitLab, Twilio, TripAdvisor, and AirBnB trust Uplers to scale their tech and digital teams effectively and cost-efficiently. Experience a simpler, faster, and more reliable hiring process with Uplers today.

Posted 1 month ago

Apply

4.0 - 7.0 years

4 - 9 Lacs

Pune

Hybrid

Role Overview: This hybrid role sits within the Distribution Data Stewardship Team and combines operational and technical responsibilities to ensure data accuracy, integrity, and process optimization across sales reporting functions. Key Responsibilities: Support sales reporting inquiries from sales staff at all levels. Reconcile omnibus activity with sales reporting systems. Analyze data flows to assess impact on commissions and reporting. Perform data audits and updates to ensure integrity. Lead process optimization and automation initiatives. Manage wholesaler commission processes, including adjustments and manual submissions. Oversee manual data integration from intermediaries. Execute territory alignment changes to meet business objectives. Contribute to team initiatives and other responsibilities as assigned. Growth Opportunities: Exposure to all facets of sales reporting and commission processes. Opportunities to develop project and relationship management skills. Potential to explore leadership or technical specialist roles within the firm. Qualifications: Bachelors degree in Computer Engineering or a related field. 4–7 years of experience with Python programming and automation . Strong background in SQL and data analysis . Experience in relationship/customer management and leading teams . Experience working with Salesforce is a plus. Required Skills: Technical proficiency in Python and SQL . Strong communication skills and stakeholder engagement. High attention to data integrity and detail . Self-directed with excellent time management. Project coordination and documentation skills. Proficiency in MS Office , especially Excel .

Posted 1 month ago

Apply

8.0 - 13.0 years

25 - 35 Lacs

Bengaluru

Work from Office

Johnson & Johnson MedTech is seeking a Sr Eng Data Engineering for Digital Surgery Platform (DSP) in Bangalore, India. Johnson & Johnson (J&J) stands as the world's leading manufacturer of healthcare products and a service provider in the pharmaceutical and medical device sectors. At Johnson & Johnson MedTech's Digital Surgery Platform, we are groundbreaking the future of healthcare by harnessing the power of people and technology, transitioning to a digital-first MedTech enterprise. With a focus on innovation and an ambitious strategic vision, we are integrating robotic-assisted surgery platforms, connected medical devices, surgical instruments, medical imaging, surgical efficiency solutions, and OR workflow into the next-generation MedTech platform. This initiative will also foster new surgical insights, improve supply chain innovation, use cloud infrastructure, incorporate cybersecurity, collaborate with hospital EMRs, and elevate our digital solutions. We are a diverse and growing team, that nurture creativity, deep understanding of data processing techniques, and the use of sophisticated analytics technologies to deliver results. Overview As a Sr Eng Data Engineering for J&J MedTech Digital Surgery Platform (DSP), you will play a pivotal role in building the modern cloud data platform by demonstrating your in-depth technical expertise and interpersonal skills. In this role, you will be required to focus on accelerating digital product development as part of the multifunctional and fast-paced DSP data platform team and will give to the digital transformation through innovative data solutions. One of the key success criteria for this role is to ensure the quality of DSP software solutions and demonstrate the ability to collaborate effectively with the core infrastructure and other engineering teams and work closely with the DSP security and technical quality partners. Responsibilities Work with platform data engineering, core platform, security, and technical quality to design, implement and deploy data engineering solutions. Develop pipelines for ingestion, transformation, orchestration, and consumption of various types of data. Design and deploy data layering pipelines that use modern Spark based data processing technologies such as Databricks and Delta Live Table (DLT). Integrate data engineering solutions with Azure data governance components not limited to Purview and Databricks Unity Catalog. Implement and support security monitoring solutions within Azure Databricks ecosystem. Design, implement, and support data monitoring solutions in data analytical workspaces. Configure and deploy Databricks Analytical workspaces in Azure with IaC (Terraform, Databricks API) with J&J DevOps automation tools within JPM/Xena framework. Implement automated CICD processes for data processing pipelines. Support DataOps for the distributed DSP data architecture. Function as a data engineering SME within the data platform. Manage authoring and execution of automated test scripts. Build effective partnerships with DSP architecture, core infrastructure and other domains to design and deploy data engineering solutions. Work closely with the DSP Product Managers to understand business needs, translate them to system requirements, demonstrate in-depth understanding of use cases for building prototypes and solutions for data processing pipelines. Operate in SAFe Agile DevOps principles and methodology in building quality DSP technical solutions. Author and implement automated test scripts as mandates DSP quality requirements. Qualifications Required Bachelors degree or equivalent experience in software or computer science or data engineering. 8+ years of overall IT experience. 5-7 years of experience in cloud computing and data systems. Advanced Python programming skills. Expert level in Azure Databricks Spark technology and data engineering (Python) including Delta Live Tables (DLT). Experience in design and implementation of secure Azure data solutions. In-depth knowledge of the data architecture infrastructure, network components, data processing Proficiency in building data pipelines in Azure Databricks. Proficiency in configuration and administration of Azure Databricks workspaces and Databricks Unity Catalog. Deep understanding of principles of modern data Lakehouse. Deep understanding of Azure system capabilities, data services, and ability to implement security controls. Proficiency with enterprise DevOps tools including Bitbucket, Jenkins, Artifactory. Experience with DataOps. Experience with quality software systems. Deep understanding of and experience in SAFe Agile. Understanding of SDLC. Preferred Master’s degree or equivalent. Proven healthcare experience. Azure Databricks certification. Ability to analyze use cases and translate them into system requirements, make data driven decisions DevOps's automation tools with JPM/Xena framework. Expertise in automated testing. Experience in AI and MLs. Excellent verbal and written communication skills. Ability to travel up to 10% of domestic required. Johnson & Johnson is an Affirmative Action and Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, age, national origin, or protected veteran status and will not be discriminated against on the basis of disability.

Posted 1 month ago

Apply

1.0 - 3.0 years

7 - 10 Lacs

Mysuru

Work from Office

Technical Skills Required: ETL Concepts: Strong understanding of Extract, Transform, Load (ETL) processes. Ability to design, develop, and maintain robust ETL pipelines. Database Fundamentals: Proficiency in working with relational databases (e.g., MySQL, PostgreSQL, Oracle, or MS SQL Server). Knowledge of database design and optimization techniques. Basic Data Visualization: Ability to create simple dashboards or reports using visualization tools (e.g., Tableau, Power BI, or similar). Query Optimization: Expertise in writing efficient, optimized queries to handle large datasets. Testing and Documentation: Experience in validating data accuracy and integrity through rigorous testing. Ability to document data workflows, processes, and technical specifications clearly. Key Responsibilities: Data Engineering Tasks: Design, develop, and implement scalable data pipelines to support business needs. Ensure data quality and integrity through testing and monitoring. Optimize ETL processes for performance and reliability. Database Management: Manage and maintain databases, ensuring high availability and security. Troubleshoot database-related issues and optimize performance. Collaboration: Work closely with data analysts, data scientists, and other stakeholders to understand and deliver on data requirements. Provide support for data-related technical issues and propose solutions. Documentation and Reporting: Create and maintain comprehensive documentation for data workflows and technical processes. Develop simple reports or dashboards to visualize key metrics and trends. Learning and Adapting: Stay updated with new tools, technologies, and methodologies in data engineering. Adapt quickly to new challenges and project requirements. Additional Requirements: Strong communication skills, both written and verbal. Analytical mindset with the ability to solve complex data problems. Quick learner and willingness to adopt new tools and technologies as needed. Flexibility to work in shifts, if required. Preferred Skills (Not Mandatory): Experience with cloud platforms (e.g., AWS, Azure, or GCP). Familiarity with big data technologies such as Hadoop or Spark. Basic understanding of machine learning concepts and data science workflows. Mandatory Key Skills Python Programming,ETL Concepts,Database Management,Query Optimization,Data Visualization,Cloud Platform,AWS,Azure,GCP,Advanced Python,Tableau,Power BI,SQL*

Posted 2 months ago

Apply

- 1 years

4 - 6 Lacs

Pune

Work from Office

Role & responsibilities Python Developers are software engineers who specialize in using the Python programming language to develop and maintain software applications. As a Python Developer, you'll be responsible for writing, testing, and debugging code in Python to create applications that can run on various platforms, such as web browsers or mobile devices. Your job description as a Python Developer includes working closely with other developers, designers, and project managers to deliver software projects on time and within budget constraints. You'll need to have a strong understanding of Python programming language, as well as knowledge of software development methodologies such as Agile or Waterfall. to excel in this role, you must also have excellent problem-solving skills and be able to pay attention to detail, as even the smallest error could cause problems in the software. Furthermore, Python Developers must have strong communication skills to work in teams or with clients to discuss project requirements. Preferred candidate profile your tasks Create large-scale data processing pipelines to help developers build and train novel machine learning algorithms. Participate in code reviews, ensure code quality and identify areas for improvement to implement practical solutions. Debugging codes when required and troubleshooting any Python-related queries. Keep up to date with emerging trends and technologies in Python development. Required skills and qualifications 3+ years of experience as a Python Developer with a strong portfolio of projects. Bachelor's degree in Computer Science, Software Engineering or a related field. In-depth understanding of the Python software development stacks, ecosystems, frameworks and tools such as Numpy, Scipy, Pandas, Dask, spaCy, NLTK, sci-kit-learn and PyTorch. Experience with front-end development using HTML, CSS, and JavaScript. Familiarity with database technologies such as SQL and NoSQL. Excellent problem-solving ability with solid communication and collaboration skills. Preferred skills and qualifications Experience with popular Python frameworks such as Django, Flask or Pyramid. Knowledge of data science and machine learning concepts and tools. A working understanding of cloud platforms such as AWS, Google Cloud or Azure. Contributions to open-source Python projects or active involvement in the Python community.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies