Home
Jobs

919 Pandas Jobs - Page 29

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2 - 4 years

4 - 6 Lacs

Mumbai, Goregaon

Work from Office

Naukri logo

We are seeking a talented and experienced Python Developer + Data Scientist with a strong background in Flask to join our dynamic team. The ideal candidate will have a passion for leveraging data to drive insights and create impactful solutions, along with proficiency in Python development, particularly with Flask. Responsibilities: Develop and maintain Python-based applications, with a focus on Flask for web development. Collaborate with cross-functional teams to understand project requirements and translate them into technical solutions. Design, implement, and maintain data pipelinesfor collecting, processing, and analysing large datasets. Perform exploratory data analysis to identify trends, patterns, and insights. Build machine learning models and algorithms to solve business problems and optimize processes. Deploy and monitor data science solutions in production environments. Conduct code reviews, testing, and debugging to ensure the quality and reliability of software applications. Stay updated with the latest trends and advancements in Python development, data science, and machine learning. Requirements: Bachelors or Master's degree in Computer Science, Data Science, or a related field. 2+ years of professional experience in Python development and data science. Strong proficiency in Python programming languagewith Flask framework and familiarity with relational databases (e.g., MySQL). Proficiency in handling and manipulating various types of data, including structured and unstructured data, using Python libraries such as Pandas, NumPy, and Beautiful Soup. Apply machine-learning techniques to analyse and extract insights from large text datasets, including social media data, customer feedback, and user interactions, to inform business decisions and strategy. Knowledge of machine learning techniques and libraries (e.g., scikit-learn, TensorFlow). Familiarity with creating and managing projects involving language models such as OpenAI's GPT (Generative Pre-trained Transformer) series, including ChatGPT and other prompt engineering tasks. Use models for LLMs and related tasks to enhance Chabots, virtual assistants, and other conversational AI applications, improving natural language understanding, conversation flow, and response generation. Familiarity with cloud platforms such as AWS, Azure, or Google Cloud Platform. Experience with version control systems (e.g., Git). Excellent problem-solving skills and attention to detail. Strong communication and collaboration abilities

Posted 2 months ago

Apply

3 - 4 years

7 - 10 Lacs

Mumbai

Work from Office

Naukri logo

? Collaborate with cross-functional teams to understand project requirements and objectives. ? Assist in data collection, preprocessing, and cleaning for training ML models. ? Develop, train, and evaluate machine learning models in NLP, speech to text and text to speech. ? Implement and deploy ML models into production systems. ? Process automation and integration into the third-party platforms with focus on financial services platforms. ? Monitor and optimize the performance of deployed models. ? Conduct experiments to fine-tune algorithms and improve model accuracy. ? Stay updated with the latest trends, tools, and technologies in the ML space. Minimum Qualifications and Experience: ? Bachelor?s degree in Computer Science, Data Science, Engineering, Mathematics, or a related field with a minimum of 3-4 years of experience. Required Expertise: ? Strong understanding of fundamental ML concepts, algorithms, and techniques (e.g., supervised, unsupervised learning). ? Proficiency in programming languages such as Python or R. ? Familiarity with ML libraries/frameworks like TensorFlow, PyTorch, or scikit-learn. ? Knowledge of data manipulation and analysis using tools like Pandas and NumPy. ? Familiarity with cloud platforms (e.g., AWS, Google Cloud, or Azure). ? Understanding of database systems (SQL or NoSQL). ? Experience with version control systems like Git. ? Basic knowledge of software engineering principles. ? Problem-solving and critical thinking. ? Strong communication and collaboration skills. ? Eagerness to learn and adapt in a fast-paced environment. Other terms: ? The position is contractual, full time in nature and subject to periodic performance reviews.

Posted 2 months ago

Apply

6 - 10 years

8 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role :About The Role : / Roles & Responsibilities (in Detail) This is hands on role. The role will involve design, coding, testing, working with product owners / scrum master for scrum planning, estimation, demos & leading / guiding junior developers as needed. Years of experience 6 to 10 years Mandatory tech skills - Python, NumPy, Pandas, Strong in Data Engineering and Analysis, SQL Server Good in writing Python code. Hands-on experience with pandas and numpy stack Able to perform data cleanup.summarization using numpy/pandas. SQL knowledge is Essencial. Must have expertise in Rest API development Cloud experience is preferred (Azure). Uses pertinent data and facts to identify and solve a range of problems within area of expertise. Investigates non- standard requests and problems, with some assistance from others. Experience Level 5 to 9 Years relevant experience

Posted 2 months ago

Apply

6 - 11 years

10 - 20 Lacs

Pune

Work from Office

Naukri logo

Position: Python Developer Exp: 7+ years Shift timings: 1.30pm to 10.30pm Work from the office is a must. Primary Responsibilities: We are looking for a skilled Python Developer who has experience in building custom processes and components in Python. The ideal candidate will have a strong understanding of the lifecycle, including changelog creation, event processing, and data streaming. The candidate should be proficient in working with pandas, numpy data frames, CSV, and Parquet files. Experience in linear optimization and familiarity with Gurobi(Optional) or Google OR tools is a plus. Additional experience in parallel processing, MVC architecture, and advanced Python packages for data engineering (such as NumPy and Graph database) will be highly valued. Knowledge of trucking and supply chain management is a bonus. Key Responsibilities: Design, develop, and maintain scalable web applications and services using FastAPI and Flask. Should have good knowledge of Jupyter notebooks, IntelliJ-IDE/VSCODE. Architect RESTful APIs and integrate them with front-end applications and third-party services. Write clean, efficient, and maintainable code that adheres to best practices and coding standards. Optimize application performance and scalability to handle high volumes of traffic. Implement security best practices including authentication, authorization, and data protection. Collaborate with DevOps to ensure seamless deployment and integration with CI/CD pipelines. Participate in code reviews, debugging, and unit testing (Pytest) to ensure code quality. Work with the team to design database schemas, ensure proper indexing, and optimize queries for performance. Contribute to the continuous improvement of development processes and methodologies. Design, develop, and maintain custom ETL processes and components in Python. Manage the entire ETL lifecycle, including changelog creation, event processing, and data streaming. Work extensively with pandas and dataframes for data manipulation and transformation. Handle various data formats including CSV and Parquet files. Good to have experience in Implementing linear optimization solutions using tools like Gurobi or Google OR. Perform parallel processing to optimize data handling and transformation tasks. Apply MVC architecture principles in the development of ETL components. Utilize advanced Python packages for data engineering, including NumPy. Collaborate with cross-functional teams to understand data requirements and deliver solutions. Optional: Apply knowledge of trucking and supply chain management to enhance data processes. Documentation knowledge is required. Skills and Qualifications: Problem Solving and analytical skills. Proven experience in building custom ETL processes and components using Python. Strong expertise in pandas and dataframes. Proficiency in handling CSV and Parquet file formats. with linear optimization and familiarity with tools like Gurobi or Google OR. Solid understanding of parallel processing techniques. Knowledge of MVC architecture and its application in ETL processes. Experience with advanced Python packages for data engineering, such as NumPy and TigerGraph. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Bonus: Experience in trucking and supply chain management.

Posted 2 months ago

Apply

4 - 9 years

6 - 11 Lacs

Pune

Work from Office

Naukri logo

About The Role Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. As a Python Developer with Databricks, you will be responsible for developing and maintaining scalable data pipelines, managing cloud environments on Azure, and ensuring smooth integration with APIs. The ideal candidate will be proficient in Python, Databricks (PySpark), and Azure DevOps, with a strong understanding of cloud services, DevOps practices, and API testing. Notice Period 30 to 90 days Key Responsibilities Develop and Maintain Data PipelinesDesign, develop, and maintain scalable data pipelines using Python and Databricks (PySpark). Data ProcessingApply strong proficiency in Python and advanced Python concepts to process and manipulate large datasets effectively. API IngestionExperience in API ingestion, working with data in JSON format to integrate and automate data workflows. Cloud ManagementUse Azure Portal for managing cloud environments and services. Databricks PySparkWork with Databricks and PySpark to build distributed data processing applications. DevOps & Agile MethodologyImplement DevOps best practices and work within a Scrum framework to ensure continuous integration and continuous delivery (CI/CD) pipelines. API Testing & AutomationUse Postman to test and automate APIs for robust integration and data workflows. CollaborationWork closely with cross-functional teams to implement solutions aligned with business objectives and technical requirements. Primary Skills Required Qualifications Programming Skills: Strong proficiency in Python with experience in data processing libraries (e.g., Pandas, NumPy). Databricks ExperienceHands-on experience with Databricks (PySpark) for data processing and analysis. Cloud PlatformExperience using Azure Portal to manage cloud environments and services. API HandlingExpertise in working with APIs, specifically with data ingestion and integration in JSON format. DevOps MethodologyFamiliarity with DevOps practices and experience working in Agile/Scrum environments. API Testing ToolsProficiency with Postman for API testing and automation. Version ControlExperience using Visual Studio Code and version control systems like Git. Preferred Qualifications Familiarity with Azure DevOps for building and deploying CI/CD pipelines. Experience working with large-scale data processing frameworks such as Apache Spark or Hadoop. Azure Certifications (e.g., Azure Data Engineer, Azure Developer) are a plus. Skills & Attributes Excellent problem-solving and troubleshooting skills. Strong communication and collaboration abilities. Ability to manage multiple priorities and meet deadlines in a fast-paced environment. A proactive mindset focused on continuous improvement and automation. About Capgemini

Posted 2 months ago

Apply

3 - 6 years

0 Lacs

Pune, Bengaluru, Hyderabad

Work from Office

Naukri logo

Mandatory Skill : Python/ Pandas, SQL, Azure Spark SQL is okay, with basic knowledge of spark core concepts

Posted 2 months ago

Apply

4 - 7 years

20 - 30 Lacs

Gurgaon

Work from Office

Naukri logo

Role : Senior Data Scientist - Fintech / Management Consulting Firm Role & Responsibilities: Support business decision making by deriving actionable insights from structured and unstructured data Conduct end-to-end machine learning and data analytics activities, including data wrangling, feature engineering, building and evaluating ML models. Research and implement novel machine learning approaches to enable informed business decisions. Collaborate with business and engineering teams to analyze, extract, normalize, and label relevant data. Perform data engineering, feature engineering, data preprocessing to train and tune ML models Utilize BI tools to prepare reports, analyze and visualize data. Implement MLOps practices such as CI/CD and continuous training (CT) for deploying and maintaining ML models in production environments. Educational Qualification / Work Experience & Skills : PG / Graduate 4 to 7 years of experience in data analytics and machine learning Hands-on experience with ML frameworks and libraries such as TensorFlow, PyTorch, Scikit-learn, Pandas and NumPy. Proficiency in writing production-quality code in Python, with a strong understanding of object-oriented programming principles Proven experience in analyzing and working with large datasets Hands on experience with various BI tools, SQL Expertise in applying both supervised and unsupervised learning techniques

Posted 2 months ago

Apply

3 - 7 years

7 - 11 Lacs

Gurgaon

Work from Office

Naukri logo

About the Role: We are based in Gurgaon and looking for a Senior Computer Vision Engineer to join our team and help our team to improve and create new technologies. You'll work on projects which makes online assessment more secure and cheating proof. If you're a seasoned computer vision expert with a passion for innovation and a track record of delivering impactful solutions, we would be happy to meet you. Role : Senior Computer Vision Engineer Functional Area : AI Educational Qualification: BTech/MS/MTech/PhD in Computer Science/Computer vision/Signal Processing/Deep Learning or equivalent. Should have worked in an academic or professional setting in the field of computer vision/signal processing. Experience: 2-5 years Location : Gurgaon Key Responsibilities: Develop and optimize advanced computer vision algorithms for image and video analysis tasks. Design, implement and train deep learning models for object detection, face processing, activity recognition and other related tasks. Test and refine models and systems based on real-world data and feedback. Evaluate project requirements, plan and manage the roadmap of a project. Present findings and insights in a clear and concise manner to stakeholders. Collaborate and help to integrate and deploy computer vision systems into broader product architecture. Conduct research to stay updated on emerging computer vision technologies and trends. Automate data preprocessing and annotation processes to streamline workflow efficiency. Maintain comprehensive documentation for algorithms, implementations, and evaluations. Mentor junior engineers and provide strategic guidance on project development. Requirements and skills: Proficiency in Python and knowledge of C++, Java and JS is plus. Solid understanding of neural networks, especially convolutional neural networks (CNNs). Knowledge of RCNNs and vision transformers. Proficient in understanding, designing and implementing deep learning models using frameworks such as TensorFlow, PyTorch and Keras. Understanding of fundamental image processing techniques like image filtering, edge detection, image segmentation and image augmentation. Experience in evaluating computer vision models using relevant metrics and performance indicators. Familiarity with GPU and related technologies which is utilized for improved computational efficiency such as CUDA, CUDNN, tensorRT etc. Familiarity with Python libraries such as OpenCV, NumPy, Pandas and scikit-learn etc. Basic knowledge of linear algebra, calculus, and statistics. Strong critical thinking, analytical, and problem-solving skills Self-motivated, quick learner and strong team player with ability to work with minimal supervision.

Posted 2 months ago

Apply

2 - 5 years

14 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience with Apache Spark (PySpark): In-depth knowledge of Sparks architecture, core APIs, and PySpark for distributed data processing. Big Data Technologies: Familiarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modeling, and data warehousing concepts. Strong proficiency in Python: Expertise in Python programming with a focus on data processing and manipulation. Data Processing Frameworks: Knowledge of data processing libraries such as Pandas, NumPy. SQL Proficiency: Experience writing optimized SQL queries for large-scale data analysis and transformation. Cloud Platforms: Experience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing

Posted 2 months ago

Apply

4 - 7 years

5 - 13 Lacs

Chennai, Hyderabad

Hybrid

Naukri logo

Python developer: skills: Python Development Django Pandas SQL Rest API Interview process: Virtual Assessment. Face to Face interview (Mandatory) Technical Round- 1 Apply only if interested. Looking for Immediate to 30 day's notice period.

Posted 2 months ago

Apply

5 - 8 years

12 - 20 Lacs

Pune, Hyderabad, Gurgaon

Work from Office

Naukri logo

About Client Hiring for One of Our Multinational Corporations! Job Title : Data Analyst Qualification : Any Graduate or Above Relevant Experience : 5 to 8 Years Must Have Skills : Python SQL AWS Roles and Responsibilities : Collect, clean, and process large datasets from various data sources. Analyze complex datasets using SQL queries to extract meaningful insights and trends. Develop and maintain dashboards and reports to monitor business KPIs. Work with Python to automate data processing workflows and develop custom analytics tools. Use AWS services (like Redshift, S3, Lambda, etc.) to handle cloud-based data storage and processing tasks. Collaborate with cross-functional teams to understand business needs and translate them into data requirements. Provide actionable insights and recommendations to stakeholders through detailed reports and presentations. Ensure data quality and integrity throughout the data lifecycle. Continuously improve data analysis processes, tools, and techniques. Location : Chennai, Bangalore, Hyderabad, Pune, Noida & Gurgaon CTC Range : Upto 24 LPA (Lakhs Per Annum) Notice period : 90 days Mode of Interview : Virtual Mode of Work : Work From Office -- Thanks & Regards, SHRIVIDYA Black and White Business Solutions Pvt.Ltd. Bangalore,Karnataka,India. Direct Number:08067432486 shrividya@blackwhite.in |www.blackwhite.in

Posted 2 months ago

Apply

8 - 12 years

9 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Be part of transparent and responsible AI innovations. Join the Software Labs to build GenAI solutions that solve complex business problems and get an opportunity to influence the future of AI. As Lead Full Stack Developer, you will Design, develop, and support scalable, secure, and high-performance systems for cloud and on-premises environments that meet world-class standards. Collaborate with architects and stakeholders to design and build systems that solve complex business problems and deliver exceptional user experiences. Use modern architectural principles and best of the class tools and technologies Lead Technical Initiatives that aligns with business goals and objectives. Mentor, Coach and provide technical leadership Work in an agile, collaborative, distributed, fast paced and exciting environment. Stay current with industry trends, best practices, and emerging technologies to drive business value, solve complex problems, and innovate solutions that meet evolving needs. Champion an entrepreneurial mindset and deliver unique value by leveraging industry and technology expertise to drive innovative strategies to accelerate clients success. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in designing and developing large scale enterprise systems.Twelve years of overall experience At least eight years of experience as a Full Stack Software Developer At least five years of experience in Python and other programming language Deep understanding of Large Language Models (LLMs), including fine-tuning, prompt engineering, and proficiency in frameworks like LangChain, Hugging Face, and related tools Working knowledge of Data Science Libraries: Pandas, Numpy, Sckit-Learn and NetworkX Experienced building Microservices & REST architectures Experience in NoSql DB such as Mongo DB and Couch DB. Knowledgeable in Front End and Back End Development programming languages and design Frameworks Knowledgeable in Docker Containerization A self-starter & ability to work effectively in a team environment Quick learner Experience in Agile SDLC (design, development, test and deploy) Exceptional knowledge of data structures, algorithms, enterprise systems, asynchronous architectures, and object-oriented programming Preferred technical and professional experience Working knowledge of DevOps practices Working knowledge of Test Automation Tools Design Thinking Experience Working experience with Knowledge Graphs and associated libraries such as neo4j Experience with Carbon, React, and Angular Design Systems Experience with VS Code Extension Development

Posted 3 months ago

Apply

2 - 5 years

14 - 17 Lacs

Pune

Work from Office

Naukri logo

As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience with Apache Spark (PySpark): In-depth knowledge of Sparks architecture, core APIs, and PySpark for distributed data processing. Big Data Technologies: Familiarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modeling, and data warehousing concepts. Strong proficiency in Python: Expertise in Python programming with a focus on data processing and manipulation. Data Processing Frameworks: Knowledge of data processing libraries such as Pandas, NumPy. SQL Proficiency: Experience writing optimized SQL queries for large-scale data analysis and transformation. Cloud Platforms: Experience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing

Posted 3 months ago

Apply

0 - 2 years

2 - 4 Lacs

Mumbai

Work from Office

Naukri logo

The Basics. Team:Founders Office. Introduction . CashFlo is a one-of-a-kind AP automation and supply chain financing platform. Made in India. Made for India. Our mission? To unleash the untapped financial potential of millions of Indian businesses. Our AP and Payments automation suite is transforming the way large businesses across the country handle their payments. Founded by illustrious alumni from BCG, ISB, and IIM, CashFlo is fortified by the trust and backing of prominent investors such as Elevation Capital and General Catalyst. Recognized by Nasscom as the Best Supply Chain Finance solution from 2019 to 2021, CashFlo's platform creates a harmonious ecosystem for buyers, suppliers, and financiers. Our integrated AP & financing platform has become a key driver of growth for over 100 large enterprises and 200,000 mid-market and SME companies across more than ten sectors. We proudly serve a clientele that includes renowned brands like Mosaic Wellness, The Souled Store, Durian, Cona Electricals, as well as industry giants like ITC, IFB, Crompton, Zydus Healthcare, Lupin, Murugappa Group, among others. We combine in-depth financial and technological insights to deliver exceptional results. We take immense pride in our work and firmly believe in the exponential effect of a job well done, every day. We are on the lookout for individuals who share this ethos and are prepared to help us take CashFlo to unparalleled heights. Youll Excel If You Possess. Data Analysis Tools:Proficiency in data analysis tools such as Python (with libraries like Pandas, NumPy, Matplotlib, and Seaborn) and R. Data visualization:Experience with data visualization tools such as Tableau, Power BI, or Matplotlib for creating meaningful visualizations. Statistical Analysis:A good understanding of statistical concepts and techniques for hypothesis testing, regression analysis, and other statistical modelling. SQL Database Management:Strong SQL skills for data extraction, manipulation, and querying from relational databases like MySQL, PostgreSQL, or Microsoft SQL Server. Problem-Solving Skills: Strong problem-solving skills to identify business problems, formulate hypotheses, and use data to derive solutions. Communication Skills: Effective communication skills to convey complex data-driven insights to non-technical stakeholders. Joining CashFlo Why Its a Great Choice. Uniquely Positioned for Success:CashFlo sits at the unique intersection of Payments, Lending, and SaaS three of the fastest-growing and most lucrative spaces globally and in India. As a part of our team, you will be a key player in an industry-defining company. An Opportunity to Create Wealth:At CashFlo, we understand that our success is deeply linked with the success of our employees. That's why we offer the potential to create exponential wealth through equity in our rapidly growing early-stage company. You will not only contribute to our growth story, but also share in the rewards. A Collaborative and Driven Team:We pride ourselves on fostering a culture that encourages kindness, collaboration, and a shared commitment to quality. Our team members are always there to help each other, and we believe in lifting each other up. Your growth is our growth, and we succeed as a team. Direct Impact on Company Success:At CashFlo, every role is crucial. Your work will have a real, tangible impact on our success. You'll see the results of your hard work in real-time. Fast-Track Your Career:We invest in our employees' professional growth through comprehensive training programs, mentoring opportunities, and clear growth paths. Whether you aspire to grow as an individual contributor or on a management track, we provide the resources and support you need to accelerate your career. Competitive Compensation and Benefits:We offer competitive salaries, comprehensive benefits, and recognition programs. We value the work you do, and our compensation package reflects our commitment to attracting and retaining the best talent. Unwavering Commitment to Excellence:We are seeking individuals ready to dive into challenging work, individuals who are excited about going above and beyond to drive their own growth and the company's. If you are motivated by ambitious goals and are ready to make a significant impact, CashFlo is the place for you. Submit Your Application. You have successfully applied. You have errors in applying. Apply With Resume *. First Name*. Middle Name. Last Name*. Email*. Mobile. Phone. Total Work Experience*. Current Location. Current Company*. Current Designation*. Fixed CTC*. Expected CTC. How Did You Hear About Us. Social Network and Web Links. Provide us with links to see some of your work (Git/ Dribble/ Behance/ Pinterest/ Blog/ Medium). Employer. Education. Show more Show less

Posted 3 months ago

Apply

4 - 7 years

6 - 9 Lacs

Ahmedabad

Work from Office

Naukri logo

Job Title :Data Science Engineer. Location :Remote (India, Ahmedabad Preferred). Shift :UK Shift. We are looking for a Data Science Engineer with 6+ years of experience in developing, deploying, and scaling machine learning models in production environments. This hybrid role combines expertise in data science and machine learning engineering to deliver impactful data-driven solutions at scale. Key Responsibilities. Design and develop machine learning models for various business problems. Implement data preprocessing, feature engineering, and model training in Python and Databricks. Work on model deployment, monitoring, and scaling in cloud or on-premise environments. Develop and optimize data pipelines to support model building and execution. Collaborate with cross-functional teams to ensure model alignment with business needs. Ensure the scalability, performance, and reliability of deployed models. Implement MLOps practices for automated testing, continuous integration, and model versioning. Monitor model performance in production and refine models as needed. Requirements. 6+ years of experience in Data Science, Machine Learning, and Model Deployment. Strong expertise in Python, Databricks, and machine learning frameworks (e.g., scikit-learn, TensorFlow, PyTorch). Experience with data pipeline development and cloud platforms. Knowledge of MLOps practices for scalable and automated model deployment. Strong SQL skills and experience working with large datasets. Familiarity with version control systems (e.g., Git) and containerization (e.g., Docker). Ability to work in a UK shift and collaborate with global teams. Job Type:4 years (Required). Databricks:3 years (Required). Work Location:Remote. (ref:hirist.tech). Show more Show less

Posted 3 months ago

Apply

4 - 7 years

6 - 9 Lacs

Kolkata

Work from Office

Naukri logo

Role :Gen Al Engineer. Location :Remote. Experience :7 years. NOTE :Immediate joiner. Technical Expertise :. Must Have. Proficient in Python programmingwith experience in agentic platforms from procode (e.g., Autogen, Semantic Kernel, LangGraph) to low code (e.g., Crew.ai, EMA.ai). Hands-on experience with GCP Al services such as Google Vertex Al, AutoML, and Al Platform. Fluent in GenAl packages like Llamaindex and Langchain. Soft Skills. Excellent communication and collaboration skills, with the abilityto work effectively with stakeholders across business and technical teams. Strong problem-solving and analytical skills. Attention to detail. Ability to work with teams in a dynamic, fast-paced :. 7 to 10 years of experience in software development, with 3+ years in AI/ML or Generative Al projects. Demonstrated experience in deploying and managing Al applications in production environments. Key Responsibilities. Write efficient, clean, and maintainable Python code for Al applications. Design, develop, and implement complex Generative Al solutions that are highly accurate and address challenging, intricate applications.. Utilize agentic platforms from procode (e.g., Autogen, Semantic Kernel, LangGraph) to low code (e.g., Crew.ai, EMA.ai). Leverage Google Vertex Al and Azure OpenAl ecosystems and tooling, including training models, advanced prompting, Assistant API, and agent curation. Develop and deploy RESTful APIs using frameworks like Flask or Django for model integration and consumption. Fine-tune and optimize Al models for business use cases. (ref:hirist.tech). Show more Show less

Posted 3 months ago

Apply

0 - 2 years

2 - 4 Lacs

Itanagar

Work from Office

Naukri logo

Job Title :Data Science Engineer. Location :Remote (India, Ahmedabad Preferred). Shift :UK Shift. We are looking for a Data Science Engineer with 6+ years of experience in developing, deploying, and scaling machine learning models in production environments. This hybrid role combines expertise in data science and machine learning engineering to deliver impactful data-driven solutions at scale. Key Responsibilities. Design and develop machine learning models for various business problems. Implement data preprocessing, feature engineering, and model training in Python and Databricks. Work on model deployment, monitoring, and scaling in cloud or on-premise environments. Develop and optimize data pipelines to support model building and execution. Collaborate with cross-functional teams to ensure model alignment with business needs. Ensure the scalability, performance, and reliability of deployed models. Implement MLOps practices for automated testing, continuous integration, and model versioning. Monitor model performance in production and refine models as needed. Requirements. 6+ years of experience in Data Science, Machine Learning, and Model Deployment. Strong expertise in Python, Databricks, and machine learning frameworks (e.g., scikit-learn, TensorFlow, PyTorch). Experience with data pipeline development and cloud platforms. Knowledge of MLOps practices for scalable and automated model deployment. Strong SQL skills and experience working with large datasets. Familiarity with version control systems (e.g., Git) and containerization (e.g., Docker). Ability to work in a UK shift and collaborate with global teams. Job Type :Full-time. Schedule :UK shift. Application Question(s). Please provide your current CTC, expected CTC and Notice period along with the application. The applications without this information may not be considered. Education :Bachelor's (Required). Experience :Python:4 years (Required). Databricks :3 years (Required). Work Location :Remote. (ref:hirist.tech). Show more Show less

Posted 3 months ago

Apply

2 - 5 years

35 - 100 Lacs

Bengaluru

Work from Office

Naukri logo

AI/ML Developer Req number: R4983 Employment type: Full time Worksite flexibility: Onsite Who we are CAI is a global technology services firm with over 8,500 associates worldwide and a yearly revenue of $1 billion+. We have over 40 years of excellence in uniting talent and technology to power the possible for our clients, colleagues, and communities. As a privately held company, we have the freedom and focus to do what is right—whatever it takes. Our tailor-made solutions create lasting results across the public and commercial sectors, and we are trailblazers in bringing neurodiversity to the enterprise. Job Summary We are looking for a motivated AI/ML Developer ready to take us to the next level! If you have strong expertise in back-end development with Python/Django and negotiation skills and are looking for your next career move, apply now. Job Description We are looking for an AI/ML Developer to Develop and maintain back-end systems using Python/Django. This position will be 6 months Contract and Hybrid in Pune . What You’ll Do Develop and maintain back-end systems using Python/Django. Work extensively with LLMs and transformer models, including BERT. Design and implement search or recommendation systems, data classification models, or Learning to Rank (LTR) models. Build cutting-edge AI/ML applications from conception to deployment. Maintain exceptional attention to detail and demonstrate a relentless willingness to learn and adapt to new technologies. What You'll Need Required: 2+ years of experience in AI/ML engineering or a related field. Proven expertise in back-end development with Python/Django. Extensive experience with LLMs and transformer models. Strong understanding of search or recommendation systems, data classification, or LTR models. A commitment to continuous learning and a proactive approach to problem-solving. Physical Demands Ability to safely and successfully perform the essential job functions Sedentary work that involves sitting or remaining stationary most of the time with occasional need to move around the office to attend meetings, etc. Ability to conduct repetitive tasks on a computer, utilizing a mouse, keyboard, and monitor Reasonable accommodation statement If you require a reasonable accommodation in completing this application, interviewing, completing any pre-employment testing, or otherwise participating in the employment selection process, please direct your inquiries to application.accommodations@cai.io or (888) 824 – 8111.

Posted 3 months ago

Apply

8 - 13 years

20 - 35 Lacs

Bengaluru

Hybrid

Naukri logo

About G2 Risk Solutions (G2RS) G2 Risk Solutions is the definitive expert in risk and compliance business intelligence for financial institutions and online platforms. We are industry pioneers providing market-leading solutions for merchant risk, digital commerce risk, bankruptcy risk, and credit risk and regulatory reporting. We are driving innovation and shaping the future of risk management through unprecedented data, technology, and global compliance and risk expertise, providing the financial services and digital commerce ecosystems with the tools needed to navigate complex and ever-changing regulatory requirements and mitigate risk. To learn more, visit g2risksolutions.com. Job Overview: As a Sr. Data Analyst, you will leverage data to uncover trends, generate insights, and support strategic decision-making across the organization. You will work closely with cross-functional teams to analyze datasets, optimize processes, and drive data-driven business solutions. Key responsibilities: Analyze datasets to uncover trends, patterns, and actionable insights, utilizing Python libraries like Pandas, NumPy, and Scikit-learn. Develop predictive models and statistical analyses to solve business problems and improve processes. Translate business objectives into data-driven strategies, identifying key questions, designing experiments, and providing actionable recommendations. Partner with cross-functional teams (Product, Marketing, Sales, Engineering) to define objectives, design experiments, and guide innovation through data. Conduct A/B testing to evaluate product features, marketing campaigns, or business initiatives, and provide recommendations for optimization. Collaborate with stakeholders to understand data needs, define project goals, and deliver scalable, impactful solutions. Build automated systems for data collection, analysis, and reporting to improve efficiency and decision-making. Ensure data quality, consistency, and integrity across multiple sources and support data governance efforts. Identify opportunities, forecast challenges, and influence product development with data-backed insights. Provide guidance through comprehensive analysis, helping teams address key questions and make informed decisions. Use analytical frameworks and methodologies to test hypotheses and deliver evidence-based evaluations. Qualifications: Bachelor's or master's degree in computer science, Data Science, Statistics, Engineering, or a related field. 8+ years of experience in data analysis, data science, or data analytics roles, leveraging data-driven insights to solve complex business challenges. Proficient in Python and SQL for modeling projects. Proficient in data visualization tools (e.g., Tableau, Power BI, Looker) and Python libraries (e.g., Matplotlib, Seaborn, Plotly) to develop dashboards and visual reports. Experienced analyzing datasets to uncover trends, patterns, and actionable insights. Experienced developing statistical analyses and predictive models to solve business challenges and enhance processes. Experienced building trust and credibility through clear communication, transparent methodologies, and consistent processes. Soft Skills: Strong analytical thinking and problem-solving skills. Demonstrated ability to independently manage technical tasks and effectively define project scope. Exceptional communication and collaboration skills, with a proven ability to engage and influence both technical and non-technical stakeholders effectively. A passion for using data to drive innovation and business success.

Posted 3 months ago

Apply

3 - 8 years

3 - 8 Lacs

Bengaluru, Hyderabad, Kolkata

Work from Office

Naukri logo

Work Mode - Hybrid Location - Kolkata, Bangalore, Hyderabad, Chennai, Kerala, Pune, Noida, Chennai Experience Required - 3yrs to 9yrs To qualify for the role you must have 3+ years of working experience in Large Scale AI/ML models and data science. Deep understanding of statistics, AI/ML algorithms, and predictive modeling. Proficiency in AI/ML programming languages like Python, R, SQL. Proficiency with a deep learning framework such as TensorFlow, PyTorch or Keras. Expertise in machine learning algorithms and data mining techniques (like SVM, Decision Trees, and Deep Learning Algorithms). Implement monitoring and logging tools to ensure AI model performance and reliability. Strong programming skills in Python including libraries for machine learning such as Scikit-learn, Pandas, NumPy, etc. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Automate tasks through Python scripting, databases, and other advanced technologies like data bricks, synapse, ML, AI, ADF etc Should have good understanding of Git, JIRA, Change / Release management, build/deploy, CI/CD Azure Devops & Share Point. Strong understanding of Large Enterprise Applications like SAP, Oracle ERP & Microsoft Dynamics etc.

Posted 3 months ago

Apply

7 - 12 years

15 - 30 Lacs

Pune, Bengaluru

Work from Office

Naukri logo

Sr Solutions Engineer Aligned Automation Based in the Pune, Maharashtra, India office. About the job A Better Together philosophy towards building a better world Aligned Automation is a strategic service provider that partners with Fortune 500 leaders to digitize enterprise operations and enable business strategies. We believe we can create positive, lasting change in the way our clients work while advancing the global impact of their business solutions for a more optimistic and better world. We are passionate about building and sustaining an inclusive and equitable workplace where all people can develop and thrive. Enriched by our 4C’s” – Care, Courage, Curiosity, and Collaboration – our culture supports solutions that empower the possible. Mid-Level Position based out of Pune (6 - 12 years) Senior Solutions Engineer Job Description: We are looking for a Developer good in Python, Spark along with ETL, Complex SQL, Cloud. Primary Skills: Python, Database, Spark Secondary: Azure/AWS, APIs In This Role, You Will Develop and maintain scalable and efficient backend systems, ensuring high performance and responsiveness to requests from the front-end. Design and implement cloud-based solutions, primarily on Microsoft Azure. Manage and optimize CI/CD pipelines for rapid and reliable deployment of software updates. Collaborate with frontend developers and other team members to establish objectives and design more functional, cohesive codes to enhance the user experience. Develop and maintain databases and server-side applications. Ensure the security of the backend infrastructure. Preferred Qualifications 5-7 years’ experience developing with Python. Experience in Automation using Python. Experience building rest APIs. Experience with mobile application development is advantageous. Experience working within a Cloud environment. Bachelor's degree in computer science or a related field or related experience Experience with CI/CD tools like Jenkins, GitLab CI, or Azure DevOps. In-depth understanding of database technologies (SQL and NoSQL) and web server technologies. Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes).

Posted 3 months ago

Apply

3 - 5 years

3 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Primary/Must Have skills: AWS lambda, S3, Cloud Watch and Python CDK for AWS Good programming skills on Python using Django framework Python Pandas & Asyncio PostGres database knowledge is a must Good to have Experience with version control systems like Git Experience in bug tracking, issue tracking using Jira/Version One Total Experience Expected: 04-06 years

Posted 3 months ago

Apply

6 - 11 years

0 - 2 Lacs

Chennai, Pune, Bengaluru

Hybrid

Naukri logo

5-8 yrs Experience Ensure the reliability and effectiveness of AI/ML software applications or systems. Develop and implement testing strategies for AI/ML models, including functional, performance, and scalability testing. Identify and document defects and verify fixes in AI/ML systems. Work closely with data scientists, data engineers and product leads to understand the application's expected behavior and performance. Use metrics and statistical tools to assess model performance and validate that the AI/ML models meet the requirements and adhere to established standards and regulations. Contribute to improving QA methodologies related to AI/ML testing and validation. Good hands-on exp on SDET- Python with relevant frameworks Python SDET with Pytest, AI ML, SQL

Posted 3 months ago

Apply

6 - 10 years

15 - 30 Lacs

Pune, Delhi NCR, Hyderabad

Work from Office

Naukri logo

Company - Cigniti Technologies Limited Experience - 6-10 years location - Hyderabad/ Noida/ Pune/ Mumbai/ Gurugram [ WFO- Hybrid ] Interview Mode - Virtual Required Skills and Experience: 6-8 years of extensive experience with Selenium WebDriver using Python and Pandas framework. Proficiency in Python programming - must be able to write code from scratch. Strong experience with the Pandas framework with idea on building framework from scratch. Strong collaboration and communication skills. Exceptional attention to detail and client facing skill

Posted 3 months ago

Apply

5 - 8 years

7 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Role & responsibilities We are seeking a highly skilled and motivated candidate with expertise in programming, problem-solving, and Machine Learning (ML) and Artificial Intelligence (AI). The ideal candidate will have strong programming skills, with a particular focus on Python, and experience using key data manipulation libraries such as Pandas and NumPy. Candidate must be experienced to fine tune / Optimize the open-source models like whisper, to run on mobile platforms (which have limited memory and CPU/GPU), without much guidance / supervision. Looking at people who have already tuned Gen AI models for a specific use case.' Candidate should have a solid background in Machine Learning, particularly in classification and retrieval tasks, as well as hands-on experience with Large Language Models (LLMs), including fine-tuning for specific use cases, Retrieval-Augmented Generation (RAG), foundation models APIs, and Prompt Engineering. Familiarity with Faiss (Facebook AI Similarity Search) for efficient nearest-neighbor search in retrieval tasks is highly desirable. A strong understanding of Natural Language Processing (NLP) is preferred, with experience in audio/voice manipulation tasks, specifically text-to-speech (TTS) and speech-to-text (STT). Familiarity with the Hugging Face Transformers and Datasets libraries is highly desirable. Proficiency in using tools such as Hydra for configuration management and Weights & Biases for tracking experiments is essential. In addition to technical expertise, the ideal candidate will have experience with Git and GitHub for version control and a proven ability to collaborate effectively in a team environment, particularly when working on shared codebases and remote projects. Strong data management and manipulation skills are crucial, as is experience working on remote servers to develop and deploy machine learning models.

Posted 3 months ago

Apply

Exploring Pandas Jobs in India

The job market for pandas professionals in India is on the rise as more companies are recognizing the importance of data analysis and manipulation in making informed business decisions. Pandas, a popular Python library for data manipulation and analysis, is a valuable skill sought after by many organizations across various industries in India.

Top Hiring Locations in India

Here are 5 major cities in India actively hiring for pandas roles: 1. Bangalore 2. Mumbai 3. Delhi 4. Hyderabad 5. Pune

Average Salary Range

The average salary range for pandas professionals in India varies based on experience levels. Entry-level positions can expect a salary ranging from ₹4-6 lakhs per annum, while experienced professionals can earn upwards of ₹12-18 lakhs per annum.

Career Path

Career progression in the pandas domain typically involves moving from roles such as Junior Data Analyst or Data Scientist to Senior Data Analyst, Data Scientist, and eventually to roles like Tech Lead or Data Science Manager.

Related Skills

In addition to pandas, professionals in this field are often expected to have knowledge or experience in the following areas: - Python programming - Data visualization tools like Matplotlib or Seaborn - Statistical analysis - Machine learning algorithms

Interview Questions

Here are 25 interview questions for pandas roles: - What is pandas in Python? (basic) - Explain the difference between Series and DataFrame in pandas. (basic) - How do you handle missing data in pandas? (basic) - What are the different ways to create a DataFrame in pandas? (medium) - Explain groupby() in pandas with an example. (medium) - What is the purpose of pivot_table() in pandas? (medium) - How do you merge two DataFrames in pandas? (medium) - What is the significance of the inplace parameter in pandas functions? (medium) - What are the advantages of using pandas over Excel for data analysis? (advanced) - Explain the apply() function in pandas with an example. (advanced) - How do you optimize performance in pandas operations for large datasets? (advanced) - What is method chaining in pandas? (advanced) - Explain the working of the cut() function in pandas. (medium) - How do you handle duplicate values in a DataFrame using pandas? (medium) - What is the purpose of the nunique() function in pandas? (medium) - How can you handle time series data in pandas? (advanced) - Explain the concept of multi-indexing in pandas. (advanced) - How do you filter rows in a DataFrame based on a condition in pandas? (medium) - What is the role of the read_csv() function in pandas? (basic) - How can you export a DataFrame to a CSV file using pandas? (basic) - What is the purpose of the describe() function in pandas? (basic) - How do you handle categorical data in pandas? (medium) - Explain the role of the loc and iloc functions in pandas. (medium) - How do you perform text data analysis using pandas? (advanced) - What is the significance of the to_datetime() function in pandas? (medium)

Prepare and Apply Confidently

As you explore pandas jobs in India, remember to enhance your skills, stay updated with industry trends, and practice answering interview questions to increase your chances of securing a rewarding career in data analysis. Best of luck on your job search journey!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies