Jobs
Interviews

758 Scipy Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 years

1 - 3 Lacs

pitampura

On-site

Immediate Hiring: Data Science / Data Analyst Trainer (Onsite – Delhi) About the Role: Veridical Technologies is looking for passionate and skilled professionals to join as Trainers in Data Science, Data Analytics at our Delhi center. This is a hardcore training profile – only those serious about building careers in teaching and mentoring should apply. Who Can Apply: Experienced Trainers with 2–3 years of proven training experience Freshers/Trainees with strong technical knowledge and a passion for teaching Only Delhi-based candidates (or those ready to relocate immediately) Key Training Areas: Machine Learning & Deep Learning Mathematics: Statistics, Probability, Calculus, Linear Algebra Programming: Python, R, SAS, SQL, SciPy Stack (NumPy, Pandas, Matplotlib) Data Visualization: Tableau, Power BI, MS Excel Databases: MySQL, MongoDB, Oracle Requirements: Qualification: M.Tech (IT/CS) / MCA / B.Tech (IT/CS) Excellent communication & presentation skills Passion for mentoring students in IT & Data Science fields Salary: ₹14,000 – ₹30,000 (based on experience) Location: Veridical Technologies Aggarwal Prestige Mall, 512, 5th Floor, Rani Bagh, Pitampura, Delhi – 110034 (Landmark: M2K Pitampura | Nearest Metro: Kohat Enclave / Shakurpur) Contact: +91 93195 93915 www.veridicaltechnologies.com Job Type: Full-time Pay: ₹14,000.00 - ₹30,000.00 per month Work Location: In person

Posted 18 hours ago

Apply

5.0 - 8.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Job Description Are you a highly skilled Technical Manager with a passion for innovation and a proven track record in designing and developing cutting-edge products and services? If you excel in Java 17, Spring/Hibernate, Kafka/ZK, SQL, and advanced Java troubleshooting, and possess experience with Kubernetes and cloud technologies, we invite you to apply. Familiarity with Linux is essential, and knowledge of the Optics domain or NMS is a significant plus. This role demands a proactive individual who thrives in a collaborative environment, is eager to learn and explore new technologies, and can mentor team members while designing innovative features and applications. Apply now and contribute your expertise to our dynamic team! How You Will Contribute And What You Will Learn As part of the team, you will: Perform high level designing using programming languages such as Java 17, Spring/Hibernate, KAFKA/ZK, SQL, Advanced Java Troubleshooting. Utilize Kubernetes and cloud technologies. Have familiarity with Linux. Possess knowledge of the Optics domain which is a value add. Have knowledge of NMS which is also a value add. Be ready to learn and explore new technologies. Demonstrate the ability to design new features and applications. Mentor team members technically. When you come onboard with Nokia's Network Automation organization as a Principal Technical Specialist - Java Architect, you are guaranteed to have an enriching experience both in terms of technologies and applications. Moreover, you will have the opportunity to work with people who are always ready to share and co-learn from each other. Key Skills And Experience You have: 8-12 as programmer 5-8 years as architect Architectural patterns Languages: Java, Python/Shell - Handy experience, golang – plus Frameworks: Spring or Guice or Micronaut, Pandas/keras/pytorch/scipy/numpy, Kafka/Spark Databases: SQL Postgres/Oracle, NSQL Elastic/Prometheus/Mongo/Cassandra/Redis/Pinot Domain: NME/EMS for telecom domain If not, OLTP/RealTime system management in enterprise software, Large scale and small scale development experience Modeling: Ability to do modeling of domain Data Engineering: Understand Statistical analytics, understand ML big data and small data models. AI experience: Understand AI/ML techniques (Supervised/Unsupervised learning, linear regression, Neural Networks) About Us Come create the technology that helps the world act together Nokia is committed to innovation and technology leadership across mobile, fixed and cloud networks. Your career here will have a positive impact on people’s lives and will help us build the capabilities needed for a more productive, sustainable, and inclusive world. We challenge ourselves to create an inclusive way of working where we are open to new ideas, empowered to take risks and fearless to bring our authentic selves to work What we offer Nokia offers continuous learning opportunities, well-being programs to support you mentally and physically, opportunities to join and get supported by employee resource groups, mentoring programs and highly diverse teams with an inclusive culture where people thrive and are empowered. Nokia is committed to inclusion and is an equal opportunity employer Nokia has received the following recognitions for its commitment to inclusion & equality: One of the World’s Most Ethical Companies by Ethisphere Gender-Equality Index by Bloomberg Workplace Pride Global Benchmark At Nokia, we act inclusively and respect the uniqueness of people. Nokia’s employment decisions are made regardless of race, color, national or ethnic origin, religion, gender, sexual orientation, gender identity or expression, age, marital status, disability, protected veteran status or other characteristics protected by law. We are committed to a culture of inclusion built upon our core value of respect. Join us and be part of a company where you will feel included and empowered to succeed. About The Team The pandemic has highlighted how important telecoms networks are to society. Nokia’s Network Infrastructure group is at the heart of a revolution to bring more and faster network capacity to people worldwide through our ambition, innovation, and technical expertise.

Posted 19 hours ago

Apply

7.0 - 12.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Job Description As a Senior Technical Specialist, you will play a key role in developing and maintaining network management applications for optical transport technologies, ensuring efficient configuration, fault supervision, and performance monitoring. You will leverage your expertise in data science and machine learning to enhance network management capabilities, applying advanced models and feature engineering techniques. With strong proficiency in Python, Java, GoLang, and cloud-based technologies, you will drive innovation, optimize performance, and enable seamless deployments through CI/CD pipelines. Your adaptability and leadership will help mentor junior developers, foster collaboration, and navigate the fast-paced, evolving landscape of optical network management. This role offers the opportunity to make a significant impact on global deployments while continuously learning and growing in a cutting-edge technological environment. How You Will Contribute And What You Will Learn Network Management of Optics Division products, including Photonic/WDM, Optical Transport, and SDH/SONET. Optics Network Management applications provide users with control over the network, including configuration (infrastructure, end-to-end services), fault supervision, and performance monitoring. These applications interface with various Network Elements, provide a user-friendly graphical interface, and implement algorithms and functions to facilitate network management and reduce OPEX. Optics Network Management applications are deployed worldwide in hundreds of installations, serving both large enterprises and small customers. The Software Development Engineer for Optics Network Management will be part of the development team, contributing to new developments and maintaining applications to enhance functionality and customer satisfaction. Key Skills And Experience You have: Bachelor’s degree or equivalent, with 7 to 12 years of experience as a Data Scientist. Analyze data and apply appropriate models (univariate, multivariate). Expertise in various learning models, including supervised, unsupervised, and different neural network variations. Proficient in applying probabilistic and stochastic models to different datasets and performing feature engineering for various models. Stay up to date with the capabilities and limitations of different models and distinguish between real-time and batch processing needs. Strong domain knowledge or the ability to learn the Optical Transport domain for effective feature engineering. It would be nice if you also had: Proficiency in NumPy, Pandas, PyTorch, SciPy, Keras, TensorFlow, GoLang, and corresponding packages for Java and Python, as well as various Time Series databases. Ability to differentiate between Proof-of-Concept mode and deployment for small- and large-scale data using CI/CD pipelines. Adaptability to rapidly evolving technology and tools, with a proven ability to learn quickly. About Us Come create the technology that helps the world act together Nokia is committed to innovation and technology leadership across mobile, fixed and cloud networks. Your career here will have a positive impact on people’s lives and will help us build the capabilities needed for a more productive, sustainable, and inclusive world. We challenge ourselves to create an inclusive way of working where we are open to new ideas, empowered to take risks and fearless to bring our authentic selves to work What we offer Nokia offers continuous learning opportunities, well-being programs to support you mentally and physically, opportunities to join and get supported by employee resource groups, mentoring programs and highly diverse teams with an inclusive culture where people thrive and are empowered. Nokia is committed to inclusion and is an equal opportunity employer Nokia has received the following recognitions for its commitment to inclusion & equality: One of the World’s Most Ethical Companies by Ethisphere Gender-Equality Index by Bloomberg Workplace Pride Global Benchmark At Nokia, we act inclusively and respect the uniqueness of people. Nokia’s employment decisions are made regardless of race, color, national or ethnic origin, religion, gender, sexual orientation, gender identity or expression, age, marital status, disability, protected veteran status or other characteristics protected by law. We are committed to a culture of inclusion built upon our core value of respect. Join us and be part of a company where you will feel included and empowered to succeed.

Posted 20 hours ago

Apply

8.0 - 15.0 years

0 Lacs

pune, maharashtra

On-site

Role Overview: As an Applications Development Technology Lead Analyst at our company, you will be a senior member responsible for establishing and implementing new or revised application systems and programs. Your main objective will be to lead applications systems analysis and programming activities in coordination with the Technology team. Key Responsibilities: - Partner with multiple management teams to ensure appropriate integration of functions, identify necessary system enhancements, and deploy new products and process improvements - Resolve high impact problems/projects through in-depth evaluation of complex business processes and system processes - Provide expertise in area and advanced knowledge of applications programming, ensuring application design aligns with the overall architecture blueprint - Develop standards for coding, testing, debugging, and implementation, utilizing advanced knowledge of system flow - Gain comprehensive knowledge of how different areas of business integrate to achieve business goals - Offer in-depth analysis and interpretive thinking to define issues and develop innovative solutions - Serve as an advisor or coach to mid-level developers and analysts, allocating work as necessary - Assess risk appropriately in business decisions, safeguarding the firm's reputation and complying with laws and regulations Qualifications: - 9-15 years of relevant experience in Apps Development or systems analysis role - Highly experienced senior core python developer with 9+ years of experience in software building and platform engineering - Extensive development expertise in building high scaled and performant software platforms for data computation and processing - Expert level knowledge of core python concepts and libraries such as pandas, numpy, and scipy, with proficiency in OOPs concepts and design patterns - Strong computer science fundamentals in data structures, algorithms, databases, and operating systems - Highly experienced with Unix based operating systems - Strong analytical and logical skills - Hands-on experience in writing SQL queries - Experience with source code management tools such as Bitbucket, Git, etc. - Experience working with banking domains like pricing, risk, etc. is a plus - CFA/FRM certification is a plus - Extensive experience in system analysis and programming of software applications - Experience in managing and implementing successful projects - Subject Matter Expert (SME) in at least one area of Applications Development - Ability to adjust priorities quickly as circumstances dictate - Demonstrated leadership and project management skills - Clear and concise written and verbal communication skills Company Additional Details: The company is seeking a Python developer with experience in developing applications for Risk and Finance analytical applications involving large volume data computation, analysis, and data loads. The ideal candidate must have 8-12 years of software development experience in core python with expert level skills. They should be able to design, review, suggest improvements, guide junior team members, and deliver in an agile environment. Expert level knowledge of core python libraries such as pandas, numpy, and scipy, along with databases, is required. Experience working with high data volumes, computation, processing, and good analytical and logical skills are essential. Hands-on experience in writing SQL queries and familiarity with source code management tools like GitHub, SVN, etc., are also expected. (Note: The above job description is a summary and other job-related duties may be assigned as required.),

Posted 1 day ago

Apply

0.0 - 3.0 years

0 - 0 Lacs

pitampura, delhi, delhi

On-site

Immediate Hiring: Data Science / Data Analyst Trainer (Onsite – Delhi) About the Role: Veridical Technologies is looking for passionate and skilled professionals to join as Trainers in Data Science, Data Analytics at our Delhi center. This is a hardcore training profile – only those serious about building careers in teaching and mentoring should apply. Who Can Apply: Experienced Trainers with 2–3 years of proven training experience Freshers/Trainees with strong technical knowledge and a passion for teaching Only Delhi-based candidates (or those ready to relocate immediately) Key Training Areas: Machine Learning & Deep Learning Mathematics: Statistics, Probability, Calculus, Linear Algebra Programming: Python, R, SAS, SQL, SciPy Stack (NumPy, Pandas, Matplotlib) Data Visualization: Tableau, Power BI, MS Excel Databases: MySQL, MongoDB, Oracle Requirements: Qualification: M.Tech (IT/CS) / MCA / B.Tech (IT/CS) Excellent communication & presentation skills Passion for mentoring students in IT & Data Science fields Salary: ₹14,000 – ₹30,000 (based on experience) Location: Veridical Technologies Aggarwal Prestige Mall, 512, 5th Floor, Rani Bagh, Pitampura, Delhi – 110034 (Landmark: M2K Pitampura | Nearest Metro: Kohat Enclave / Shakurpur) Contact: +91 93195 93915 www.veridicaltechnologies.com Job Type: Full-time Pay: ₹14,000.00 - ₹30,000.00 per month Work Location: In person

Posted 1 day ago

Apply

5.0 - 7.0 years

12 - 14 Lacs

noida

On-site

Job Title: Python Developer Experience Required: 5 to 7 Years Location: Noida / Gurgaon (Onsite – 3 Days a Week) Employment Type: Full-Time Position Summary: We are seeking an experienced and results-driven Python Developer to join our technology team. The ideal candidate will have a strong foundation in Python programming, object-oriented design, data structures, and algorithms. This role requires hands-on experience with data science libraries, modern development tools, and containerization workflows. The position is onsite , requiring physical presence at our Noida or Gurgaon office five days a week . Key Responsibilities: Design, develop, and maintain high-performance, scalable Python-based applications. Apply object-oriented programming and algorithmic thinking to solve complex problems. Develop data processing and analytical workflows using libraries such as NumPy , Pandas , SciPy , and Scikit-learn . Collaborate with DevOps teams to integrate applications into CI/CD pipelines and containerize solutions using Docker . Manage version control using Git , ensuring clean and efficient codebase management. Design and optimize relational databases, particularly MySQL , for performance and reliability. Work closely with cross-functional teams to gather requirements and deliver high-quality solutions within deadlines. Required Skills: Python Programming Advanced Object-Oriented Programming (OOP) Strong understanding Data Structures & Algorithms Proficient NumPy, Pandas, SciPy, Scikit-learn Hands-on experience Git Proficient Docker, CI/CD Working experience MySQL Solid experience in database design and queries Candidate Attributes: Excellent problem-solving and analytical abilities. Strong verbal and written communication skills. Ability to work independently in a fast-paced, collaborative environment. Detail-oriented with a commitment to code quality and documentation. Job Types: Full-time, Permanent Pay: ₹100,000.00 - ₹120,000.00 per month Benefits: Paid time off Work Location: In person

Posted 1 day ago

Apply

10.0 years

0 Lacs

chennai, tamil nadu, india

On-site

Medical Signal Processing Software Engineer Location: Chennai, India Job Type: Full-Time Experience Level: 7–10 years About the Role We are seeking an experienced Medical Signal Processing Software Engineer to join our team in Chennai. In this role, you will develop robust pipelines to process biomedical signals, decode proprietary file formats, and contribute to the integration of clinical systems and analytics platforms. This is a hands-on engineering role ideal for someone with a deep understanding of physiological signal data and a passion for building scalable, high-performance tools for medical applications. Key Responsibilities · Design and develop software solutions for biomedical signal processing using Python, C/C++, or Java · Reverse-engineer and decode proprietary binary files from medical monitoring devices · Work with diverse data encoding formats including IEEE floating point, PCM, and BCD · Build and optimize signal processing algorithms using libraries such as NumPy, SciPy, wfdb, BioSPPy, and neurokit2 · Collaborate with MATLAB-based research teams for algorithm development and integration · Convert and standardize physiological waveform data across formats such as SCP, EDF+, HL7 aECG, DICOM waveform, and MIT-BIH · Analyze long-duration cardiac monitoring recordings, including signal segmentation and annotation · Utilize tools like PhysioNet WFDB Toolkit and OpenECG Tools for efficient signal file handling · Inspect and manipulate binary data using hex editors and Python modules like struct and bitstring · Document processing workflows, file formats, and code for maintainability and compliance · Integrate third-party APIs and plug-ins into medical signal processing pipelines Qualifications & Skills · Proficiency in Python; experience with C/C++ or Java is highly desirable · Proven expertise in reverse-engineering binary data formats · Strong understanding of biomedical data standards, especially those used in cardiac monitoring · Solid background in signal processing using Python and MATLAB · Familiarity with binary inspection tools (e.g., HxD, 010 Editor) and relevant Python libraries (struct, bitstring) · Strong problem-solving ability and the capacity to work independently and within cross-functional teams · Experience integrating APIs and third-party plug-ins · Exposure to AI/ML concepts is a plus Preferred Background · Bachelor’s or Master’s degree in Biomedical Engineering, Computer Science, Electrical Engineering, or a related field · 7–10 years of experience in medical device software development. · Familiarity with cloud platforms and healthcare data pipelines is advantageous

Posted 1 day ago

Apply

4.0 - 7.0 years

5 - 9 Lacs

noida

Work from Office

Broad Function: We are seeking a Gen AI Developer with proven experience in integrating Generative AI tools into daily development driving innovation and productivity. The ideal candidate will have hands-on experience using tools like GitHub Copilot, ChatGPT, Amazon Code Whisperer, or similar platforms for 50% or more of their daily development tasks. Roles and Responsibilities: Interpret briefs and create high-quality, GenAI-assisted code that meets business objectives. Lead the design and development of Java-based applications with microservices architecture. Use GenAI tools in at least 50% of your development workflow to enhance productivity, code quality, and innovation. Collaborate with the Product Manager/Architect to define deliverables and technical strategy. Drive the integration of AI/ML capabilities into products using libraries like TensorFlow, PyTorch, or Hugging Face. Guide teams through refactoring and codebase optimization using AI-assisted insights. Conduct peer code reviews, ensuring usage of GenAI-generated suggestions where appropriate. Mentor and motivate junior developers in both traditional and GenAI-enhanced coding practices. Collaborate in agile development environments and CI/CD pipelines Implement NLP capabilities with Hugging Face Transformers and spaCy. Integrate AI/ML pipelines with CI/CD tools like Jenkins, GitLab CI, and CircleCI Requirements Desired Candidate Profile: Java 8+, Spring Boot, Hibernate, JPA, Microservices, MongoDB, Redis, Kafka Docker, Kubernetes, AWS, GitLab, JIRA, Confluence Strong understanding of cloud-native and distributed systems Proficiency in integrating AI tools (e.g., GitHub Copilot, ChatGPT, Bard, etc.) into coding workflows Familiarity with AI/ML libraries and large language model APIs Experience with GenAI-based code generation, documentation creation, and API development. Proficient in AI/ML libraries such as TensorFlow, PyTorch, Keras, and Scikit-learn. Experienced in MLOps frameworks such as MLflow, Kubeflow, and DataRobot. Knowledge of data analysis frameworks like Pandas, NumPy, and SciPy for model development. Benefits The company offers a range of employee benefits including: Cashless medical insurance for employees, spouse and children Accidental insurance coverage Life insurance coverage Complementary lunch coupons Company-paid transportation Access to online learning platforms such as Udemy and LinkedIn Learning Retirement benefits including Provident Fund (PF) and Gratuity Paternity & Maternity Leave Benefit

Posted 2 days ago

Apply

5.0 - 7.0 years

0 Lacs

gurugram, haryana, india

On-site

Job Title: Python Developer Experience Required: 5 to 7 Years Location: Noida / Gurgaon (Onsite – 3 Days a Week) Employment Type: Full-Time Position Summary: We are seeking an experienced and results-driven Python Developer to join our technology team. The ideal candidate will have a strong foundation in Python programming, object-oriented design, data structures, and algorithms. This role requires hands-on experience with data science libraries, modern development tools, and containerization workflows. The position is onsite , requiring physical presence at our Noida or Gurgaon office five days a week . Key Responsibilities: Design, develop, and maintain high-performance, scalable Python-based applications. Apply object-oriented programming and algorithmic thinking to solve complex problems. Develop data processing and analytical workflows using libraries such as NumPy , Pandas , SciPy , and Scikit-learn . Collaborate with DevOps teams to integrate applications into CI/CD pipelines and containerize solutions using Docker . Manage version control using Git , ensuring clean and efficient codebase management. Design and optimize relational databases, particularly MySQL , for performance and reliability. Work closely with cross-functional teams to gather requirements and deliver high-quality solutions within deadlines. Required Skills: Python Programming Advanced Object-Oriented Programming (OOP) Strong understanding Data Structures & Algorithms Proficient NumPy, Pandas, SciPy, Scikit-learn Hands-on experience Git Proficient Docker, CI/CD Working experience MySQL Solid experience in database design and queries Candidate Attributes: Excellent problem-solving and analytical abilities. Strong verbal and written communication skills. Ability to work independently in a fast-paced, collaborative environment. Detail-oriented with a commitment to code quality and documentation.

Posted 2 days ago

Apply

3.0 years

10 - 11 Lacs

gurgaon

On-site

Job Description We aim to bring about a new paradigm in medical image diagnostics, providing intelligent, holistic, ethical, explainable, and patient-centered care. We are looking for innovative problem solvers. We want people who can empathize with the consumer, understand business problems, and design and deliver intelligent products. People who are looking to extend artificial intelligence into unexplored areas. Your primary focus will be in applying deep learning and artificial intelligence techniques to the domain of medical image analysis. Responsibilities Selecting features, building, and optimizing classifier engines using deep learning techniques. Understanding the problem and applying the suitable image processing techniques Use techniques from artificial intelligence/deep learning to solve supervised and unsupervised learning problems. Understanding and designing solutions for complex problems related to medical image analysis by using Deep Learning/Object Detection/Image Segmentation. Recommend and implement best practices around the application of statistical modeling. Create, train, test, and deploy various neural networks to solve complex problems. Develop and implement solutions to fit business problems which may include applying algorithms from a standard statistical tool, deep learning or custom algorithm development. Understanding the requirements and designing solutions and architecture in accordance with them is important. Participate in code reviews, sprint planning, and Agile ceremonies to drive high-quality deliverables. Design and implement scalable data science architectures for training, inference, and deployment pipelines. Ensure code quality, readability, and maintainability by enforcing software engineering best practices within the data science team. Optimize models for production, including quantization, pruning, and latency reduction for real-time inference. Drive the adoption of versioning strategies for models, datasets, and experiments (e.g., using MLFlow, DVC). Contribute to the architectural design of data platforms to support large-scale experimentation and production workloads. Skills and Qualifications Strong software engineering skills in Python (or other languages used in data science) with emphasis on clean code, modularity, and testability. Excellent understanding and hands-on of Deep Learning techniques such as ANN, CNN, RNN, LSTM, Transformers, VAEs etc. Must have experience with Tensorflow or PyTorch framework in building, training, testing, and deploying neural networks. Experience in solving problems in the domain of Computer Vision. Knowledge of data, data augmentation, data curation, and synthetic data generation. Ability to understand the complete problem and design the solutions that best fit all the constraints. Knowledge of the common data science and deep learning libraries and toolkits such as Keras, Pandas, Scikit-learn, Numpy, Scipy, OpenCV etc. Good applied statistical skills, such as distributions, statistical testing, regression, etc. Exposure to Agile/Scrum methodologies and collaborative development practices. Experience with the development of RESTful APIs. The knowledge of libraries like FastAPI and the ability to apply it to deep learning architectures is essential. Excellent analytical and problem-solving skills with a good attitude and keen to adapt to evolving technologies. Experience with medical image analysis will be an advantage. Experience designing and building ML architecture components (e.g., feature stores, model registries, inference servers). Solid understanding of software design patterns, microservices, and cloud-native architectures. Expertise in model optimization techniques (e.g., ONNX conversion, TensorRT, model distillation) Education: BE/B Tech MS/M Tech (will be a bonus) Experience: 3+ Years Job Type: Full-time Pay: ₹1,000,000.00 - ₹1,100,000.00 per year Application Question(s): Do you have Strong software engineering skills in Python (or other languages used in data science) with emphasis on clean code, modularity, and testability? Do you have Excellent understanding and hands-on of Deep Learning techniques such as ANN, CNN, RNN, LSTM, Transformers, VAEs etc.? Do you have Knowledge of data, data augmentation, data curation, and synthetic data generation? Do you have Knowledge of the common data science and deep learning libraries and toolkits such as Keras, Pandas, Scikit-learn, Numpy, Scipy, OpenCV etc? Do you have Experience with the development of RESTful APIs? Experience: Data science: 3 years (Required) AI/ML: 3 years (Required) Work Location: In person

Posted 2 days ago

Apply

3.0 years

0 Lacs

india

Remote

Join a leading U.S.-based company as a Sr. Python Developer, where your expertise will drive cutting-edge solutions in technology. Leverage your Python skills to solve complex challenges, optimize processes, and contribute to impactful projects alongside global experts. If you have a passion for innovation and a proven track record in Python development, this role offers the perfect opportunity to elevate your career. Job Responsibilities Develop scalable and efficient Python applications. Analyze large datasets to extract business insights. Write clean, optimized code in Jupyter Notebooks or similar platforms. Collaborate with researchers and engineers on data-driven solutions. Maintain clear documentation for all code. Use datasets from Kaggle, the UN, and U.S. government sources for business insights. Job Requirements 3+ years of Python development experience. Bachelor’s/Master’s in Computer Science, Engineering, or related field. Strong Python skills with experience in Pandas, NumPy, SciPy, etc. Experience with databases (SQL/NoSQL) and cloud platforms (AWS, GCP, Azure) is a plus. Excellent problem-solving and analytical skills. Strong communication and teamwork skills in a remote setting. Perks Work with top industry experts worldwide. Fully remote flexibility. Competitive salary aligned with global standards. Be part of cutting-edge, high-impact projects. Selection Process Shortlisted developers may be asked to complete an assessment. If you clear the assessment, you will be contacted for contract assignments with expected start dates, durations, and end dates. Some contract assignments require fixed weekly hours, averaging 20/30/40 hours per week for the duration of the contract assignment.

Posted 2 days ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

Role Overview: You will be responsible for establishing and implementing new or revised application systems and programs in coordination with the Technology team. Your main objective will be to lead applications systems analysis and programming activities. The project you will be working on is for the Market Risk department of Citi, involving risk measurement using various methodologies and Risk Regulations. Your role will include connecting to Centralized/FO Data providers, downloading required data items for trades, trade pricing, and risk calculation. Accuracy and performance are crucial for the success of these projects, making each project implementation optimal, performant, scalable, and using the best possible tech-stack to meet business needs. Regular interactions with Quants, Risk Analytics, Risk Manager, and FO IT teams will be required, integrating pricing libraries, performing calculations, and reporting activities involving millions of daily priced trades. Key Responsibilities: - Partner with multiple management teams for integration of functions, defining necessary system enhancements, and deploying new products and process improvements - Resolve high-impact problems/projects through evaluation of complex business processes and system processes - Provide expertise in applications programming, ensuring application design adheres to the overall architecture blueprint - Utilize advanced system flow knowledge, develop coding standards, and ensure testing, debugging, and implementation - Develop comprehensive knowledge of business areas integration to accomplish goals, providing in-depth analysis and innovative solutions - Serve as an advisor or coach to mid-level developers and analysts, allocating work as necessary - Assess risk in business decisions, safeguarding Citigroup, its clients, and assets, by driving compliance with laws, rules, and regulations Qualifications: - Highly experienced senior python developer with data science understanding (8-12 years of experience) - Extensive development expertise in building highly scaled and performant software platforms for data computation and processing - Expert level knowledge of core python concepts and libraries such as pandas, numpy, and scipy, with proficiency in OOPs concepts and design patterns - Strong fundamentals in data structures, algorithms, databases, and operating systems - Experience with Unix-based operating systems, strong analytical and logical skills, and hands-on experience in writing SQL queries - Familiarity with source code management tools like Bitbucket, Git, and working in banking domains like pricing, risk is a plus - CFA/FRM certification is a plus - Extensive experience in system analysis and programming of software applications, managing and implementing successful projects - Subject Matter Expert (SME) in at least one area of Applications Development - Demonstrated leadership, project management skills, and clear communication Education: - Bachelors degree/University degree or equivalent experience required - Masters degree preferred,

Posted 3 days ago

Apply

5.0 years

0 Lacs

sarupathar, assam, india

Remote

About PFF PFF is a world leader in the collection, analysis, and application of sports data. With clients spanning the majority of the professional and college football landscapes, as well as many media entities and other sports fans at large, our employees have the opportunity to drive changes in the way sports are played and consumed by coaches, athletes and consumers. We are looking for a Computer Vision (CV) Engineer/Scientist to join our foundational team and help develop high-performance object detection and tracking models tailored for football sports footage. You will be responsible for deployment of the current architectures within our framework and further improving the accuracy of the models with training on the go. You’ll work closely with a cross-functional team of engineers, analysts, and football experts to push the boundaries of what’s possible in sports tech. What You’ll Do ? Maintain a CI/CD pipeline for CV models and workflows Handle training and fine-tuning of CV models for object detection, tracking, homography, multi-modal analysis, etc., Be proactive in researching on latest CV developments and bring proof of concept projects (POC) Build scalable data engines for evaluation of models and their integration to existing framework Contribute towards development of custom datasets for training and validation Collaborate cross-functionally with data scientists, football analysts, and engineers to deploy CV systems in production Evaluate models with custom dataset Use internal and external APIs including AWS platform Maintain efficient documentation in Git and Confluence Minimum Qualifications 5+ years of experience in Computer Vision or Applied Machine Learning roles. Hands-on experience with modern detection and tracking Strong understanding of projective geometry, camera calibration, and video homography. Proficiency in Python and CV/ML tools such as PyTorch, OpenCV, MMDetection, Ultralytics and relevant platforms. Knowledge of hybrid inference strategies (e.g., cascade models, frame-skipping, multi-stage inference). Experience in deploying models in real-time systems or stream-processing frameworks. Strong Python and CV tooling background: OpenCV, MMDetection, Ultralytics, NumPy, SciPy. Comfort designing evaluation pipelines tailored to real-world use cases and edge cases. Experience with large-scale video datasets and temporal training techniques. Experience in AWS for training and deploying models Preferred Qualifications B.E/B.Tech/M.Sc/M.Tech in relevant field Prior experience with sports video, broadcast, or ball/player tracking. Experience in training and deploying models using AWS The Pay Range For This Role Is 190,000 - 220,000 USD per year (Remote (United States)) 280,000 - 330,000 CAD per year (Remote (Canada))

Posted 3 days ago

Apply

0 years

0 Lacs

india

On-site

Role : This is an exciting opportunity for an experienced environmental modeller with strong programming expertise to join our growing team. Working alongside our Principal Soil Modeller, you will be responsible for developing, implementing, and maintaining components of the Agricarbon Ecosystem Model (AEM) using Python. Your advanced programming skills will be crucial in translating complex modelling concepts into robust, production-ready code that enhances our ability to make accurate predictions of soil carbon levels and agricultural system interactions. You will need to be adaptable - capable of working independently and as a key member of a A multi-disciplinary team reflecting engineering, GIS, soil science, quality management, and data systems, and the commercial team, as well as collaborating effectively with external partners. Key responsibilities: Model Components & Integration: Working with agricultural ecosystem models (AEM) including plant growth models (LINTUL-5, LINGRA), soil organic carbon models (RothPC, RothPC-N), soil water models, mineral nitrogen models, and grazing models Model Integration : Implementing and maintaining the integration between different AEM components, ensuring seamless data flow between plant growth, soil carbon, water, nitrogen, and livestock models within the Bayesian data assimilation framework Technical Development Bayesian Framework Development : Contributing to the development and maintenance of the Bayesian data assimilation framework that underpins the AEM, ensuring robust uncertainty quantification and model calibration Model Development : Configuring, running, and extending existing model components such as LINTUL-5 (arable crops), LINGRA (grass), RothPC-N (soil organic carbon and nitrogen), developing Python implementations that maximise the benefit of our access to the world's largest soil carbon database Machine Learning Integration : Evaluating and implementing machine learning and statistical models using Python libraries to enhance overall accuracy and predictive power, potentially as part of ensemble modelling approaches Code Quality & Collaboration: Code Quality and Maintenance: Ensuring all modelling code meets high standards for reliability, performance, and maintainability, with comprehensive testing and documentation Technical Collaboration : Working closely with our Principal Soil Modeller to translate scientific requirements into robust technical solutions, providing programming expertise to support complex modelling challenges Data & Validation Model Validation: Designing and implementing automated testing frameworks to validate and improve model performance, ensuring statistical rigour in all implementations Communication & Documentation Technical Documentation: Producing comprehensive technical documentation, code comments, and user guides for all modelling implementations. Research Support : Supporting collaborative research initiatives by providing technical implementation of novel modelling approaches and contributing to peer-reviewed publications Stakeholder Communication : Communicating technical modelling concepts and results to both technical and non-technical audiences, including partners and stakeholders Skills and experience: Must have: Advanced Programming Skills: Extensive experience in Python programming for data science and environmental modelling, including proficiency with scientific libraries (NumPy, SciPy, Pandas, scikit-learn, GeoPandas) and Bayesian statistical libraries (PyMC or similar) Environmental Modelling Experience : Proven experience developing and working with ecosystem models or related areas Data Science Proficiency : Extensive experience with machine learning techniques and their application to environmental data, including model validation and statistical analysis Code Quality Focus: Experience with software development best practices including version control (Git), testing frameworks, and code documentation Problem-Solving Skills: Excellent analytical and problem-solving abilities with extreme attention to detail and a rigorous approach to model development Educational Background: Master's degree or PhD in Data Science, Environmental Science, Computer Science, or related field with a strong focus on modelling and programming Nice to have: Experience with Bayesian methods and data assimilation frameworks Familiarity with Soil carbon (e.g. RothC) and crop growth models (e.g. LINTUL, WOFOST, DSSAT, APSIM) or grassland (e.g. LINGRA) models, and/or integrated agricultural system models Knowledge of nitrogen cycling and soil-plant-atmosphere interactions Familiarity with data assimilation using satellite-derived data (e.g. Leaf area index, canopy cover) Experience with cloud computing platforms for large-scale data processing (AWS, Azure, GCP) Track record of peer-reviewed publications in relevant fields Geospatial data handling experience (e.g., GeoPandas, DuckDB, etc.) Familiarity with containerisation and deployment technologies (Docker)

Posted 3 days ago

Apply

8.0 years

2 - 9 Lacs

gurgaon

On-site

Project description We are seeking a highly experienced Data Scientist with deep expertise in Python and advanced machine learning techniques. You need to have a strong background in statistical analysis, big data platforms, and cloud integration, and you will be responsible for designing and deploying scalable data science solutions. Responsibilities Develop and deploy machine learning, deep learning, and predictive models. Perform statistical analysis, data mining, and feature engineering on large datasets. Build and optimize data pipelines and ETL workflows. Collaborate with data engineers and business stakeholders to deliver actionable insights. Create compelling data visualizations using tools like Tableau, Power BI, Matplotlib, or Plotly. Implement MLOps practices, including CI/CD, model monitoring, and lifecycle management. Mentor junior data scientists and contribute to team knowledge-sharing. Stay current with trends in AI/ML and data science. Skills Must have Minimum 8+ years of hands-on experience in Data Science with strong expertise in Python and libraries such as Pandas, NumPy, SciPy, Scikit-learn, TensorFlow, or PyTorch. Proven ability to design, develop, and deploy machine learning, deep learning, and predictive models to solve complex business problems. Strong background in statistical analysis, data mining, and feature engineering for large-scale structured and unstructured datasets. Experience working with big data platforms (Spark, Hadoop) and integrating with cloud environments (AWS, Azure, GCP). Proficiency in building data pipelines, ETL workflows, and collaborating with data engineers for scalable data solutions. Expertise in data visualization and storytelling using Tableau, Power BI, Matplotlib, Seaborn, or Plotly to present insights effectively. Strong knowledge of MLOps practices, including CI/CD pipelines, model deployment, monitoring, and lifecycle management. Ability to engage with business stakeholders, gather requirements, and deliver actionable insights aligned with business goals. Experience in mentoring junior data scientists/analysts, leading projects, and contributing to knowledge-sharing across teams. Continuous learner with strong problem-solving, communication, and leadership skills, staying updated with the latest trends in AI/ML and data science. Nice to have N/A Other Languages English: B2 Upper Intermediate Seniority Senior Gurugram, India Req. VR-117421 Data Science BCM Industry 11/09/2025 Req. VR-117421

Posted 3 days ago

Apply

5.0 years

0 Lacs

noida, uttar pradesh, india

On-site

Job Description – Software Engineer (Machine Learning) We are looking for an experienced Software Engineer – Machine Learning with 3–5 years of proven expertise in designing, developing, and deploying AI/ML solutions at an enterprise level. The ideal candidate will bring hands-on experience in Computer Vision, Deep Learning, Generative AI, and GIS-based AI applications, along with strong programming, analytical, and solution-building skills. In this role, you will contribute to the development of AI-powered GIS platforms, multimodal and geospatially aware models, and real-time feature detection systems, playing a key role in the end-to-end lifecycle of enterprise-grade AI/ML solutions. Must Have -3–5 years of hands-on experience in implementing AI/ML solutions in enterprise or large-scale production environments. -Strong programming proficiency in Python with practical use of TensorFlow, PyTorch, Keras, scikit-learn, NumPy, Pandas, and OpenCV. -Proven experience with Computer Vision architectures, including YOLO, SAM, U-Net, and other CNN-based models for real-time object detection, segmentation, and feature extraction. -Expertise in image classification using advanced CV models (e.g., ResNet, VGG, YOLOv5/v8, EfficientNet). -Deep understanding and implementation experience of machine learning algorithms (supervised, unsupervised, and reinforcement learning) for classification, regression, and clustering. -Experience in Generative AI and Large Language Models (LLMs), including transformers, diffusion models, multimodal pipelines, and speech-to-text/NLP solutions. -Strong foundation in mathematics, statistics, data structures, algorithms, and optimization techniques. -Hands-on experience with RNNs/LSTMs, hybrid neural architectures, and temporal/spatiotemporal modeling. -Practical knowledge of Agentic AI systems (AI agents, multi-agent workflows, or autonomous agent design). -GIS domain expertise – demonstrated ability to integrate AI/ML with geospatial datasets, satellite imagery, and spatial analytics to deliver practical solutions. -End-to-end experience in AI/ML solution lifecycle – from data preprocessing, model training, evaluation, deployment, and scaling. Hands-on exposure to geospatial toolkits/libraries (e.g., GDAL, GeoPandas, QGIS, ArcGIS APIs) and their integration with ML workflows. Should Have -Working knowledge of MLOps/LLMOps workflows for scalable, automated AI/ML deployments (CI/CD, containerization, orchestration). -Experience with enterprise-grade data platforms (cloud-native, distributed systems) ensuring high performance and interoperability with AI workloads. -Practical exposure to geospatial feature extraction, change detection, and segmentation workflows. -Strong ability to communicate AI/GIS solutions, collaborate with cross-functional teams, and translate research into applied enterprise use cases. Could Have -Certifications in AI/ML, GIS, or cloud platforms (AWS, Azure, GCP). -Familiarity with scientific computing libraries (SciPy, Theano, Julia ecosystem). -Experience contributing to AI/GIS research publications, open-source projects, or innovation programs. -Knowledge of emerging paradigms such as federated learning, multi-agent systems, spatial AI, or Responsible AI practices.

Posted 3 days ago

Apply

5.0 - 7.0 years

12 - 22 Lacs

kolkata

Work from Office

About Us : ZURU Tech is digitalizing global construction by developing the worlds first digital building fabrication platform - ZURU Home - empowering anyone to design and manufacture buildings with complete freedom. Join our multinational team to help shape the future of construction technology. What are you Going to do? As a Senior Python Developer, you'll play a pivotal role in the development of a cutting-edge, cloud-based structural analysis platform. You'll design and optimize core services that integrate with external building design tools, focusing on delivering high-quality software solutions in a distributed environment. Example projects include: Creating services to generate complex structural meshes from architectural models. Designing and integrating finite element analysis solutions with advanced FEM libraries. Implementing region-specific structural verification logic according to building codes. Collaborating with subject matter experts on requirements and testing scenarios. Coding and validating structural verifications, including Reinforced Concrete structures. Working collaboratively with engineering teams across multiple locations. What are we Looking for? Minimum 5 years of experience with Python (object-oriented programming, data classes) Strong background in NumPy and SciPyND tensors, slicing, sparse representations Solid grasp of software engineering principles and best practices Proficient in Test Driven Development; experienced with PyTest. Experienced with CI/CD pipelines and containerization (Docker) Excellent communication skills, with a collaborative and proactive approach to problem-solving Demonstrated ability to mentor team members or guide technical discussions is a plus Skilled at taking ownership of key technical decisions and delivering projects to completion Preferred: Experience in distributed systems and cloud deployments What do we Offer? Competitive compensation 5 Working Days with Flexible Working Hours Team outings Medical Insurance for self & family Training & skill development programs Work with the Global team, Make the most of the diverse knowledge Several discussions over Multiple Pizza Parties A lot more! Come and discover us!

Posted 3 days ago

Apply

5.0 - 7.0 years

0 Lacs

gurugram, haryana, india

On-site

Job Title: Python Developer Experience Required: 5 to 7 Years Location: Noida / Gurgaon (Onsite – 3 Days a Week) Employment Type: Full-Time Position Summary: We are seeking an experienced and results-driven Python Developer to join our technology team. The ideal candidate will have a strong foundation in Python programming, object-oriented design, data structures, and algorithms. This role requires hands-on experience with data science libraries, modern development tools, and containerization workflows. The position is onsite , requiring physical presence at our Noida or Gurgaon office five days a week . Key Responsibilities: Design, develop, and maintain high-performance, scalable Python-based applications. Apply object-oriented programming and algorithmic thinking to solve complex problems. Develop data processing and analytical workflows using libraries such as NumPy , Pandas , SciPy , and Scikit-learn . Collaborate with DevOps teams to integrate applications into CI/CD pipelines and containerize solutions using Docker . Manage version control using Git , ensuring clean and efficient codebase management. Design and optimize relational databases, particularly MySQL , for performance and reliability. Work closely with cross-functional teams to gather requirements and deliver high-quality solutions within deadlines. Required Skills: Python Programming Advanced Object-Oriented Programming (OOP) Strong understanding Data Structures & Algorithms Proficient NumPy, Pandas, SciPy, Scikit-learn Hands-on experience Git Proficient Docker, CI/CD Working experience MySQL Solid experience in database design and queries Candidate Attributes: Excellent problem-solving and analytical abilities. Strong verbal and written communication skills. Ability to work independently in a fast-paced, collaborative environment. Detail-oriented with a commitment to code quality and documentation.

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

bangalore, karnataka

On-site

As a Data Scientist at our company, your role will involve supporting our product, sales, leadership, and marketing teams by providing insights derived from analyzing company data. You will be responsible for utilizing large datasets to identify opportunities for product and process optimization, testing the effectiveness of various actions through models, and driving business results with data-based insights. Your work will also involve collaborating with a wide range of stakeholders to improve business outcomes through data-driven solutions. **Key Responsibilities:** - Work with stakeholders across the organization to leverage company data for driving business solutions. - Analyze data from company databases to optimize product development, marketing techniques, and business strategies. - Evaluate the accuracy of new data sources and data gathering methods. - Develop custom data models and algorithms for data sets. - Utilize predictive modeling to enhance customer experiences, revenue generation, ad targeting, and other business outcomes. - Establish a company A/B testing framework and assess model quality. - Collaborate with functional teams to implement models and monitor outcomes. - Create processes and tools for monitoring model performance and data accuracy. **Qualifications:** - Strong problem-solving skills with a focus on product development. - Proficiency in statistical computer languages such as R, Python, and SQL for data manipulation and insights extraction. - Experience in working with and building data architectures. - Knowledge of various machine learning techniques and their practical advantages and drawbacks. - Familiarity with advanced statistical concepts and techniques. - Excellent written and verbal communication skills for cross-team coordination. - Eagerness to learn and master new technologies and methodologies. - 5-7 years of experience in manipulating data sets, building statistical models, and a Master's or PhD in Statistics, Mathematics, Computer Science, or a related quantitative field. - Familiarity with coding languages like C, C++, Java, JavaScript, etc. - Experience with statistical and data mining techniques, databases, web services, machine learning algorithms, distributed data tools, and data visualization/presentation tools. In addition to the above, you should possess skills in linear algebra, statistics, Python, R, NumPy, Pandas, SciPy, and Scikit-learn to excel in this role.,

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

west bengal

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. **Key Responsibilities:** - OOPs concepts: Functions, Classes, Decorators - Python and experience in anyone frameworks Flask/Django/Fast API - Python Libraries (Pandas, TensorFlow, Numpy, SciPy) - AWS Cloud Experience - Docker, Kubernetes and microservices - Postgres/MySQL - GIT, SVN or any Code repository tools - Design Patterns - SQL Alchemy/ any ORM libraries (Object Relational Mapper) **Qualifications Required:** - Bachelor or Master degree with 3+ years of strong Python development experience EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,

Posted 4 days ago

Apply

7.0 - 10.0 years

1 - 5 Lacs

noida

Work from Office

Broad Function: We are seeking a Gen AI Developer with proven experience in integrating Generative AI tools into daily development driving innovation and productivity. The ideal candidate will have hands-on experience using tools like GitHub Copilot, ChatGPT, Amazon Code Whisperer, or similar platforms for 50% or more of their daily development tasks. Roles and Responsibilities: Interpret briefs and create high-quality, GenAI-assisted code that meets business objectives. Lead the design and development of Java-based applications with microservices architecture. Use GenAI tools in at least 50% of your development workflow to enhance productivity, code quality, and innovation. Collaborate with the Product Manager/Architect to define deliverables and technical strategy. Drive the integration of AI/ML capabilities into products using libraries like TensorFlow, PyTorch, or Hugging Face. Guide teams through refactoring and codebase optimization using AI-assisted insights. Conduct peer code reviews, ensuring usage of GenAI-generated suggestions where appropriate. Mentor and motivate junior developers in both traditional and GenAI-enhanced coding practices. Collaborate in agile development environments and CI/CD pipelines Implement NLP capabilities with Hugging Face Transformers and spaCy. Integrate AI/ML pipelines with CI/CD tools like Jenkins, GitLab CI, and CircleCI. Requirements Desired Candidate Profile: Java 8+, Spring Boot, Hibernate, JPA, Microservices, MongoDB, Redis, Kafka Docker, Kubernetes, AWS, GitLab, JIRA, Confluence Strong understanding of cloud-native and distributed systems Proficiency in integrating AI tools (e.g., GitHub Copilot, ChatGPT, Bard, etc.) into coding workflows Familiarity with AI/ML libraries and large language model APIs Experience with GenAI-based code generation, documentation creation, and API development. Proficient in AI/ML libraries such as TensorFlow, PyTorch, Keras, and Scikit-learn. Experienced in MLOps frameworks such as MLflow, Kubeflow, and DataRobot. Knowledge of data analysis frameworks like Pandas, NumPy, and SciPy for model development. Benefits The company offers a range of employee benefits including: Cashless medical insurance for employees, spouse and children Accidental insurance coverage Life insurance coverage Complementary lunch coupons Company-paid transportation Access to online learning platforms such as Udemy and LinkedIn Learning Retirement benefits including Provident Fund (PF) and Gratuity Paternity & Maternity Leave Benefit

Posted 4 days ago

Apply

0 years

0 Lacs

delhi, india

On-site

Description: As a Data Scientist at Encardio, you will analyze complex time-series data from devices such as accelerometers, strain gauges, and tilt meters. Your responsibilities will span data preprocessing, feature engineering, machine learning model development, and integration with real-time systems. You'll collaborate closely with engineers and domain experts to translate physical behaviours into actionable insights. This role is ideal for someone with strong statistical skills, experience in time-series modeling, and a desire to understand the real-world impact of their models in civil and industrial monitoring. Responsibilities Sensor Data Understanding & Preprocessing Clean, denoise, and preprocess high-frequency time-series data from edge devices. Handle missing, corrupted, or delayed telemetry from IoT sources. Develop domain knowledge of physical sensors and their behaviour (e.g., vibration patterns, strain profiles). Exploratory & Statistical Analysis Perform statistical and exploratory data analysis (EDA) on structured/unstructured sensor data. Identify anomalies, patterns, and correlations in multi-sensor environments. Feature Engineering Generate meaningful time-domain and frequency-domain features (e.g., FFT, wavelets). Implement scalable feature extraction pipelines. Model Development Build and validate ML models for: Anomaly detection (e.g., vibration spikes) Event classification (e.g., tilt angle breaches) Predictive maintenance (e.g., time-to-failure) Leverage traditional ML and deep learning and LLMs Deployment & Integration Work with Data Engineers to integrate models into real-time data pipelines and edge/cloud platforms. Package and containerize models (e.g., with Docker) for scalable deployment. Monitoring & Feedback Track model performance post-deployment and retrain/update as needed. Design feedback loops using human-in-the-loop or rule-based corrections. Collaboration & Communication Collaborate with hardware, firmware, and data engineering teams. Translate physical phenomena into data problems and insights. Document approaches, models, and assumptions for reproducibility. 🎯 Key Deliverables Reusable preprocessing and feature extraction modules for sensor data. Accurate and explainable ML models for anomaly/event detection. Model deployment artifacts (Docker images, APIs) for cloud or edge execution. Jupyter notebooks and dashboards (streamlit) for diagnostics, visualization, and insight generation. Model monitoring reports and performance metrics with retraining pipelines. Domain-specific data dictionaries and technical knowledge bases. Contribution to internal documentation and research discussions. Build deep understanding and documentation of sensor behavior and characteristics. 🔧 Technologies Languages & Libraries Python (NumPy, Pandas, SciPy, Scikit-learn, PyTorch/TensorFlow) Bash (for data ops & batch jobs) Signal Processing & Feature Extraction FFT, DWT, STFT (via SciPy, Librosa, tsfresh) Time-series modeling (sktime, statsmodels, Prophet) Machine Learning & Deep Learning Scikit-learn (traditional ML) PyTorch / TensorFlow / Keras (deep learning) XGBoost / LightGBM (tabular modeling) Data Analysis & Visualization Jupyter, Matplotlib, Seaborn, Plotly, Grafana (for dashboards) Model Deployment Docker (for containerizing ML models) FastAPI / Flask (for ML inference APIs) GitHub Actions (CI/CD for models) ONNX / TorchScript (for lightweight deployment) Data Engineering Integration Kafka (real-time data ingestion) S3 (model/data storage) Trino / Athena (querying raw and processed data) Argo Workflows / Airflow (model training pipelines) Monitoring & Observability Prometheus / Grafana (model & system monitoring)

Posted 4 days ago

Apply

4.0 - 8.0 years

25 - 40 Lacs

pune, bengaluru

Work from Office

Design, implement, and improve scalable forecasting and M&V algorithms using statistical, machine learning & scientific methods. Exp. Docker &Kubernetes Machine learning methods for time-series forecasting, e.g,ARIMA, regression models,deep learning Required Candidate profile 4+yrs exp-Data science role with a focus on time-series forecasting/M&V. Strong Python exp. Exp Python data science stack (NumPy, SciPy, Pandas, scikit-learn). Exp PySpark/Dask. Cloud Exp. Preffered

Posted 4 days ago

Apply

5.0 years

0 Lacs

greater kolkata area

On-site

₹10 - ₹20 a year About Us ZURU Tech is digitalizing global construction by developing the world’s first digital building fabrication platform—ZURU Home—empowering anyone to design and manufacture buildings with complete freedom. Join our multinational team to help shape the future of construction technology. What are you Going to do? Role As a Senior Python Developer, you'll play a pivotal role in the development of a cutting-edge, cloud-based structural analysis platform. You'll design and optimize core services that integrate with external building design tools, focusing on delivering high-quality software solutions in a distributed environment. Example projects include: 📌Creating services to generate complex structural meshes from architectural models. 📌 Designing and integrating finite element analysis solutions with advanced FEM libraries. 📌Implementing region-specific structural verification logic according to building codes. 📌 Collaborating with subject matter experts on requirements and testing scenarios. 📌 Coding and validating structural verifications, including Reinforced Concrete structures. 📌 Working collaboratively with engineering teams across multiple locations. What are we Looking for? ✔Minimum 5 years of experience with Python (object-oriented programming, data classes) ✔Strong background in NumPy and SciPy—ND tensors, slicing, sparse representations ✔Solid grasp of software engineering principles and best practices ✔Proficient in Test Driven Development; experienced with PyTest. ✔Experienced with CI/CD pipelines and containerization (Docker) ✔Excellent communication skills, with a collaborative and proactive approach to problem-solving ✔Demonstrated ability to mentor team members or guide technical discussions is a plus ✔Skilled at taking ownership of key technical decisions and delivering projects to completion ✔Preferred: Experience in distributed systems and cloud deployments What do we Offer? 💰 Competitive compensation ⌛️ 5 Working Days with Flexible Working Hours 🌎 Annual trips & Team outings 🚑 Medical Insurance for self & family 🚩 Training & skill development programs 🤘🏼 Work with the Global team, Make the most of the diverse knowledge 🍕 Several discussions over Multiple Pizza Parties A lot more! Come and discover us!

Posted 4 days ago

Apply

3.0 - 5.0 years

0 Lacs

gurugram, haryana, india

On-site

Job Description We aim to bring about a new paradigm in medical image diagnostics; providing intelligent, holistic, ethical, explainable and patient centric care. We are looking for innovative problem solvers. We want people who can empathize with the consumer, understand business problems, and design and deliver intelligent products. People who are looking to extend artificial intelligence into unexplored areas. Your primary focus will be in applying deep learning and artificial intelligence techniques to the domain of medical image analysis. Responsibilities Selecting features, building and optimizing classifier engines using deep learning techniques. Understanding the problem and applying the suitable image processing techniques Use techniques from artificial intelligence/deep learning to solve supervised and unsupervised learning problems. Understanding and designing solutions for complex problems related to medical image analysis by using Deep Learning/Object Detection/Image Segmentation. Recommend and implement best practices around the application of statistical modeling. Create, train, test, and deploy various neural networks to solve complex problems. Develop and implement solutions to fit business problems which may include applying algorithms from a standard statistical tool, deep learning or custom algorithm development. Understanding the requirements and designing solutions and architecture in accordance with them. Participate in code reviews, sprint planning, and Agile ceremonies to drive high-quality deliverables. Design and implement scalable data science architectures for training, inference, and deployment pipelines. Ensure code quality, readability, and maintainability by enforcing software engineering best practices within the data science team. Optimize models for production, including quantization, pruning, and latency reduction for real-time inference. Drive the adoption of versioning strategies for models, datasets, and experiments (e.g., using MLFlow, DVC). Contribute to the architectural design of data platforms to support large-scale experimentation and production workloads. Skills and Qualifications Strong software engineering skills in Python (or other languages used in data science) with emphasis on clean code, modularity, and testability. Excellent understanding and hands-on of Deep Learning techniques such as ANN, CNN, RNN, LSTM, Transformers, VAEs etc. Must have experience with Tensorflow or PyTorch framework in building, training, testing, and deploying neural networks. Experience in solving problems in the domain of Computer Vision. Knowledge of data, data augmentation, data curation, and synthetic data generation. Ability to understand the complete problem and design the solutions that best fit all the constraints. Knowledge of the common data science and deep learning libraries and toolkits such as Keras, Pandas, Scikit-learn, Numpy, Scipy, OpenCV etc. Good applied statistical skills, such as distributions, statistical testing, regression, etc. Exposure to Agile/Scrum methodologies and collaborative development practices. Experience with the development of RESTful APIs. The knowledge of libraries like FastAPI and the ability to apply it to deep learning architectures is essential. Excellent analytical and problem-solving skills with a good attitude and keen to adapt to evolving technologies. Experience with medical image analysis will be an advantage. Experience designing and building ML architecture components (e.g., feature stores, model registries, inference servers). Solid understanding of software design patterns, microservices, and cloud-native architectures. Expertise in model optimization techniques (e.g., ONNX conversion, TensorRT, model distillation) Familiarization with Triton. Education: BE/B Tech MS/M Tech (will be a bonus) Experience: 3-5 Years

Posted 4 days ago

Apply

Exploring Scipy Jobs in India

Scipy is a popular library in Python for scientific computing and offers a wide range of opportunities for job seekers in India. With the increasing demand for data science and machine learning professionals, there is a growing need for skilled individuals proficient in using Scipy.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Delhi
  4. Hyderabad
  5. Pune

Average Salary Range

The average salary range for Scipy professionals in India varies from ₹4,00,000 per annum for entry-level positions to ₹15,00,000 per annum for experienced professionals.

Career Path

In the field of Scipy, a typical career path may include roles such as Junior Developer, Data Scientist, Senior Developer, Machine Learning Engineer, and eventually progressing to a Tech Lead or Data Science Manager.

Related Skills

Alongside Scipy, employers often expect candidates to have proficiency in the following skills: - Python programming - Pandas - NumPy - Machine learning algorithms - Data visualization tools (e.g., Matplotlib, Seaborn)

Interview Questions

  • What is Scipy and how is it different from NumPy? (basic)
  • Explain the difference between machine learning and deep learning. (medium)
  • How would you handle missing data in a dataset using Scipy? (medium)
  • Describe a project where you used Scipy to solve a complex problem. (advanced)
  • What are the advantages of using Scipy over other libraries for scientific computing? (medium)
  • Explain the concept of optimization in Scipy. (advanced)
  • How would you assess the performance of a machine learning model using Scipy? (medium)
  • What is the use of the SciPy stack in Python? (basic)
  • How does Scipy support linear algebra operations? (medium)
  • Describe a scenario where you used Scipy to optimize a machine learning model. (advanced)
  • How would you handle outliers in a dataset using Scipy? (medium)
  • Explain the concept of statistical functions in Scipy. (basic)
  • What are the different modules available in Scipy? (basic)
  • How does Scipy support sparse matrix operations? (advanced)
  • Describe a scenario where you used Scipy for feature engineering in a machine learning project. (medium)
  • What is the significance of integration and differentiation functions in Scipy? (medium)
  • How can you perform interpolation using Scipy? (medium)
  • Explain the concept of signal processing in Scipy. (medium)
  • What are the advantages of using Scipy for scientific computing tasks? (basic)
  • How does Scipy support numerical optimization techniques? (advanced)
  • Describe a scenario where you used Scipy to solve a complex linear algebra problem. (advanced)
  • What are the different types of clustering algorithms available in Scipy? (medium)
  • How would you handle multi-dimensional arrays in Scipy? (medium)
  • Explain the concept of image processing in Scipy. (medium)
  • What are the different methods available for solving differential equations in Scipy? (advanced)

Closing Remark

As you prepare for Scipy job interviews in India, make sure to brush up on your technical skills and showcase your expertise confidently. With the right skills and preparation, you can land a rewarding job in the field of Scipy and contribute to the rapidly growing industry of data science and machine learning. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies