Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Summary The Senior Software Engineering role within the Advanced Technology Organization (ATO) will focus on software development for AI/ML research and innovation projects. This position will utilize standard development methodologies, best practices, tools and proven processes to lead teams to deliver a quality software proof of concept on budget, within planned schedule. The position is responsible for driving the design and development efforts related to architecture, scalability, availability and performance in alignment with the innovation roadmap Job Description Roles and Responsibilities Subject matter expert in AI/ML methodologies & MLOps using Azure/AWS for Grid Control systems with a proven track record. Design, build, and deploy machine learning models for regression, classification, and other predictive analytics tasks using Python. Work on the development, fine-tuning, and deployment of large language models for various NLP tasks such as text classification and language understanding. Clean, preprocess, and transform data to make it suitable for machine learning models, including handling missing data, feature selection, and data augmentation. Evaluate models using appropriate metrics and fine-tune them for better accuracy, performance, and scalability. Collaborate with cross-functional teams including power system engineers, software engineers, and business stakeholders to understand requirements and deliver AI-driven solutions. Experience in agile methodologies such as SCRUM. Work with internal teams and customers to understand requirements & vision. Drive increased efficiency across the teams, eliminating duplication, leveraging product and technology reuse. Navigate through ambiguity and prioritizing conflicting asks & tasks. Expert level skills in design, architecture, and development, with an ability to take a deep dive in the implementation aspects if the situation demands. Be an expert in core data structures as well as algorithms and the ability to implement them. Education Qualification For Roles Outside USA Master’s degree in Computer Science/Data Analytics or “STEM” Majors (Science, Technology, Engineering and Math) with minimum 5 years of experience Desired Characteristics Technical Expertise Quantifies effectiveness of design choices by gathering data. Drives accountability and adoption. Publishes guidance and documentation to promote adoption of design. Knowledgeable about AI/ML and MLOps development for the electric utility industry. Proven experience in developing and deploying machine learning models, specifically in regression and classification tasks. Experience working with large language models (LLMs) like GPT, BERT, et Proficient in Python and its machine learning libraries such as TensorFlow, PyTorch, scikit-learn, pandas, and NumPy. Strong understanding of regression techniques and classification algorithms . Hands-on experience in implementing and deploying ML models using MLOps practices. Experience with natural language processing (NLP) techniques and tools for working with large language models Leadership Influences through others; builds direct and "behind the scenes" support for ideas. Pre-emptively sees downstream consequences and effectively tailors influencing strategy to support a positive outcome. Uses experts or other third parties to influence. Able to verbalize what is behind decisions and downstream implications. Continuously reflecting on success and failures to improve performance and decision-making. Understands when change is needed. Participates in technical strategy planning. Proactively identifies and removes project obstacles or barriers on behalf of the team. Able to navigate accountability in a matrixed organization. Communicates and demonstrates a shared sense of purpose. Learns from failure. Personal Attributes Able to effectively direct and mentor others in critical thinking skills. Proactively engages with cross-functional teams to resolve issues and design solutions using critical thinking and analysis skills and best practices. Finds important patterns in seemingly unrelated information. Influences and energizes other toward the common vision and goal. Maintains excitement for a process and drives to new directions of meeting the goal even when odds and setbacks render one path impassable. Innovates and integrates new processes and/or technology to significantly add value to GE. Identifies how the cost of change weighs against the benefits and advises accordingly. Additional Information Relocation Assistance Provided: Yes Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Andhra Pradesh, India
On-site
The Data & Analytics team is responsible for integrating new data sources, creating data models, developing data dictionaries, and building machine learning models for Wholesale Bank. The primary objective is to design and deliver data products that assist squads at Wholesale Bank in achieving business outcomes and generating valuable business insights. Within this job family, we distinguish between Data Analysts and Data Scientists Requirements We are seeking a highly skilled Data Science and Machine Learning specialist with 2+ years of experience in Advanced Analytics, Statistical and ML model development. In this role, candidates will be responsible for leveraging data-driven insights and machine learning techniques to solve complex business problems, optimize processes, and drive innovation. The ideal candidate will be skilled in working with large datasets Key Responsibilities Extract and analyze data from company databases to drive the optimization and enhancement of product development and marketing strategies. Analyze large datasets to uncover trends, patterns, and insights that can influence business decisions. Leverage predictive and AI/ML modeling techniques to enhance and optimize customer experience, boost revenue generation, improve ad targeting, and more. Design, implement, and optimize machine learning models for a wide range of applications such as predictive analytics, natural language processing, recommendation systems, and more. Conduct experiments to fine tune machine learning models and evaluate their Performance Using Appropriate Metrics. Qualifications Bachelors, Master's or Ph.D in Computer Science, Data Science, Mathematics, Statistics, or a related field. 2+ years of experience in Analytics, Machine learning, Deep learning. Proficiency in programming languages such as Python, and familiarity with machine learning libraries (e.g., Numpy, Pandas, TensorFlow, Keras, PyTorch, Scikit-learn). Strong experience with data wrangling, cleaning, and transforming raw data into structured, usable formats. Hands on experience in developing, training, and deploying machine learning models for various applications (e.g., predictive analytics, recommendation systems, anomaly detection). In depth understanding of machine learning algorithms (supervised, unsupervised, reinforcement learning) and their appropriate use cases Good to Have: Show more Show less
Posted 1 week ago
0.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Company Description BRAINWONDERS About Company: At Brainwonders, we are proud to be India’s largest career counselling and guidance company, recognized for our commitment to transforming students' futures. With 1223+ educational institutes using our services, 93+ corporate connections, and 108+ franchisees, we have built an expansive network of support for students, educators, and professionals. Brainwonders has earned numerous national and regional awards for excellence in career counselling and guidance, and is consistently rated as one of the highest-paying employers in the counselling industry by various job portals. Job Description AI Engineer Intern – LLMs, Agents, AWS Position: AI Engineer Intern Experience: 0-1 Year (Final-year students or freshers welcome) Location: Borivali East, Mumbai Type: Internship (3–6 months) | Possibility of Full-time Offer Focus Areas: Large Language Models (LLMs) AI Agents / Autonomous Systems ML Fundamentals & Transformers Cloud Deployment & Scaling (AWS) Responsibilities: Build and test LLM-based applications and workflows. Experiment with agent frameworks (e.g., LangGraph, CrewAI). Use and adapt transformer models (e.g., LLaMA, Mistral) for domain tasks. Deploy models and services on AWS (Lambda, EC2, SageMaker). Work with vector stores and embeddings for search or RAG-based systems. Collaborate on prompt engineering, fine-tuning, and pipeline optimization. ✅ Requirements: Solid understanding of LLMs, tokenization, transformers. Python programming (Pandas, NumPy, Hugging Face, LangChain, etc.). Exposure to AWS services like EC2, Lambda, S3, or SageMaker. Eagerness to explore real-world AI use cases (agents, copilots, RAG). Basic ML knowledge and hands-on project experience. Good to know: Experience with serverless deployments on AWS. Familiarity with monitoring and logging (e.g., CloudWatch, OpenTelemetry). Contributions to open-source AI tools or models Additional Information Show more Show less
Posted 1 week ago
1.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job Description We are seeking a dynamic and energetic individual with strong Python development experience to join our Research Modernization team. Job Summary As a Python developer within our team, you will bring your dynamic energy, strong attention to detail, and ability to multitask to our Research Modernization team within Global Research. In this role, you will have the opportunity to be highly organized, proactive, and flexible. Your keen attention to detail, strong communication skills, and sense of urgency will be highly valued. Job Responsibilities Build, deploy, and maintain business processes and web crawlers using Python. Work with Research Analysts to understand their data needs and translate their requirements into data/information solutions. Utilize Python for data handling and manipulation to prepare and transform data into a usable format for Research Analysts. Create, productionize, and store enhanced and original datasets; improve data analysis and discovery tools; model the data with a focus on ease of storage, retrieval, and usability. Develop scripts and tools to automate repetitive tasks, improve efficiency, and streamline workflows. Monitor and troubleshoot process performance and reliability issues in a timely manner. Identify opportunities for process improvement and develop policies for data access and governance. Ensure data quality and integrity through effective validation and testing processes. Automate publication and analytics workflows for Research Analysts. Research and analyze datasets using a variety of analytical techniques in Python. Required Qualifications, Capabilities, And Skills Bachelor’s degree in engineering, computer science, information technology, or other data science/analytics fields. Minimum 1 year of experience in data analysis, automation, and web crawling using Python. Strong knowledge of Python and relevant packages for data handling and manipulation. It is essential that the candidate has hands-on experience in Python programming and in-depth knowledge of libraries like Pandas and NumPy. Understanding of essential libraries for web scraping, such as BeautifulSoup and Selenium. Strong background in data structures, algorithms, and data engineering, with experience in multiple data/technology platforms. Strong analytical and problem-solving skills. Excellent communication, presentation, interpersonal, and organizational skills. Detail-oriented, with a highly disciplined approach to processes and controls. Preferred Qualifications, Capabilities, And Skills Demonstrable interest in finance and research. ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team J.P. Morgan’s Commercial & Investment Bank is a global leader across banking, markets, securities services and payments. Corporations, governments and institutions throughout the world entrust us with their business in more than 100 countries. The Commercial & Investment Bank provides strategic advice, raises capital, manages risk and extends liquidity in markets around the world. Show more Show less
Posted 1 week ago
5.0 - 10.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
We are seeking a highly skilled and motivated Lead DS/ML engineer to join our team. The role is critical to the development of a cutting-edge reporting platform designed to measure and optimize online marketing campaigns. We are seeking a highly skilled Data Scientist / ML Engineer with a strong foundation in data engineering (ELT, data pipelines) and advanced machine learning to develop and deploy sophisticated models. The role focuses on building scalable data pipelines, developing ML models, and deploying solutions in production to support a cutting-edge reporting, insights, and recommendations platform for measuring and optimizing online marketing campaigns. The ideal candidate should be comfortable working across data engineering, ML model lifecycle, and cloud-native technologies. Job Description: Key Responsibilities: Data Engineering & Pipeline Development Design, build, and maintain scalable ELT pipelines for ingesting, transforming, and processing large-scale marketing campaign data. Ensure high data quality, integrity, and governance using orchestration tools like Apache Airflow, Google Cloud Composer, or Prefect. Optimize data storage, retrieval, and processing using BigQuery, Dataflow, and Spark for both batch and real-time workloads. Implement data modeling and feature engineering for ML use cases. Machine Learning Model Development & Validation Develop and validate predictive and prescriptive ML models to enhance marketing campaign measurement and optimization. Experiment with different algorithms (regression, classification, clustering, reinforcement learning) to drive insights and recommendations. Leverage NLP, time-series forecasting, and causal inference models to improve campaign attribution and performance analysis. Optimize models for scalability, efficiency, and interpretability. MLOps & Model Deployment Deploy and monitor ML models in production using tools such as Vertex AI, MLflow, Kubeflow, or TensorFlow Serving. Implement CI/CD pipelines for ML models, ensuring seamless updates and retraining. Develop real-time inference solutions and integrate ML models into BI dashboards and reporting platforms. Cloud & Infrastructure Optimization Design cloud-native data processing solutions on Google Cloud Platform (GCP), leveraging services such as BigQuery, Cloud Storage, Cloud Functions, Pub/Sub, and Dataflow. Work on containerized deployment (Docker, Kubernetes) for scalable model inference. Implement cost-efficient, serverless data solutions where applicable. Business Impact & Cross-functional Collaboration Work closely with data analysts, marketing teams, and software engineers to align ML and data solutions with business objectives. Translate complex model insights into actionable business recommendations. Present findings and performance metrics to both technical and non-technical stakeholders. Qualifications & Skills: Educational Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Science, Machine Learning, Artificial Intelligence, Statistics, or a related field. Certifications in Google Cloud (Professional Data Engineer, ML Engineer) is a plus. Must-Have Skills: Experience: 5-10 years with the mentioned skillset & relevant hands-on experience Data Engineering: Experience with ETL/ELT pipelines, data ingestion, transformation, and orchestration (Airflow, Dataflow, Composer). ML Model Development: Strong grasp of statistical modeling, supervised/unsupervised learning, time-series forecasting, and NLP. Programming: Proficiency in Python (Pandas, NumPy, Scikit-learn, TensorFlow/PyTorch) and SQL for large-scale data processing. Cloud & Infrastructure: Expertise in GCP (BigQuery, Vertex AI, Dataflow, Pub/Sub, Cloud Storage) or equivalent cloud platforms. MLOps & Deployment: Hands-on experience with CI/CD pipelines, model monitoring, and version control (MLflow, Kubeflow, Vertex AI, or similar tools). Data Warehousing & Real-time Processing: Strong knowledge of modern data platforms for batch and streaming data processing. Nice-to-Have Skills: Experience with Graph ML, reinforcement learning, or causal inference modeling. Working knowledge of BI tools (Looker, Tableau, Power BI) for integrating ML insights into dashboards. Familiarity with marketing analytics, attribution modeling, and A/B testing methodologies. Experience with distributed computing frameworks (Spark, Dask, Ray). Location: Bengaluru Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less
Posted 1 week ago
20.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Staff AI Engineer MLOps About The Team The AI Center of Excellence team includes Data Scientists and AI Engineers that work together to conduct research, build prototypes, design features and build production AI components and systems. Our mission is to leverage the best available technology to protect our customers' attack surfaces. We partner closely with Detection and Response teams, including our MDR service, to leverage AI/ML for enhanced customer security and threat detection. We operate with a creative, iterative approach, building on 20+ years of threat analysis and a growing patent portfolio. We foster a collaborative environment, sharing knowledge, developing internal learning, and encouraging research publication. If you’re passionate about AI and want to make a major impact in a fast-paced, innovative environment, this is your opportunity. The Technologies We Use Include AWS for hosting our research environments, data, and features (i.e. Sagemaker, Bedrock) EKS to deploy applications Terraform to manage infrastructure Python for analysis and modeling, taking advantage of numpy and pandas for data wrangling. Jupyter notebooks (locally and remotely hosted) as a computational environment Sci-kit learn for building machine learning models Anomaly detection methods to make sense of unlabeled data About The Role Rapid7 is seeking a Staff AI Engineer to join our team as we expand and evolve our growing AI and MLOps efforts. You should have a strong foundation in software engineering, and MLOps and DevOps systems and tools. Further, you’ll have a demonstrated track record of taking models created in the AI R&D process to production with repeatable deployment, monitoring and observability patterns. In this intersectional role, you will combine your expertise in AI/ML deployments, cloud systems and software engineering to enhance our product offerings and streamline our platform's functionalities. In This Role, You Will Design and build ML production systems, including project scoping, data requirements, modeling strategies, and deployment Develop and maintain data pipelines, manage the data lifecycle, and ensure data quality and consistency throughout Assure robust implementation of ML guardrails and manage all aspects of service monitoring Develop and deploy accessible endpoints, including web applications and REST APIs, while maintaining steadfast data privacy and adherence to security best practices and regulations Share expertise and knowledge consistently with internal and external stakeholders, nurturing a collaborative environment and fostering the development of junior engineers Embrace agile development practices, valuing constant iteration, improvement, and effective problem-solving in complex and ambiguous scenarios The Skills You’ll Bring Include 8-12 years experience as a Software Engineer, with at least 3 years focused on gaining expertise in ML deployment (especially in AWS) Solid technical experience in the following is required: Software engineering: developing APIs with Flask or FastAPI, paired with strong Python knowledge DevOps and MLOps: Designing and integrating scalable AI/ML systems into production environments, CI/CD tooling, Docker, Kubernetes, cloud AI resource utilization and management Pipelines, monitoring, and observability: Data pre-processing and feature engineering, model monitoring and evaluation A growth mindset - welcoming the challenge of tackling complex problems with a bias for action Strong written and verbal communication skills - able to effectively communicate technical concepts to diverse audiences and creating clear documentation of system architectures and implementation details Proven ability to collaborate effectively across engineering, data science, product, and other teams to drive successful MLOps initiatives and ensure alignment on goals and deliverables. Experience With The Following Would Be Advantageous Experience with Java programming Experience in the security industry AI and ML models, understanding their operational frameworks and limitations Familiarity with resources that enable data scientists to fine tune and experiment with LLMs Knowledge of or experience with model risk management strategies, including model registries, concept/covariate drift monitoring, and hyperparameter tuning We know that the best ideas and solutions come from multi-dimensional teams. That’s because these teams reflect a variety of backgrounds and professional experiences. If you are excited about this role and feel your experience can make an impact, please don’t be shy - apply today. About Rapid7 At Rapid7, we are on a mission to create a secure digital world for our customers, our industry, and our communities. We do this by embracing tenacity, passion, and collaboration to challenge what’s possible and drive extraordinary impact. Here, we’re building a dynamic workplace where everyone can have the career experience of a lifetime. We challenge ourselves to grow to our full potential. We learn from our missteps and celebrate our victories. We come to work every day to push boundaries in cybersecurity and keep our 10,000 global customers ahead of whatever’s next. Join us and bring your unique experiences and perspectives to tackle some of the world’s biggest security challenges. Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
India
On-site
About us: GUVI (Grab Your Vernacular Imprint) Geek Network Private Limited is an Edu-Tech incubated by IIT-Madras, IIM-Ahmedabad and Partnered with Google for Education. What sets us apart is the fact that we offer online learning in a plethora of different vernacular languages along with English. With more than 15+ lakh users currently learning from our platform, GUVI continues to grow at a tremendous rate. For more information about us, visit www.guvi.in Responsibilities: 1. Instant Query handling. 2. Handling LIVE sessions. 3. Providing various weekly assignments, assessments, and tasks to learners and evaluating them on a regular basis. 4. Creating individual real-time mini projects involving learners to do hands-on and helping them update their profiles. 5. Tech support advice and tips for the interview Preparation. 6. Building good Rapport with learners and helping them with the smooth learning process in all possible ways. Skills: Python Programming, Pandas, Probability and Statistics with Numpy, MongoDB, Data Visualisation in Python (Matplotlib, Seaborn), Data Analysis on Image and text data, Machine Learning, NLP. Experience: 2+ years experience as a full time Data Scientist is mandatory Job Type: Part-time Show more Show less
Posted 1 week ago
0 years
0 Lacs
India
Remote
Artificial Intelligence & Machine Learning Intern 📍 Location: Remote (100% Virtual) 📅 Duration: 3 Months 💸 Stipend for Top Interns: ₹15,000 🎁 Perks: Certificate | Letter of Recommendation | Full-Time Offer (Based on Performance) About INLIGHN TECH INLIGHN TECH is a forward-thinking edtech startup offering project-driven virtual internships that prepare students for today’s competitive tech landscape. Our AI & ML Internship is designed to immerse you in real-world applications of machine learning and artificial intelligence, helping you develop job-ready skills through hands-on projects. 🚀 Internship Overview As an AI & ML Intern , you will explore machine learning algorithms, build predictive models, and work on projects that mimic real-world use cases—ranging from recommendation systems to AI-based automation tools. You’ll gain experience with Python, Scikit-learn, TensorFlow , and more. 🔧 Key Responsibilities Work on datasets to clean, preprocess, and prepare for model training Implement machine learning algorithms (regression, classification, clustering, etc.) Build and test models using Scikit-learn, TensorFlow, Keras , or PyTorch Analyze model performance and optimize using evaluation metrics Collaborate with mentors to develop AI solutions for business or academic use cases Present findings and document all steps of the model-building process ✅ Qualifications Pursuing or recently completed a degree in Computer Science, Data Science, AI/ML, or related fields Proficient in Python and familiar with data science libraries (NumPy, Pandas, Matplotlib) Basic understanding of machine learning concepts and algorithms Experience with tools like Jupyter Notebook , Google Colab , or similar platforms Strong analytical mindset and interest in solving real-world problems using AI Enthusiastic about learning and exploring new technologies 🎓 What You’ll Gain Hands-on experience with real-world AI and ML projects Exposure to end-to-end model development workflows A strong project portfolio to showcase your skills Internship Certificate upon successful completion Letter of Recommendation for high-performing interns Opportunity for a Full-Time Offer based on performance Show more Show less
Posted 1 week ago
0 years
0 Lacs
India
Remote
Data Science Intern 📍 Location: Remote (100% Virtual) 📅 Duration: 3 Months 💸 Stipend for Top Interns: ₹15,000 🎁 Perks: Certificate | Letter of Recommendation | Full-Time Offer (Based on Performance) About INLIGHN TECH INLIGHN TECH offers hands-on, project-based virtual internships that bridge the gap between academic knowledge and industry skills. Our Data Science Internship is curated to provide aspiring data professionals with real-world exposure in data collection, analysis, modeling, and decision-making using cutting-edge tools. 🚀 Internship Overview As a Data Science Intern , you’ll work on real-time datasets , apply machine learning models , and generate actionable insights . The role is ideal for individuals looking to strengthen their understanding of data pipelines, predictive modeling, and storytelling with data . 🔧 Key Responsibilities Clean, manipulate, and analyze structured and unstructured datasets Build and evaluate machine learning models for prediction/classification Apply statistical techniques to uncover trends and insights Work with tools such as Python, Pandas, NumPy, Scikit-learn, and Jupyter Notebooks Create visualizations using Matplotlib, Seaborn, or Power BI/Tableau Collaborate with mentors and peers to solve data-driven problems Document code, findings, and processes in clear and concise reports ✅ Qualifications Pursuing or recently completed a degree in Data Science, Computer Science, Statistics, Engineering , or related fields Proficient in Python and familiar with Pandas, NumPy, and basic ML libraries Strong foundation in statistics, probability , and data visualization Understanding of supervised and unsupervised learning algorithms Eager to learn, experiment, and solve complex problems using data 🎓 What You’ll Gain Practical experience with real-world datasets and projects Exposure to machine learning workflows and tools Internship Certificate upon successful completion Letter of Recommendation for top performers Opportunity for a Full-Time Offer based on performance A portfolio of data science projects to showcase in interviews Show more Show less
Posted 1 week ago
3.0 - 4.0 years
0 Lacs
Mumbai, Maharashtra, India
Remote
Job Title: Data Scientist – Computer Vision & Generative AI Location: Mumbai Experience Level: 3 to 4 years Employment Type: Full-time Industry: Renewable Energy / Solar Services Job Overview: We are seeking a talented and motivated Data Scientist with a strong focus on computer vision, generative AI, and machine learning to join our growing team in the solar services sector. You will play a pivotal role in building AI-driven solutions that transform how solar infrastructure is analyzed, monitored, and optimized using image-based intelligence. From drone and satellite imagery to on-ground inspection photos, your work will enable intelligent automation, predictive analytics, and visual understanding in critical areas like fault detection, panel degradation, site monitoring, and more. If you're passionate about working at the cutting edge of AI for real-world sustainability impact, we’d love to hear from you. Key Responsibilities: Design, develop, and deploy computer vision models for tasks such as object detection, classification, segmentation, anomaly detection, etc. Work with generative AI techniques (e.g., GANs, diffusion models) to simulate environmental conditions, enhance datasets, or create synthetic training data. Build ML pipelines for end-to-end model training, validation, and deployment using Python and modern ML frameworks. Analyze drone, satellite, and on-site images to extract meaningful insights for solar panel performance, wear-and-tear detection, and layout optimization. Collaborate with cross-functional teams (engineering, field ops, product) to understand business needs and translate them into scalable AI solutions. Continuously experiment with the latest models, frameworks, and techniques to improve model performance and robustness. Optimize image pipelines for performance, scalability, and edge/cloud deployment. Key Requirements: 3–4 years of hands-on experience in data science, with a strong portfolio of computer vision and ML projects. Proven expertise in Python and common data science libraries: NumPy, Pandas, Scikit-learn, etc. Proficiency with image-based AI frameworks: OpenCV , PyTorch or TensorFlow , Detectron2 , YOLOv5/v8 , MMDetection , etc. Experience with generative AI models like GANs , Stable Diffusion , or ControlNet for image generation or augmentation. Experience building and deploying ML models using MLflow , TorchServe , or TensorFlow Serving . Familiarity with image annotation tools (e.g., CVAT, Labelbox), and data versioning tools (e.g., DVC). Experience with cloud platforms ( AWS , GCP , or Azure ) for storage, training, or model deployment. Experience with Docker , Git , and CI/CD pipelines for reproducible ML workflows. Ability to write clean, modular code and a solid understanding of software engineering best practices in AI/ML projects. Strong problem-solving skills, curiosity, and ability to work independently in a fast-paced environment. Bonus / Preferred Skills: Experience with remote sensing and working with satellite or drone imagery. Exposure to MLOps practices and tools like Kubeflow , Airflow , or SageMaker Pipelines . Knowledge of solar technologies, photovoltaic systems, or renewable energy is a plus. Familiarity with edge computing for vision applications on IoT devices or drones. Application Instructions: Please submit your resume, portfolio (GitHub, blog, or project links), and a short cover letter explaining why you’re interested in this role to khushboo.b@solarsquare.in or sidhant.c@solarsquare.in Show more Show less
Posted 1 week ago
0.0 - 1.0 years
0 Lacs
Rajarajeshwari Nagar, Bengaluru, Karnataka
On-site
Key Responsibilities: Work on bug fixing , code improvements , and minor feature enhancements Write unit and integration tests for backend services and Python scripts Collaborate with senior developers to learn best practices in coding, debugging, and testing Understand feature requirements and contribute to internal documentation Participate in team scrums and code reviews Learn about cloud platforms (AWS) and AI/ML workflows as part of Raaka’s projects Required Skills: Bachelor's degree (BTech/BE) in Computer Science, IT, or related field Good foundation in Python programming , with basic understanding of OOP Familiarity with writing simple scripts, handling files, and using standard Python libraries Interest or basic knowledge in AI/ML concepts (e.g., sklearn, pandas, NumPy) Strong analytical aptitude and problem-solving ability Good written and spoken English communication Willingness to learn and adapt in a fast-paced, collaborative environment Job Type: Full-time Pay: ₹300,000.00 - ₹400,000.00 per year Benefits: Health insurance Provident Fund Schedule: Day shift Supplemental Pay: Performance bonus Ability to commute/relocate: Rajarajeshwari Nagar, Bengaluru, Karnataka: Reliably commute or planning to relocate before starting work (Required) Application Question(s): What is your current LPA and expected LPA? What other languages/framework you know? Experience: Python: 1 year (Required) Work Location: In person Application Deadline: 14/06/2025
Posted 1 week ago
2.0 - 6.0 years
60 - 72 Lacs
Ahmedabad
Work from Office
Responsibilities: Collaborate with cross-functional teams on project requirements & deliverables. Design, develop, test & maintain Python apps using Django/Flask & AWS/Azure Cloud.
Posted 1 week ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Key Responsibilities Develop and maintain web applications using Python, Django, and Flask Write clean, efficient, and reusable code Design and implement RESTful APIs Assess a given problem can be addressed using advanced data analytics and/or various ML techniques Work on ideas for data collection Clean & structure the data by applying feature-engineering techniques Train models for designing, developing, and implementing POCs Requirements Should be willing to work as an individual contributor when necessary Extensive programming experience in Python Familiar with various machine learning techniques and libraries. Strong preference for programming experience in Numpy, Pandas, and Matplotlib libraries Knowledge of basic statistical methods Random Forest, Linear Regression, Logistic Regression, and other such methods About Company: We are data analytics, IoT and digital services provider with offices in Bangalore and Pune. We are passionate about applying technology for better business decisions. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Bangalore Urban, Karnataka, India
On-site
Job Title: Data Scientist – OpenCV Experience: 2–3 Years Location: Bangalore Notice Period: Immediate Joiners Only Job Overview We are looking for a passionate and driven Data Scientist with a strong foundation in computer vision, image processing, and OpenCV. This role is ideal for professionals with 2–3 years of experience who are excited about working on real-world visual data problems and eager to contribute to impactful projects in a collaborative environment. Key Responsibilities Develop and implement computer vision solutions using OpenCV and Python. Work on tasks including object detection, recognition, tracking, and image/video enhancement. Clean, preprocess, and analyze large image and video datasets to extract actionable insights. Collaborate with senior data scientists and engineers to deploy models into production pipelines. Contribute to research and proof-of-concept projects in the field of computer vision and machine learning. Prepare clear documentation for models, experiments, and technical processes. Required Skills Proficient in OpenCV and image/video processing techniques. Strong coding skills in Python, with familiarity in libraries such as NumPy, Pandas, Matplotlib. Solid understanding of basic machine learning and deep learning concepts. Hands-on experience with Jupyter Notebooks; exposure to TensorFlow or PyTorch is a plus. Excellent analytical, problem-solving, and debugging skills. Effective communication and collaboration abilities. Preferred Qualifications Bachelor’s degree in computer science, Data Science, Electrical Engineering, or a related field. Practical exposure through internships or academic projects in computer vision or image analysis. Familiarity with cloud platforms (AWS, GCP, Azure) is an added advantage. What We Offer A dynamic and innovation-driven work culture. Guidance and mentorship from experienced data science professionals. The chance to work on impactful, cutting-edge projects in computer vision. Competitive compensation and employee benefits. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Key Responsibilities Develop and maintain web applications using Python, Django, and Flask Write clean, efficient, and reusable code Design and implement RESTful APIs Assess a given problem can be addressed using advanced data analytics and/or various ML techniques Work on ideas for data collection Clean & structure the data by applying feature-engineering techniques Train models for designing, developing, and implementing POCs Requirements Should be willing to work as an individual contributor when necessary Extensive programming experience in Python Familiar with various machine learning techniques and libraries. Strong preference for programming experience in Numpy, Pandas, and Matplotlib libraries Knowledge of basic statistical methods Random Forest, Linear Regression, Logistic Regression, and other such methods About Company: We are data analytics, IoT and digital services provider with offices in Bangalore and Pune. We are passionate about applying technology for better business decisions. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Job Title - S&C Global Network - AI – CMT - ML Architecture – Specialist Management Level: 9-Team Lead/Consultant Location: Bengaluru, BDC7C Must-have skills: Machine Learning Architecture Design Good to have skills: Ability to leverage design thinking, business process optimization, and stakeholder management skills. Job Summary: This role involves driving strategic initiatives, managing business transformations, and leveraging industry expertise to create value-driven solutions. Roles & Responsibilities: Provide strategic advisory services, conduct market research, and develop data-driven recommendations to enhance business performance. WHAT’S IN IT FOR YOU? Accenture has committed to invest USD 3Billion into GenAI in the next 3 years. We will continually invest in your learning and growth. You'll work with Accenture’s highly skilled and experienced practitioners, and Accenture will support you in growing your own career path and interests. You’ll be part of a diverse, vibrant, global Accenture Data and AI community, continually pushing the boundaries of business capabilities. What You Would Do In This Role ML Maturity Assessment: Conduct comprehensive assessments of the organization's ML maturity, identifying strengths, weaknesses, and areas for improvement. Provide strategic recommendations to enhance the overall ML capability and align it with business objectives. ML Ops Roadmap & Processes: Develop ML Ops roadmaps and establish robust processes for the end-to-end machine learning lifecycle, including data preparation, model training, deployment, and monitoring. Implement best practices in ML Ops to ensure efficiency, scalability, and reliability of ML systems. Implementation of Gen AI Solutions: Lead the design and implementation of state-of-the-art Generative AI solutions, leveraging deep learning frameworks such as TensorFlow and PyTorch. Drive innovation in Gen AI, staying abreast of the latest advancements and incorporating cutting-edge technologies into solutions. Incubating and Designing: Proactively identify opportunities for ML and / or Gen AI applications within the organization. Work closely with cross-functional teams to incubate and design bespoke ML solutions tailored to business requirements. Technical Leadership: Provide technical leadership and mentorship to data scientists, engineers, and other team members. Collaborate with stakeholders to ensure alignment between technical solutions and business objectives. Collaboration and Communication: Collaborate with business stakeholders to understand their needs and translate them into ML and Gen AI requirements. Effectively communicate complex technical concepts to non-technical audiences. Professional & Technical Skills: Relevant experience in the required domain. Strong analytical, problem-solving, and communication skills. Ability to work in a fast-paced, dynamic environment. Proven expertise in conducting ML maturity assessments and developing ML Ops roadmaps. Hands-on experience in operationalizing the Machine learning system on a cloud and / or an On-Prem platform. Experience in implementing Generative AI solutions, including incubation, design, and deployment will be a big plus. Proficiency in deep learning frameworks such as TensorFlow and PyTorch. Good knowledge of ML Ops best practices and processes. Excellent problem-solving skills and ability to design scalable and reliable ML architectures. Strong leadership and communication skills, with a track record of leading successful ML initiatives. Experience in Telecom or Hi Tech or Software and platform industry desirable Tools & Techniques: TensorFlow, PyTorch, Scikit-learn, Keras NumPy, Pandas Matplotlib, Seaborn TensorFlow Serving, Docker and Kubernetes Good software engineering practices, including code modularization, documentation, and testing. Experience with open API , Integration architecture , microservices Additional Information: Opportunity to work on innovative projects. Career growth and leadership exposure. About Our Company | Accenture Experience: 8-10Years Educational Qualification: Any Degree Show more Show less
Posted 1 week ago
0.0 - 2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Marketing Measurement & Optimization Analyst Job Description: Qualifications: Bachelor's degree in Statistics, Mathematics, Computer Science, Engineering, or a related field. Proven 0-2 years of experience in a similar role. Strong problem-solving skills. Excellent communication skills. Skills: Proficiency in R (tidyverse, plotly/ggplot2), or Python (pandas, numpy), for data manipulation and visualization, and SQL (joins, aggregation, analytics functions) for data handling. Ability to understand marketing data and perform statistical tests. Knowledge of data visualization tools such as Tableau or Power BI. Responsibilities: Familiar with Media Mix Modelling, Multi-Touch Attribution. Knowledge of panel data and its analysis. Understand of Data Science workflow. Familiarity with marketing channels, performance & effectiveness metrics, and conversion funnel. Work with large data and performing data QA & manipulation tasks such as joins/merge, aggregation & segregation, append. Location: Pune Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less
Posted 1 week ago
55.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At Capgemini Engineering, the world leader in engineering services, we bring together a global team of engineers, scientists, and architects to help the world’s most innovative companies unleash their potential. From autonomous cars to life-saving robots, our digital and software technology experts think outside the box as they provide unique R&D and engineering services across all industries. Join us for a career full of opportunities. Where you can make a difference. Where no two days are the same. Job Description Strong working experience in python based Django,Flask framework. Experience in developing microservices based design and architecture. Strong programming knowledge in Javascript,HTML5, Python, Restful API, gRPC API Programming experience & object-oriented concepts in PYTHON. Knowledge of python libraries like Numpy, Pandas, Ppen3D, OpenCV, Matplotlib. Knowledge of MySQL/Postgres/MSSQL database. Knowledge of 3D geometry. Knowledge of SSO/OpenID Connect/OAuth authentication protocols. Working experience with version control systems like GitHub/BitBucket/GitLab. Familiarity with continuous integration and continuous deployment (CI/CD) pipelines. Basic knowledge of image processing. Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications.1. Applies scientific methods to analyse and solve software engineering problems.2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance.3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers.4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities.5. The software engineer collaborates and acts as team player with other software engineers and stakeholders. Job Description - Grade Specific Has more than a year of relevant work experience. Solid understanding of programming concepts, software design and software development principles. Consistently works to direction with minimal supervision, producing accurate and reliable results. Individuals are expected to be able to work on a range of tasks and problems, demonstrating their ability to apply their skills and knowledge. Organises own time to deliver against tasks set by others with a mid term horizon. Works co-operatively with others to achieve team goals and has a direct and positive impact on project performance and make decisions based on their understanding of the situation, not just the rules. Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem. Show more Show less
Posted 1 week ago
0.0 - 2.0 years
0 Lacs
New Delhi, Delhi, India
On-site
Marketing Measurement & Optimization Analyst Job Description: Qualifications: Bachelor's degree in Statistics, Mathematics, Computer Science, Engineering, or a related field. Proven 0-2 years of experience in a similar role. Strong problem-solving skills. Excellent communication skills. Skills: Proficiency in R (tidyverse, plotly/ggplot2), or Python (pandas, numpy), for data manipulation and visualization, and SQL (joins, aggregation, analytics functions) for data handling. Ability to understand marketing data and perform statistical tests. Knowledge of data visualization tools such as Tableau or Power BI. Responsibilities: Familiar with Media Mix Modelling, Multi-Touch Attribution. Knowledge of panel data and its analysis. Understand of Data Science workflow. Familiarity with marketing channels, performance & effectiveness metrics, and conversion funnel. Work with large data and performing data QA & manipulation tasks such as joins/merge, aggregation & segregation, append. Location: Pune Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less
Posted 1 week ago
0.0 - 2.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Marketing Measurement & Optimization Analyst Job Description: Qualifications: Bachelor's degree in Statistics, Mathematics, Computer Science, Engineering, or a related field. Proven 0-2 years of experience in a similar role. Strong problem-solving skills. Excellent communication skills. Skills: Proficiency in R (tidyverse, plotly/ggplot2), or Python (pandas, numpy), for data manipulation and visualization, and SQL (joins, aggregation, analytics functions) for data handling. Ability to understand marketing data and perform statistical tests. Knowledge of data visualization tools such as Tableau or Power BI. Responsibilities: Familiar with Media Mix Modelling, Multi-Touch Attribution. Knowledge of panel data and its analysis. Understand of Data Science workflow. Familiarity with marketing channels, performance & effectiveness metrics, and conversion funnel. Work with large data and performing data QA & manipulation tasks such as joins/merge, aggregation & segregation, append. Location: Pune Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less
Posted 1 week ago
0.0 - 2.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Marketing Measurement & Optimization Analyst Job Description: Qualifications: Bachelor's degree in Statistics, Mathematics, Computer Science, Engineering, or a related field. Proven 0-2 years of experience in a similar role. Strong problem-solving skills. Excellent communication skills. Skills: Proficiency in R (tidyverse, plotly/ggplot2), or Python (pandas, numpy), for data manipulation and visualization, and SQL (joins, aggregation, analytics functions) for data handling. Ability to understand marketing data and perform statistical tests. Knowledge of data visualization tools such as Tableau or Power BI. Responsibilities: Familiar with Media Mix Modelling, Multi-Touch Attribution. Knowledge of panel data and its analysis. Understand of Data Science workflow. Familiarity with marketing channels, performance & effectiveness metrics, and conversion funnel. Work with large data and performing data QA & manipulation tasks such as joins/merge, aggregation & segregation, append. Location: Pune Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less
Posted 1 week ago
0 years
0 Lacs
India
On-site
Financial Analysts and Advisors (Vet Existing Resources) Collect market and company data, build/maintain financial models, craft decks, track portfolios, run risk and scenario analyses, develop client recommendations, and manage CRM workflows. Commercial Software ‑ Bloomberg Terminal, Refinitiv Eikon, FactSet, Excel, PowerPoint, Salesforce FSC, Redtail, Wealthbox, Orion Advisor Tech, Morningstar Office, BlackRock Aladdin, Riskalyze, Tolerisk, eMoney Advisor, MoneyGuidePro, Tableau, Power BI. Open / Free Software ‑ LibreOffice Calc, Google Sheets, Python (Pandas, yfinance, NumPy, SciPy, Matplotlib), R (QuantLib, tidyverse), SuiteCRM, EspoCRM, Plotly Dash, Streamlit, Portfolio Performance, Ghostfolio, Yahoo Finance API, Alpha Vantage, IEX Cloud (free tier). Show more Show less
Posted 1 week ago
0 years
0 Lacs
India
On-site
Financial Managers (Vet Existing Resources) Build and manage budgets, rolling forecasts, cash‑flow models, variance analyses, compliance packs, and board‑level reports; oversee treasury, investments, and strategic financial planning. Commercial Software ‑ SAP S/4HANA, Oracle Financials, NetSuite, QuickBooks, Microsoft Dynamics 365 Finance, Adaptive Planning, PlanGuru, Anaplan, Bloomberg Terminal, Power BI, Tableau, Xero, Sage Intacct, Expensify, BrightPay, Gusto, Kissflow. Open / Free Software ‑ LibreOffice Calc, Google Sheets (free), Odoo Community, ERPNext, GnuCash, Wave Accounting, Metabase, Apache Superset, Python (Pandas/NumPy), R. Show more Show less
Posted 1 week ago
2.0 - 4.0 years
2 - 6 Lacs
Gurugram
Work from Office
About the Opportunity Job TypeApplication 16 June 2025 Title Test Analyst Department ISS DELIVERY - DEVELOPMENT - GURGAON Location GGN Level 2 Were proud to have been helping our clients build better financial futures for over 50 years. How have we achieved thisBy working together - and supporting each other - all over the world. So, join our ISS Delivery team and feel like youre part of something bigger. About your team The Investment Solutions Services (ISS) delivery team provides systems development, implementation and support services for FILs global Investment Management businesses across asset management lifecyle. We support Fund Managers, Research Analysts, Traders and Investment Services Operations in all of FILs international locations, including London, Hong Kong, and Tokyo About your role You will be joining this position as Test Analyst in QE chapter, and therefore be responsible for executing testing activities for all applications under Investment Risk & Attribution team based out of India. Here are the expectations and probably how your day in a job will look like Understand business needs and analyse requirements and user stories to carry out different testing activities. Collaborate with developers and BAs to understand new features, bug fixes, and changes in the codebase. Create and execute functional as well as automated test cases on different test environments to validate the functionality Log defects in defect tracker and work with PMs and devs to prioritise and resolve them. Develop and maintain automation script, preferably using python stack. Intermediate level understanding of relational database. Document test cases , results and any other issues encountered during testing. Attend team meetings and stand ups to discuss progress, risks and any issues that affects project deliveries Stay updated with new tools, techniques and industry trends. About You Seasoned Software Test analyst with more than 2+ years of hands on experience Hands-on experience in automating web and backend automation using open source tools (Playwright, pytest, request, numpy , pandas). Proficiency in writing and understanding db queries in various databases ( Oracle, AWS RDS) Good understanding of cloud ( AWS , Azure) Preferable to have finance investment domain. Strong logical reasoning and problem solving skills. Preferred programming language Python and Java. Familiarity with CI/CD tools (e.g., Jenkins, GitLab CI) for automating deployment and testing workflows For starters, well offer you a comprehensive benefits package. Well value your wellbeing and support your development. And well be as flexible as we can about where and when you work finding a balance that works for all of us. Its all part of our commitment to making you feel motivated by the work you do and happy to be part of our team.
Posted 1 week ago
3.0 - 5.0 years
5 - 9 Lacs
Bengaluru
Work from Office
About the Opportunity Job TypeApplication 13 June 2025 Title Technical Specialist Department Technology - Corporate Enablers (CFO Technology) Location Bengaluru (Bangalore) India Reports To Senior Manager Level 4 About your team The Corporate Enablers technology function provides IT services to multiple business functions like Finance, HR and General Counsel, globally. CFO Technology collaborates with Finance and Procurement stakeholders globally to develop and support business applications that underpin all core finance processes across FIL. This includes on-premises and SaaS solutions, both in-house built and vendor provided. There is a strong focus on data analytics, workflow and automation tools to bring greater efficiency to these functions. Together with this there is continued move towards greater use of Agile and DevOps practices. CFO Technology team is a global team with people based in the UK, China, and India. About your role Join our team of enthusiastic technologists as we build the future of cloud-based Integration platform We are seeking a skilled and experienced Full stack Developer to join our team. The ideal candidate will have a strong background in API development and PLSQL Store procedures along with good understanding of Kubernetes,AWS,SnapLogic cloud-native technologies. This role requires deep technical expertise and the ability to work in a dynamic and fast-paced environment. About you Essential Skills Minimum 7 years of overall full stack (Python, Oracle/PLSQL) hands on experience of system software development, testing and maintenance Knowledge of latest Python frameworks and technologies (e.g., Django, Flask, FastAPI) Experience with Python libraries and tools (e.g., Pandas, NumPy, SQLAlchemy) Strong experience in designing, developing, and maintaining RESTful APIs. Familiarity with API security, authentication, and authorization mechanisms (e.g., OAuth, JWT) Good experience and hands-on knowledge of PL/SQL (Packages/Functions/Ref cursors) Experience in development & low-level design of Warehouse solutions Familiarity with Data Warehouse, Datamart and ODS concepts Knowledge of data normalisation and Oracle performance optimisation techniques Hands on development experience of AWS (S3, lambda, api gateway, EC2, CloudFront, Route53, Dynamo DB, vpc, subnets) Hands-on experience with Kubernetes for container orchestration Knowledge of deploying, managing, and scaling applications on Kubernetes clusters Should be able to provide technical design and architecture independently for business solutions Experience with cloud architecture and design principles, micro-services Good understanding of infra-aspects of technical solutions like storage, platform, middleware Should have clear understating on continuous integration, build, release, code quality Good understating of load balancing, disaster recovery aspects of solutions Good knowledge on security aspects like authentication, authorization by using open standards like oAuth Hands on with coding and debugging. Should be able to write high quality code optimized for performance and scale Good analytical- problem solving skills and should be good with algorithms Skills nice to have Experience with infrastructure-as-code tools (e.g., Terraform, CloudFormation). Experience with SnapLogic cloud-native integration platform. Ability to design and implement integration pipelines using SnapLogic. Experience in AI prompt engineering, Gen AI, LLM models , Agents Experience in CI/CD, TDD, DevOps, CI/CD tools - Jenkins/UrbanCode/SonarQube/ Bamboo Key Responsibilities Lead and guide a team of developers/senior developers Architect technical design of the application, document and present it to senior stakeholders Interact with senior architects and other consultants to understand and review the technical solution and direction Communicate with business analysts to discuss various business requirements Proactively refactor code/solution, be aggressive about tech debt identification and reduction Develop, maintain and troubleshoot issues; and take a leading role in the ongoing support and enhancements of the applications Help in maintaining the standards, procedures and best practices in the team. Also help his team to follow these standards. Prioritisation of requirements in pipeline with stakeholders Experience and Qualification: B.E./ B.Tech. or M.C.A. in Computer Science from a reputed University Total 8-10 years of experience with application development on Python language, API development along with Oracle RDBMS, SQL, PL/SQL Must have led a team of developers
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Numpy is a widely used library in Python for numerical computing and data analysis. In India, there is a growing demand for professionals with expertise in numpy. Job seekers in this field can find exciting opportunities across various industries. Let's explore the numpy job market in India in more detail.
The average salary range for numpy professionals in India varies based on experience level: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum
Typically, a career in numpy progresses as follows: - Junior Developer - Data Analyst - Data Scientist - Senior Data Scientist - Tech Lead
In addition to numpy, professionals in this field are often expected to have knowledge of: - Pandas - Scikit-learn - Matplotlib - Data visualization
np.where()
function in numpy. (medium)np.array
and np.matrix
in numpy. (advanced)As you explore job opportunities in the field of numpy in India, remember to keep honing your skills and stay updated with the latest developments in the industry. By preparing thoroughly and applying confidently, you can land the numpy job of your dreams!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.