Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 years
0 Lacs
India
Remote
Job Title: Quant Engineer Location: Remote Quant Engineer Job Description: Strong Python developer with up-to-date skills, including web development, cloud (ideally Azure), Docker, testing , devops (ideally terraform + github actions). Data engineering (pyspark, lakehouses, kafka) is a plus. Good understanding of maths, finance as role interacts with quant devs, analysts and traders. Familiarity with e.g. PnL, greeks, volatility, partial derivative, normal distribution etc. Financial and/or trading exposure is nice to have, particularly energy commodities Productionise quant models into software applications, ensuring robust day to day operation, monitoring and back testing are in place Translate trader or quant analyst’s need into software product requirements Prototype and implement data pipelines Co-ordinate closely with analysts and quants during development of models, acting as a technical support and coach Produce accurate, performant, scalable, secure software, and support best practices following defined IT standards Transform proof of concepts into a larger deployable product in Shell and outside. Work in a highly-collaborative, friendly Agile environment, participate in Ceremonies and Continuous Improvement activities. Ensuring that documentation and explanations of results of analysis or modelling are fit for purpose for both a technical and non-technical audience Mentor and coach other teammates who are upskilling in Quants Engineering Professional Qualifications & Skills Educational Qualification Graduation / postgraduation /PhD with 8+ years’ work experience as software developer /data scientist. Degree level in STEM, computer science, engineering, mathematics, or a relevant field of applied mathematics. Good understanding of Trading terminology and concepts (incl. financial derivatives), gained from experience working in a Trading or Finance environment. Required Skills Expert in core Python with Python scientific stack / ecosystem (incl pandas, numpy, scipy, stats), and a second strongly typed language (e.g.: C#, C++, Rust or Java). Expert in application design, security, release, testing and packaging. Mastery of SQL / no-SQL databases, data pipeline orchestration tools. Mastery of concurrent/distributed programming and performance optimisation methods
Posted 1 day ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
As a Data Scientist, you will be responsible for analyzing complex data using statistical and machine learning models to derive actionable insights. You will use Python for data analysis, visualization, and working with various technologies such as APIs, Linux OS, databases, big data technologies, and cloud services. Additionally, you will develop innovative solutions for natural language processing and generative modeling tasks, collaborating with cross-functional teams to understand business requirements and translate them into data science solutions. You will work in an Agile framework, participating in sprint planning, daily stand-ups, and retrospectives. Furthermore, you will research, develop, and analyze computer vision algorithms in areas related to object detection, tracking, product identification and verification, and scene understanding, ensuring model robustness, generalization, accuracy, testability, and efficiency. You will also be responsible for writing product or system development code, designing and maintaining data pipelines and workflows within Azure Databricks for optimal performance and scalability, and communicating findings and insights effectively to stakeholders through reports and visualizations. To qualify for this role, you should have a Master's degree in Data Science, Statistics, Computer Science, or a related field. You should have over 5 years of proven experience in developing machine learning models, particularly for time series data within a financial context. Advanced programming skills in Python or R, with extensive experience in libraries such as Pandas, NumPy, and Scikit-learn are required. Additionally, you should have comprehensive knowledge of AI and LLM technologies, with a track record of developing applications and models. Proficiency in data visualization tools like Tableau, Power BI, or similar platforms is essential. Exceptional analytical and problem-solving abilities, coupled with meticulous attention to detail, are necessary for this role. Superior communication skills are also required to enable the clear and concise presentation of complex findings. Extensive experience in Azure Databricks for data processing, model training, and deployment is preferred, along with proficiency in Azure Data Lake and Azure SQL Database for data storage and management. Experience with Azure Machine Learning for model deployment and monitoring, as well as an in-depth understanding of Azure services and tools for data integration and orchestration, will be beneficial for this position.,
Posted 1 day ago
0.0 - 2.0 years
0 - 0 Lacs
Cannanore, Kerala
On-site
Job Title: Python & Data Science Trainer Location: Kannur , Kerala Employment Type: Full-Time / Part-Time Experience: 06 to 2years ( Freshers also can apply) Qualification: Any Degree Reporting To: Academic Head / RTH Job Summary: We are seeking a skilled and passionate Python & Data Science Trainer to join our academic team. The trainer will be responsible for delivering high-quality training sessions to students, professionals, and corporate clients in Python programming, data analysis, and machine learning techniques. Key Responsibilities: Design and deliver comprehensive training modules in Python, Data Science, and Machine Learning Conduct classes (online/offline) for students, job seekers, and working professionals Develop curriculum content, assignments, projects, and real-world case studies Prepare assessments, quizzes, and capstone projects to evaluate student understanding Keep course material up to date with the latest industry trends and tools Guide students on projects and support them during practical sessions Assist in the development of internal tools, demos, and data-driven models Take ownership of student performance, engagement, and satisfaction Key Skills & Tools Required: Strong programming knowledge in Python Core Python , OOPs, File Handling, Libraries (NumPy, Pandas, Matplotlib, etc.) Data Analysis , Exploratory Data Analysis (EDA) , Data Cleaning Statistics & Probability for Data Science Machine Learning algorithms : Linear/Logistic Regression, Decision Trees, Random Forest, KNN, SVM, etc. Model evaluation techniques , Cross-validation, Confusion Matrix, etc. Hands-on with Jupyter Notebook , Google Colab , Anaconda Familiarity with SQL , Power BI/Tableau , or Deep Learning is a plus Excellent communication, presentation, and mentoring skills Preferred Certifications: Python Certification (e.g., PCEP, PCAP) Data Science Certification (Coursera, Udemy, IBM, Google, etc.) Machine Learning or AI specialization certificates Salary: As per industry standards Perks: Performance bonuses, training certifications, and growth opportunities Job Types: Full-time, Permanent, Fresher Pay: ₹12,000.00 - ₹20,000.00 per month Benefits: Leave encashment Paid time off Schedule: Day shift Fixed shift Morning shift Supplemental Pay: Yearly bonus Application Deadline: 04/08/2025 Expected Start Date: 04/08/2025
Posted 1 day ago
3.0 years
0 Lacs
Kolkata, West Bengal, India
Remote
Join a leading U.S.-based company as a Sr. Python Developer, where your expertise will drive cutting-edge solutions in technology. Leverage your Python skills to solve complex challenges, optimize processes, and contribute to impactful projects alongside global experts. If you have a passion for innovation and a proven track record in Python development, this role offers the perfect opportunity to elevate your career. Job Responsibilities Develop scalable and efficient Python applications. Analyze large datasets to extract business insights. Write clean, optimized code in Jupyter Notebooks or similar platforms. Collaborate with researchers and engineers on data-driven solutions. Maintain clear documentation for all code. Use datasets from Kaggle, the UN, and U.S. government sources for business insights. Job Requirements 3+ years of Python development experience. Bachelor’s/Master’s in Computer Science, Engineering, or related field. Strong Python skills with experience in Pandas, NumPy, SciPy, etc. Experience with databases (SQL/NoSQL) and cloud platforms (AWS, GCP, Azure) is a plus. Excellent problem-solving and analytical skills. Strong communication and teamwork skills in a remote setting. Perks Work with top industry experts worldwide. Fully remote flexibility. Competitive salary aligned with global standards. Be part of cutting-edge, high-impact projects. Selection Process Shortlisted developers may be asked to complete an assessment. If you clear the assessment, you will be contacted for contract assignments with expected start dates, durations, and end dates. Some contract assignments require fixed weekly hours, averaging 20/30/40 hours per week for the duration of the contract assignment.
Posted 1 day ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
We are looking for a highly skilled and motivated Senior Data Scientist with 3–5 years of hands-on experience in Machine Learning , Deep Learning , and Large Language Models (LLMs) . The ideal candidate should have strong experience working with Microsoft technologies and Azure Cloud and should be passionate about building scalable AI solutions that drive real business value. You will work closely with cross-functional teams to design, develop, and deploy data-driven models and solutions, leveraging the latest in cloud computing and AI innovations. Responsibilities: Design and develop advanced ML/DL models for predictive analytics, natural language processing, computer vision, and recommendation systems. Build and fine-tune LLMs and deploy them effectively on Azure Machine Learning or other Microsoft AI platforms. Collaborate with data engineers and cloud architects to implement end-to-end ML pipelines on Azure. Apply Deep Learning techniques using frameworks like TensorFlow, PyTorch, or Hugging Face. Conduct exploratory data analysis and identify key trends and insights to inform business decisions. Work on large-scale datasets and implement scalable solutions for model training and deployment. Ensure models are monitored, maintained, and retrained as needed to meet performance standards. Contribute to the design of data architecture and infrastructure for AI applications on Azure. Keep up-to-date with advancements in AI, ML, LLMs, and related Microsoft technologies. Qualifications: Bachelor's or Master’s degree in Computer Science, Data Science, Artificial Intelligence, or a related field. 3–5 years of relevant experience in data science, machine learning, and AI. Strong experience with Azure Cloud Services including Azure ML, Azure Data Lake, and Azure Functions. Proven experience in Machine Learning, Deep Learning, and working with LLMs. Proficiency in Python, with experience in libraries such as Scikit-learn, Pandas, NumPy, TensorFlow, PyTorch, or Hugging Face Transformers. Solid understanding of Microsoft’s ecosystem and AI stack (e.g., Cognitive Services, Azure OpenAI, Microsoft Fabric, etc.). Familiarity with MLOps practices and tools on Azure. Strong analytical thinking, problem-solving skills, and communication abilities. Ability to work in a collaborative and fast-paced environment.
Posted 1 day ago
3.0 years
0 Lacs
India
Remote
Join a leading U.S.-based company as a Sr. Python Developer, where your expertise will drive cutting-edge solutions in technology. Leverage your Python skills to solve complex challenges, optimize processes, and contribute to impactful projects alongside global experts. If you have a passion for innovation and a proven track record in Python development, this role offers the perfect opportunity to elevate your career. Job Responsibilities Develop scalable and efficient Python applications. Analyze large datasets to extract business insights. Write clean, optimized code in Jupyter Notebooks or similar platforms. Collaborate with researchers and engineers on data-driven solutions. Maintain clear documentation for all code. Use datasets from Kaggle, the UN, and U.S. government sources for business insights. Job Requirements 3+ years of Python development experience. Bachelor’s/Master’s in Computer Science, Engineering, or related field. Strong Python skills with experience in Pandas, NumPy, SciPy, etc. Experience with databases (SQL/NoSQL) and cloud platforms (AWS, GCP, Azure) is a plus. Excellent problem-solving and analytical skills. Strong communication and teamwork skills in a remote setting. Perks Work with top industry experts worldwide. Fully remote flexibility. Competitive salary aligned with global standards. Be part of cutting-edge, high-impact projects. Selection Process Shortlisted developers may be asked to complete an assessment. If you clear the assessment, you will be contacted for contract assignments with expected start dates, durations, and end dates. Some contract assignments require fixed weekly hours, averaging 20/30/40 hours per week for the duration of the contract assignment.
Posted 1 day ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
Chubb is a world-renowned insurance leader with operations spanning across 54 countries and territories, offering a wide range of commercial and personal insurance solutions. Known for its extensive product portfolio, robust distribution network, exceptional financial stability, and global presence, Chubb is committed to providing top-notch services to its diverse clientele. The parent company, Chubb Limited, is publicly listed on the New York Stock Exchange (NYSE: CB) and is a constituent of the S&P 500 index, boasting a workforce of around 43,000 individuals worldwide. For more information, visit www.chubb.com. Chubb India is embarking on an exciting digital transformation journey fueled by a focus on engineering excellence and analytics. The company takes pride in being officially certified as a Great Place to Work for the third consecutive year, underscoring its culture that nurtures innovation, growth, and collaboration. With a talented team of over 2500 professionals, Chubb India promotes a startup mindset that encourages diverse perspectives, teamwork, and a solution-oriented approach. The organization is dedicated to honing expertise in engineering, analytics, and automation, empowering its teams to thrive in the ever-evolving digital landscape. As a Full Stack Data Scientist within the Advanced Analytics team at Chubb, you will play a pivotal role in developing cutting-edge data-driven solutions using state-of-the-art machine learning and AI technologies. This technical position involves leveraging AI and machine learning techniques to automate underwriting processes, enhance claims outcomes, and provide innovative risk solutions. Ideal candidates for this role possess a solid educational background in computer science, data science, statistics, applied mathematics, or related fields, coupled with a penchant for solving complex problems through innovative thinking while maintaining a keen focus on delivering actionable business insights. You should be proficient in utilizing a diverse set of tools, strategies, machine learning algorithms, and programming languages to address a variety of challenges. Key Responsibilities: - Collaborate with global business partners to identify analysis requirements, manage deliverables, present results, and implement models. - Leverage a wide range of machine learning, text and image AI models to extract meaningful features from structured and unstructured data. - Develop and deploy scalable and efficient machine learning models to automate processes, gain insights, and facilitate data-driven decision-making. - Package and publish codes and solutions in reusable Python formats for seamless integration into CI/CD pipelines and workflows. - Ensure high-quality code that aligns with business objectives, quality standards, and secure web development practices. - Build tools for streamlining the modeling pipeline, sharing knowledge, and implementing real-time monitoring and alerting systems for machine learning solutions. - Establish and maintain automated testing and validation infrastructure, troubleshoot pipelines, and adhere to best practices for versioning, monitoring, and reusability. Qualifications: - Proficiency in ML concepts, supervised/unsupervised learning, ensemble techniques, and various ML models including Random Forest, XGBoost, SVM, etc. - Strong experience with Azure cloud computing, containerization technologies (Docker, Kubernetes), and data science frameworks like Pandas, Numpy, TensorFlow, Keras, PyTorch, and sklearn. - Hands-on experience with DevOps tools such as Git, Jenkins, Sonar, Nexus, along with data pipeline building, debugging, and unit testing practices. - Familiarity with AI/ML applications, Databricks ecosystem, and statistical/mathematical domains. Why Chubb - Join a leading global insurance company with a strong focus on employee experience and a culture that fosters innovation and excellence. - Benefit from a supportive work environment, industry leadership, and opportunities for personal and professional growth. - Embrace a startup-like culture that values speed, agility, ownership, and continuous improvement. - Enjoy comprehensive employee benefits that prioritize health, well-being, learning, and career advancement. Employee Benefits: - Access to savings and investment plans, upskilling opportunities, health and welfare benefits, and a supportive work environment that encourages inclusivity and growth. Join Us: Your contributions are integral to shaping the future at Chubb. If you are passionate about integrity, innovation, and inclusion and ready to make a difference, we invite you to be part of Chubb India's journey. Apply Now: Chubb India Career Page,
Posted 1 day ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
The 55ip Quant team is seeking a quantitative professional to research, implement, test, and maintain the core algorithms of its technology-enabled investment platform for large investment advisory (RIA) & wealth management firms. As a Research Analyst at JP Morgan Chase within the Asset and Wealth Management and the 55ip Quant team, you will play a crucial role in researching, implementing, testing, and maintaining the core algorithms of the technology-enabled investment platform. This position offers the opportunity to contribute significantly to research projects and grow as an independent researcher. If you have a background in statistical models, software design constructs, and tools, possess strong problem-solving skills, are a motivated team player, and are eager to make a meaningful impact, then this role could be an ideal fit for you. Responsibilities: - Engage in end-to-end research, development, and maintenance of investment algorithms - Contribute to the development and maintenance of optimization models, participate in building the research and development framework - Review investment algorithmic results thoroughly and contribute to the design of the research data platform - Explore datasets for use in new or existing algorithms, engage in agile practices, and collaborate with stakeholders to gather functional requirements - Participate in research and code reviews, adhere to high-quality coding standards and best practices, conduct comprehensive end-to-end unit testing, and offer support during testing and post go-live stages - Drive research innovation through creative and comprehensive experimentation of cutting-edge hardware, advanced analytics, machine learning techniques, and other methodologies - Work in collaboration with technology teams to ensure the alignment of requirements, standards, and integration Required Qualifications: - Experience in a quantitative role - Proficiency in Python, Git, and Jira - A Master's degree in computer science, computational mathematics, or financial engineering - Strong mathematical foundation and practical experience in the finance industry - Proficiency in quantitative, statistical, and machine learning/artificial intelligence techniques and their implementation using Python modules such as Pandas, NumPy, SciPy, SciKit-Learn, etc. - Excellent communication skills (both written and oral) and analytical problem-solving abilities - Strong attention to detail, commitment to delivering high-quality work, and a willingness to learn - Understanding of financial capital markets, various financial instruments (e.g., stocks, ETFs, Mutual Funds), and financial tools (e.g., Bloomberg, Reuters) - Knowledgeable in SQL Preferred Qualifications: - Professional experience with commercial optimizers (e.g., Gurobi, CPLEX) is advantageous - Ability to adapt quickly to time-sensitive requests - Self-motivated, proactive, responsive, with strategic thinking capabilities while also being willing to delve into the specifics and tactics - Understanding of LaTeX and/or RMarkdown would be a plus,
Posted 1 day ago
5.0 - 8.0 years
0 Lacs
Greater Kolkata Area
On-site
We are looking for a Senior Python Developer with strong experience in building robust data connectors to integrate with various third-party platforms. This role is critical in designing and implementing OAuth-based integrations and scalable data ingestion pipelines in Python. You will lead the connector development efforts while optionally contributing to LLM integration, API services, and cloud deployment. Responsibilities Design and develop standalone Python connectors to fetch and sync data from HR systems and third-party platforms such as Justworks, HiBob, Workday, Rippling, Velocity, BambooHR, UKG, etc. Implement secure OAuth2 authentication flows and manage API token lifecycles. Read and interpret API documentation to understand external systems and develop integration solutions. Collaborate with cross-functional teams to understand use cases and deliver reliable data access mechanisms. Requirements 5- 8 years of Python development experience. Strong experience in building connectors/integrations with third-party APIs. Hands-on experience with OAuth2 and other authentication mechanisms. Ability to quickly understand API documentation and build efficient data fetching solutions. Familiarity with JSON, REST APIs, and best practices for secure and scalable data handling. Nice-to-Have Skills (Optional) Experience building REST APIs using FastAPI or Flask. Exposure to LLMs (Large Language Models) and frameworks like Hugging Face. Basic understanding of NumPy, PyTorch, or other AI/ML libraries. Familiarity with Amazon ECS or containerized deployment environments. Previous experience with HR/ERP platform integration. Knowledge of async programming (asyncio, aiohttp) for scalable connector performance. (ref:hirist.tech)
Posted 1 day ago
9.0 - 13.0 years
0 Lacs
chennai, tamil nadu
On-site
As an ideal candidate for this role, you should possess in-depth knowledge of Python and have good experience in creating APIs using FastAPI. You should also have exposure to data libraries like Pandas, DataFrame, NumPy, as well as knowledge in Apache open-source components and Apache Spark. Familiarity with Lakehouse architecture and Open table formats is also desirable. Additionally, you should be well-versed in automated unit testing, preferably using PyTest, and have exposure to distributed computing. Experience working in a Linux environment is a must, while working knowledge in Kubernetes would be considered an added advantage. Basic exposure to ML and MLOps would also be advantageous for this role.,
Posted 1 day ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Summary The Associate Data Scientist will be responsible for developing and implementing machine learning models using computer vision and PyTorch. This individual will work closely with a team of data scientists and engineers to support the development of innovative solutions in a fast-paced, collaborative environment. Key Responsibilities Develop and implement machine learning models using computer vision and PyTorch Collaborate with a team of data scientists and engineers to support the development of innovative solutions Conduct data analysis and feature engineering to support the development of machine learning models Use computer vision techniques to extract and analyze data from images and videos Support the deployment and maintenance of machine learning models in a production environment Contribute to the continuous improvement of machine learning processes and practices Key Skills Strong proficiency in Python programming. Hands-on experience with PyTorch, Pandas, NumPy, and OpenCV (cv2). Prior experience working on computer vision projects is required. Knowledge of cloud platforms and services is a plus (ref:hirist.tech)
Posted 1 day ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Backend Developer (Python/Django): - Strong working experience in Python-based Django, Flask framework. - Experience in developing microservices based design and architecture. - Strong programming knowledge in JavaScript, HTML5, Python, Restful API, gRPC API. - Programming experience & object-oriented concepts in Python. - Knowledge of Python libraries like Numpy, Pandas, Ppen3D, OpenCV, Matplotlib. - Knowledge of MySQL/Postgres/MSSQL database. - Knowledge of 3D geometry. - Knowledge of SSO/OpenID Connect/OAuth authentication protocols. - Working experience with version control systems like GitHub/BitBucket/GitLab. - Familiarity with continuous integration and continuous deployment (CI/CD) pipelines. As a Backend Developer, you will work in the area of Software Engineering, encompassing the development, maintenance, and optimization of software solutions/applications. You will apply scientific methods to analyze and solve software engineering problems, develop and apply software engineering practice and knowledge, exercise original thought and judgement, supervise the technical and administrative work of other software engineers, and build skills and expertise in your discipline to meet standard expectations for the role. Collaboration and teamwork with other software engineers and stakeholders are essential aspects of the role. Capgemini is a global business and technology transformation partner, helping organizations accelerate their dual transition to a digital and sustainable world. With a responsible and diverse team of 340,000 members across more than 50 countries, Capgemini leverages its over 55-year heritage to unlock technology's value for clients and address their business needs comprehensively. The Group's services and solutions span strategy, design, engineering, AI, cloud, and data, supported by deep industry expertise and a strong partner ecosystem. In 2023, the Group reported global revenues of 22.5 billion.,
Posted 1 day ago
2.0 - 6.0 years
0 Lacs
punjab
On-site
We are searching for an experienced Python Developer to become a part of our dynamic development team. The ideal candidate should possess 2 to 5 years of experience in constructing scalable backend applications and APIs using contemporary Python frameworks. This position necessitates a solid foundation in object-oriented programming, web technologies, and collaborative software development. Your responsibilities will involve close collaboration with the design, frontend, and DevOps teams to deliver sturdy and high-performance solutions. Your key responsibilities will include developing, testing, and maintaining backend applications utilizing Django, Flask, or FastAPI. You will also be responsible for building RESTful APIs and incorporating third-party services to enrich platform capabilities. Utilization of data handling libraries such as Pandas and NumPy for efficient data processing is essential. Additionally, writing clean, maintainable, and well-documented code conforming to industry best practices, participating in code reviews, and mentoring junior developers are part of your role. You will collaborate within Agile teams using Scrum or Kanban workflows and troubleshoot and debug production issues proactively and analytically. Required qualifications for this position include 2 to 5 years of backend development experience with Python, proficiency in core and advanced Python concepts, strong command over at least one Python framework (Django, Flask, or FastAPI), experience with data libraries like Pandas and NumPy, understanding of authentication/authorization mechanisms, middleware, and dependency injection, familiarity with version control systems like Git, and comfort working in Linux environments. Must-have skills for this role consist of expertise in backend Python development and web frameworks, experience with Generative AI frameworks (e.g., LangChain, Transformers, OpenAI APIs), strong debugging, problem-solving, and optimization skills, experience with API development and microservices architecture, and a deep understanding of software design principles and security best practices. Good-to-have skills include exposure to Machine Learning libraries (e.g., Scikit-learn, TensorFlow, PyTorch), knowledge of containerization tools (Docker, Kubernetes), familiarity with web servers (e.g., Apache, Nginx) and deployment architectures, understanding of asynchronous programming and task queues (e.g., Celery, AsyncIO), familiarity with Agile practices and tools like Jira or Trello, and exposure to CI/CD pipelines and cloud platforms (AWS, GCP, Azure). In return, we offer competitive compensation based on your skills and experience, generous time off with 18 annual holidays to maintain a healthy work-life balance, continuous learning opportunities while working on cutting-edge projects, and valuable experience in client-facing roles to enhance your professional growth.,
Posted 1 day ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Data Visualization and Web Development Specialist at Inclusive Minds, you will be a crucial part of the team responsible for transforming intricate electoral data into engaging visual representations that facilitate strategic decision-making for political campaigns. Your role will involve utilizing your expertise in web visualization frameworks like D3.js, Chart.js, Highcharts, or equivalent tools to present data in a user-friendly and impactful manner. Candidates aspiring to join our dynamic data and analytics team should possess a solid technical background encompassing the following key areas: - Proficiency in utilizing web visualization frameworks such as D3.js, Chart.js, Highcharts, or their equivalents to create visually appealing representations of data. - Strong command over SQL with practical experience in managing relational databases (optional). - Competence in programming languages like Python, R, or PHP for data processing, analytics, and visualization purposes. - Familiarity with libraries such as Pandas, NumPy, and Matplotlib for efficient data handling within the Python environment. - Ability to automate data processing workflows and derive actionable insights from complex datasets. - Proficient in HTML, CSS, and JavaScript for constructing and enhancing interactive web-based dashboards. - Knowledge of front-end frameworks like React.js or Vue.js would be advantageous. - Experience in integrating APIs and external data sources into web applications. - Understanding of database optimization techniques for efficient management of extensive electoral datasets. In this role, you will have the opportunity to contribute to the strategic decision-making process of political campaigns by translating data into visually compelling formats. Your expertise in data visualization and web development will be instrumental in shaping impactful electoral campaigns that drive social change and influence public policies positively.,
Posted 1 day ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
We are looking for a passionate and skilled Senior Python Developer to join our growing engineering team. You will play a key role in designing, building, and maintaining scalable backend systems and APIs that drive meaningful user experiences. This role involves close collaboration with frontend engineers, product managers, and designers to build high-impact software : Design, develop, and maintain high-performance, scalable, and secure backend services using Python. Write clean, efficient, and well-documented code, following best practices and company coding standards. Work with modern Python frameworks such as Django, Flask, or FastAPI to build RESTful APIs and microservices. Collaborate with cross-functional teams to define and implement new features and improvements. Participate in code reviews, peer programming, and team knowledge-sharing sessions. Troubleshoot, debug, and optimize existing systems for performance and reliability. Analyze requirements and translate them into effective technical solutions. Integrate with third-party services, APIs, and data sources. Contribute to Agile ceremonies including sprint planning, reviews, and retrospectives. Write unit and integration tests to ensure software quality and reliability. Ensure timely delivery of assigned tasks and proactively communicate progress and need to have the following : 3+ years of hands-on experience in software development using Python. Bachelors degree in Computer Science, Engineering, or a related field (or equivalent practical experience). Solid understanding of at least one Python web framework : Django, Flask, or FastAPI. Experience with data manipulation and analysis libraries such as Pandas and NumPy. Familiarity with relational and/or NoSQL databases such as MySQL, PostgreSQL, MongoDB, or Firebase. Knowledge of message queues and task workers (e.g., Celery, Redis, RabbitMQ). Basic understanding of RESTful API design and HTTP protocols. Experience with version control systems like Git. Exposure to cloud platforms (AWS, GCP, etc.) and containerization tools (Docker) is a plus. Basic understanding of DevOps principles and practices (e.g., CI/CD). Understanding of the software development lifecycle and Agile methodologies. Good knowledge of data structures, algorithms, and system design principles. Basic understanding of AI/ML concepts and tools is a plus. Strong debugging and problem-solving skills. Good communication skills and the ability to work collaboratively in a team environment. A portfolio of side projects or contributions to open-source repositories (on GitHub) is a big plus. Skills Needed Python, Django, Flask/FastAPI, Pandas, NumPy, REST APIs, Git, MySQL, PostgreSQL, MongoDB, Docker, Message Queues (Celery, Redis, RabbitMQ, SQS), DevOps (basic CI/CD). (ref:hirist.tech)
Posted 1 day ago
2.0 - 6.0 years
0 Lacs
guwahati, assam
On-site
You are an experienced Software Engineer specializing in Machine Learning with at least 2+ years of relevant experience. In this role, you will be responsible for designing, developing, and optimizing machine learning solutions and data systems. Your proven track record in implementing ML models, building scalable systems, and collaborating with cross-functional teams will be essential in solving complex challenges using data-driven approaches. As a Software Engineer - Machine Learning, your primary responsibilities will include designing and implementing end-to-end machine learning solutions, building and optimizing scalable data pipelines, collaborating with data scientists and product teams, monitoring and optimizing deployed models, staying updated with the latest trends in machine learning, debugging complex issues related to ML systems, and documenting processes for knowledge sharing and clarity. To qualify for this role, you should have a Bachelor's or Master's degree in Computer Science, Machine Learning, Data Science, or related fields. Your technical skills should include a strong proficiency in Python and machine learning libraries such as TensorFlow, PyTorch, or scikit-learn, experience with data processing tools like Pandas, NumPy, and Spark, proficiency in SQL and database systems, hands-on experience with cloud platforms (AWS, GCP, Azure), familiarity with CI/CD pipelines and Git, and experience with model deployment frameworks like Flask, FastAPI, or Docker. Additionally, you should possess strong analytical skills, leadership abilities to guide junior team members, and a proactive approach to learning and collaboration. Preferred qualifications include experience with MLOps tools like MLflow, Kubeflow, or SageMaker, knowledge of big data technologies such as Hadoop, Spark, or Kafka, familiarity with advanced ML techniques like NLP, computer vision, or reinforcement learning, and experience in designing and managing streaming data workflows. Key Performance Indicators for this role include successfully delivering optimized and scalable ML solutions within deadlines, maintaining high model performance in production environments, and ensuring seamless integration of ML models with business applications. Join us in this exciting opportunity to drive innovation and make a significant impact in the field of Machine Learning.,
Posted 1 day ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Comcast brings together the best in media and technology. We drive innovation to create the world's best entertainment and online experiences. As a Fortune 50 leader, we set the pace in a variety of innovative and fascinating businesses and create career opportunities across a wide range of locations and disciplines. We are at the forefront of change and move at an amazing pace, thanks to our remarkable people, who bring cutting-edge products and services to life for millions of customers every day. If you share in our passion for teamwork, our vision to revolutionize industries and our goal to lead the future in media and technology, we want you to fast-forward your career at Comcast. Job Summary Responsible for contributing to the development and deployment of machine learning algorithms. Evaluates accuracy and functionality of machine learning algorithms as a part of a larger team. Contributes to translating application requirements into machine learning problem statements. Analyzes and evaluates solutions both internally generated as well as third party supplied. Contributes to developing ways to use machine learning to solve problems and discover new products, working on a portion of the problem and collaborating with more senior researchers as needed. Works with moderate guidance in own area of knowledge. Job Description Core Responsibilities About the Role: We are seeking an experienced Data Scientist to join our growing Operational Intelligence team. You will play a key role in building intelligent systems that help reduce alert noise, detect anomalies, correlate events, and proactively surface operational insights across our large-scale streaming infrastructure. You’ll work at the intersection of machine learning, observability, and IT operations, collaborating closely with Platform Engineers, SREs, Incident Managers, Operators and Developers to integrate smart detection and decision logic directly into our operational workflows. This role offers a unique opportunity to push the boundaries of AI/ML in large-scale operations. We welcome curious minds who want to stay ahead of the curve, bring innovative ideas to life, and improve the reliability of streaming infrastructure that powers millions of users globally. What You’ll Do Design and tune machine learning models for event correlation, anomaly detection, alert scoring, and root cause inference Engineer features to enrich alerts using service relationships, business context, change history, and topological data Apply NLP and ML techniques to classify and structure logs and unstructured alert messages Develop and maintain real-time and batch data pipelines to process alerts, metrics, traces, and logs Use Python, SQL, and time-series query languages (e.g., PromQL) to manipulate and analyze operational data Collaborate with engineering teams to deploy models via API integrations, automate workflows, and ensure production readiness Contribute to the development of self-healing automation, diagnostics, and ML-powered decision triggers Design and validate entropy-based prioritization models to reduce alert fatigue and elevate critical signals Conduct A/B testing, offline validation, and live performance monitoring of ML models Build and share clear dashboards, visualizations, and reporting views to support SREs, engineers, and leadership Participate in incident postmortems, providing ML-driven insights and recommendations for platform improvements Collaborate on the design of hybrid ML + rule-based systems to support dynamic correlation and intelligent alert grouping Lead and support innovation efforts including POCs, POVs, and exploration of emerging AI/ML tools and strategies Demonstrate a proactive, solution-oriented mindset with the ability to navigate ambiguity and learn quickly Participate in on-call rotations and provide operational support as needed Qualifications Bachelor's or Master's degree in Computer Science, Data Science, Machine Learning, Statistics or a related field 3+ years of experience building and deploying ML solutions in production environments 2+ years working with AIOps, observability, or real-time operations data Strong coding skills in Python (including pandas, NumPy, Scikit-learn, PyTorch, or TensorFlow) Experience working with SQL, time-series query languages (e.g., PromQL), and data transformation in pandas or Spark Familiarity with LLMs, prompt engineering fundamentals, or embedding-based retrieval (e.g., sentence-transformers, vector DBs) Strong grasp of modern ML techniques including gradient boosting (XGBoost/LightGBM), autoencoders, clustering (e.g., HDBSCAN), and anomaly detection Experience managing structured + unstructured data, and building features from logs, alerts, metrics, and traces Familiarity with real-time event processing using tools like Kafka, Kinesis, or Flink Strong understanding of model evaluation techniques including precision/recall trade-offs, ROC, AUC, calibration Comfortable working with relational (PostgreSQL), NoSQL (MongoDB), and time-series (InfluxDB, Prometheus) databases Ability to collaborate effectively with SREs, platform teams, and participate in Agile/DevOps workflows Clear written and verbal communication skills to present findings to technical and non-technical stakeholders Comfortable working across Git, Confluence, JIRA, & collaborative agile environments Nice To Have Experience building or contributing to the AIOps platform (e.g., Moogsoft, BigPanda, Datadog, Aisera, Dynatrace, BMC etc.) Experience working in streaming media, OTT platforms, or large-scale consumer services Exposure to Infrastructure as Code (Terraform, Pulumi) and modern cloud-native tooling Working experience with Conviva, Touchstream, Harmonic, New Relic, Prometheus, & event- based alerting tools Hands-on experience with LLMs in operational contexts (e.g., classification of alert text, log summarization, retrieval-augmented generation) Familiarity with vector databases (e.g., FAISS, Pinecone, Weaviate) and embeddings-based search for observability data Experience using MLflow, SageMaker, or Airflow for ML workflow orchestration Knowledge of LangChain, Haystack, RAG pipelines, or prompt templating libraries Exposure to MLOps practices (e.g., model monitoring, drift detection, explainability tools like SHAP or LIME) Experience with containerized model deployment using Docker or Kubernetes Use of JAX, Hugging Face Transformers, or LLaMA/Claude/Command-R models in experimentation Experience designing APIs in Python or Go to expose models as services Cloud proficiency in AWS/GCP, especially for distributed training, storage, or batch inferencing Contributions to open-source ML or DevOps communities, or participation in AIOps research/benchmarking efforts Certifications in cloud architecture, ML engineering, or data science specialization Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law. Base pay is one part of the Total Rewards that Comcast provides to compensate and recognize employees for their work. Most sales positions are eligible for a Commission under the terms of an applicable plan, while most non-sales positions are eligible for a Bonus. Additionally, Comcast provides best-in-class Benefits to eligible employees. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That’s why we provide an array of options, expert guidance and always-on tools, that are personalized to meet the needs of your reality – to help support you physically, financially and emotionally through the big milestones and in your everyday life. Please visit the compensation and benefits summary on our careers site for more details. Education Bachelor's Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Relevant Work Experience 2-5 Years
Posted 1 day ago
2.0 - 6.0 years
0 Lacs
haryana
On-site
As an Artificial Intelligence (AI) Developer in our team, you will have the exciting opportunity to blend cutting-edge AI techniques with scalable web application architectures to design and build intelligent systems. Working closely with cross-functional teams, including data scientists, software engineers, and product managers, you will develop end-to-end solutions that enhance business operations and deliver exceptional user experiences. Your responsibilities will include full-stack development, where you will architect, design, and develop both frontend and backend components of AI-driven web applications. You will build responsive and user-friendly interfaces using modern JavaScript frameworks such as React, Angular, or Vue.js, along with robust backend services using technologies like Node.js, Django, or .NET. Furthermore, you will be responsible for developing and integrating secure RESTful and GraphQL APIs to connect AI models with cloud services and third-party systems. Leveraging cloud platforms like AWS, Azure, or GCP, you will deploy and manage scalable applications and services effectively. Collaborating with data scientists, you will integrate machine learning models, natural language processing, computer vision, and other AI techniques into web applications. You will optimize AI workflows by ensuring seamless data exchange and efficient model inference across the technology stack. In terms of DevOps and deployment, you will implement CI/CD pipelines, containerization using tools like Docker and Kubernetes, and automated testing to ensure efficient and reliable releases. Monitoring application performance and troubleshooting issues in real-time will be essential to maintain high-quality production environments. Your role will also involve close collaboration with cross-functional teams to gather requirements, deliver project updates, and ensure solutions align with business needs. Documenting development processes, API specifications, and integration practices will be crucial to support future enhancements and maintenance. To excel in this role, you must have a degree in Computer Science, Data Science, Engineering, or a related field. Additionally, you should possess a minimum of 5 years of experience with full-stack development and at least 2 years of experience with AI development. Hands-on experience in programming languages like Python, JavaScript (or TypeScript), and/or C#, along with expertise in front-end and backend frameworks, cloud platforms, containerization, and data management, will be essential. If you are detail-oriented, possess strong problem-solving, analytical, and communication skills, and have a passion for continuous learning and innovation, this role is perfect for you. Join us at the intersection of AI and full-stack web development to create robust, intelligent systems that scale in the cloud while delivering intuitive and responsive user experiences.,
Posted 1 day ago
3.0 - 7.0 years
0 Lacs
indore, madhya pradesh
On-site
Join Tecnomi's Innovation Team in Indore as an AI/ML Python Developer! In this full-time, onsite role, you will be involved in cutting-edge machine learning projects that require expertise in TensorFlow, PyTorch, and more. Your responsibilities will include designing and implementing machine learning models, developing data processing pipelines, performing feature engineering, and deploying ML models in production environments. Collaboration with cross-functional teams in an agile setting is a key aspect of this position. The ideal candidate will have 3-5 years of Python development experience and a strong background in ML libraries such as TensorFlow, PyTorch, scikit-learn, pandas, and NumPy. Experience with time-series analysis, neural networks, cloud platforms (preferably AWS), and containerization is also required. A Bachelor's or Master's degree in Computer Science, Data Science, or a related field is preferred. Additionally, knowledge in NLP, reinforcement learning, and MLOps (including MLflow and model versioning) is considered a plus. If you are passionate about AI/ML development and eager to contribute to innovative R&D initiatives involving real-time data processing and advanced AI model development, we encourage you to apply for this exciting opportunity at Tecnomi.,
Posted 1 day ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
Sykatiya Technology Pvt Ltd is a leading Semiconductor Industry innovator committed to leveraging cutting-edge technology to solve complex problems. We are currently looking for a highly skilled and motivated Data Scientist to join our dynamic team and contribute to our mission of driving innovation through data-driven insights. As the Lead Data Scientist and Machine Learning Engineer at Sykatiya Technology Pvt Ltd, you will play a crucial role in analyzing large datasets to uncover patterns, develop predictive models, and implement AI/ML solutions. Your responsibilities will include working on projects involving neural networks, deep learning, data mining, and natural language processing (NLP) to drive business value and enhance our products and services. Key Responsibilities: - Lead the design and implementation of machine learning models and algorithms to address complex business problems. - Utilize deep learning techniques to enhance neural network models and enhance prediction accuracy. - Conduct data mining and analysis to extract actionable insights from both structured and unstructured data. - Apply natural language processing (NLP) techniques for advanced text analytics. - Develop and maintain end-to-end data pipelines, ensuring data integrity and reliability. - Collaborate with cross-functional teams to understand business requirements and deliver data-driven solutions. - Mentor and guide junior data scientists and engineers in best practices and advanced techniques. - Stay updated with the latest advancements in AI/ML, neural networks, deep learning, data mining, and NLP. Technical Skills: - Proficiency in Python and its libraries such as NumPy, pandas, sci-kit-learn, TensorFlow, Keras, and PyTorch. - Strong understanding of machine learning algorithms and techniques. - Extensive experience with neural networks and deep learning frameworks. - Hands-on experience with data mining and analysis techniques. - Proficiency in natural language processing (NLP) tools and libraries like NLTK, spaCy, and transformers. - Proficiency in Big Data Technologies including Sqoop, Hadoop, HDFS, Hive, and PySpark. - Experience with Cloud Platforms such as AWS services like S3, Step Functions, EventBridge, Athena, RDS, Lambda, and Glue. - Strong knowledge of Database Management systems like SQL, Teradata, MySQL, PostgreSQL, and Snowflake. - Familiarity with Other Tools like ExactTarget, Marketo, SAP BO, Agile, and JIRA. - Strong Analytical Skills to analyze large datasets and derive actionable insights. - Excellent Problem-Solving Skills with the ability to think critically and creatively. - Effective Communication Skills and teamwork abilities to collaborate with various stakeholders. Experience: - At least 8 to 12 years of experience in a similar role.,
Posted 1 day ago
0.0 - 12.0 years
0 Lacs
karnataka
On-site
You will be working as an Application Developer at Happiest Minds with a focus on Generative AI technology. With a minimum of 4 years of experience and up to 12 years, you will be responsible for developing applications utilizing your strong programming skills in Python, .NET, or Java. For Python, you should be familiar with frameworks like FastAPI or Flask, and data libraries such as NumPy and Pandas. If you have expertise in .NET, knowledge of ASP.NET and Web API development is required. Proficiency in Java with Spring or Spring Boot is necessary for Java developers. In addition, you should have hands-on experience with at least one of the major cloud platforms and services, such as Azure (Azure App Service, Azure Functions, Azure Storage) or AWS (Elastic Beanstalk, Lambda, S3). Furthermore, you should have practical experience with various databases like Oracle, Azure SQL, SQL Server, Cosmos DB, MySQL, PostgreSQL, or MongoDB. A minimum of 3 months experience in developing Generative AI solutions using any LLMs (Large Language Models) and deploying them on cloud platforms is essential for this role. You will collaborate closely with team members and clients to understand project requirements and effectively translate them into technical solutions. Your problem-solving and analytical skills will be crucial in troubleshooting, debugging, and enhancing solutions to meet project needs effectively.,
Posted 1 day ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
The Machine Learning Engineer / Data Scientist (Generative AI & Data Engineering) position based in Gurgaon/NCR requires 3 to 5 years of experience in AI & Data Science Enterprise Delivery. As a Team Manager, your primary responsibility will be to manage client expectations and collaborate with stakeholders to fulfill customer requirements. Your duties will encompass various aspects of Data Science, including developing machine learning models to support recommendation systems and NLP projects. You will be tasked with providing actionable insights to optimize products and services. In Data Engineering, your role involves building and maintaining scalable ETL pipelines, optimizing data storage solutions, and ensuring data accuracy and reliability for analytics. Moreover, you will analyze complex datasets to identify trends and patterns, generating insights that drive strategic decisions and enhance client services. Collaborating with product and service teams, you will translate data needs into technical solutions. You will report to the VP Business Growth and engage with external stakeholders, primarily clients. Your technical skills should include proficiency in Python (Pandas, NumPy), SQL, and Java, along with experience in LLMs, LangChain, Generative AI technologies, ML frameworks (TensorFlow, PyTorch), and data engineering tools (Spark, Kafka). Soft skills such as the ability to work independently or in a team, excellent communication, critical thinking, problem-solving abilities, and a proactive approach in dynamic environments are essential. Academic qualifications entail a Bachelor's Degree in Computer Science, Data Analytics, Engineering, or a related field, with 3 to 5 years of experience in Data Science and Data Engineering. Proficiency in key data engineering concepts and strong communication skills are crucial for success in this role.,
Posted 2 days ago
0.0 - 4.0 years
0 Lacs
chennai, tamil nadu
On-site
As an AI Developer at KritiLabs, you will play a crucial role in various aspects of data analysis and model development. Your responsibilities will include assisting in data analysis, supporting the design and implementation of machine learning models, conducting experiments to evaluate model performance, documenting processes and results, collaborating with cross-functional teams, and engaging in ongoing learning opportunities to enhance your understanding of AI/ML concepts and technologies. To excel in this role, you should have a background in Computer Science, Data Science, Mathematics, Statistics, or a related field. Proficiency in programming languages such as Python or R, along with experience in data manipulation libraries like Pandas and NumPy, is essential. A basic understanding of machine learning concepts and algorithms, including regression, classification, and clustering, is required. Strong analytical skills, problem-solving abilities, attention to detail, and the capacity to work with large datasets are key attributes for success in this position. Effective communication skills, both verbal and written, are crucial for articulating complex concepts clearly. Being a team player is vital to collaborate effectively in a fast-paced and collaborative environment. Any prior experience with AI/ML projects or relevant coursework will be advantageous. Additionally, familiarity with machine learning frameworks such as TensorFlow, PyTorch, or scikit-learn will be beneficial. KritiLabs offers a dynamic, innovative, and inclusive work environment that values and celebrates individual contributions. We provide opportunities for growth, competitive benefits including health insurance and retirement plans, and a strong emphasis on maintaining a healthy work-life balance through flexible work arrangements. Join us in Chennai, where you can drive positive change and make a difference while working on cutting-edge projects that challenge conventional thinking and push the boundaries of innovation. English proficiency is mandatory, and knowledge of other languages is an added advantage.,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
We are looking for a highly skilled and experienced Senior Python & ML Engineer with expertise in PySpark, machine learning, and large language models (LLMs). You will play a key role in designing, developing, and implementing scalable data pipelines, machine learning models, and LLM-powered applications. In this role, you will need to have a solid understanding of Python's ecosystem, distributed computing using PySpark, and practical experience in AI optimization. Your responsibilities will include designing and maintaining robust data pipelines with PySpark, optimizing PySpark jobs for efficiency on large datasets, and ensuring data integrity throughout the pipeline. You will also be involved in developing, training, and deploying machine learning models using key ML libraries such as scikit-learn, TensorFlow, and PyTorch. Tasks will include feature engineering, model selection, hyperparameter tuning, and integrating ML models into production systems for scalability and reliability. Additionally, you will research, experiment with, and integrate state-of-the-art Large Language Models (LLMs) into applications. This will involve developing solutions that leverage LLMs for tasks like natural language understanding, text generation, summarization, and question answering. You will also fine-tune pre-trained LLMs for specific business needs and datasets, and explore techniques for prompt engineering, RAG (Retrieval Augmented Generation), and LLM evaluation. Collaboration is key in this role, as you will work closely with data scientists, engineers, and product managers to understand requirements and translate them into technical solutions. You will mentor junior team members, contribute to best practices for code quality, testing, and deployment, and stay updated on the latest advancements in Python, PySpark, ML, and LLMs. Furthermore, you will be responsible for deploying, monitoring, and maintaining models and applications in production environments using MLOps principles. Troubleshooting and resolving issues related to data pipelines, ML models, and LLM applications will also be part of your responsibilities. To be successful in this role, you should have a Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related field. Strong proficiency in Python programming, PySpark, machine learning, and LLMs is essential. Experience with cloud platforms like AWS, Azure, or GCP is preferred, along with strong problem-solving, analytical, communication, and teamwork skills. Nice-to-have skills include familiarity with R and Shiny, streaming data technologies, containerization technologies, MLOps tools, graph databases, and contributions to open-source projects.,
Posted 2 days ago
6.0 - 10.0 years
0 Lacs
jaipur, rajasthan
On-site
As an AI / ML Engineer, you will be responsible for utilizing your expertise in the field of Artificial Intelligence and Machine Learning to develop innovative solutions. You should hold a Bachelor's or Master's degree in Computer Science, Engineering, Data Science, AI/ML, Mathematics, or a related field. With a minimum of 6 years of experience in AI/ML, you are expected to demonstrate proficiency in Python and various ML libraries such as scikit-learn, XGBoost, pandas, NumPy, matplotlib, and seaborn. In this role, you will need a strong understanding of machine learning algorithms and deep learning architectures including CNNs, RNNs, and Transformers. Hands-on experience with TensorFlow, PyTorch, or Keras is essential. You should also have expertise in data preprocessing, feature selection, exploratory data analysis (EDA), and model interpretability. Additionally, familiarity with API development and deploying models using frameworks like Flask, FastAPI, or similar tools is required. Experience with MLOps tools such as MLflow, Kubeflow, DVC, and Airflow will be beneficial. Knowledge of cloud platforms like AWS (SageMaker, S3, Lambda), GCP (Vertex AI), or Azure ML is preferred. Proficiency in version control using Git, CI/CD processes, and containerization with Docker is essential for this role. Bonus skills that would be advantageous include familiarity with NLP frameworks (e.g., spaCy, NLTK, Hugging Face Transformers), Computer Vision experience using OpenCV or YOLO/Detectron, and knowledge of Reinforcement Learning or Generative AI (GANs, LLMs). Experience with vector databases such as Pinecone or Weaviate, as well as LangChain for AI agent building, is a plus. Familiarity with data labeling platforms and annotation workflows will also be beneficial. In addition to technical skills, you should possess soft skills such as an analytical mindset, strong problem-solving abilities, effective communication, and collaboration skills. The ability to work independently in a fast-paced, agile environment is crucial. A passion for AI/ML and a proactive approach to staying updated with the latest developments in the field are highly desirable for this role.,
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough