Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 13.0 years
25 - 30 Lacs
Noida, Faridabad, Delhi
Work from Office
We are seeking a highly skilled Senior Data Science Consultant with 8+ years of experience to lead an internal optimization initiative. The ideal candidate should have a strong background in data science, operations research, and mathematical optimization, with a proven track record of applying these skills to solve complex business problems. This role requires a blend of technical depth, business acumen, and collaborative communication. A background in internal efficiency/operations improvement or cost/resource optimization projects is highly desirable. Key Responsibilities Lead and contribute to internal optimization-focused data science projects from design to deployment. Develop and implement mathematical models to optimize resource allocation, process performance, and decision-making. Use techniques such as linear programming, mixed-integer programming, heuristic and metaheuristic algorithms. Collaborate with business stakeholders to gather requirements and translate them into data science use cases. Build robust data pipelines and use statistical and machine learning methods to drive insights. Communicate complex technical findings in a clear, concise manner to both technical and non-technical audiences. Mentor junior team members and contribute to knowledge sharing and best practices within the team. Required Skills And Qualifications Masters or PhD in Data Science, Computer Science, Operations Research, Applied Mathematics, or related fields. Minimum 8 years of relevant experience in data science, with a strong focus on optimization. Expertise in Python (NumPy, Pandas, SciPy, Scikit-learn), SQL, and optimization libraries such as PuLP, Pyomo, Gurobi, or CPLEX. Experience with end-to-end lifecycle of internal optimization projects. Strong analytical and problem-solving skills. Excellent communication and stakeholder management abilities. Preferred Qualifications Experience working on internal company projects focused on logistics, resource planning, workforce optimization, or cost reduction. Exposure to tools/platforms like Databricks, Azure ML, or AWS SageMaker. Familiarity with dashboards and visualization tools like Power BI or Tableau. Prior experience in consulting or internal centers of excellence (CoE) is a plus. Additional Notes This is a contractual/project-based role and not a permanent/full-time position. Compensation will be finalized post-interview, based on mutual agreement and project scope. The project may be divided into phases, with compensation linked to successful milestone completion. Skills: data science , communication , mentoring , optimization , metaheuristic algorithms,sql,linear programming,data pipelines,python,machine learning,mathematical optimization,mixed-integer programming,operations research,statistical methods,heuristic algorithms
Posted 1 month ago
8.0 - 13.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Date Location: Bangalore, KA, IN Company Alstom At Alstom, we understand transport networks and what moves people. From high-speed trains, metros, monorails, and trams, to turnkey systems, services, infrastructure, signalling and digital mobility, we offer our diverse customers the broadest portfolio in the industry. Every day, 80,000 colleagues lead the way to greener and smarter mobility worldwide, connecting cities as we reduce carbon and replace cars. Could you be the full-time **Data Solutions Manager** in **[Insert Location]** were looking for Take on a new challenge and apply your **data science and technical leadership** expertise in a cutting-edge field. Youll work alongside **collaborative and innovative** teammates. You'll play a pivotal role in shaping and sustaining advanced data solutions that drive our industrial programs. Day-to-day, youll work closely with teams across the business (**engineering, IT, and program management**), **define and develop scalable data solutions**, and much more. Youll specifically take care of **designing production-grade, cyber-secure data solutions**, but also **applying AI techniques to enhance data utilization for key indicators**. Well look to you for: Managing the team to ensure technical excellence and process adherence Designing scalable, multi-tenant data collectors and storage systems Building streaming and batch data processing pipelines Developing SQL and NoSQL data models Assessing and enhancing the quality of incoming data flows Applying advanced AI techniques and data management/security components Creating customizable analytical dashboards Evaluating opportunities presented by emerging technologies Implementing strong testing and quality assurance practices All about you We value passion and attitude over experience. Thats why we dont expect you to have every single skill. Instead, weve listed some that we think will help you succeed and grow in this role: Engineering degree or equivalent 8+ years of experience in IT, digital companies, software, or startups Proficiency in data processing and software development using tools like QlikSense, PowerApps, Power BI, or Java/Scala Experience with Apache Spark and other data processing frameworks Strong statistical skills (e.g., probability theories, regression, hypothesis testing) Expertise in machine learning techniques and algorithms (e.g., Logistic Regression, Decision Trees, Clustering) Proficiency in data science methods (CRISP-DM, feature engineering, model evaluation) Experience with Python and R libraries (NumPy, Pandas, Scikit) Deep knowledge of SQL database configuration (e.g., Postgres, MariaDB, MySQL) Familiarity with DevOps tools (e.g., Docker, Ansible) and version control (e.g., Git) Knowledge of cloud platforms (Azure, AWS, GCP) is a plus Understanding of network and security principles (e.g., SSL, certificates, IPSEC) Fluent in English; French is a plus Things youll enjoy Join us on a life-long transformative journey the rail industry is here to stay, so you can grow and develop new skills and experiences throughout your career. Youll also: Enjoy stability, challenges, and a long-term career free from boring daily routines Work with cutting-edge security standards for rail data solutions Collaborate with transverse teams and supportive colleagues Contribute to innovative and impactful projects Utilise our **flexible and inclusive** working environment Steer your career in whatever direction you choose across functions and countries Benefit from our investment in your development through award-winning learning opportunities Progress towards leadership roles in data science and digital transformation Benefit from a fair and dynamic reward package that recognises your performance and potential, plus comprehensive and competitive social coverage (life, medical, pension) You dont need to be a train enthusiast to thrive with us. We guarantee that when you step onto one of our trains with your friends or family, youll be proud. If youre up for the challenge, wed love to hear from you! As a global business, were an equal-opportunity employer that celebrates diversity across the 63 countries we operate in. Were committed to creating an inclusive workplace for everyone.
Posted 1 month ago
7.0 - 10.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Date 10 Jun 2025 Location: Bangalore, KA, IN Company Alstom At Alstom, we understand transport networks and what moves people. From high-speed trains, metros, monorails, and trams, to turnkey systems, services, infrastructure, signalling and digital mobility, we offer our diverse customers the broadest portfolio in the industry. Every day, 80,000 colleagues lead the way to greener and smarter mobility worldwide, connecting cities as we reduce carbon and replace cars. Your future role Take on a new challenge and apply your deep learning and AI expertise in a new cutting-edge field. Youll work alongside innovative, analytical, and collaborative teammates. You'll spearhead the development and deployment of advanced generative AI models, impacting the way we approach transport solutions. Day-to-day, youll work closely with teams across the business (data scientists, software engineers, project managers), architecting scalable machine learning solutions and much more. Youll specifically take care of leading research initiatives, optimizing AI model efficiency but also fostering a culture of continuous improvement and innovation. Well look to you for: 7-10 years of experience in machine learning, deep learning. Experience End-to-end architecture design for machine learning solutions Deployment of AI models into scalable services Staying at the forefront of generative AI advancements Optimizing models for performance and scalability Collaborating with cross-functional teams for successful project delivery All about you We value passion and attitude over experience. Thats why we dont expect you to have every single skill. Instead, weve listed some that we think will help you succeed and grow in this role: Degree in Computer Science, Information Technology, Electrical Engineering, or related field Experience or understanding of machine learning, deep learning, generative AIand MLOps Knowledge of deep learning techniques and frameworks (TensorFlow, PyTorch) Familiarity with cloud platforms, preferably Microsoft Azure A certification in AI or machine learning is advantageous Strong analytical and problem-solving skills Excellent communication and team collaboration abilities Things youll enjoy Join us on a life-long transformative journey the rail industry is here to stay, so you can grow and develop new skills and experiences throughout your career. Youll also: Enjoy stability, challenges and a long-term career free from boring daily routines Work with new security standards for rail signalling Collaborate with transverse teams and helpful colleagues Contribute to innovative projects Utilise our flexible working environment Steer your career in whatever direction you choose across functions and countries Benefit from our investment in your development, through award-winning learning Progress towards leadership and advanced technical roles Benefit from a fair and dynamic reward package that recognises your performance and potential, plus comprehensive and competitive social coverage (life, medical, pension) You dont need to be a train enthusiast to thrive with us. We guarantee that when you step onto one of our trains with your friends or family, youll be proud. If youre up for the challenge, wed love to hear from you! Important to note As a global business, were an equal-opportunity employer that celebrates diversity across the 63 countries we operate in. Were committed to creating an inclusive workplace for everyone.
Posted 1 month ago
0.0 - 1.0 years
0 - 1 Lacs
Madurai
Work from Office
Join PiBi Technologies as an AI Product Intern. Work on real-time projects in AI, UI/UX, full stack, APIs, and cloud. Gain hands-on experience in building scalable tech products with expert guidance.
Posted 1 month ago
3.0 - 5.0 years
84 - 108 Lacs
Hyderabad
Work from Office
Responsibilities: * Develop predictive models using Python, Sql, NumPy, Pandas & ML algorithms * Collaborate with cross-functional teams on data analysis projects * Optimize database queries for performance improvement Office cab/shuttle
Posted 1 month ago
12.0 - 16.0 years
30 - 37 Lacs
Chennai
Work from Office
Understand business objectives and translate them into data and machine learning requirements Identify what data is needed to support specific AI use cases and assess its availability, quality and structure Explore and profile data from various sources to evaluate its suitability and uncover biases, gaps or integration challenges Visualising data and analytical results using tools such as Power BI, with the ability to create insightful dashboards for both technical and non-technical audiences. Collaborate with data owners and engineers to locate, access and prepare data for experimentation and model development Work closely with product teams to define success metrics and ensure data can support reliable measurement and evaluation Support the design of data flows and contribute to ingestion and preparation strategies that align with modelling needs Translate insights from data exploration and model experimentation into clear actionable recommendations Contribute to best practices related to data validation, model documentation and governance of analytical workflows Collaborate with ML engineers and AI architects to assess modelling options and design evaluation strategies Stay up to date with developments in AI and machine learning and actively share insights with the wider team Qualification Your starting point for constant growth From the moment you join Ramboll, we will support your personal and professional development so that you grow with the company. For this role, we believe your starting point is: A master’s degree in data science, computer science, statistics, applied mathematics or a related field Strong experience in applied data science or machine learning, with an emphasis on data readiness and problem framing Proficiency in Python and experience with common data science libraries (e.g., pandas, scikit-learn, TensorFlow or PyTorch) Demonstrated ability to explore, profile and assess complex data sets to determine their value for ML solutions Experience working with business stakeholders to translate objectives into data needs and analytic plans Understanding of how data flows from source systems into models, and experience partnering with data engineers to build effective pipelines Excellent communication skills, with the ability to explain and visualize data-related concepts to non-technical audiences Fluency in English Additional Information We offer: Competitive salary and benefits package. Opportunities for professional growth and development. Flexible work environment and work-life balance. A multi-cultural work environment Personal qualities that will help you succeed in this role include: Ability to collect and analyse data, establish facts and make recommendations in written and verbal form. Ability to explain and translate complex, technical IT requirements into material understandable for non-IT staff. Ability to liaise with all parts of an organisation, including IT and business stakeholders. Open-minded and helpful and, finally, a good portion of self-awareness and humour.
Posted 1 month ago
6.0 - 10.0 years
25 - 35 Lacs
Gurugram
Remote
Hi, With reference to your profile on job portal we would like to share an opportunity with you for one of our Gurgaon Based client for Gurgaon location. Please find below the details regarding same: Location: Remote/WFH Experience: 6-10 Years Title: Manager-Data Engineer (Web Scraping) Notice Period: Only Immediate Joiner - 30 Days Max Job Responsibilities Technical Skills Required: Proficiency in Python and SQL/Database skills is required. Must have strong expertise in using Pandas library (Python). Experience with web technologies (HTML/JS, APIs, etc.) is essential. Should have a good understanding of tools such as Scrapy, BeautifulSoup, and Selenium. Responsible for reviewing and approving pull requests to ensure clean, maintainable, and efficient code. Experience building scalable scraping solutions for large-scale data collection Familiarity with AWS technologies like S3, RDS, SNS, SQS, Lambda, and others is necessary. Qualifications Bachelors/masters degree in computer science or in any related field. Role Summary Leading and mentoring a team of seasoned Data engineers performing Web Scraping using various scraping techniques and then utilizing Pythons Pandas library for data cleaning and manipulation. Then ingesting the data into a Database/Warehouse, and scheduling the scrapers using Airflow or other tools Role Overview The Web Scraping Team is seeking a creative and detail-oriented Leaders to contribute to client projects and lead by examples. This team develops essential applications, datasets, and alerts that directly support client investment decisions. Our focus is to maintain operational excellence by providing high-quality proprietary datasets, timely notifications, and exceptional service. The ideal candidate will be self-motivated, self-sufficient, and possess a passion for tinkering and a love for automation. If in case you are interested to avail this opportunity then please revert with your updated profile asap to sachin@vanassociates.com Note: Do not change the subject line while reverting. 1. Total Exp: 2. Relevant experience in Python, Pandas, Data Cleansing, Data Transformation, Team Management: 3. Current CTC: 4. Expected CTC: 5. Official Notice Period: 6. Ready to work in Gurgaon: 7. Availability for MS Teams Interviews in Weekdays:
Posted 1 month ago
7.0 - 10.0 years
45 - 50 Lacs
Noida, Kolkata, Chennai
Work from Office
Dear Candidate, We are hiring a Python Developer to build scalable backend systems, data pipelines, and automation tools. This role requires strong expertise in Python frameworks and a deep understanding of software engineering principles. Key Responsibilities: Develop backend services, APIs, and automation scripts using Python. Work with frameworks like Django, Flask, or FastAPI. Collaborate with DevOps and data teams for end-to-end solution delivery. Write clean, testable, and efficient code. Troubleshoot and debug applications in production environments. Required Skills & Qualifications: Proficient in Python 3.x , OOP, and design patterns Experience with Django, Flask, FastAPI, Celery Knowledge of REST APIs, SQL/NoSQL databases (PostgreSQL, MongoDB) Familiar with Docker, Git, CI/CD, and cloud platforms (AWS/GCP/Azure) Experience in data processing, scripting, or automation is a plus Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 1 month ago
6.0 - 11.0 years
15 - 20 Lacs
Pune
Hybrid
Role & responsibilities B.Tech or M.Tech in Computer Science, or equivalent experience. 5+ years of experience working professionally as a Python Software Developer. Organized, self-directed, and resourceful. Excellent written and verbal communication skills. Expert in python & pandas. Experience in building data pipelines, ETL and ELT processes. Advanced working SQL experience working with relational databases, query authoring (SQL) and working familiarity with a variety of databases. Understanding of Docker and Data Orchestration tools. Experience with Jupyter notebooks.
Posted 1 month ago
4.0 - 5.0 years
2 - 5 Lacs
Mumbai
Work from Office
About The Role New or Never Normal Job Title: Internal Audit Team Member Corporate Grade: CA fresher"™s Kotak Overview Kotak Mahindra Bank Limited is an Indian banking and financial services company headquartered in Mumbai, India. It offers banking products and financial services for corporate and retail customers in the areas of personal finance, investment banking, life insurance, and wealth management. At Kotak, we expect more from ourselves, than what anyone else expects of us. This way, we are creating a rewarding and delightful experience every day for our customers. We are an equal opportunity employer and we are opposed to discrimination on any grounds. Overall purpose of role As Kotak Internal Audit Member, you will be a part of Kotak Internal Audit Department (IAD) which aims to provide independent, reliable, valued, insightful and timely assurance to the Board and executive management thus demonstrating a role of an ENABLER . This is achieved by looking at the effectiveness of governance, risk/control framework over current and evolving risks, within the current and expected business environment and in accordance with the international standards definition of internal auditing. Kotak IAD is a 200+ member department spread across 5 locations (Mumbai, Ahmedabad, Bangalore, Chennai and Delhi) which is set up to perform Internal Audits for KMBL (Kotak Mahindra Bank Limited) Flagship Company of the Kotak group. A one-stop shop. 4 Strategic Business Units - Consumer Banking, Corporate Banking (Wholesale Banking), Commercial Banking, Treasury. Key Accountabilities Executing the delivery of the Kotak IAD"™s Audit plan for the calendar year. "¢ To assist the Team Supervisor / Team Lead on assigned audit work of KMBL Businesses & Processes. This will entail working on the audit to deliver the Audit Planning Memo, Controls Document, agree issues & action plans with management and submission of draft report to the Team Supervisor. "¢ Demonstrate sound knowledge of both business/technical areas and expert knowledge in the audit process, including the IA systems, to ensure that audit work is carried out with a high standard that meets all methodologies. "¢ Demonstrate sound knowledge of regulations governing the bank (RBI, IRDA, SEBI, FEMA, FIMMDA, FEDAI, FATCA, NDS-OM, etc.) "¢ Conducts reviews to understand end-to-end process, evaluate and highlight key control deficiencies, analyse root cause and discuss / agree with management for effective and timely remediation plans. "¢ Manage individual workflow to ensure timely report deliveries with quality articulation of audit queries "¢ Writing high quality audit reports. To embrace and demonstrate effective Audit report writing & presentation skills from E2E perspective (Planning, Execution, Discussion, Clarification, Finalisation, Documentation into system etc.) "¢ Endeavouring to update awareness of risk issues and changes across relevant business units and use this knowledge to update the audit approach. "¢ Driving and leading discussions with relevant stakeholders regarding audit observations, consulting with IAD team supervisors / Team Leads. Decision-making and Problem Solving Take into account reputation of Kotak at all times, through positive interactions and adhering to policies, procedures and manuals. Set an example and supports fair and ethical behaviour. Make sure you are equipped to be able to protect our reputation at all times. Challenge others where appropriate, if you believe self to be correct. Endeavour to be part in decision making on a broad range of factors, with Kotak"™s values at heart - Always Responsible, Always Accountable. Risk and Control Objective [This section is mandatory for all role profiles] Ensure that all activities and duties are carried out in full compliance with regulatory requirements, Enterprise Wide Risk Management Framework and internal Kotak"™s Policies and Standards. Person Specification Personal attributes essential to performing role including competencies, expertise, knowledge, and experience. Noteexperience requirements must not be in the form of years (minimum or otherwise). Essential Skills/Basic & Preferred Qualifications: Understanding of Banking Business their products and processes. "¢ Person who can work alone, close audits and manage stakeholders "¢ Relevant professional qualifications e.g. CA/CMA, CISA, CIA, (Bachelor"™s degree mandatory). "¢ A Bachelor"™s degree in Commerce (preferably) holder with at least3-5 years of experiencein internal audit of a financial institution and/or relevant experience in the following areas: "¢ Experience in a BFSI or Finance processes, including audit. Understanding of relevant regulatory environment would be an added advantage. "¢ Experience in data analytics and/or exposure to data science/machine learning techniques would be an added advantage.
Posted 1 month ago
4.0 - 5.0 years
2 - 5 Lacs
Chennai
Work from Office
About The Role New or Never Normal Job Title: Internal Audit Team Member Corporate Grade: CA fresher"™s Kotak Overview Kotak Mahindra Bank Limited is an Indian banking and financial services company headquartered in Mumbai, India. It offers banking products and financial services for corporate and retail customers in the areas of personal finance, investment banking, life insurance, and wealth management. At Kotak, we expect more from ourselves, than what anyone else expects of us. This way, we are creating a rewarding and delightful experience every day for our customers. We are an equal opportunity employer and we are opposed to discrimination on any grounds. Overall purpose of role As Kotak Internal Audit Member, you will be a part of Kotak Internal Audit Department (IAD) which aims to provide independent, reliable, valued, insightful and timely assurance to the Board and executive management thus demonstrating a role of an ENABLER . This is achieved by looking at the effectiveness of governance, risk/control framework over current and evolving risks, within the current and expected business environment and in accordance with the international standards definition of internal auditing. Kotak IAD is a 200+ member department spread across 5 locations (Mumbai, Ahmedabad, Bangalore, Chennai and Delhi) which is set up to perform Internal Audits for KMBL (Kotak Mahindra Bank Limited) Flagship Company of the Kotak group. A one-stop shop. 4 Strategic Business Units - Consumer Banking, Corporate Banking (Wholesale Banking), Commercial Banking, Treasury. Key Accountabilities Executing the delivery of the Kotak IAD"™s Audit plan for the calendar year. "¢ To assist the Team Supervisor / Team Lead on assigned audit work of KMBL Businesses & Processes. This will entail working on the audit to deliver the Audit Planning Memo, Controls Document, agree issues & action plans with management and submission of draft report to the Team Supervisor. "¢ Demonstrate sound knowledge of both business/technical areas and expert knowledge in the audit process, including the IA systems, to ensure that audit work is carried out with a high standard that meets all methodologies. "¢ Demonstrate sound knowledge of regulations governing the bank (RBI, IRDA, SEBI, FEMA, FIMMDA, FEDAI, FATCA, NDS-OM, etc.) "¢ Conducts reviews to understand end-to-end process, evaluate and highlight key control deficiencies, analyse root cause and discuss / agree with management for effective and timely remediation plans. "¢ Manage individual workflow to ensure timely report deliveries with quality articulation of audit queries "¢ Writing high quality audit reports. To embrace and demonstrate effective Audit report writing & presentation skills from E2E perspective (Planning, Execution, Discussion, Clarification, Finalisation, Documentation into system etc.) "¢ Endeavouring to update awareness of risk issues and changes across relevant business units and use this knowledge to update the audit approach. "¢ Driving and leading discussions with relevant stakeholders regarding audit observations, consulting with IAD team supervisors / Team Leads. Decision-making and Problem Solving Take into account reputation of Kotak at all times, through positive interactions and adhering to policies, procedures and manuals. Set an example and supports fair and ethical behaviour. Make sure you are equipped to be able to protect our reputation at all times. Challenge others where appropriate, if you believe self to be correct. Endeavour to be part in decision making on a broad range of factors, with Kotak"™s values at heart - Always Responsible, Always Accountable. Risk and Control Objective [This section is mandatory for all role profiles] Ensure that all activities and duties are carried out in full compliance with regulatory requirements, Enterprise Wide Risk Management Framework and internal Kotak"™s Policies and Standards. Person Specification Personal attributes essential to performing role including competencies, expertise, knowledge, and experience. Noteexperience requirements must not be in the form of years (minimum or otherwise). Essential Skills/Basic & Preferred Qualifications: Understanding of Banking Business their products and processes. "¢ Person who can work alone, close audits and manage stakeholders "¢ Relevant professional qualifications e.g. CA/CMA, CISA, CIA, (Bachelor"™s degree mandatory). "¢ A Bachelor"™s degree in Commerce (preferably) holder with at least3-5 years of experiencein internal audit of a financial institution and/or relevant experience in the following areas: "¢ Experience in a BFSI or Finance processes, including audit. Understanding of relevant regulatory environment would be an added advantage. "¢ Experience in data analytics and/or exposure to data science/machine learning techniques would be an added advantage.
Posted 1 month ago
2.0 - 3.0 years
4 - 5 Lacs
Mumbai
Work from Office
Job role Job Requirements Personality traits To work on heavy data extraction, generation of reports, visualising data and putting analysis in discussed formatSQL & Advanced python knowledge with proven track record in python scripting using Numpy, pandas, seaborn, matplotlib, scikit learn etc Candidate must have excellent organizational skills, be level-headed, have good interpersonal skills Model building like classification, regression, recommendation systems etc SAS EG/Eminer knowledge is a plusNumerical skills - will need to acquire data and use it to target selected groups, as well as analyse the success or otherwise of campaigns The role holder will be responsible for data support, segmentation/scorecards, analytics and customer insights Past experience with deployment on cloud systems & API integration is preferredAt least 2-3 years of experience with data handling roles Must be excellent with Data visualisation He/ She should be comfortable working with heavy datasets and should have an analytical bent of mind Should have an active interest in data analysisHighly logical individuals with adaptable trait towards new technologiesAdhere to documented process and compliance requirements, assisting seniors on diversified projectsMotivated and determined candidate with a zeal to handle responsibilities and take initiativesStrong communication skills to effective interaction with business stakeholders
Posted 1 month ago
4.0 - 5.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Dear Candidates, Greetings!! We are hiring for one of the Globalized Motor Vehicle Manufacturing MNC. Job Type : FTE Job Role :- AI Engineer Experience: 4 to 5 Years Location : Bangalore Work Mode : Work From Office Notice Period : Serving to 30 days Budget : As Per Market Standards Mandatory Skills : AI Solution, Machine Learning, Data Forecasting, Pyspark, Data bricks, Linux, Azure, Experience in building agentic AI solutions, Familiarity in Python & Pandas. Interested candidates can share their updated resume on Gurpreet@selectiveglobalsearch.com
Posted 1 month ago
10.0 - 15.0 years
12 - 18 Lacs
Pune
Work from Office
Responsibilities: * Design and deliver corporate training programs using Python * Ensure proficiency in Python, Pyspark, data structures, NumPy, Pandas, Aws, Azure, GCP Cloud, Data visualization, Big Data tools * Experience in core python skills Food allowance Travel allowance House rent allowance
Posted 1 month ago
7.0 - 9.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Scale an existing RAG code base for a production grade AI application Requirements: Proficiency in Prompt Engineering, LLMs, and Retrieval Augmented Generation Programming languages like Python or Java Experience with vector databases Experience using LLMs in software applications, including prompting, calling, and processing outputs Experience with AI frameworks such as LangChain Troubleshooting skills and creative in finding new ways to leverage LLM Experience with Azure Proof of Concept (POC) Development: Develop POCs to validate and showcase the feasibility and effectiveness of the proposed AI solutions. Collaborate with development teams to implement and iterate on POCs, ensuring alignment with customer requirements and expectations. Help in showcasing the ability of Gen AI code assistant to refactor/rewrite and document code from one language to another, particularly COBOL to JAVA through rapid prototypes/ PoC Documentation and Knowledge Sharing: Document solution architectures, design decisions, implementation details, and lessons learned. Create technical documentation, white papers, and best practice guides. Contribute to internal knowledge sharing initiatives and mentor new team members. Industry Trends and Innovation: Stay up to date with the latest trends and advancements in AI, foundation models, and large language models. Evaluate emerging technologies, tools, and frameworks to assess their potential impact on solution design and implementation. Experience in python and pyspark will be added advantage Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong programming skills, with proficiency in Python and experience with AI frameworks such as TensorFlow, PyTorch, Keras or Hugging Face. Understanding in the usage of libraries such as SciKit Learn, Pandas, Matplotlib, etc. Familiarity with cloud platforms (e.g. Kubernetes, AWS, Azure, GCP) and related services is a plus. Experience and working knowledge in COBOL & JAVA would be preferred Having experience in Code generation, code matching & code translation Prepare the effort estimates, WBS, staffing plan, RACI, RAID etc. . Excellent interpersonal and communication skills. Engage with stakeholders for analysis and implementation. Commitment to continuous learning and staying updated with advancements in the field of AI. Demonstrate a growth mindset to understand clients' business processes and challenges Preferred technical and professional experience Education: Bachelor's, Master's, or Ph.D. degree in Computer Science, Artificial Intelligence, Data Science or a related field. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 1 month ago
5.0 - 7.0 years
8 - 10 Lacs
Mumbai
Work from Office
As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience with Apache Spark (PySpark): In-depth knowledge of Sparks architecture, core APIs, and PySpark for distributed data processing. Big Data Technologies: Familiarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modeling, and data warehousing concepts. Strong proficiency in Python: Expertise in Python programming with a focus on data processing and manipulation. Data Processing Frameworks: Knowledge of data processing libraries such as Pandas, NumPy. SQL Proficiency: Experience writing optimized SQL queries for large-scale data analysis and transformation. Cloud Platforms: Experience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing
Posted 1 month ago
5.0 - 7.0 years
8 - 10 Lacs
Pune
Work from Office
As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience with Apache Spark (PySpark): In-depth knowledge of Sparks architecture, core APIs, and PySpark for distributed data processing. Big Data Technologies: Familiarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modeling, and data warehousing concepts. Strong proficiency in Python: Expertise in Python programming with a focus on data processing and manipulation. Data Processing Frameworks: Knowledge of data processing libraries such as Pandas, NumPy. SQL Proficiency: Experience writing optimized SQL queries for large-scale data analysis and transformation. Cloud Platforms: Experience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing
Posted 1 month ago
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru
Work from Office
As an Associate Data Scientist at IBM, you will work to solve business problems using leading edge and open-source tools such as Python, R, and TensorFlow, combined with IBM tools and our AI application suites. You will prepare, analyze, and understand data to deliver insight, predict emerging trends, and provide recommendations to stakeholders. In your role, you may be responsible for: Implementing and validating predictive and prescriptive models and creating and maintaining statistical models with a focus on big data & incorporating machine learning. techniques in your projects Writing programs to cleanse and integrate data in an efficient and reusable manner Working in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors Communicating with internal and external clients to understand and define business needs and appropriate modelling techniques to provide analytical solutions. Evaluating modelling results and communicating the results to technical and non-technical audiences Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proof of Concept (POC) Development: Develop POCs to validate and showcase the feasibility and effectiveness of the proposed AI solutions. Collaborate with development teams to implement and iterate on POCs, ensuring alignment with customer requirements and expectations. Help in showcasing the ability of Gen AI code assistant to refactor/rewrite and document code from one language to another, particularly COBOL to JAVA through rapid prototypes/ PoC Document solution architectures, design decisions, implementation details, and lessons learned. Create technical documentation, white papers, and best practice guides Preferred technical and professional experience Strong programming skills, with proficiency in Python and experience with AI frameworks such as TensorFlow, PyTorch, Keras or Hugging Face. Understanding in the usage of libraries such as SciKit Learn, Pandas, Matplotlib, etc. Familiarity with cloud platforms Experience and working knowledge in COBOL & JAVA would be preferred
Posted 1 month ago
4.0 - 7.0 years
10 - 15 Lacs
Noida
Work from Office
Job Title: Python Developer (4 Years Experience) Location: [Noida Job Type: [Full-time] Experience Level: Mid-Level (4+ years) Salary: [Optional] About the Role: We are looking for a skilled and enthusiastic Python Developer with around 4 years of professional experience to join our growing tech team. The ideal candidate will have a strong foundation in Python and hands-on experience in building scalable, maintainable, and efficient backend systems or applications. Key Responsibilities: • Design, develop, and maintain robust Python-based applications. • Write reusable, testable, and efficient code. • Collaborate with cross-functional teams including front-end developers, DevOps, and product managers. • Integrate user-facing elements with server-side logic. • Implement data storage solutions (e.g., PostgreSQL, MongoDB, Redis). • Develop RESTful APIs or microservices architecture. • Participate in code reviews, architecture discussions, and continuous improvement initiatives. • Write unit and integration tests to ensure code quality. • Troubleshoot and debug applications. Required Skills & Qualifications: • 4+ years of experience in Python development. • Strong understanding of Python frameworks like Django, Flask, or FastAPI. • Good knowledge of SQL and NoSQL databases. • Experience working with APIs and third-party integrations. • Familiarity with version control systems like Git. • Understanding of software development best practices, including Agile/Scrum. • Exposure to containerization tools like Docker (preferred). • Experience with CI/CD pipelines is a plus. • Good problem-solving and communication skills. Nice to Have: • Knowledge of front-end technologies like HTML, CSS, and JavaScript. • Experience with cloud platforms like AWS, GCP, or Azure. • Familiarity with asynchronous programming, Celery, or message brokers like RabbitMQ/Kafka. • Basic understanding of data structures and algorithms. Education: • Bachelor's degree in Computer Science, Engineering, or related field (or equivalent work experience).
Posted 1 month ago
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru
Work from Office
An AI Data Scientist at IBM is not just a job title - its a mindset. Youll leverage the watsonx,AWS Sagemaker,Azure Open AI platform to co-create AI value with clients, focusing on technology patterns to enhance repeatability and delight clients. We are seeking an experienced and innovative AI Data Scientist to be specialized in foundation models and large language models. In this role, you will be responsible for architecting and delivering AI solutions using cutting-edge technologies, with a strong focus on foundation models and large language models. You will work closely with customers, product managers, and development teams to understand business requirements and design custom AI solutions that address complex challenges. Experience with tools like Github Copilot, Amazon Code Whisperer etc. is desirable. Success is our passion, and your accomplishments will reflect this, driving your career forward, propelling your team to success, and helping our clients to thrive. Day-to-Day Duties: Proof of Concept (POC) Development: Develop POCs to validate and showcase the feasibility and effectiveness of the proposed AI solutions. Collaborate with development teams to implement and iterate on POCs, ensuring alignment with customer requirements and expectations. Help in showcasing the ability of Gen AI code assistant to refactor/rewrite and document code from one language to another, particularly COBOL to JAVA through rapid prototypes/ PoC Documentation and Knowledge Sharing: Document solution architectures, design decisions, implementation details, and lessons learned. Create technical documentation, white papers, and best practice guides. Contribute to internal knowledge sharing initiatives and mentor new team members. Industry Trends and Innovation: Stay up to date with the latest trends and advancements in AI, foundation models, and large language models. Evaluate emerging technologies, tools, and frameworks to assess their potential impact on solution design and implementation Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong programming skills, with proficiency in Python and experience with AI frameworks such as TensorFlow, PyTorch, Keras or Hugging Face. Understanding in the usage of libraries such as SciKit Learn, Pandas, Matplotlib, etc. Familiarity with cloud platforms (e.g. Kubernetes, AWS, Azure, GCP) and related services is a plus. Experience and working knowledge in COBOL & JAVA would be preferred Having experience in Code generation, code matching & code translation leveraging LLM capabilities would be a Big plus (e.g. Amazon Code Whisperer, Github Copilot etc.)Soft Skills: Excellent interpersonal and communication skills. Engage with stakeholders for analysis and implementation. Commitment to continuous learning and staying updated with advancements in the field of AI. Growth mindset: Demonstrate a growth mindset to understand clients' business processes and challenges. Experience in python and pyspark will be added advantage Preferred technical and professional experience Experience: Proven experience in designing and delivering AI solutions, with a focus on foundation models, large language models, exposure to open source, or similar technologies. Experience in natural language processing (NLP) and text analytics is highly desirable. Understanding of machine learning and deep learning algorithms. Strong track record in scientific publications or open-source communities Experience in full AI project lifecycle, from research and prototyping to deployment in production environments
Posted 1 month ago
2.0 - 4.0 years
4 - 6 Lacs
Mumbai
Work from Office
Responsibilities : Manipulate and preprocess structured and unstructured data to prepare datasets for analysis and model training. Utilize Python libraries like PyTorch, Pandas, and NumPy for data analysis, model development, and implementation. Fine-tune large language models (LLMs) to meet specific use cases and enterprise requirements. Collaborate with cross-functional teams to experiment with AI/ML models and iterate quickly on prototypes. Optimize workflows to ensure fast experimentation and deployment of models to production environments. Implement containerization and basic Docker workflows to streamline deployment processes. Write clean, efficient, and production-ready Python code for scalable AI solutions. Good to Have: Exposure to cloud platforms like AWS, Azure, or GCP. Knowledge of MLOps principles and tools. Basic understanding of enterprise Knowledge Management Systems. Ability to work against tight deadlines. Ability to work on unstructured projects independently. Strong initiative and self-motivated Strong Communication & Collaboration acumen. Required Skills: Proficiency in Python with strong skills in libraries like PyTorch, Pandas, and NumPy. Experience in handling both structured and unstructured datasets. Familiarity with fine-tuning LLMs and understanding of modern NLP techniques. Basics of Docker and containerization principles. Demonstrated ability to experiment, iterate, and deploy code rapidly in a production setting. Strong problem-solving mindset with attention to detail. Ability to learn and adapt quickly in a fast-paced, dynamic environment.
Posted 1 month ago
3.0 - 5.0 years
11 - 12 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
We are looking for a highly skilled and analytical Data Scientist to join our team. You will be responsible for analyzing large datasets to extract actionable insights, build predictive models, and support data-driven decision-making across the organization. The ideal candidate should be passionate about data, statistical analysis, and applying machine learning techniques to solve real-world business problems. Key Responsibilities: Collect, clean, and preprocess large datasets from multiple sources. Analyze complex data sets to identify patterns, trends, and opportunities. Design and implement machine learning models to solve business challenges. Communicate findings through data visualizations, dashboards, and reports. Collaborate with cross-functional teams (engineering, product, marketing) to implement data solutions. Monitor and maintain model performance and retrain when necessary. Stay updated on the latest developments in AI/ML and data science tools. Required Skills and Qualifications: Bachelors / Masters/Ph.D. in Computer Science, Statistics, Mathematics, Data Science, or a related field. Strong proficiency in Python or R, along with libraries such as Pandas, NumPy, Scikit-learn, TensorFlow, or PyTorch. Experience with data querying languages such as SQL. Good understanding of statistical analysis, machine learning, and data mining techniques. Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote
Posted 1 month ago
4.0 - 9.0 years
6 - 11 Lacs
Chennai
Work from Office
Skills Skill SDLC PL/SQL Solution Architecture Java Unix .NET SOA Microsoft SQL Server SQL ASP.NET Education Qualification No data available CERTIFICATION No data available Role Data Scientist Educational Qualification ME/BE / MCA Experience Required 4+ Years Shifts Day shift Skills & Responsibilities Experience4+ years in machine learning, deep learning, or generative AI. Programming & Frameworks Strong Python scripting with Pandas, NumPy, OOPs concepts. Experience with PyTorch, TensorFlow, Keras, Hugging Face Transformers. Proficiency in SQL queries when needed. Generative AI & NLP Experience with LLMs, GANs, VAEs, Diffusion Models. Familiarity with OpenAI GPT, DALLE, Stable Diffusion. Deep knowledge of NLP techniques & deep learning architectures (RNN, CNN, LSTM, GRU). Machine Learning & Statistics Understanding of ML/DL algorithms, statistical analysis, and feature engineering. Theoretical knowledge of Random Forest, SVM, Boosting, Bagging, Regression (Linear & Logistic), and Unsupervised Learning. MLOps & Deployment Familiarity with Cloud platforms (AWS, Azure, GCP). Experience in MLOps, CI/CD, Docker, Kubernetes. Comfortable with Linux systems and GPU-based deep learning. Research & Ethics Contributions to AI research, open-source projects, or Kaggle competitions. Awareness of AI ethics, bias mitigation, and model interpretability. Soft Skills & Work Environment Ability to work independently and deliver results. Experience in an agile development environment. Knowledge of computer vision is a plus.
Posted 1 month ago
2.0 - 5.0 years
4 - 7 Lacs
Gurugram
Work from Office
Company: Mercer Description: About the Role: We are based in Gurgaon and looking for a Senior Computer Vision Engineer to join our team and help our team to improve and create new technologies. You'll work on projects which makes online assessment more secure and cheating proof. If you're a seasoned computer vision expert with a passion for innovation and a track record of delivering impactful solutions, we would be happy to meet you. Role Senior Computer Vision Engineer Functional Area AI Educational Qualification: BTech/MS/MTech/PhD in Computer Science/Computer vision/Signal Processing/Deep Learning or equivalent.Should have worked in an academic or professional setting in the field of computer vision/signal processing. Experience: 2-5 years Location Gurgaon Key Responsibilities: Develop and optimize advanced computer vision algorithms for image and video analysis tasks.Design, implement and train deep learning models for object detection, face processing, activity recognition and other related tasks.Test and refine models and systems based on real-world data and feedback.Evaluate project requirements, plan and manage the roadmap of a project.Present findings and insights in a clear and concise manner to stakeholders.Collaborate and help to integrate and deploy computer vision systems into broader product architecture.Conduct research to stay updated on emerging computer vision technologies and trends.Automate data preprocessing and annotation processes to streamline workflow efficiency.Maintain comprehensive documentation for algorithms, implementations, and evaluations.Mentor junior engineers and provide strategic guidance on project development. and skills: Proficiency in Python and knowledge of C++, Java and JS is plus.Solid understanding of neural networks, especially convolutional neural networks (CNNs). Knowledge of RCNNs and vision transformers.Proficient in understanding, designing and implementing deep learning models using frameworks such as TensorFlow, PyTorch and Keras.Understanding of fundamental image processing techniques like image filtering, edge detection, image segmentation and image augmentation.Experience in evaluating computer vision models using relevant metrics and performance indicators.Familiarity with GPU and related technologies which is utilized for improved computational efficiency such as CUDA, CUDNN, tensorRT etc.Familiarity with Python libraries such as OpenCV, NumPy, Pandas and scikit-learn etc.Basic knowledge of linear algebra, calculus, and statistics.Strong critical thinking, analytical, and problem-solving skillsSelf-motivated, quick learner and strong team player with ability to work with minimal supervision. Mercer, a business of Marsh McLennan (NYSEMMC), is a global leader in helping clients realize their investment objectives, shape the future of work and enhance health and retirement outcomes for their people. Marsh McLennan is a global leader in risk, strategy and people, advising clients in 130 countries across four businessesMarsh, Guy Carpenter, Mercer and Oliver Wyman. With annual revenue of $24 billion and more than 90,000 colleagues, Marsh McLennan helps build the confidence to thrive through the power of perspective. For more information, visit mercer.com, or follow on LinkedIn and X. Marsh McLennan is committed to embracing a diverse, inclusive and flexible work environment. We aim to attract and retain the best people and embrace diversity of age, background, caste, disability, ethnic origin, family duties, gender orientation or expression, gender reassignment, marital status, nationality, parental status, personal or social status, political affiliation, race, religion and beliefs, sex/gender, sexual orientation or expression, skin color, or any other characteristic protected by applicable law. Marsh McLennan is committed to hybrid work, which includes the flexibility of working remotely and the collaboration, connections and professional development benefits of working together in the office. All Marsh McLennan colleagues are expected to be in their local office or working onsite with clients at least three days per week. Office-based teams will identify at least one anchor day per week on which their full team will be together in person.
Posted 1 month ago
15.0 - 20.0 years
17 - 22 Lacs
Bengaluru
Work from Office
Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Large Language Models Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an AI / ML Engineer, you will engage in the development of applications and systems that leverage artificial intelligence tools and cloud AI services. Your typical day will involve designing and implementing production-ready solutions, ensuring that they meet quality standards. You will work with various AI models, including generative AI, and may also explore deep learning, neural networks, chatbots, and image processing technologies. Collaboration with cross-functional teams will be essential to integrate these solutions effectively into existing systems and workflows. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve existing AI models and systems to ensure optimal performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in Large Language Models.- Good To Have Skills: Experience with cloud-based AI services.- Strong understanding of deep learning frameworks such as TensorFlow or PyTorch.- Familiarity with natural language processing techniques and tools.- Experience in developing and deploying chatbots and conversational agents. Additional Information:- The candidate should have minimum 5 years of experience in Large Language Models.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France