Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
10 - 20 Lacs
Noida
Remote
Experience: 4-8 Years Job Location: Remote No. of Position: Multiple Qualifications: B Tech / M Tech/ MCA or Higher Work Timings: 1:30 PM IST to 10:30 PM IST Functional Area: Data Engineering Job Description: We are seeking a skilled Data Engineer with 4 to 8 years of experience to join our team. The ideal candidate will have a strong background in Python programming, along with expertise in AWS or Azure services. The candidate should also possess solid SQL skills and be proficient in web scraping techniques. Role and responsibilities: Develop and maintain data pipelines using Python, PySpark, and SQL to extract, transform, and load data from various sources. Implement and optimize data processing workflows on AWS or Azure cloud platforms. Utilize Databricks or Azure data factory for efficient data storage and processing. Develop and maintain web scraping scripts to gather data from online sources. Collaborate with cross-functional teams to design and implement API endpoints for data access. Work on UI Path automation projects to streamline data extraction and processing tasks. Develop and maintain Django or Flask web applications for internal data management and visualization. Leverage Pandas and other data manipulation libraries for data analysis and preprocessing. Enhance API development skills for integrating data services with external systems. Stay updated with the latest industry trends and technologies, such as Flask, PyTorch, etc., to continuously improve data engineering processes. Skills, Knowledge, Experience: Bachelor's degree in Computer Science, Engineering, or related field. 4 to 8 years of experience in data engineering roles. Proficiency in Python programming language. Strong understanding of AWS or Azure cloud services. Solid SQL skills for querying and manipulating data. Previous experience with web scraping techniques and tools. Hands-on experience with Django web framework. Knowledge of API development and integration. Experience with PySpark for big data processing. Proficiency in Pandas for data manipulation and analysis. Familiarity with UI Path for automation or Power Automate is advantageous. Experience with Databricks. Familiarity with Flask and PyTorch is a plus. Experience working with USA or European clients is a plus. Experience working with multi-vendor, multi-culture, distributed offshore, and onshore development teams in a dynamic and complex environment will be helpful in day-to-day working. Must have excellent written and verbal communication skills. The candidate should be able to present his suggestions and explain the technical approach.
Posted 1 month ago
6.0 - 9.0 years
0 - 2 Lacs
Noida, Gurugram, Greater Noida
Hybrid
Hello Everyone, Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together. Primary Skills: Machine Learning, Deep Learning, TensorFlow, PyTorch, Python, Data Analysis, Algorithm Design, AWS/GCP, Model Deployment, Big Data Technologies, Version Control (Git) Primary Responsibilities: Design and implement complex AI/ML models and algorithms Collaborate with data scientists to preprocess, analyze, and interpret large datasets Develop and maintain scalable and robust machine learning frameworks Integrate AI/ML solutions into products and develop new innovative features Perform model testing, validation, and optimization Deploy machine learning models into production environments Stay up-to-date with the latest advancements in AI/ML technologies and practices Mentor junior team members and provide technical guidance Required Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or a related field 6+ years of experience in software engineering with a focus on AI/ML Experience with machine learning frameworks such as TensorFlow, PyTorch, or similar Experience with data preprocessing and analysis Solid programming skills in Python, Java, or C++ Familiarity with cloud platforms and services (e.g., AWS, Google Cloud, Azure) Hands-on technical skill Proven excellent problem-solving skills and the ability to think algorithmically Proven solid communication and teamwork abilities Interested can share their updated CV at arshad_mohammad@optum.com Thanks and Regards, Arshad Ayub - Talent Scout Optum
Posted 1 month ago
3.0 - 8.0 years
2 Lacs
Noida
Work from Office
AbyM Technology is looking for Python Developer to join our dynamic team and embark on a rewarding career journey Coordinating with development teams to determine application requirements. Writing scalable code using Python programming language. Testing and debugging applications. Developing back-end components. Integrating user-facing elements using server-side logic. Assessing and prioritizing client feature requests. Integrating data storage solutions. Reprogramming existing databases to improve functionality. Developing digital tools to monitor online traffic. Write effective, scalable code Develop back-end components to improve responsiveness and overall performance Integrate user-facing elements into applications Test and debug programs Improve functionality of existing systems Implement security and data protection solutions Assess and prioritize feature requests Coordinate with internal teams to understand user requirements and provide technical solutions.
Posted 1 month ago
3.0 - 8.0 years
4 - 8 Lacs
Mumbai, Gurugram, Bengaluru
Work from Office
Job Title - S&C Global Network - AI - Retail - Consultant - Retail Specialized Data Scientist/ Retail Specialized Data Scientist Level 9 SnC GN Data & AI Management Level:09 - Consultant Location:Bangalore / Gurgaon / Mumbai / Chennai / Pune / Hyderabad / Kolkata Must have skills: A solid understanding of retail industry dynamics, including key performance indicators (KPIs) such as sales trends, customer segmentation, inventory turnover, and promotions. Strong ability to communicate complex data insights to non-technical stakeholders, including senior management, marketing, and operational teams. Meticulous in ensuring data quality, accuracy, and consistency when handling large, complex datasets. Gather and clean data from various retail sources, such as sales transactions, customer interactions, inventory management, website traffic, and marketing campaigns. Strong proficiency in Python for data manipulation, statistical analysis, and machine learning (libraries like Pandas, NumPy, Scikit-learn). Expertise in supervised and unsupervised learning algorithms Use advanced analytics to optimize pricing strategies based on market demand, competitor pricing, and customer price sensitivity. Good to have skills: Familiarity with big data processing platforms like Apache Spark, Hadoop, or cloud-based platforms such as AWS or Google Cloud for large-scale data processing. Experience with ETL (Extract, Transform, Load) processes and tools like Apache Airflow to automate data workflows. Familiarity with designing scalable and efficient data pipelines and architecture. Experience with tools like Tableau, Power BI, Matplotlib, and Seaborn to create meaningful visualizations that present data insights clearly. Job Summary : The Retail Specialized Data Scientist will play a pivotal role in utilizing advanced analytics, machine learning, and statistical modeling techniques to help our retail business make data-driven decisions. This individual will work closely with teams across marketing, product management, supply chain, and customer insights to drive business strategies and innovations. The ideal candidate should have experience in retail analytics and the ability to translate data into actionable insights. Roles & Responsibilities: Leverage Retail Knowledge:Utilize your deep understanding of the retail industry (merchandising, customer behavior, product lifecycle) to design AI solutions that address critical retail business needs. Gather and clean data from various retail sources, such as sales transactions, customer interactions, inventory management, website traffic, and marketing campaigns. Apply machine learning algorithms, such as classification, clustering, regression, and deep learning, to enhance predictive models. Use AI-driven techniques for personalization, demand forecasting, and fraud detection. Use advanced statistical methods help optimize existing use cases and build new products to serve new challenges and use cases. Stay updated on the latest trends in data science and retail technology. Collaborate with executives, product managers, and marketing teams to translate insights into business actions. Professional & Technical Skills : Strong analytical and statistical skills. Expertise in machine learning and AI. Experience with retail-specific datasets and KPIs. Proficiency in data visualization and reporting tools. Ability to work with large datasets and complex data structures. Strong communication skills to interact with both technical and non-technical stakeholders. A solid understanding of the retail business and consumer behavior. Programming Languages:Python, R, SQL, Scala Data Analysis Tools:Pandas, NumPy, Scikit-learn, TensorFlow, Keras Visualization Tools:Tableau, Power BI, Matplotlib, Seaborn Big Data Technologies:Hadoop, Spark, AWS, Google Cloud Databases:SQL, NoSQL (MongoDB, Cassandra) Additional Information: - Qualification Experience: Minimum 3 year(s) of experience is required Educational Qualification: Bachelors or Master's degree in Data Science, Statistics, Computer Science, Mathematics, or a related field.
Posted 1 month ago
6.0 - 10.0 years
5 - 9 Lacs
Noida
Work from Office
We are looking for a skilled ML-Ops (AI/ML Ops/Kubernetes) Engineer with 6 to 10 years of experience. The ideal candidate will have a strong background in machine learning and operations, with expertise in AI/ML ops and Kubernetes. Roles and Responsibility Design and implement scalable data pipelines using AI/ML techniques. Collaborate with cross-functional teams to develop and deploy machine learning models. Develop and maintain large-scale data architectures using Kubernetes. Ensure seamless integration of machine learning models into existing systems. Troubleshoot and resolve complex technical issues related to AI/ML and Kubernetes. Implement automated testing and deployment scripts for efficient workflow management. Job Strong understanding of machine learning concepts and algorithms. Experience with AI/ML frameworks such as TensorFlow or PyTorch. Proficiency in programming languages such as Python or Java. Experience with cloud platforms such as AWS or Azure. Knowledge of containerization using Docker and orchestration using Kubernetes. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. About Company Apptad Technologies Pvt Ltd. is an employment firm that provides recruitment services to various industries. We focus on matching top talent with the right opportunities, and we are committed to delivering exceptional results.
Posted 1 month ago
5.0 - 10.0 years
4 - 8 Lacs
Noida
Work from Office
We are looking for a skilled Senior Azure Data Engineer with 5 to 10 years of experience to design and implement scalable data pipelines using Azure technologies, driving data transformation, analytics, and machine learning. The ideal candidate will have a strong background in data engineering and proficiency in Python, PySpark, and Spark Pools. Roles and Responsibility Design and implement scalable Databricks data pipelines using PySpark. Transform raw data into actionable insights through data analysis and machine learning. Build, deploy, and maintain machine learning models using MLlib or TensorFlow. Optimize cloud data integration from Azure Blob Storage, Data Lake, and SQL/NoSQL sources. Execute large-scale data processing using Spark Pools and fine-tune configurations for efficiency. Collaborate with cross-functional teams to identify business requirements and develop solutions. Job Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Minimum 5 years of experience in data engineering, with at least 3 years specializing in Azure Databricks, PySpark, and Spark Pools. Proficiency in Python, PySpark, Pandas, NumPy, SciPy, Spark SQL, DataFrames, RDDs, Delta Lake, Databricks Notebooks, and MLflow. Hands-on experience with Azure Data Lake, Blob Storage, Synapse Analytics, and other relevant technologies. Strong understanding of data modeling, data warehousing, and ETL processes. Experience with agile development methodologies and version control systems.
Posted 1 month ago
5.0 - 10.0 years
4 - 8 Lacs
Noida
Work from Office
We are looking for a skilled Database Engineer with 5 to 10 years of experience to design, develop, and maintain our database infrastructure. This position is based remotely. Roles and Responsibility Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale and big data processing. Implement data security measures to protect sensitive information and comply with relevant regulations. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to relational database systems or cloud-based solutions like Google BigQuery and AWS. Develop import workflows and scripts to automate data import processes. Ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and resolve issues, while collaborating with the full-stack web developer to implement efficient data access and retrieval mechanisms. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows, exploring third-party technologies as alternatives to legacy approaches for efficient data pipelines. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices, and use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines, taking accountability for achieving development milestones. Prioritize tasks to ensure timely delivery in a fast-paced environment with rapidly changing priorities, while also collaborating with fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems, leveraging online resources effectively like StackOverflow, ChatGPT, Bard, etc., considering their capabilities and limitations. Job Proficiency in SQL and relational database management systems like PostgreSQL or MySQL, along with database design principles. Strong familiarity with Python for scripting and data manipulation tasks, with additional knowledge of Python OOP being advantageous. Demonstrated problem-solving skills with a focus on optimizing database performance and automating data import processes. Knowledge of cloud-based databases like AWS RDS and Google BigQuery. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. About Company Marketplace is an experienced team of industry experts dedicated to helping readers make informed decisions and choose the right products with ease. We arm people with trusted advice and guidance, so they can make confident decisions and get back to doing the things they care about most.
Posted 1 month ago
3.0 - 8.0 years
3 - 6 Lacs
Noida
Work from Office
We are looking for a skilled MLOps professional with 3 to 11 years of experience to join our team in Hyderabad. The ideal candidate will have a strong background in Machine Learning, Artificial Intelligence, and Computer Vision. Roles and Responsibility Design, build, and maintain efficient, reusable, and tested code in Python and other applicable languages and library tools. Understand stakeholder needs and convey them to developers. Work on automating and improving development and release processes. Deploy Machine Learning (ML) to large production environments. Drive continuous learning in AI and computer vision. Test and examine code written by others and analyze results. Identify technical problems and develop software updates and fixes. Collaborate with software developers and engineers to ensure development follows established processes and works as intended. Plan out projects and participate in project management decisions. Job Minimum 3 years of hands-on experience with AWS services and products (Batch, SageMaker, StepFunctions, CloudFormation/CDK). Strong Python experience. Minimum 3 years of experience with Machine Learning/AI or Computer Vision development/engineering. Ability to provide technical leadership to developers for designing and securing solutions. Understanding of Linux utilities and Bash. Familiarity with containerization using Docker. Experience with data pipeline frameworks, such as MetaFlow is preferred. Experience with Lambda, SQS, ALB/NLBs, SNS, and S3 is preferred. Practical experience deploying Computer Vision/Machine Learning solutions at scale into production. Exposure to technologies/tools such as Keras, Pandas, TensorFlow, PyTorch, Caffe, NumPy, DVC/CML.
Posted 1 month ago
6.0 - 11.0 years
25 - 30 Lacs
Pune, Chennai, Bengaluru
Hybrid
Role & responsibilities Preferred candidate profile Job Title: Senior Python Developer AI/ML Location: [On-site/Remote/Hybrid Location] Experience Required: 6+ years Employment Type: [Full-time Job Summary: We are seeking a highly skilled and experienced Senior Python Developer with strong AI/ML expertise to join our dynamic team. The ideal candidate will play a critical role in designing, developing, and deploying intelligent applications and services. You will collaborate closely with data scientists, ML engineers, and product teams to deliver scalable, high-performance AI solutions. Key Responsibilities: Design, develop, and maintain robust and scalable Python applications with integrated AI/ML components. Build and optimize machine learning models for classification, regression, NLP, computer vision, or recommendation systems. Work with large datasets, perform data wrangling, and implement data pipelines using tools such as Pandas, PySpark, or Apache Airflow. Integrate ML models into production-grade applications and APIs (using Flask, FastAPI, etc.). Collaborate with cross-functional teams including Data Engineering, DevOps, and Product Management to define system architecture and implementation strategies. Participate in code reviews, mentor junior developers, and contribute to best coding practices and documentation. Monitor and improve performance of deployed models and applications. Required Skills and Qualifications: Bachelor's/Master’s degree in Computer Science, Data Science, Engineering, or related field. 6+ years of professional experience in Python development. Strong knowledge of machine learning algorithms, statistical modeling, and data science techniques. Experience with ML frameworks and libraries such as scikit-learn, TensorFlow, Keras, PyTorch, XGBoost . Proficient in data processing and analysis tools like NumPy, Pandas, SQL, Spark . Hands-on experience deploying ML models in production environments. Experience with REST APIs and microservice architecture. Familiarity with containerization tools like Docker and orchestration tools like Kubernetes is a plus. Version control (Git), CI/CD pipelines, and cloud platforms (AWS, GCP, Azure) experience preferred. Excellent problem-solving skills, analytical thinking, and attention to detail.
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Power Programmer - Specialist Programmer - Machine Learning (EV-Q1-FY-26) Responsibilities Power Programmer is an important initiative within Global Delivery to develop a team of Full Stack Developers who will be working on complex engineering projects, platforms and marketplaces for our clients using emerging technologies., They will be ahead of the technology curve and will be constantly enabled and trained to be Polyglots., They are Go-Getters with a drive to solve end customer challenges and will spend most of their time in designing and coding, End to End contribution to technology oriented development projects., Providing solutions with minimum system requirements and in Agile Mode., Collaborate with Power Programmers., Open Source community and Tech User group., Custom Development of new Platforms & Solutions ,Opportunities., Work on Large Scale Digital Platforms and marketplaces., Work on Complex Engineering Projects using cloud native architecture ., Work with innovative Fortune 500 companies in cutting edge technologies., Co create and develop New Products and Platforms for our clients., Contribute to Open Source and continuously upskill in latest technology areas., Incubating tech user group Additional Responsibilities: Minimum requirements Formal training in a quantitative discipline (e.g., statistics, operations research, economics, computer science, mathematics) 5 years of relevant work experience in data analysis, data science or related fields, with deep expertise in statistical data analysis Experience with Python and related ML libraries (eg. Pandas, Pytorch) Experience with database languages (e.g. SQL) Experience working in a multi-disciplinary team of data scientists, software engineers, product managers and subject domain experts Experience in Agile working environmentPreferred requirements Experience with AWS cloud services like AWS Textract and Comprehend is preferred Experience with Dagster/Airflow is preferred Experience in MLOps Experience in delivering Natural language processing (NLP) products Technical and Professional Requirements: Responsibilities Responsible for ensuring that ML models and pipelines are deployed successfully into production Implementation of ML features for new and existing systems Deploy applications to AWS cloud leveraging on the full spectrum of the AWS cloud services Automate model training, testing and deployment using CI/CD Understand and implement metrics to verify effectiveness and soundness of the ML modelRequirements Preferred Skills: Technology->Machine learning->data science Technology->Cloud Platform->Azure AI Services->Azure Machine Learning Technology->Machine Learning->MLOps Educational Requirements Bachelor of Engineering,Bachelor Of Technology Service Line Strategic Technology Group
Posted 1 month ago
0.0 - 3.0 years
2 - 5 Lacs
Pune
Work from Office
Job Description: Python Training, Internship and Job Assistance Position: Python Intern Location: Pune, Maharashtra, India Duration: 6 months Mode: Offline Stipend: Unpaid Training Program: Free Training with Job Assistance upon Successful Completion Pre-Placement Offer (PPO) Opportunity: 2.5 LPA CTC (Based on performance) Note: CANDIDATES SHOULD BE READY TO LEARN NEW TECHNOLOGIES About the Role: We are seeking a motivated and detail-oriented Python Intern to join our team. The ideal candidate will have a foundational understanding of API development using frameworks like Flask and FastAPI, a strong grasp of Python libraries such as Pandas and NumPy, and a knack for problem-solving. Knowledge of data analytics tools and Large Language Models (LLMs) will be considered a plus. Key Responsibilities: Develop and maintain RESTful APIs using Flask and FastAPI. Work with data manipulation and analysis libraries like Pandas and NumPy. Assist in building scalable and efficient back-end solutions. Solve real-world problems with logical and analytical thinking. Support data analytics efforts using tools like Power BI (preferred). Collaborate with cross-functional teams to understand requirements and deliver solutions. Explore and implement applications of LLMs and other advanced technologies (as needed). Requirements: Proficiency in Python programming. Hands-on experience or knowledge of Flask and FastAPI frameworks. Familiarity with data analysis libraries such as Pandas and NumPy. Strong problem-solving and logical thinking skills. Good understanding of RESTful API design principles. Knowledge of data analytics tools like Power BI is a plus. Awareness of Large Language Models (LLMs) and their potential applications is a bonus. Eagerness to learn and work in a collaborative environment. What We Offer: Hands-on experience in API development and data analytics projects. Mentorship from experienced professionals. Opportunity to explore cutting-edge technologies like LLMs. A learning-focused environment to develop both technical and analytical skills. Potential for full-time opportunities based on performance.
Posted 1 month ago
4.0 - 8.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Reference 250008SX Responsibilities ML OPS Engineer You will be responsible for the deployment and maintenance of the group data science platform infrastructure, on which data science pipelines are deployed and scaled To achieve this, you will collaborate with Data Scientists and Data Engineers from various business lines and the Global Technology Service infrastructure team (GTS) Roles: Implement techniques and processes for supporting the development and scaling of data science pipelines Industrialize inference, retraining, monitoring data science pipelines, ensuring their maintainability and compliance Provide platform support to end-users Be attentive to the needs and requirements expressed by the end-users Anticipate needs and necessary developments for the platform Work closely with Data Scientists, Data Engineers, and business stakeholders Stay updated and demonstrate a keen interest in the ML OPS domain Environment: Cloud on-premise, AZure Python, Kubernetes Integrated vendor solutions: Dataiku, Snowflake DB: PostGreSQL Distributed computing: Spark Big Data: Hadoop, S3/Scality, MAPR Datascience: Scikit-learn, Transformers, ML Flow, Kedro, DevOps, CI/CD: JFROG, Harbor, Github Actions, Jenkins Monitoring: Elastic Search/Kibana, Grafana, Zabbix Agile ceremonies: PI planning, Sprint, Sprint Review, Refinement, Retrospectives, ? ITIL framework Required Profile required Technical Skills: Python FastApi, SqlAlchemy Numpy, Pandas, Scikit-learn, Transformers Kubernetes, Docker Pytest CI/CD: Jenkins, Ansible, GitHub Action, Harbor, Docker Soft Skills: Client Focus: Demonstrate strong listening skills, understanding, and anticipation of user needs Team Spirit: Organize collaboration, workshops to find the best solutions Share expertise with colleagues to find the most suitable solutions Innovation: Propose innovative ideas, solutions, or strategies, and think out the box Prefer simplicity over complexity Responsibility: Take ownership, keep commitments and respect deadlines Why join us "We are committed to creating a diverse environment and are proud to be an equal opportunity employer All qualified applicants receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status? Business insight At SocitGnrale, we are convinced that people are drivers of change, and that the world of tomorrow will be shaped by all their initiatives, from the smallest to the most ambitious Whether youre joining us for a period of months, years or your entire career, together we can have a positive impact on the future Creating, daring, innovating, and taking action are part of our DNA If you too want to be directly involved, grow in a stimulating and caring environment, feel useful on a daily basis and develop or strengthen your expertise, you will feel right at home with us! Still hesitating You should know that our employees can dedicate several days per year to solidarity actions during their working hours, including sponsoring people struggling with their orientation or professional integration, participating in the financial education of young apprentices, and sharing their skills with charities There are many ways to get involved We are committed to support accelerating our Groups ESG strategy by implementing ESG principles in all our activities and policies They are translated in our business activity (ESG assessment, reporting, project management or IT activities), our work environment and in our responsible practices for environment protection Diversity and Inclusion We are an equal opportunities employer and we are proud to make diversity a strength for our company Societe Generale is committed to recognizing and promoting all talents, regardless of their beliefs, age, disability, parental status, ethnic origin, nationality, gender identity, sexual orientation, membership of a political, religious, trade union or minority organisation, or any other characteristic that could be subject to discrimination
Posted 1 month ago
1.0 - 2.0 years
3 - 4 Lacs
Chennai
Work from Office
Key Responsibilities: Develop and deploy AI/ML models and backend APIs using Python and FastAPI. * Build scalable databases and pipelines using MySQL for AI-powered applications. * Work on projects involving STT/TTS, LLM fine-tuning, and RAG-based applications. * Design and implement multi-agent systems using frameworks like LangGraph and LangChain. * Collaborate with cross-functional teams to build AI chatbots, medical scribes, and automated coding tools. * Optimize prompt engineering for specific domain needs. Required Skills: * Programming: Python (Advanced), FastAPI (Hands-on), MySQL (Schema design, queries) * AI/ML Frameworks: HuggingFace, Scikit-learn, Keras, PyTorch * AI Technologies : LangChain, LangGraph, OpenAI APIs, Vertex AI, RAG methods * Data Tools : Pandas, NumPy, SQL, Matplotlib, Seaborn * Soft Skills: Problem-solving, team collaboration, agile methodology, project management Preferred Qualifications: * Computer science or masters * Portfolio of end-to-end AI applications (STT/TTS, chatbots, RAG systems, etc.) What We Offer: * Opportunity to work on real-world AI challenges in healthcare and enterprise settings * Collaborative, innovation-driven environment * Competitive compensation and career growth opportunities * Flexible work arrangements and supportive leadership How to Apply Please send your updated resume and project portfolio (GitHub link or similar) to careers@llmsoftware.com with the subject Line: " AI/ML Engineer (Python/FastAPI)" LLMSoftware.com is an equal opportunity employer. We value diversity and are committed to fostering an inclusive workplace for all team members.
Posted 1 month ago
5.0 - 10.0 years
20 - 25 Lacs
Gurugram
Work from Office
Required Desired Prior experience with writing and debugging python Prior experience with building data pipelines. Prior experience Data lakes in an aws environment Prior experience with Data warehouse technologies in an aws environment Prior experience with AWS EMR Prior experince with pyspark Candidate should have prior experience with AWS and Azure. Additional Cloud-based tools experience is important (see skills section) Additional desired skills include experience with the following: Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience with Python and experience with libraries such as pandas and numpy. Experience with pyspark. Experience building and optimizing big data data pipelines, architectures, and data sets.
Posted 1 month ago
7.0 - 12.0 years
15 - 30 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Dear Candidate, Greetings of the day!!! Please find the Job Description below. Job Title: Python Developer Experience : 5-12years Location: Bangalore/Chennai/Hyderabad Detailed JD: Coding experience in core Python programming Implementation ability in basic concepts like MAP/REDUCE/FILTER/Custom Iterators Knowledge in concepts like Decorators, Multithreading/Multiprocessing/super keywords usage, Context Managers Strong on OOPS concepts and implementation Experience in using Jupyter/Eclipse/Spyder IDE Required experience message-oriented middleware (AMPS/RabbitMQ/ZMQ etc.) Good Database skills with SQL/NoSQL commands and queries SQL Alchemy Experience in REST/WebSocket, hands-on in full stack components, will be plus Excellent grip on debugging and problem-solving techniques Flask, Gunicorn, Installing PYPI packages and maintaining python environment Experience with Data preparation and basics of Pandas If you are interested to learn and grow with us. Kindly share your latest resume at ayasha.k@pisquaretech.com
Posted 1 month ago
9.0 - 14.0 years
30 - 35 Lacs
Pune
Work from Office
.
Posted 1 month ago
2.0 - 7.0 years
4 - 9 Lacs
Chennai
Work from Office
Diverse Lynx is looking for Python Developer to join our dynamic team and embark on a rewarding career journey Coordinating with development teams to determine application requirements. Writing scalable code using Python programming language. Testing and debugging applications. Developing back-end components. Integrating user-facing elements using server-side logic. Assessing and prioritizing client feature requests. Integrating data storage solutions. Reprogramming existing databases to improve functionality. Developing digital tools to monitor online traffic. Write effective, scalable code Develop back-end components to improve responsiveness and overall performance Integrate user-facing elements into applications Test and debug programs Improve functionality of existing systems Implement security and data protection solutions Assess and prioritize feature requests Coordinate with internal teams to understand user requirements and provide technical solutions.
Posted 1 month ago
1.0 - 3.0 years
3 - 6 Lacs
Surat
Work from Office
iGeek is looking for Python Developer to join our dynamic team and embark on a rewarding career journey Coordinating with development teams to determine application requirements. Writing scalable code using Python programming language. Testing and debugging applications. Developing back-end components. Integrating user-facing elements using server-side logic. Assessing and prioritizing client feature requests. Integrating data storage solutions. Reprogramming existing databases to improve functionality. Developing digital tools to monitor online traffic. Write effective, scalable code Develop back-end components to improve responsiveness and overall performance Integrate user-facing elements into applications Test and debug programs Improve functionality of existing systems Implement security and data protection solutions Assess and prioritize feature requests Coordinate with internal teams to understand user requirements and provide technical solutions.
Posted 1 month ago
3.0 - 8.0 years
6 - 16 Lacs
Hyderabad
Work from Office
Role & responsibilities Develop and maintain high-quality, reusable, and efficient Python code. Build robust and scalable web applications and APIs using frameworks such as Django, Flask, or FastAPI. Design and implement database solutions, ensuring data integrity and performance. Integrate with third-party APIs, services, and tools. Participate in code reviews, unit testing, and performance tuning. Work collaboratively with DevOps, front-end developers, QA engineers, and product managers. Troubleshoot and debug applications and identify performance bottlenecks. Ensure adherence to coding standards, security best practices, and documentation. Bachelors or Masters degree in Computer Science, Information Technology, or related field. Minimum 3+ years of professional experience in Python programming. Strong understanding of Object-Oriented Programming (OOP), design patterns, and software architecture. Hands-on experience with one or more Python web frameworks (Django, Flask, FastAPI). Solid knowledge of RESTful API design and implementation. Experience working with relational databases (e.g., PostgreSQL, MySQL) and ORM tools. Familiarity with Git and version control workflows. Experience working in Agile/Scrum development environments. Strong problem-solving skills and attention to detail Preferred candidate profile Experience with cloud platforms such as AWS, Azure, or Google Cloud Platform. Knowledge of containerization technologies (Docker, Kubernetes). Exposure to CI/CD tools and DevOps practices. Familiarity with front-end technologies (JavaScript, React, HTML/CSS) is a plus. Understanding of microservices architecture and API gateways. Experience with automated testing frameworks (PyTest, unittest).
Posted 1 month ago
4.0 - 9.0 years
14 - 22 Lacs
Pune
Work from Office
Responsibilities: * Design, develop, test and maintain scalable Python applications using Scrapy, Selenium and Requests. * Implement anti-bot systems and data pipeline solutions with Airflow and Kafka. Share CV on recruitment@fortitudecareer.com Flexi working Work from home
Posted 1 month ago
7.0 - 10.0 years
20 - 30 Lacs
Pune, Chennai
Hybrid
YOULL BUILD TECH THAT EMPOWERS GLOBAL BUSINESSES Our Connect Technology teams are working on our new Connect platform, a unified, global, open data ecosystem powered by Microsoft Azure. Our clients around the world rely on Connect data and insights to innovate and grow. As a Senior Data Engineer, you’ll be part of a team of smart, highly skilled technologists who are passionate about learning and supporting cutting-edge technologies such as Spark, Scala, Pyspark, Databricks, Airflow, SQL, Docker, Kubernetes, and other Data engineering tools. These technologies are deployed using DevOps pipelines leveraging Azure, Kubernetes, Jenkins and Bitbucket/GIT Hub. WHAT YOU’LL DO: Develop, test, troubleshoot, debug, and make application enhancements leveraging, Spark , Pyspark, Scala, Pandas, Databricks, Airflow, SQL as the core development technologies. Deploy application components using CI/CD pipelines. Build utilities for monitoring and automating repetitive functions. Collaborate with Agile cross-functional teams - internal and external clients including Operations, Infrastructure, Tech Ops Collaborate with Data Science team and productionize the ML Models. Participate in a rotational support schedule to provide responses to customer queries and deploy bug fixes in a timely and accurate manner Qualifications WE’RE LOOKING FOR PEOPLE WHO HAVE: 8-10 Years of years of applicable software engineering experience Strong fundamentals with experience in Bigdata technologies, Spark, Pyspark, Scala, Pandas, Databricks, Airflow, SQL, Must have experience in cloud technologies, preferably Microsoft Azure. Must have experience in performance optimization of Spark workloads. Good to have experience with DevOps Technologies as GIT Hub, Kubernetes, Jenkins, Docker. Good to have knowledge of relational databases, preferably PostgreSQL. Excellent English communication skills, with the ability to effectively interface across cross-functional technology teams and the business Minimum B.S. degree in Computer Science, Computer Engineering or related field Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP)
Posted 1 month ago
2.0 - 4.0 years
20 - 30 Lacs
Gurugram
Work from Office
Preferred Education**: Graduates from IIT or equivalent premier institutions About the Role: We are in search of a couple of top-notch AI developers to join our team remotely. The role is ideal for someone passionate about building innovative AI-based applications and contributing to cutting-edge technology solutions. The candidate should have a strong background in AI, machine learning, and deep learning, with hands-on experience in building and deploying AI models. Key Responsibilities: 1. Develop, implement, and maintain advanced AI algorithms and models. 2. Collaborate with cross-functional teams to integrate AI solutions into our application. 3. Optimize and fine-tune models for performance, scalability, and efficiency. 4. Continuously research and stay updated with the latest advancements in AI/ML technologies. 5. Write clean, maintainable, and efficient code. 6. Troubleshoot and debug AI models and systems. 7. Participate in peer reviews and contribute to team knowledge sharing. Required Skills: - Strong proficiency in programming languages such as Python, R, or Java. - Hands-on experience with AI/ML libraries and frameworks like TensorFlow, PyTorch, Keras, or Scikit-learn. - Solid understanding of machine learning algorithms, deep learning techniques, and natural language processing. - Knowledge of cloud platforms such as AWS, GCP, or Azure for AI model deployment. - Experience with data preprocessing, feature engineering, and model evaluation metrics. - Strong problem-solving skills and the ability to work independently in a remote setup. - Excellent communication and teamwork skills. Preferred Qualifications: - A degree from IIT or equivalent premier institutions - Experience in developing AI-driven applications, with at least one year of hands-on experience. - Previous exposure to projects related to AI-based apps, computer vision, NLP, or similar fields
Posted 1 month ago
1.0 - 4.0 years
7 - 11 Lacs
Bengaluru
Work from Office
- Good knowledge of Python 3.X and Django Framework, - Good knowledge of databases (PostgreSQL 12.X) and backend services (RESTful) - Good communication skills - Good debugging skills - Exposure to AWS technologies is a plus - Exposure to Javascript frameworks like React and Vue.js is a plus - Good to have knowledge of - Git - Thorough understanding of the responsibilities of the platform, database, API, caching layer, proxies, and other web services.
Posted 1 month ago
7.0 - 11.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Were Hiring: Python + GCP Engineer (Dataiku) Experience: 68 Years Location: Bangalore / Chennai / Gurugram Company: Derisk360 Are you passionate about building data solutions on the cloud and experienced in Python and GCP BigQueryJoin our team to build scalable pipelines, automate processes, and enable smarter decision-making with Dataiku and modern cloud tooling What Youll Do: Build and manage data workflows in Dataiku, including partitioned datasets Develop and optimize Python scripts for complex data transformations using pandas, numpy, and regex Integrate data sources and orchestrate transformations using Google Cloud Platform (BigQuery) Use Terraform to manage infrastructure and deployment of code changes in a scalable, automated fashion Collaborate with data scientists, analysts, and DevOps to deliver end-to-end data solutions What You Bring: 52+ years of hands-on experience with Dataiku, particularly working with partitioned datasets Strong command of Python, especially for data handling using pandas and numpy Practical knowledge of regular expressions (regex) for data cleaning and transformation Proficiency with GCP BigQuery for querying and managing large datasets Experience using Terraform to manage cloud infrastructure and code deployments Nice to Have: Exposure to CI/CD practices in data workflows Prior experience in data governance, metadata management, or MLOps environments What Youll Get: Competitive compensation and hybrid flexibility Opportunity to work on large-scale, impactful cloud data projects Collaborative culture with a strong focus on technical excellence and learning Access to cutting-edge tools in the data engineering ecosystem
Posted 1 month ago
2.0 - 5.0 years
8 - 12 Lacs
Hyderabad
Work from Office
We use cookies to offer you the best possible website experience Your cookie preferences will be stored in your browsers local storage This includes cookies necessary for the website's operation Additionally, you can freely decide and change any time whether you accept cookies or choose to opt out of cookies to improve website's performance, as well as cookies used to display content tailored to your interests Your experience of the site and the services we are able to offer may be impacted if you do not accept all cookies Press Tab to Move to Skip to Content Link Skip to main content Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook Search by Keyword Search by Location Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook View Profile Employee Login Search by Keyword Search by Location Show More Options Loading Requisition ID All Skills All Select How Often (in Days) To Receive An Alert: Create Alert Select How Often (in Days) To Receive An Alert: Apply now Apply Now Start apply with LinkedIn Please wait Contractor AI/ML Job Date: Jun 11, 2025 Job Requisition Id: 61561 Location: Hyderabad, TG, IN Pune, IN Pune, MH, IN Indore, MP, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation At YASH, were a cluster of the brightest stars working with cutting-edge technologies Our purpose is anchored in a single truth bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future We are looking forward to hire AI/ML Professionals in the following areas : Designation: AI Engineer Experience: 24 Years Job Type: Full-time We are seeking a highly skilled and motivated Data Scientist to join our dynamic team In this role, you will leverage your advanced analytical and technical expertise to solve complex business problems and drive impactful data-driven decisions You will design, develop, and deploy sophisticated machine learning models, conduct in-depth data analyses, and collaborate with cross-functional teams to deliver actionable insights Responsibilities: Build and deploy ML models for classification, regression, and clustering tasks Apply foundational GenAI concepts such as embeddings, summarization, and RAG Use APIs and tools like LangChain, vector databases (e g , Pinecone, FAISS) Prepare documentation and results interpretation Required Skills: Strong hands-on experience in Python, Scikit-learn, Pandas Knowledge of model evaluation, feature engineering, and model tuning Exposure to LangChain and vector DBs Basic exposure to FastAPI or Flask At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Apply now Apply Now Start apply with LinkedIn Please wait Find Similar Jobs: Careers Home View All Jobs Top Jobs Quick Links Blogs Events Webinars Media Contact Contact Us Copyright 2020 YASH Technologies All Rights Reserved
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough