Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a Full-Time employee working on-site at Office 406, 4th floor, Treza Business Hub HQ47+4RW Mohan Nagar, near Bitwise, Mohan Nagar Co-Op Society, Baner, Pune, Maharashtra 411045, you will be responsible for the following key responsibilities: MEAN Stack Development: Design and develop web applications utilizing MongoDB, Express.js, Angular, and Node.js. Your role will involve creating and optimizing reusable components to ensure a responsive user experience. Additionally, you will integrate and manage RESTful APIs to guarantee application security and optimal performance. You will be expected to handle application state management and implement efficient solutions for front-end tasks. Python Development: Your tasks will include building and maintaining robust backend systems using frameworks such as Django, Flask, or FastAPI. You will be responsible for developing APIs and automating workflows to support application features. Furthermore, you will perform data processing and transform information to align with business logic. Server Deployment and Maintenance: Your role will involve deploying and managing applications on cloud platforms like AWS, Azure, or Google Cloud. You will ensure server security, scalability, and performance through proper configuration. Additionally, you will be responsible for debugging, troubleshooting, and optimizing applications to maintain speed and reliability. Team Collaboration and Continuous Improvement: You will collaborate with cross-functional teams to design and implement new features. It is essential to stay updated with the latest technologies, trends, and best practices to improve development workflows. Required Skills and Experience: To excel in this role, you should have proven experience in MEAN stack development and Python programming. Strong proficiency in JavaScript (ES6+) and Python is essential. Hands-on experience with MongoDB, Express.js, Angular, Node.js, and Python frameworks (Django, Flask, or FastAPI) is required. Knowledge of RESTful API design and implementation, along with familiarity with front-end frameworks like Bootstrap or Angular Material is crucial. A solid understanding of asynchronous programming, event-driven architecture, proficiency with version control systems (especially Git), and experience in server setup, cloud deployment, and maintenance are necessary. Preferred Skills: Experience with mobile development frameworks like Flutter or React Native is advantageous. Knowledge of DevOps practices, including CI/CD pipelines, Docker, and Kubernetes, and familiarity with data analytics libraries (Pandas, NumPy) and basic machine learning concepts are desirable. Education and Experience: A Bachelor's degree in Computer Science, Engineering, or a related field is required. You should have 3 to 5 years of experience in full-stack development using MEAN stack and Python.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
nagercoil, tamil nadu
On-site
You should have at least 2+ years of experience in Machine Learning and Natural Language Processing. Your programming skills should be strong in Python. It is essential to have knowledge of SQL and NoSQL databases. Your expertise should include writing robust code in Python and working with both structured and unstructured data, involving data extraction, integration, and normalization. You should be proficient in ML frameworks/libraries like Tensorflow, Pytorch, ONNX, Scikit Learn, Keras, and Spacy. Additionally, you should possess expert skills in manipulating data frames using Pandas and arrays using NumPy. Experience with big data frameworks such as Spark and Hadoop would be considered a plus. Moreover, familiarity with containerizing applications using docker is also desirable.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
As a Data Scientist specializing in Machine Learning and Artificial Intelligence (ML & AI), your primary responsibility will be to tackle business challenges by leveraging data-driven solutions. Your role will involve collaborating closely with product teams to extract insights from complex data sources and steer the direction of our products based on data-driven recommendations. Being a Subject Matter Expert in data science, you will have a crucial role in shaping our analytics strategy and ensuring that our decisions are well-supported by rigorous data analysis. Your key responsibilities will include: - Problem Solving: Translating intricate business issues into machine learning solutions driven by data. - Model Development: Creating, managing, and designing machine learning and advanced analytics models to generate actionable insights. - Data Analysis: Analyzing data from various sources to maintain integrity, accuracy, and relevance. - Insight Sharing: Communicating findings and insights to stakeholders to facilitate data-driven decision-making. - Subject Matter Expertise: Serving as the go-to expert in the field of data science, offering guidance and knowledge to other teams and stakeholders. - Product Leadership: Influencing product teams with data-based recommendations and ensuring alignment with business objectives by conveying business status, experiment outcomes, and other pertinent insights. Desired Technical Skills: - Proficiency in programming languages such as Python, R, or other relevant data science languages. - Experience working with machine learning libraries like TensorFlow, Keras, Scikit-learn, or similar. - Competence in utilizing data manipulation tools like Pandas, NumPy, or equivalent. In addition to a challenging and rewarding work environment, some of the perks and benefits of this role include: - 5 Days Working - Flexible Timings Please note that we are currently not hiring for the Data Scientist (ML & AI) position. Kindly check back later for any updates on the hiring status.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
NTT DATA is seeking a Python Engineer to join the team in Bangalore, Karntaka (IN-KA), India. As a Senior Python Engineer, you will be part of the C3 Data Warehouse team, focusing on building the next-gen data platform that sources and stores data from various technology systems. Your primary responsibility will be contributing to the development of a unified data pipeline framework using Python, Airflow, DBT, Spark, and Snowflake. Additionally, you will integrate this framework with internal platforms for data quality, cataloging, discovery, incident logging, and metric generation. Collaboration with data warehousing leads, data analysts, ETL developers, and infrastructure engineers is crucial for successful implementation. Your key responsibilities will include: - Developing various components in Python for the data pipeline framework. - Establishing best practices for efficient Snowflake usage. - Assisting with testing and deployment utilizing standard frameworks and CI/CD tools. - Monitoring query performance, data loads, and tuning as needed. - Providing guidance during QA & UAT phases to identify and resolve issues effectively. Minimum Skills Required: - 5+ years of data development experience in complex environments with large data volumes. - 5+ years of experience in developing data pipelines and warehousing solutions using Python, Pandas, NumPy, PySpark, etc. - 3+ years of experience in hybrid data environments (On-Prem and Cloud). - Exposure to Power BI / Snowflake. NTT DATA is a global innovator of business and technology services, serving Fortune Global 100 companies. With a commitment to innovation and transformation, we offer consulting, data, AI, industry solutions, and application development services. Join our diverse team across 50+ countries and contribute to building digital and AI infrastructure for a sustainable digital future. Learn more about us at us.nttdata.com.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
We are looking for a Senior Python Developer with 3-4 years of experience and a solid understanding of stock market dynamics, technical indicators, and trading systems. Your main responsibilities will include taking charge of backtesting frameworks, optimizing strategies, and creating high-performance trading modules that are ready for production. The ideal candidate is expected to possess critical thinking skills when it comes to trading logic, demonstrate precision in handling edge cases, and have the ability to write clean, scalable, and testable code. Working in a fast-paced, data-intensive environment where accuracy and speed are crucial should be something you are comfortable with. Your responsibilities will involve designing and maintaining robust backtesting and live trading frameworks, developing modules for strategy building, simulation, and optimization, as well as integrating with real-time and historical market data sources such as APIs and databases. Utilizing libraries like Pandas, NumPy, TA-Lib, Matplotlib, and SciPy for data processing and signal generation will be part of your daily tasks. Additionally, you will be applying statistical methods to validate strategies, optimizing code for low-latency execution and memory efficiency, and collaborating with traders and quants to implement and iterate on ideas. Managing codebases with best practices using Git, including unit testing and modular design, is also expected from you. The qualifications we are looking for include 3-4 years of Python development experience, particularly in data-intensive settings, a strong grasp of algorithms, data structures, and performance optimization, and hands-on experience with technical indicators, trading strategy design, and data visualization. Proficiency in Pandas, NumPy, Matplotlib, SciPy, TA-Lib, as well as strong SQL skills and experience with structured and time-series data are essential. Exposure to REST APIs, data ingestion pipelines, and message queues like Kafka and RabbitMQ would be advantageous. Experience in using version control systems like Git and collaborative development workflows is also required. Preferred experience for this role includes hands-on experience with trading platforms or algorithmic trading systems, familiarity with order management systems (OMS), execution logic, or market microstructure, prior work with cloud infrastructure (AWS, GCP) or Docker/Kubernetes, and knowledge of machine learning or reinforcement learning in financial contexts is considered a bonus. This job opportunity is presented by Human Resource from Codeline Tech.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You are a strategic thinker passionate about driving solutions in Automation. You have found the right team. As an automation Associate in our Finance team, you will spend each day defining, refining, and delivering set goals for our firm. You will be a key driver for a critical team that conducts process deep dives, reviews ideas, and designs, develops, and deploys scalable automation solutions by leveraging intelligent solutions. Your key focus will be to customize firm-wide intelligent automation capabilities to deploy cost-effective modules that impact execution velocity, enhance controls, and improve ROI. Job responsibilities - Partner with relevant stakeholders to understand process-related manual touchpoints, design future state, develop, test, and deploy - Manage and deliver E2E projects in adherence to the Hubs governance and execution model - Ensure automation implementation is compliant as per company policy - Collaborate with business, technology teams, controls partners to ensure calibrated delivery Required qualifications, capabilities, and skills - Expert with hands-on experience in development (must have) - intelligent automation solutions: Python (selenium, django, pandas, numpy, win32com, tkinter, PDF/OCR libraries, exchange connections, API connectivity), UI Path (attended & unattended), Alteryx (advanced) and Pega (advanced) - Advanced hands-on experience - Tableau, QlikView, Qlik sense & SharePoint - 5+ years experience in technology development, strong problem-solving abilities, project management, roadblock management & suctioning - Degree in Computer Science, Engineering, or any related field - Advanced knowledge of Microsoft Office with proficiency in MS Excel, MS Access & MS PowerPoint Preferred qualifications, capabilities, and skills - Project Management Certification - Demonstrate innovation with the ability to translate concepts into visuals - Technical Designer / Solution Architect,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
As a Finance Technology Python Developer at GenAI Solutions, you will be a key member of the team responsible for designing, developing, and deploying cutting-edge Generative AI solutions using Python. Your role will involve hands-on development, optimization of existing solutions, and driving the adoption of modern engineering practices. You will collaborate with various teams to ensure the successful integration of AI solutions in the banking sector. In this role, you will have the opportunity to work on innovative projects that adhere to Citis strategic architecture principles. You will be responsible for optimizing AI models, leveraging LLMs, and enhancing solution performance through efficient coding practices. Additionally, you will play a crucial role in implementing best practices in readability, performance, and security through thorough code reviews. The ideal candidate for this position should have at least 8 years of experience as a Python or Backend Developer, with a strong proficiency in Python and frameworks such as Flask, Django, and Pandas. You should also possess hands-on experience with LLMs APIs and integration frameworks like LangChain, as well as expertise in prompting techniques for AI models. Knowledge of RAG methodologies, DB-driven applications, ORM such as SQLAlchemy, Kubernetes, and modern engineering practices like Agile, DevOps, and CI/CD is essential for this role. To be successful in this role, you should have a bachelors or masters degree in computer science or engineering. Additionally, you should be able to think strategically, engage with cross-functional teams, and demonstrate creativity in identifying new opportunities. Your ability to work collaboratively in a team environment and your experience in product lifecycle management will be valuable assets in this position. If you are excited about leveraging your Python development skills to drive innovation in Finance, Markets, and Credit Risk Technology at Citi, we encourage you to apply for this position and be a part of shaping the future of banking.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
Job Description: As the Digital Transformation Lead at Godrej Agrovet Limited (GAVL) in Mumbai, you will play a crucial role in driving innovation and productivity in the agri-business sector. GAVL is dedicated to enhancing the livelihood of Indian farmers by developing sustainable solutions that enhance crop and livestock yields. With leading market positions in Animal Feed, Crop Protection, Oil Palm, Dairy, Poultry, and Processed Foods, GAVL is committed to making a positive impact on the agricultural industry. With an impressive annual sales figure of 6000 Crore INR in FY 18-19, GAVL has a widespread presence across India, offering high-quality feed and nutrition products for cattle, poultry, aqua feed, and specialty feed. The company operates 50 manufacturing facilities, has a network of 10,000 rural distributors/dealers, and employs over 2500 individuals. At GAVL, our people philosophy revolves around the concept of tough love. We set high expectations for our team members, recognizing and rewarding performance and potential through career growth opportunities. We prioritize the development, mentoring, and training of our employees, understanding that diverse interests and passions contribute to a strong team dynamic. We encourage individuals to explore their full potential and provide a supportive environment for personal and professional growth. In this role, you will utilize your expertise as a Data Scientist to extract insights from complex datasets, develop predictive models, and drive data-driven decisions across the organization. You will collaborate with various teams, including business, engineering, and product, to apply advanced statistical methods, machine learning techniques, and domain knowledge to address real-world challenges. Key Responsibilities: - Data Cleaning, Preprocessing & Exploration: Prepare and analyze data, ensuring quality and completeness by addressing missing values, outliers, and data transformations to identify patterns and anomalies. - Machine Learning Model Development: Build, train, and deploy machine learning models using tools like MLflow on the Databricks platform, exploring regression, classification, clustering, and time series analysis techniques. - Model Evaluation & Deployment: Enhance model performance through feature selection, leveraging distributed computing capabilities for efficient processing, and utilizing CI/CD tools for deployment automation. - Collaboration: Work closely with data engineers, analysts, and stakeholders to understand business requirements and translate them into data-driven solutions. - Data Visualization and Reporting: Create visualizations and dashboards to communicate insights to technical and non-technical audiences using tools like Databricks and Power BI. - Continuous Learning: Stay updated on the latest advancements in data science, machine learning, and industry best practices to enhance skills and processes. Required Technical Skills: - Proficiency in statistical analysis, hypothesis testing, and machine learning techniques. - Familiarity with NLP, time series analysis, computer vision, and A/B testing. - Strong knowledge of Databricks, Spark DataFrames, MLlib, and programming languages (Python, TensorFlow, Pandas, scikit-learn, PySpark, NumPy). - Proficient in SQL for data extraction, manipulation, and analysis, along with experience in MLflow and cloud data storage tools. Qualifications: - Education: Bachelors degree in Statistics, Mathematics, Computer Science, or a related field. - Experience: Minimum of 3 years in a data science or analytical role. Join us at Vikhroli, Mumbai, and be a part of our mission to drive digital transformation and innovation in the agricultural sector at Godrej Agrovet Limited.,
Posted 1 week ago
1.0 - 5.0 years
0 Lacs
maharashtra
On-site
As a Data Scientist in our team, you will be responsible for manipulating and preprocessing structured and unstructured data to prepare datasets for analysis and model training. You will utilize Python libraries such as PyTorch, Pandas, and NumPy for data analysis, model development, and implementation. Additionally, you will fine-tune large language models (LLMs) to meet specific use cases and enterprise requirements. Collaboration with cross-functional teams to experiment with AI/ML models and iterate quickly on prototypes is also a key aspect of your role. Your responsibilities will include optimizing workflows to ensure fast experimentation and deployment of models to production environments. You will be expected to implement containerization and basic Docker workflows to streamline deployment processes. Writing clean, efficient, and production-ready Python code for scalable AI solutions is crucial for this role. It would be beneficial if you have exposure to cloud platforms like AWS, Azure, or GCP, knowledge of MLOps principles and tools, and a basic understanding of enterprise Knowledge Management Systems. The ability to work against tight deadlines, independently handle unstructured projects, and showcase strong initiative and self-motivation are desirable traits. Strong communication, collaboration acumen, and a problem-solving mindset with attention to detail are also highly valued. Required skills for this role include proficiency in Python with strong skills in libraries like PyTorch, Pandas, and NumPy, experience in handling both structured and unstructured datasets, and familiarity with fine-tuning LLMs and modern NLP techniques. A basic understanding of Docker and containerization principles, along with the demonstrated ability to experiment, iterate, and deploy code rapidly in a production setting, are essential. You should also possess the ability to learn and adapt quickly in a fast-paced, dynamic environment. In return, we offer you the opportunity to work on cutting-edge AI technologies and impactful projects, a collaborative and growth-oriented work environment, competitive compensation, and benefits package, as well as the chance to be a part of a team shaping the future of enterprise intelligence.,
Posted 1 week ago
6.0 - 11.0 years
20 - 35 Lacs
Hyderabad
Work from Office
Senior Data Scientist JD As a Senior Data Scientist working at TTEC Digital, you will work on a team with diverse clients and industries to create data-driven solutions, develop cutting edge algorithms and provide expert insights to increase our understanding of customer behavior and needs. You will help clients turn data into actionable insights that drive tangible outcomes, improve performance and help clients lead the market. This position requires in-depth understanding and use of statistical and data analysis tools. What youll be doing: Build new analytics solutions; as well as analyze pre-existing analytic solutions, provide suggestions on how to evolve, and improve their efficiency and effectiveness Build client facing presentations to review analytic results and ability to convey technical concepts around analytic solutions to non-technical audiences in compelling story ¢ Where required, work as independent contributor on projects, managing end-to-end deliverables ¢ Handle communications with internal & external stakeholders, scope client requirements, develop analytics frameworks/solutions and implement and deploy solutions What skillsets we are looking for: ¢ 4-6 years in analytics domain developing & deploying ML/AI solutions ¢ 1+ years working on NLP projects text mining, having used LLMs, worked on Hugging Face/Transformer models; worked on publicly available LLM APIs ¢ Proficient level of understanding in machine learning and deep learning methods ¢ Excellent programming skills in Python, Pandas, PySpark & SQL ¢ Excellent communication skills ability to describe findings to technical and non-technical audience ¢ Good knowledge of MS office suite of products (Excel and PowerPoint) ¢ Ability to create end-to-end PowerPoint presentations (outline, visualization, results, summary) that are client-ready ¢ Experience with at least one of the following: Azure, GCP, Databricks, AWS; GCP is preferred Who we are: TTEC Digital Consulting is a pioneer in customer experience, engagement and growth solutions. We utilize a holistic approach, applying solutions from both our Engage and Digital segments to help companies provide an amazing experience to their customers, inspire customer loyalty, and grow their business. We provide end-to-end advisory and execution services, giving leaders the confidence and tools to re-think how to compete with a combination of our logic, intellect and ability to make sense of the data with our creativity and experience. TTEC Digital Analytics India 12th Floor, SALARPURIA SATTVA KNOWLEDGE CITY, HITEC City, Hyderabad, Telangana 500081
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
We are looking for a Data Scientist to join our dynamic team dedicated to developing cutting-edge AI-powered search and research tools that are revolutionizing how teams access information and make informed decisions. As a Data Scientist, you will play a crucial role in transforming complex datasets into valuable insights, making an impact at the forefront of productivity and intelligence tool development. Your responsibilities will include owning and managing the data transformation layer using dbt and SQL, designing scalable data models, maintaining business logic, creating intuitive dashboards and visualizations using modern BI tools, collaborating with various teams to uncover key insights, working with diverse structured and unstructured data sources such as Snowflake and MongoDB, and translating business questions into data-driven recommendations. Additionally, you will support experimentation and A/B testing strategies across teams. The ideal candidate for this role will have a minimum of 4-8 years of experience in analytics, data engineering, or BI roles, with strong proficiency in SQL, dbt, and Python (pandas, plotly, etc.). Experience with BI tools, dashboard creation, and working with multiple data sources is essential. Excellent communication skills are a must as you will collaborate across global teams. Familiarity with Snowflake, MongoDB, Airflow, startup experience, or a background in experimentation is considered a bonus. Joining our team means being part of a global effort to redefine enterprise search and research with a clear vision and strong leadership. If you are passionate about solving complex data challenges, enjoy working independently, and thrive in a collaborative environment with brilliant minds, this role offers an exciting opportunity for professional growth and innovation. Location: Abu Dhabi Experience: 4-8 years Role Type: Individual Contributor | Reports to Team Lead (Abu Dhabi),
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
As a Data Scientist in the Automotive industry, you will be responsible for leveraging your expertise in computer science, Data Science, Statistics, or related fields to drive impactful insights and solutions. With a minimum of 6 years of hands-on experience in data science and analytics, you will bring a deep understanding of various programming tools such as Python and libraries like Pandas, NumPy, Scikit-learn, TensorFlow/PyTorch to the table. Your role will involve working with big data tools like Spark, AWS/GCP data pipelines, or similar platforms to extract valuable information from complex datasets. An in-depth understanding of time-series data, signal processing, and Machine Learning for spatiotemporal datasets will be essential for this position. Experience with connected vehicle data or telemetry from trucks/autonomous systems will be highly advantageous in this role. Additionally, familiarity with vehicle dynamics, CAN data decoding, or driver behavior modeling will be considered a plus. Your proficiency in SQL and data visualization tools such as Tableau, PowerBI, Plotly, etc., will enable you to communicate your findings effectively to stakeholders. This position offers a unique opportunity to apply your skills in a remote setting for a contract duration of 12 months. If you are passionate about data science, have a keen eye for detail, and enjoy tackling complex challenges in the Automotive industry, we invite you to apply for this exciting opportunity.,
Posted 2 weeks ago
0.0 - 3.0 years
0 Lacs
maharashtra
On-site
We are seeking a passionate and motivated Junior Python Data Scientist to join our expanding data team. This position offers an excellent opportunity for recent graduates or individuals with up to one year of experience who are enthusiastic about applying their Python and data analysis skills to real-world challenges. As a Junior Python Data Scientist, you will be involved in meaningful projects focused on data preparation, analysis, and the development of machine learning models, all under the guidance of experienced data scientists. Your responsibilities will include cleaning, transforming, and analyzing data utilizing Python, pandas, and NumPy. You will also play a key role in supporting the development and evaluation of machine learning models using tools such as scikit-learn and TensorFlow. Additionally, you will conduct statistical analyses to extract valuable insights and identify trends, as well as contribute to building data pipelines and automating data processes. Communication of findings and presentation of insights in a clear and concise manner will be an essential part of your role. Collaboration with cross-functional teams, including data engineers, product managers, and software developers, is also a key aspect of this position. The ideal candidate for this role should have at least one year of hands-on experience with Python for data analysis or machine learning. Familiarity with pandas, NumPy, scikit-learn, and TensorFlow is required. A solid understanding of core statistical concepts and basic exploratory data analysis (EDA) is essential. Knowledge of machine learning models such as linear regression, decision trees, or classification algorithms is preferred, and exposure to advanced ML algorithms and Deep Learning is a plus. Strong problem-solving and analytical thinking skills, along with good communication and documentation abilities, are also important qualities we are looking for. A Bachelor's degree in Computer Science, Data Science, Statistics, Mathematics, or a related field is required, or currently pursuing one. Preferred qualifications include completed coursework, certifications, or personal projects related to ML or Data Science. Exposure to version control (e.g., Git), Jupyter notebooks, or cloud environments is advantageous. We are looking for candidates who are enthusiastic about learning and growing in the data science field.,
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
ahmedabad, gujarat
On-site
I Vision Infotech is a full-fledged IT company established in 2011, providing top-notch, cost-effective, and dependable web and e-commerce solutions. Situated in India, the company caters to a diverse clientele comprising both national and international customers from countries like the USA, Malaysia, Australia, Canada, and the United Kingdom. The services offered encompass a broad spectrum including web design, development, e-commerce, and mobile app development on multiple platforms such as Android, iOS, Windows, and Blackberry. This full-time on-site position is for a Python Developer specializing in Data Science & Django, based in Ahmedabad. The Python Developer will play a pivotal role in building back-end web applications, crafting efficient and sustainable code, and collaborating with cross-functional teams to fuse user-facing components with server-side logic. Daily responsibilities encompass the design, development, and maintenance of software applications and databases, ensuring optimal performance and responsiveness. This role entails object-oriented programming, software development, and close collaboration with data science teams to implement and enhance data-driven solutions. Key Responsibilities: - Crafting clean and efficient Python code - Developing web applications utilizing Django - Engaging in data collection, processing, and visualization through Pandas/Numpy/Matplotlib - Constructing REST APIs using Django REST Framework - Collaborating with the team for real-world project deployments Required Skills: - Profound expertise in Python programming - Fundamental comprehension of data science libraries (Pandas, NumPy, Matplotlib, Scikit-learn) - Practical experience with Django or Django REST Framework - Understanding of HTML, CSS, JavaScript (Basic level) - Knowledge of Git is considered advantageous - Qualification: B.E/B.tech/BCA/MCA/BSc/MSc (CS/IT) or equivalent Perks and Benefits: - Exposure to real-time projects - Internship/Training certificate - Flexible working hours,
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
hyderabad, telangana
On-site
Seeking a highly skilled Python Developer with at least 7 years of experience to join our dynamic team. You should have a strong proficiency in Python, particularly in scripting and automation. Hands-on experience with API development and integration using technologies like REST, JSON, OAuth2, etc. is essential. We expect you to have expertise in Python libraries such as requests, httpx, json, unittest, pytest, pandas, and logging. Additionally, you should be knowledgeable about API testing tools like Postman, Swagger, or automation through Python scripts. Familiarity with version control systems like Git and continuous integration tools such as Jenkins, GitLab CI would be a plus. If you are passionate about Python development, API integration, and automation, and possess the required skills and experience, we would like to hear from you.,
Posted 2 weeks ago
5.0 - 24.0 years
0 Lacs
pune, maharashtra
On-site
As an AI Engineer at YASH Technologies, you will have the opportunity to utilize your 24 years of experience in the field to make a significant impact. You will be a part of a team that is dedicated to leveraging cutting-edge technologies to drive business transformation and deliver exceptional stakeholder experiences. Your primary responsibility will be to build and deploy machine learning models for classification, regression, and clustering tasks. You will also apply foundational GenAI concepts such as embeddings, summarization, and RAG, using tools like LangChain and vector databases. Additionally, you will be expected to prepare documentation and interpret results to drive actionable insights. To excel in this role, you must have strong hands-on experience in Python, Scikit-learn, and Pandas. Knowledge of model evaluation, feature engineering, and model tuning is essential, along with exposure to LangChain and vector databases. Basic familiarity with FastAPI or Flask is also preferred. At YASH, we believe in empowering our employees to shape their careers while working in a collaborative and inclusive environment. We provide career-oriented skilling models and opportunities for continuous learning, unlearning, and relearning at a rapid pace. Our Hyperlearning workplace is guided by principles such as flexible work arrangements, open collaboration, support for business goals, and a stable employment environment with an ethical corporate culture. Join us at YASH Technologies and be a part of a team that is driving positive changes in an increasingly virtual world.,
Posted 2 weeks ago
5.0 - 9.0 years
0 - 0 Lacs
karnataka
On-site
You will be responsible for building and interpreting machine learning models on real business data from the SigView platform, such as Logistic Regression, Boosted trees (Gradient boosting), Random Forests, and Decision Trees. Your tasks will include identifying data sources, integrating multiple sources or types of data, and applying data analytics expertise within a data source to develop methods to compensate for limitations and extend the applicability of the data. Moreover, you will be expected to extract data from relevant data sources, including internal systems and third-party data sources, through manual and automated web scrapping. Your role will involve validating third-party metrics by cross-referencing various syndicated data sources and determining the numerical variables to be used in the same form as they are from the raw datasets, categorized into buckets, and used to create new calculated numerical variables. You will perform exploratory data analysis using PySpark to finalize the list of compulsory variables necessary to solve the business problem and transform formulated problems into implementation plans for experiments by applying appropriate data science methods, algorithms, and tools. Additionally, you will work with offshore teams post data preparation to identify the best statistical model/analytical solution that can be applied to the available data to solve the business problem and derive actionable insights. Your responsibilities will also include collating the results of the models, preparing detailed technical reports showcasing how the models can be used and modified for different scenarios in the future to develop predictive insights. You will develop multiple reports to facilitate the generation of various business scenarios and provide features for users to generate scenarios. Furthermore, you will be interpreting the results of tests and analyses to develop insights into formulated problems within the business/customer context and provide guidance on risks and limitations. Acquiring and using broad knowledge of innovative data analytics methods, algorithms, and tools, including Spark, Elasticsearch, Python, Databricks, Azure, Power BI, Azure Cloud services, LLMs-Gen AI, and Microsoft Suite will be crucial for success in this role. This position may involve telecommuting and requires 10% travel nationally to meet with clients. The minimum requirements for this role include a Bachelor's Degree in Electronics Engineering, Computer Engineering, Data Analytics, Computer Science, or a related field plus five (5) years of progressive experience in the job offered or related occupation. Special skill requirements for this role include applying statistical methods to validate results and support strategic decisions, building and interpreting advanced machine learning models, using various tools such as Python, Scikit-Learn, XGBoost, Databricks, Excel, and Azure Machine Learning for data preparation and model validation, integrating diverse data sources using data analytics techniques, and performing data analysis and predictive model development using AI/ML algorithms. Your mathematical knowledge in Statistics, Probability, Differentiation and Integration, Linear Algebra, and Geometry will be beneficial. Familiarity with Data Science libraries such as NumPy, SciPy, and Pandas, Azure Data Factory for data pipeline design, NLTK, Spacy, Hugging Face Transformers, Azure Text Analytics, OpenAI, Word2Vec, and BERT will also be advantageous. The base salary for this position ranges from $171,000 to $190,000 per annum for 40 hours per week, Monday to Friday. If you have any applications, comments, or questions regarding the job opportunity described, please contact Piyush Khemka, VP, Business Operations, at 111 Town Square Pl., Suite 1203, Jersey City, NJ 07310.,
Posted 2 weeks ago
0.0 - 4.0 years
0 Lacs
kakinada, andhra pradesh
On-site
Nyros is a Web & Mobile apps development agency with a global team of designers, developers, and managers. The goal at Nyros is to expand opportunities for anyone to imagine, design, and make a better world. We are seeking a Junior Fullstack Developer who excels in Python-based backends and possesses a strong foundation in frontend development. If you have at least 6 months of real project experience, a passion for clean code, and are eager to work on end-to-end features across web and data-rich applications, then this opportunity is perfect for you. In this role, you will collaborate closely with senior engineers to create full-stack solutions for platforms that incorporate APIs, dashboards, databases, and map-based data. **Responsibilities:** - Develop backend APIs using Python (Django/Flask). - Create responsive UI utilizing HTML, CSS, JavaScript, and modern JS frameworks (React preferred). - Integrate REST APIs to establish connections between frontend and backend systems. - Manage data processing tasks with Pandas, NumPy, and establish connections with PostgreSQL/MySQL databases. - Utilize Git for version control, proficiently debug code, and deploy features end-to-end. - Engage in sprint activities with a cross-functional team to foster collaboration and productivity. **Skills Required:** - Minimum of 6 months of experience in fullstack web development. - Proficient in Python and a web framework such as Django or Flask. - Comfortable with JavaScript, React or Vue, and possess basic UI design skills. - Familiarity with SQL and practical experience with databases in projects. - Understanding of REST APIs, modular programming, and clean architecture principles. - Eagerness to learn and develop expertise in both backend and frontend domains. **Good to have:** - Previous experience in building or contributing to dashboards, admin panels, or geospatial projects. - Knowledge of PostGIS, GeoPandas, or Leaflet.js/Mapbox for map-based UI development. - Exposure to Docker, CI/CD practices, or cloud deployment procedures. *To Apply:* Please submit your resume along with your GitHub projects to hr@nyros.com *Job Types:* Full-time, Fresher *Application Question(s):* - What is your current location *Work Location:* In person,
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
vadodara, gujarat
On-site
As a Machine Learning Engineer, you will be responsible for designing and implementing scalable machine learning models throughout the entire lifecycle - from data preprocessing to deployment. Your role will involve leading feature engineering and model optimization efforts to enhance performance and accuracy. Additionally, you will build and manage end-to-end ML pipelines using MLOps practices, ensuring seamless deployment, monitoring, and maintenance of models in production environments. Collaboration with data scientists and product teams will be key in understanding business requirements and translating them into effective ML solutions. You will conduct advanced data analysis, create visualization dashboards for insights, and maintain detailed documentation of models, experiments, and workflows. Moreover, mentoring junior team members on best practices and technical skills will be part of your responsibilities to foster growth within the team. In terms of required skills, you must have at least 3 years of experience in machine learning development, with a focus on the end-to-end model lifecycle. Proficiency in Python using Pandas, NumPy, and Scikit-learn for advanced data handling and feature engineering is crucial. Strong hands-on expertise in TensorFlow or PyTorch for deep learning model development is also a must-have. Desirable skills include experience with MLOps tools like MLflow or Kubeflow for model management and deployment, familiarity with big data frameworks such as Spark or Dask, and exposure to cloud ML services like AWS SageMaker or GCP AI Platform. Additionally, working knowledge of Weights & Biases and DVC for experiment tracking and versioning, as well as experience with Ray or BentoML for distributed training and model serving, will be considered advantageous. Join our team and contribute to cutting-edge machine learning projects while continuously improving your skills and expertise in a collaborative and innovative environment.,
Posted 2 weeks ago
2.0 - 6.0 years
0 - 0 Lacs
punjab
On-site
As an SDE-II Python Developer at our company, you will be an integral part of our Operations department, working in an in-office setting on a full-time basis. We are seeking an experienced Python Developer with 2 to 5 years of expertise in developing scalable backend applications and APIs using modern Python frameworks. Your role will involve collaborating with the design, frontend, and DevOps teams to deliver high-performance solutions. Your key responsibilities will include developing, testing, and maintaining backend applications using frameworks like Django, Flask, or FastAPI. You will also be responsible for building RESTful APIs, integrating third-party services, and leveraging data handling libraries such as Pandas and NumPy for efficient data processing. Additionally, you will participate in code reviews, mentor junior developers, and troubleshoot production issues with a proactive approach. To succeed in this role, you must possess 2 to 5 years of experience in backend development with Python, proficiency in core and advanced Python concepts, and a strong command over at least one Python framework. You should also have experience with data libraries, authentication mechanisms, version control systems, and be comfortable working in Linux environments. The must-have skills for this position include expertise in backend Python development, strong debugging and optimization skills, experience with API development and microservices architecture, and a deep understanding of software design principles and security best practices. Additionally, good-to-have skills include experience with Generative AI frameworks, Machine Learning libraries, containerization tools, web servers, asynchronous programming, Agile practices, CI/CD pipelines, and cloud platforms. At our company, we specialize in delivering cutting-edge solutions in custom software, web, and AI development. Our work culture emphasizes a unique blend of in-office and remote collaboration, prioritizing employee well-being. You will have access to competitive salary packages, generous time off, continuous learning opportunities, and client exposure to enhance your professional growth. Join us in an environment that values continuous learning, leadership opportunities, and mutual respect, supporting you in achieving your fullest potential.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
The Content and Data Analytics team is a part of DataOps within Global Operations at Elsevier, focusing on providing data analysis services primarily using Databricks. The team primarily serves product owners and data scientists of Elsevier's Research Data Platform, contributing to the delivery of leading data analytics products for scientific research, including Scopus and SciVal. As a Senior Data Analyst at Elsevier, you are expected to have a solid understanding of best practices and the ability to execute projects and initiatives independently. You should be capable of creating advanced-level insights and recommendations, as well as leading analytics efforts with high complexity autonomously. Your responsibilities will include supporting data scientists within the Domains of the Research Data Platform, engaging in various analytical activities such as analyzing large datasets, performing data preparation, and reviewing data science algorithms. You must possess a keen eye for detail, strong analytical skills, expertise in at least one data analysis system, curiosity, dedication to quality work, and an interest in scientific research. The requirements for this role include a minimum of 5 years of work experience, coding skills in at least one programming language (preferably Python) and SQL, familiarity with string manipulation functions like regular expressions, prior exposure to data analysis tools such as Pandas or Apache Spark/Databricks, knowledge of basic statistics relevant to data science, and experience with visualization tools like Tableau/Power BI. You will be expected to build and maintain strong relationships with Data Scientists and Product Managers, align activities with stakeholders, and present achievements and project updates to various stakeholders. Key competencies for this role include collaborating effectively as part of a team, taking initiative in problem-solving, and driving tasks to successful conclusions. Elsevier offers various benefits to promote a healthy work-life balance, including well-being initiatives, shared parental leave, study assistance, and sabbaticals. Additionally, the company provides comprehensive health insurance, flexible working arrangements, employee assistance programs, modern family benefits, various paid time off options, and subsidized meals. The company prides itself on being a global leader in information and analytics, supporting science, research, health education, and interactive learning while addressing the world's challenges and fostering a sustainable future.,
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
haryana
On-site
As a Python Lead, your primary responsibility will be to develop and maintain robust, scalable, and efficient Python AI/ML & web applications that integrate with cloud services. You will utilize web development frameworks such as Django and Flask to build user-facing interfaces. Your role will involve developing, testing, and maintaining high-quality analytics applications using Python in conjunction with DevOps & cloud development practices. It is essential that you write clean, maintainable, and efficient code to ensure the overall quality of the applications. In this position, you will participate in code reviews and actively contribute to team knowledge sharing sessions. Performing unit testing and integration testing will also be part of your routine to guarantee the quality of the codebase. You will work closely with SQL databases to store, retrieve, and manipulate data as required by the applications. Additionally, you will leverage data analysis libraries like Pandas and NumPy for data exploration and manipulation tasks. To be successful in this role, you should have at least 2-3 years of experience in developing and deploying analytics applications with a focus on AI/ML, DevOps, and cloud technologies. A strong understanding of object-oriented programming (OOP) concepts is essential, along with proficiency in SQL and relational databases such as MySQL and PostgreSQL. Experience with web development frameworks, particularly Django or Flask, is preferred. Working knowledge of data analysis libraries like Pandas and NumPy would be a definite plus. Familiarity with testing frameworks such as unittest and pytest is also desirable.,
Posted 2 weeks ago
1.0 - 5.0 years
0 Lacs
maharashtra
On-site
Are you passionate about data, automation, and the world of music In the role of Data & Automation, you will be at the crossroads of technology, analytics, and entertainment, supporting a leading music label in extracting valuable insights from extensive industry data. This position is ideal for individuals who relish Python scripting, Dashboarding, and dealing with substantial datasets while gaining an exclusive glimpse into the music business. You will be responsible for: - Automating data gathering and processing using Python libraries such as Pandas, NumPy, Selenium, Beautiful Soup, etc. Demonstrated experience through projects, internships, or work is essential. - Developing and enhancing dashboards to monitor trends and performance effectively. - Aiding in organizing and managing large data volumes sourced from diverse digital platforms. - Collaborating with marketing, A&R, and digital teams to deliver data-backed insights. - Implementing data gathering techniques like APIs and web-based sources to enrich reporting. - Bringing to the table strong analytical capabilities, meticulous attention to detail, and a fervor for music and statistical patterns. Why Join Us - Collaborate with a major player in the music industry. - Acquire firsthand knowledge of how data influences song and artist growth. - Engage with industry professionals, marketing specialists, and digital platforms. - Opportunity to translate your love for music and data into a rewarding career trajectory. Pre-requisites: - Graduation in B. Tech, MCA, or MSc IT. - Proficiency in Python and Advanced Excel. - Location preference: Mumbai. If you are eager to combine data, automation, and insights from the music industry, seize this incredible opportunity to make a difference!,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Machine Learning and AI Specialist, you will be responsible for architecting and implementing cutting-edge solutions in the fields of Computer Vision, Natural Language Processing (NLP), and Large Language Models (LLMs). Your role will involve integrating these models into application ecosystems, collaborating with data engineering teams to develop robust pipelines, and conducting AI/ML research to ensure that our solutions stay ahead of industry trends. In addition to your technical responsibilities, you will also play a key role in mentoring junior team members, collaborating with stakeholders across departments, and ensuring the scalability and efficiency of our ML models. Participation in code reviews, planning discussions, and overall contribution to the team's success will be crucial aspects of your day-to-day work. To excel in this role, you must possess a strong foundation in programming, particularly in Python, and have experience working with frameworks and libraries such as TensorFlow, PyTorch, scikit-learn, OpenAI, and Llama. Proficiency in data libraries like NumPy and Pandas, along with expertise in domains like Computer Vision, NLP, LLMs, and Data Science, will be essential for success. The ideal candidate for this position will hold a Bachelors or Masters degree in Computer Science, Data Science, or a related field, and have a minimum of 4 years of experience in Data Science and Machine Learning, with at least 2 years of experience in enterprise applications. Strong analytical and communication skills, along with the ability to manage multiple projects in a fast-paced environment, will be key to thriving in this role. This is a full-time position based in Noida, with a day shift schedule that requires in-person work at the designated location. If you are passionate about leveraging AI and ML technologies to drive innovation and solve complex challenges, we encourage you to apply and join our dynamic team.,
Posted 2 weeks ago
6.0 - 11.0 years
19 - 34 Lacs
Bengaluru
Work from Office
Role Overview We are seeking a skilled Data Analyst to support our platform powering operational intelligence across airports and similar sectors. The ideal candidate will have experience working with time-series datasets and operational information to uncover trends, anomalies, and actionable insights. This role will work closely with data engineers, ML teams, and domain experts to turn raw data into meaningful intelligence for business and operations stakeholders. Key Responsibilities Analyze time-series and sensor data from various sources Develop and maintain dashboards, reports, and visualizations to communicate key metrics and trends. Correlate data from multiple systems (vision, weather, flight schedules, etc) to provide holistic insights. Collaborate with AI/ML teams to support model validation and interpret AI-driven alerts (e.g., anomalies, intrusion detection). Prepare and clean datasets for analysis and modeling; ensure data quality and consistency. Work with stakeholders to understand reporting needs and deliver business-oriented outputs. Qualifications & Required Skills Bachelors or Master’s degree in Data Science, Statistics, Computer Science, Engineering, or a related field. 6+ years of experience in a data analyst role, ideally in a technical/industrial domain. Strong SQL skills and proficiency with BI/reporting tools (e.g., Power BI, Tableau, Grafana). Hands-on experience analyzing structured and semi-structured data (JSON, CSV, time-series). Proficiency in Python or R for data manipulation and exploratory analysis. Understanding of time-series databases or streaming data (e.g., InfluxDB, Kafka, Kinesis). Solid grasp of statistical analysis and anomaly detection methods. Experience working with data from industrial systems or large-scale physical infrastructure. Good-to-Have Skills Domain experience in airports, smart infrastructure, transportation, or logistics. Familiarity with data platforms (Snowflake, BigQuery, Custom-built using open-source). Exposure to tools like Airflow, Jupyter Notebooks and data quality frameworks. Basic understanding of AI/ML workflows and data preparation requirements.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough