Jobs
Interviews

4556 Numpy Jobs - Page 33

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 3.0 years

0 Lacs

Ahmedabad, Gujarat

On-site

Location Ahmedabad, India Experience 2-3 Job Type Full Time Job Description Designation: AI/ML Developer Location: Ahmedabad Department: Technical Job Summary: We Are Looking Enthusiastic AI/ML Developer With 2 To 3 Years Of Relevant Experience In Machine Learning And Artificial Intelligence. The Candidate Should Be Well-Versed In Designing And Developing Intelligent Systems And Have A Solid Grasp Of Data Handling And Model Deployment. Key Responsibilities: Develop and implement machine learning models tailored to business needs. Develop and fine-tune Generative AI models (e.g., LLMs, diffusion models, VAEs) using platforms like Hugging Face, LangChain, or OpenAI. Conduct data collection, cleaning, and pre-processing for model readiness. Train, test, and optimize models to improve accuracy and performance. Work closely with cross-functional teams to deploy AI models in production environments. Perform data exploration, visualization, and feature selection Stay up-to-date with the latest trends in AI/ML and experiment with new approaches. Design and implement Multi-Agent Systems (MAS) for distributed intelligence, autonomous collaboration, or decision-making. Integrate and orchestrate agentic workflows using tools like Agno, CrewAI or LangGraph. Ensure scalability and efficiency of deployed solutions. Monitor model performance and perform necessary updates or retraining. Requirements: Strong programming skills in Python and experience with libraries like Tensor Flow, PyTorch, Scikit-learn, and Keras. Experience working with vector databases (Pinecone, Weaviate, Chroma) for RAG systems. Good understanding of machine learning concepts, including classification, regression, clustering, and deep learning. Knowledge of knowledge graphs, semantic search, or symbolic reasoning. Proficiency in working with tools such as Pandas, NumPy, and data visualization libraries. Hands-on experience deploying models using REST APIs with frameworks like Flask or FastAPI. Familiarity with cloud platforms (AWS, Google Cloud, or Azure) for ML deployment. Knowledge of version control systems like Git. Experience with Natural Language Processing (NLP), computer vision, or predictive analytics. Exposure to MLOps tools and workflows (e.g., MLflow, Kubeflow, Airflow). Basic familiarity with big data frameworks like Apache Spark or Hadoop. Understanding of data pipelines and ETL processes. What We Offer: Opportunity to work on live projects and client interactions. A vibrant and learning-driven work culture. 5 days a week & Flexible work timings. About Company Techify is the Fastest Growing Tech Company with a talented, passionate and learning team. Techify's DNA Is About Solutions & Technologies. We are here to help our customers grow their business. Our Vision is to Become One of the Best Product Engineering companies in India We put client relationships first hence our mission is to build software solutions that help clients transform their business by unleashing hidden potential with technology. So our success mantra is Customer first, Team second and We are the third. Our main focus is our Customers’ and Partners’ success. Our visionary and experienced team turns innovative ideas into efficient products & softwares. Our well-defined processes ensure on-time delivery to our partners giving us an edge over our competitors. The most important pillar in achieving our goals is our dedicated Team and to encourage them and keep them motivated, we have set up a culture that rewards Self Development and Innovation. Our cutting-edge services include intensive research and analysis to identify the appropriate technology to achieve best performances by incurring least cost possible. We take a studied approach towards cost, performance, feature trade-offs to help companies surmount the challenges of delivering high-quality, timely products and services to the marketplace. We have the ability to take up any product be it at the stage of defining, designing, verifying or realizing. Here are our recognitions. We are the winner of Grand Challenge in Vibrant Gujarat Summit’2018. We have also achieved prestigious “Trend Setter” award from Gujarat Innovation Society. Times Coffee Table Book Covered us in “Gujarat the Inspiring edge” edition. We are also Amazon web services consulting and networking partners.

Posted 2 weeks ago

Apply

2.0 - 11.0 years

0 Lacs

maharashtra

On-site

Dear Candidate, Hope you are doing well! CMS Computers Limited (INDIA) is currently hiring for the following positions in Mumbai (Bhandup) on a Work From Office basis: 1) Data Analytics / BI Position: - Experience Required: 2-3 years - Location: Mumbai (Bhandup) WFO - Note: Immediate to 15 days notice period will be considered - Seeking candidates with 2-3 years of experience in Investment, real estate, taxation, finance, and accounts. 2) CA Requirement: - Experience Required: 8+ years - Location: Mumbai (Bhandup) WFO - Note: Immediate to 15 days notice period will be considered - The role involves leading the data Analytics / BI domain and requires the following skills: - Python experience, especially in data analytics projects - Strong backend experience with advanced SQL skills - Experience in working on multiple projects in different domains and use cases - Good communication skills The ideal candidate should be a strong, result-oriented programmer with a can-do attitude, capable of driving existing analytics products and managing teams with a focus on technical expertise and management skills. Key Responsibilities Include: - Writing effective, scalable code - Developing back-end components to enhance responsiveness and performance - Integrating user-facing elements into applications - Testing and debugging programs - Improving the functionality of existing systems - Implementing security and data protection solutions - Assessing and prioritizing feature requests - Coordinating with internal teams to understand user requirements and provide technical solutions Required Skills and Abilities: - 6 to 11 years of hands-on experience with Python - Proficiency in at least one popular Python framework (e.g., Django, Flask, or Pyramid) - Knowledge of object-relational mapping (ORM) - Familiarity with front-end technologies such as React JS, JavaScript, and HTML5 - Experience with Python libraries like Pandas, NumPy, and requests - Working knowledge of SCM tools like Git and Bitbucket - Experience with RDBMS and SQL - Experience in unit testing frameworks like pytest and performance testing - Good communication and debugging skills Good to Have Skills: - Any Cloud Experience - Big Data - Pyspark/Spark-Scala Knowledge - Working knowledge of Kubernetes, dockerization - Experience with Terraform If you are interested, please share your updated resume at chaitali.saha@voqeoit.com. Kindly provide the following details for further processing: - Total Experience - Relevant Experience - Current CTC - Expected CTC - Notice Period - Negotiable Notice Period - Last Working Day (if not currently employed) - Job offers in hand Regards, Chaitali Saha Talent Acquisition Group at Voqeoit Technologies Contact No: 8088779710 Website: www.voqeoit.com,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Python AWS Developer at Willware Technologies, you will need to have over 5 years of experience working with Python frameworks such as Django, Flask, and FastAPI, along with a strong expertise in AWS services. Your role will involve utilizing Python libraries like Pandas, NumPy, and SQLAlchemy, while designing, developing, and maintaining RESTful APIs. It is essential to have experience in API security, authentication, and authorization protocols such as OAuth and JWT. Additionally, hands-on experience in PL/SQL, including Packages, Functions, and Ref Cursors, is required for this role. You should possess a strong knowledge of Data Warehouse solutions, Datamart, and ODS concepts, along with an understanding of data normalization and Oracle performance optimization techniques. While not mandatory, good-to-have technical skills for this position include experience with Kubernetes for container orchestration, deploying, managing, and scaling applications on Kubernetes clusters, and proficiency in various AWS services like EC2, S3, RDS, and Lambda. Familiarity with Infrastructure-as-Code tools such as Terraform and CloudFormation, as well as knowledge of SnapLogic for cloud-native integration and designing integration pipelines, would be advantageous in this role.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

jaipur, rajasthan

On-site

Amplework Software is a full-stack development agency based in Jaipur (Rajasthan), IND, specializing in end-to-end software development solutions for clients worldwide. We are dedicated to delivering high-quality products that align with business requirements and leverage cutting-edge technologies. Our expertise encompasses custom software development, mobile applications, AI-driven solutions, and enterprise applications. Join our innovative team that drives digital transformation through technology. We are looking for a Mid-Level Python and AI Engineer to join our team. In this role, you will assist in building and training machine learning models using frameworks such as TensorFlow, PyTorch, and Scikit-Learn. You will experiment with pre-trained AI models for NLP, Computer Vision, and Predictive Analytics. Additionally, you will work with structured and unstructured data, collaborate with data scientists and software engineers, and continuously learn, experiment, and optimize models to enhance performance and efficiency. Ideal candidates should possess a Bachelor's degree in Computer Science, Engineering, AI, or a related field and proficiency in Python with experience in writing optimized and clean code. Strong problem-solving skills, understanding of machine learning concepts, and experience with data processing libraries are required. Familiarity with AI models and neural networks using frameworks like Scikit-Learn, TensorFlow, or PyTorch is essential. Preferred qualifications include experience with NLP using transformers, BERT, GPT, or OpenAI APIs, AI model deployment, database querying, and participation in AI-related competitions or projects. Soft skills such as analytical thinking, teamwork, eagerness to learn, and excellent English communication skills are highly valued. Candidates who excel in problem-solving, possess a willingness to adapt and experiment, and prefer a dynamic environment for AI exploration are encouraged to apply. A face-to-face interview will be conducted, and applicants should be able to attend the interview at our office location. Join the Amplework Software team to collaborate with passionate individuals, work on cutting-edge projects, make a real impact, enjoy competitive benefits, and thrive in a great working environment.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

maharashtra

On-site

As a Backend Developer at Maitri AI, you will be responsible for developing, optimizing, and maintaining backend services using FastAPI. Your role will involve implementing API authentication, authorization, and security best practices while working with SQLAlchemy for ORM-based database management. You will design, query, and optimize databases using MySQL, utilizing NumPy and Pandas for data processing and transformation. Collaboration with frontend developers to seamlessly integrate APIs and maintaining CI/CD workflows with GitHub will be crucial aspects of your responsibilities. Additionally, you will be expected to research and integrate Machine Learning and Generative AI solutions such as Gemini AI and OpenAI. Troubleshooting and resolving backend-related issues, staying updated with the latest advancements in backend technologies and AI, are also key parts of your role. To qualify for this position, you should have at least 2 years of experience in backend development, a Bachelor's degree in Computer Science, Information Technology, or a relevant field, proficiency in Python and experience with FastAPI. Hands-on experience with SQLAlchemy and MySQL databases, a strong understanding of API authentication and security principles, and experience working with GitHub for version control and collaboration are essential. Preferred skills include experience with Machine Learning and Generative AI models, knowledge of containerization technologies like Docker, and exposure to cloud services such as AWS, GCP, or Azure. This role offers the opportunity to work on cutting-edge AI and backend projects, competitive salary and benefits, and a collaborative work environment that encourages learning and growth. Join us at Maitri AI and be a part of an innovative team at the forefront of AI and technology. This is a full-time position with a day shift schedule, requiring in-person work at our Borivali West, Mumbai location.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

You are a Senior Python Developer with a minimum of 5 years of experience in Python, specifically with expertise in Django. You have hands-on experience with Azure Web Application Services, Web Services, and API Development. Additionally, you are proficient in working with databases such as Postgres and have knowledge of API documentation tools like swagger or RAML. Your skill set includes Python libraries like Pandas, NumPy, and familiarity with Object Relational Mapper (ORM) libraries. You can integrate multiple data sources and databases into a single system and understand threading limitations and multi-process architecture in Python. Moreover, you have a basic understanding of front-end technologies like JavaScript, HTML5, and CSS3. You are knowledgeable about user authentication and authorization between multiple systems, servers, and environments. You understand the fundamental design principles behind scalable applications and have experience with event-driven programming in Python. You can optimize output for different delivery platforms such as mobile vs. desktop and create database schemas that support business processes. Strong unit testing and debugging skills are part of your expertise, along with proficiency in code versioning tools like Git, Mercurial, or SVN. You are familiar with ADS, Jira, Kubernetes, and CI/CD tools, as well as have knowledge of data mining and algorithms. Your primary skill set includes Python, SciPy, NumPy, Django, Django ORM, MySQL, DRF, and basic knowledge of Azure. As a Full Stack Developer in the IT Services & Consulting industry, you are responsible for software development and engineering. You hold a B.Tech/B.E. degree in Electronics/Telecommunication or Computers. The job location is in Noida, and you are expected to work full-time in a permanent role. The company, Crosslynx US LLC, is established in Product Engineering and Cloud Engineering, delivering innovative solutions to enterprises worldwide. With expertise in Trustworthy and Explainable AI, Embedded Software, Cloud Application Development, RF Design & Testing, and Quality Assurance, the company offers a healthy work environment with work-life balance. Employees have opportunities to collaborate with clients, work on innovative projects, and learn about cutting-edge technologies. If you possess the technical aptitude and skills required for this role, you are encouraged to apply and join the team at Crosslynx to boost your career.,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

ahmedabad, gujarat

On-site

You will be joining a forward-thinking team at Tecblic, a company specializing in delivering AI-driven solutions to empower businesses in the digital age. If you are passionate about Large Language Models (LLMs), machine learning, and pushing the boundaries of Agentic AI, we welcome you to be a part of our innovative team. As a Machine Learning Engineer at Tecblic, your key responsibilities will include conducting research and development to design and fine-tune machine learning models, with a specific focus on LLMs and Agentic AI systems. You will be responsible for optimizing pre-trained LLMs for domain-specific use cases, collaborating with software engineers and product teams to integrate AI models into customer-facing applications, and performing data preprocessing, pipeline creation, feature engineering, and exploratory data analysis for dataset preparation. Additionally, you will design and implement model deployment pipelines, prototype innovative solutions using cutting-edge techniques, and provide technical mentorship to junior team members. In terms of core technical skills, we require proficiency in Python for machine learning and data science tasks, expertise in ML frameworks and libraries like PyTorch, TensorFlow, and Hugging Face, a solid understanding of LLMs such as GPT, T5, BERT, or Bloom, experience in NLP tasks, knowledge of deep learning architectures, strong data manipulation skills, and familiarity with cloud services and ML model deployment tools. Exposure to Agentic AI, MLOps tools, generative AI models, prompt engineering, few-shot/fine-tuned approaches for LLMs, vector databases, version control, and collaborative development practices would be considered advantageous. Furthermore, we are looking for individuals with a strong analytical and mathematical background, including proficiency in linear algebra, statistics, and probability, as well as a solid understanding of algorithms and data structures to solve complex ML problems. Soft skills such as excellent problem-solving, critical-thinking, communication, collaboration, self-motivation, and continuous learning mindset are also highly valued. If you are ready to contribute your expertise and passion for machine learning and AI to a dynamic and innovative team, Tecblic is the place for you. Join us in pushing the boundaries of technology and creating impactful AI-driven solutions for businesses in the digital era.,

Posted 2 weeks ago

Apply

12.0 - 16.0 years

0 Lacs

kolkata, west bengal

On-site

As a Data Scientist at UST, you will independently develop data-driven solutions to tackle complex business challenges by utilizing analytical, statistical, and programming skills to collect, analyze, and interpret large datasets under supervision. Your role involves working closely with stakeholders across the organization to identify opportunities for leveraging customer data in creating models that generate valuable business insights. You will be responsible for creating new experimental frameworks, building automated tools for data collection, correlating similar datasets, and deriving actionable results. Additionally, you will develop predictive models and machine learning algorithms to analyze vast amounts of information, uncover trends, and patterns. By mining and analyzing data from company databases, you will drive optimization and improvement in product development, marketing techniques, and business strategies. Moreover, you will be expected to create processes and tools to monitor and analyze model performance, data accuracy, and develop data visualization and illustrations to address business problems effectively. Your role will also involve using predictive modeling to enhance and optimize customer experiences and other business outcomes. Collaboration with different functional teams to implement models and monitor outcomes is essential. You will be required to set and provide feedback on FAST goals for reportees. Key responsibilities include applying statistical techniques like regression, properties of distributions, and statistical tests to analyze data, as well as utilizing machine learning techniques such as clustering, decision tree learning, and artificial neural networks for data analysis. Moreover, you will be creating advanced algorithms and statistics through regression, simulation, scenario analysis, and modeling. In terms of data visualization, you will visualize and present data for stakeholders using tools like Periscope, Business Objects, D3, ggplot, etc. You will oversee the activities of analyst personnel, ensuring efficient execution of their duties, while mining the business's database for critical insights and communicating findings to relevant departments. Other expectations include creating efficient and reusable code for data improvement, manipulation, and analysis, managing project codebase through version control tools like git and bitbucket, and creating reports depicting trends and behaviors from analyzed data. You will also be involved in training end users on new reports and dashboards, documenting your work, conducting peer reviews, managing knowledge, and reporting task status. The ideal candidate will possess excellent pattern recognition and predictive modeling skills, a strong background in data mining and statistical analysis, expertise in machine learning techniques, and advanced algorithm creation. Moreover, analytical, communication, critical thinking, attention to detail, mathematical, and interpersonal skills are crucial for success in this role. Proficiency in programming languages such as Java, Python, R, web services like Redshift and S3, statistical and data mining techniques, computing tools, analytical languages, data visualization software, mathematics, spreadsheet tools, DBMS, operating systems, and project management tools is required. Additionally, a strong understanding of statistical concepts, SQL, machine learning (Regression and Classification), deep learning (ANN, RNN, CNN), advanced NLP, computer vision, Gen AI/LLM, AWS Sagemaker/Azure ML/Google Vertex AI, and basic implementation experience of Docker, Kubernetes, kubeflow, MLOps, Python (numpy, panda, sklearn, streamlit, matplotlib, seaborn) is essential for this role.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

NTT DATA is looking for a Senior Python Engineer to join the team in Bangalore, Karntaka (IN-KA), India. As a Senior Python Engineer, you will be part of the C3 Data Warehouse team, focusing on building the next-gen data platform that sources and stores data from various technology systems across the firm into a centralized data platform. This platform empowers reporting and analytics solutions for the Technology Risk functions within Morgan Stanley. Your primary responsibilities will include contributing to the development of a unified data pipeline framework in Python, utilizing technologies such as Airflow, DBT, Spark, and Snowflake. Additionally, you will integrate this framework with existing internal platforms for data quality, cataloging, discovery, incident logging, and metric generation. Collaboration with data warehousing leads, data analysts, ETL developers, infrastructure engineers, and data analytics teams will be essential to successfully implement the data platform and pipeline framework. Your key duties will include developing various components in Python for the unified data pipeline framework, establishing best practices for optimal Snowflake usage, assisting with testing and deployment using standard frameworks and CI/CD tooling, monitoring query performance and data loads, and providing guidance during QA & UAT phases to identify issues and determine the best resolutions. The ideal candidate should have at least 5 years of experience in data development and solutions in complex data environments, with expertise in developing data pipelines and warehousing solutions using Python and libraries like Pandas, NumPy, PySpark. Experience in hybrid data environments (on-Prem and Cloud) and exposure to Power BI/Snowflake are also required. NTT DATA is a trusted global innovator of business and technology services, serving 75% of the Fortune Global 100. Committed to helping clients innovate, optimize, and transform for long-term success, NTT DATA has diverse experts in over 50 countries and a robust partner ecosystem. Services provided include business and technology consulting, data and artificial intelligence, industry solutions, as well as development, implementation, and management of applications, infrastructure, and connectivity. NTT DATA is a leading provider of digital and AI infrastructure globally and is part of the NTT Group, investing significantly in R&D to support organizations and society in transitioning confidently and sustainably into the digital future. Visit us at us.nttdata.com.,

Posted 2 weeks ago

Apply

0.0 - 1.0 years

0 - 0 Lacs

Noida, Uttar Pradesh

On-site

Location: Noida / Gurgaon (Onsite – 5 Days a Week) Experience: 4 to 7 Years Employment Type: Full-Time Background Verification (BGV) : Mandatory post-selection About the Role We are looking for highly skilled Python Developers who are passionate about building scalable and reliable backend solutions. You will work in a dynamic, collaborative environment and contribute to robust system architecture, core development, and feature optimization using modern Python frameworks, libraries, and tools . Key Responsibilities Design, develop, test, and maintain backend components using Python and modern frameworks. Apply strong knowledge of OOPs concepts , data structures , and algorithms to write clean, efficient, and scalable code. Work with Python libraries such as NumPy, Pandas, SciPy, and Scikit-learn to build data-driven solutions. Collaborate with DevOps teams for seamless deployment using Docker and CI/CD pipelines . Work with MySQL databases to design schema, write queries, and ensure data integrity. Version control and code management using Git . Collaborate with cross-functional teams, participate in code reviews, and contribute to agile development. Job Type: Full-time Pay: ₹70,000.00 - ₹80,000.00 per month Location Type: In-person Schedule: Day shift Experience: Python: 5 years (Required) SQL: 3 years (Required) Git: 1 year (Required) Location: Noida, Uttar Pradesh (Required) Work Location: In person Speak with the employer +91 8851582342

Posted 2 weeks ago

Apply

0.0 - 4.0 years

0 Lacs

punjab

On-site

We are seeking a passionate Data Science fresher who has completed at least 6 months of practical training, internship, or project experience in the data science field. In this role, you will have the exciting opportunity to utilize your analytical and problem-solving skills on real-world datasets while collaborating closely with experienced data scientists and engineers. Your responsibilities will include assisting in data collection, cleaning, and preprocessing from various sources, supporting the team in building, evaluating, and optimizing machine learning models, performing exploratory data analysis (EDA) to extract insights and patterns, working on data visualization dashboards and reports using tools like Power BI, Tableau, or Matplotlib/Seaborn, collaborating with senior data scientists and domain experts on ongoing projects, documenting findings, code, and models in a structured manner, and continuously learning and adopting new techniques, tools, and frameworks. To be successful in this role, you should hold a Bachelor's degree in Computer Science, Statistics, Mathematics, Engineering, or a related field, along with a minimum of 6 months of internship/training experience in data science, analytics, or machine learning. Proficiency in Python (Pandas, NumPy, Scikit-learn, etc.), understanding of machine learning algorithms (supervised/unsupervised), knowledge of SQL and database concepts, familiarity with data visualization tools/libraries, and a basic understanding of statistics and probability are required technical skills. Additionally, you should possess strong analytical thinking and problem-solving abilities, good communication and teamwork skills, and an eagerness to learn and grow in a dynamic environment. Exposure to cloud platforms (AWS, GCP, Azure), experience with big data tools (Spark, Hadoop), and knowledge of deep learning frameworks (TensorFlow, PyTorch) are considered advantageous but not mandatory. In return, we offer you the opportunity to work on real-world data science projects, mentorship from experienced professionals in the field, a collaborative, innovative, and supportive work environment, and a growth path to advance into a full-time Data Scientist role with us. This is a full-time, permanent position suitable for fresher candidates. The benefits include health insurance, and the work schedule consists of day shifts from Monday to Friday. Fluency in English is preferred, and the work location is in person.,

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

Purpose Of The Position We are looking for a highly experienced Senior Data Scientist with a deep understanding of AI/ML technologies, including Recommendation Systems, Chatbots, Generative AI, and Large Language Models (LLMs). The ideal candidate will have hands-on experience in applying these technologies to solve real-world problems, working with large datasets, and collaborating with cross-functional teams to deliver innovative data-driven Responsibilities : Design, develop, and optimize recommendation systems to enhance user experience and engagement across platforms. Build and deploy chatbots with advanced NLP capabilities for automating customer interactions and improving business processes. Lead the development of Generative AI solutions, including content generation and automation. Research and apply Large Language Models (LLMs) like GPT, BERT, and others to solve business-specific problems and create innovative solutions. Collaborate with engineering teams to integrate machine learning models into production systems, ensuring scalability and reliability. Perform data exploration, analysis, and feature engineering to improve model performance. Stay updated on the latest advancements in AI and ML technologies, proposing new techniques and tools to enhance our product capabilities. Mentor junior data scientists and engineers, providing guidance on best practices in AI/ML model development and deployment. Collaborate with product managers and business stakeholders to translate business goals into AI-driven solutions. Work on model interpretability, explainability, and ensure models are built in an ethical and responsible Skills And Qualifications : 5+ years of experience in data science or machine learning, with a focus on building and deploying AI models. Strong expertise in designing and developing recommendation systems and working with collaborative filtering, matrix factorization, and content-based filtering techniques. Hands-on experience with chatbots using Natural Language Processing (NLP) and conversational AI frameworks. In-depth understanding of Generative AI, including transformer-based models and GANs (Generative Adversarial Networks). Experience working with Large Language Models (LLMs) such as GPT, BERT, T5, etc. Proficiency in machine learning frameworks such as TensorFlow, PyTorch, and Scikit-learn. Strong programming skills in Python and libraries such as NumPy, Pandas, Hugging Face, and NLTK. Experience with cloud platforms like AWS, GCP, or Azure for deploying and scaling machine learning models. Solid understanding of data pipelines, ETL processes, and working with large datasets using SQL or NoSQL databases. Knowledge of MLOps and experience deploying models in production environments. Strong problem-solving skills and a deep understanding of statistical methods and Qualifications : Experience with Reinforcement Learning and Recommender Systems personalization techniques. Experience of working with AWS Bedrock services. Familiarity with ethical AI and model bias mitigation techniques. Experience with A/B testing, experimentation, and performance tracking for AI models in production. Prior experience mentoring junior data scientists and leading AI/ML projects. Strong communication and collaboration skills, with the ability to convey complex technical concepts to non-technical Join Opportunity to be part of a rapidly growing, innovative product-based company. Collaborate with a talented, driven team focused on building high-quality software solutions. Competitive compensation and benefits package. (ref:hirist.tech)

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

As part of ACL Digital, an ALTEN Group Company, you will be contributing to digital product innovation and engineering as a key player. Our focus lies in assisting our clients in the design and development of cutting-edge products that are AI, Cloud, and Mobile ready, along with creating content and commerce-driven platforms. Through a design-led Digital Transformation framework, we facilitate the creation of connected, converged digital experiences tailored for the modern world. By leveraging our expertise in strategic design, engineering, and industry knowledge, you will play a crucial role in helping our clients navigate the digital landscape, thereby accelerating their growth trajectory. Headquartered in Silicon Valley, ACL Digital is a frontrunner in design-led digital experiences, innovation, enterprise modernization, and product engineering services, particularly within the Technology, Media & Telecom sectors. We are proud of our diverse and skilled workforce, which is part of the larger ALTEN Group comprising over 50,000 employees spread across 30+ countries, fostering a multicultural workplace and a collaborative knowledge-sharing environment. In India, our operations span across Bangalore, Chennai, Pune, Panjim, Hyderabad, Noida, and Ahmedabad, while in the USA, we have established offices in California, Atlanta, Philadelphia, and Washington states. As a suitable candidate for this role, you are expected to possess the following technical skills and competencies: - A minimum of 4-5 years of relevant experience in the field - Preferably trained or certified in Data Science/Machine Learning - Capable of effectively collaborating with technical leads - Strong communication skills coupled with the ability to derive meaningful conclusions - Proficiency in Data Science concepts, Machine Learning algorithms & Libraries like Scikit-learn, Numpy, Pandas, Stattools, Tensorflow, PyTorch, XGBoost - Experience in Machine Learning Training and Deployment pipelines - Familiarity with FastAPI/Flask framework - Proficiency in Docker and Virtual Environment - Proficient in Database Operations - Strong analytical and problem-solving skills - Ability to excel in a dynamic environment with varying degrees of ambiguity Your role would involve the following responsibilities and competencies: - Applying data mining, quantitative analysis, statistical techniques, and conducting experiments to derive reliable insights from data - Understanding business use-cases and utilizing various sources to collect and annotate datasets for business problems - Possessing a strong academic background with excellent analytical skills and exposure to machine learning and information retrieval domain and technologies - Strong programming skills with the ability to work in languages such as Python, C/C++ - Acquiring data from primary or secondary sources and performing data tagging - Filtering and cleaning data based on business requirements and maintaining a well-defined, structured, and clean database - Working on data labeling tools and annotating data for machine learning models - Interpreting data, analyzing results using statistical techniques and models, and conducting exploratory analysis If you are someone who thrives in a challenging and dynamic environment and possesses the required technical skills and competencies, we look forward to having you join our team at ACL Digital.,

Posted 2 weeks ago

Apply

3.0 - 8.0 years

11 - 16 Lacs

Hyderabad

Work from Office

About ValGenesis ValGenesis is a leading digital validation platform provider for life sciences companies. ValGenesis suite of products are used by 30 of the top 50 global pharmaceutical and biotech companies to achieve digital transformation, total compliance and manufacturing excellence/intelligence across their product lifecycle. Learn more about working for ValGenesis, the de facto standard for paperless validation in Life Sciences: https://www.youtube.com/watch?v=tASq7Ld0JsQ About the Role: We are seeking a highly skilled Senior AI/ML Engineer to join our dynamic team to build the next gen applications for our global customers. If you are a technology enthusiast and highly passionate, we are eager to discuss with you about the potential role. Responsibilities: Implement, and deploy Machine Learning solutions to solve complex problems and deliver real business value, i.e. revenue, engagement, and customer satisfaction. Collaborate with data product managers, software engineers and SMEs to identify AI/ML opportunities for improving process efficiency. Develop production-grade ML models to enhance customer experience, content recommendation, content generation, and predictive analysis. Monitor and improve model performance via data enhancement, feature engineering, experimentation and online/offline evaluation. Stay up to date with the latest in machine learning and artificial intelligence and influence AI/ML for the Life science industry. Responsibilities 4 - 8 years of experience in AI/ML engineering, with a track record of handling increasingly complex projects. Strong programming skills in Python, Rust. Experience with Pandas, NumPy, SciPy, OpenCV (for image processing) Experience with ML frameworks, such as scikit-learn, Tensorflow, PyTorch. Experience with GenAI tools, such as Langchain, LlamaIndex, and open-source Vector DBs. Experience with one or more Graph DBs - Neo4J, ArangoDB Experience with MLOps platforms, such as Kubeflow or MLFlow. Expertise in one or more of the following AI/ML domains: Causal AI, Reinforcement Learning, Generative AI, NLP, Dimension Reduction, Computer Vision, Sequential Models. Expertise in building, deploying, measuring, and maintaining machine learning models to address real-world problems. Thorough understanding of software product development lifecycle, DevOps (build, continuous integration, deployment tools) and best practices. Excellent written and verbal communication skills and interpersonal skills. Advanced degree in Computer Science, Machine Learning or related field. We’re on a Mission In 2005, we disrupted the life sciences industry by introducing the world’s first digital validation lifecycle management system. ValGenesis VLMS® revolutionized compliance-based corporate validation activities and has remained the industry standard. Today, we continue to push the boundaries of innovation enhancing and expanding our portfolio beyond validation with an end-to-end digital transformation platform. We combine our purpose-built systems with world-class consulting services to help every facet of GxP meet evolving regulations and quality expectations. The Team You’ll Join Our customers’ success is our success. We keep the customer experience centered in our decisions, from product to marketing to sales to services to support. Life sciences companies exist to improve humanity’s quality of life, and we honor that mission. We work together. We communicate openly, support each other without reservation, and never hesitate to wear multiple hats to get the job done. We think big. Innovation is the heart of ValGenesis. That spirit drives product development as well as personal growth. We never stop aiming upward. We’re in it to win it. We’re on a path to becoming the number one intelligent validation platform in the market, and we won’t settle for anything less than being a market leader. How We Work Our Chennai, Hyderabad and Bangalore offices are onsite, 5 days per week. We believe that in-person interaction and collaboration fosters creativity, and a sense of community, and is critical to our future success as a company. ValGenesis is an equal-opportunity employer that makes employment decisions on the basis of merit. Our goal is to have the best-qualified people in every job. All qualified applicants will receive consideration for employment without regard to race, religion, sex, sexual orientation, gender identity, national origin, disability, or any other characteristics protected by local law.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

As a Python Backend Developer at Coforge, you should have 4-6 years of experience working with Python as a backend technology, particularly in stack development. You must also possess expertise in microservices and APIs. Proficiency in core Python fundamentals, Pandas, and Numpy is essential for this role. Additionally, you will have the opportunity to be trained in React JS. At Coforge, we are currently looking to hire both full-time employees and freelancers.,

Posted 2 weeks ago

Apply

6.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

About Calfus: Calfus is a Silicon Valley headquartered software engineering and platforms company with a vision deeply rooted in the Olympic motto "Citius, Altius, Fortius Communiter". At Calfus, we aim to inspire our team to rise faster, higher, and stronger while fostering a collaborative environment to build software at speed and scale. Our primary focus is on creating engineered digital solutions that drive positive impact on business outcomes. Upholding principles of #Equity and #Diversity, we strive to create a diverse ecosystem that extends to the broader society. Join us at #Calfus and embark on an extraordinary journey with us! Position Overview: As a Data Engineer specializing in BI Analytics & DWH, you will be instrumental in crafting and implementing robust business intelligence solutions that empower our organization to make informed, data-driven decisions. Leveraging your expertise in Power BI, Tableau, and ETL processes, you will be responsible for developing scalable architectures and interactive visualizations. This role necessitates a strategic mindset, strong technical acumen, and effective collaboration with stakeholders across all levels. Key Responsibilities: - BI Architecture & DWH Solution Design: Develop and design scalable BI Analytical & DWH Solution aligning with business requirements, utilizing tools like Power BI and Tableau. - Data Integration: Supervise ETL processes through SSIS to ensure efficient data extraction, transformation, and loading into data warehouses. - Data Modelling: Establish and maintain data models that support analytical reporting and data visualization initiatives. - Database Management: Employ SQL for crafting intricate queries, stored procedures, and managing data transformations via joins and cursors. - Visualization Development: Spearhead the design of interactive dashboards and reports in Power BI and Tableau while adhering to best practices in data visualization. - Collaboration: Engage closely with stakeholders to gather requirements and translate them into technical specifications and architecture designs. - Performance Optimization: Analyze and optimize BI solutions for enhanced performance, scalability, and reliability. - Data Governance: Implement data quality and governance best practices to ensure accurate reporting and compliance. - Team Leadership: Mentor and guide junior BI developers and analysts to cultivate a culture of continuous learning and improvement. - Azure Databricks: Utilize Azure Databricks for data processing and analytics to seamlessly integrate with existing BI solutions. Qualifications: - Bachelor's degree in computer science, Information Systems, Data Science, or a related field. - 6-12 years of experience in BI architecture and development, with a strong emphasis on Power BI and Tableau. - Proficiency in ETL processes and tools, particularly SSIS. Strong command over SQL Server, encompassing advanced query writing and database management. - Proficient in exploratory data analysis using Python. - Familiarity with the CRISP-DM model. - Ability to work with various data models and databases like Snowflake, Postgres, Redshift, and MongoDB. - Experience with visualization tools such as Power BI, QuickSight, Plotly, and Dash. - Strong programming foundation in Python for data manipulation, analysis, serialization, database interaction, data pipeline and ETL tools, cloud services, and more. - Familiarity with Azure SDK is a plus. - Experience with code quality management, version control, collaboration in data engineering projects, and interaction with REST APIs and web scraping tasks is advantageous. Calfus Inc. is an Equal Opportunity Employer.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

maharashtra

On-site

Are you seeking an exciting opportunity to become part of a dynamic and expanding team in a fast-paced and challenging environment This unique position offers you the chance to collaborate with the Business team to deliver a comprehensive perspective. As a Model Risk Program Analyst within the Model Risk Governance and Review Group (MRGR), your responsibilities include developing model risk policy and control procedures, conducting model validation activities, offering guidance on appropriate model usage in the business context, evaluating ongoing model performance testing, and ensuring that model users understand the strengths and limitations of the models. This role also presents attractive career paths for individuals involved in model development and validation, allowing them to work closely with Model Developers, Model Users, Risk and Finance professionals. Your key responsibilities will involve engaging in new model validation activities for all Data Science models in the coverage area. This includes evaluating the model's conceptual soundness, assumptions, reliability of inputs, testing completeness, numerical robustness, and performance metrics. You will also be responsible for conducting independent testing and additional model review activities, liaising with various stakeholders to provide oversight and guidance on model usage, controls, and performance assessment. To excel in this role, you should possess strong quantitative and analytical skills, preferably with a degree in a quantitative discipline such as Computer Science, Statistics, Data Science, Math, Economics, or Math Finance. A Master's or PhD degree is desirable. Additionally, you should have a solid understanding of Machine Learning and Data Science theory, techniques, and tools, including Python programming proficiency and experience with machine learning libraries such as Numpy, Scipy, Scikit-learn, TensorFlow, and PyTorch. Prior experience in Data Science, Quantitative Model Development, Model Validation, or Technology focused on Data Science, along with excellent writing and communication skills, will be advantageous. A risk and control mindset, with the ability to ask incisive questions, assess materiality, and escalate issues, are also essential for this role. By staying updated on the latest developments in your coverage area, you will contribute to maintaining the model risk control apparatus of the bank and serve as a key point of contact within the organization. Join our team and be a part of shaping the future of model-related risk management decisions.,

Posted 2 weeks ago

Apply

0.0 - 3.0 years

0 Lacs

vapi, gujarat

On-site

You are invited to join our team as an Applied AI Engineer (Fresher), specifically looking for graduates from IITs or NITs. This is a full-time position based in Vapi, Gujarat, offering an exciting opportunity to work within our dynamic AI Engineering team. The ideal candidate will possess a strong foundation in Data Structures & Algorithms, exceptional problem-solving skills, and a genuine enthusiasm for AI, machine learning, and deep learning technologies. As an Applied AI Engineer, you will collaborate closely with senior engineers and data scientists to design and implement AI-driven solutions. Your responsibilities will include developing, optimizing, and deploying machine learning and deep learning models, as well as translating abstract AI challenges into efficient, scalable, and production-ready code. Additionally, you will contribute to data preprocessing, feature engineering, and model evaluation tasks, participate in technical discussions and code reviews, and explore cutting-edge AI frameworks and research trends. Key Skills required for this role include exceptional problem-solving abilities, proficiency in Python and core libraries such as NumPy, Pandas, and scikit-learn, a fundamental understanding of machine learning concepts, exposure to deep learning frameworks like TensorFlow or PyTorch, and a strong grasp of object-oriented programming and software engineering principles. A passion for AI, along with analytical, logical thinking, and mathematical skills, will be essential for success in this position. Candidates with hands-on experience in AI/ML projects, familiarity with Reinforcement Learning, NLP, or Computer Vision, and knowledge of tools like Git, Docker, and cloud platforms (AWS, GCP, Azure) are highly preferred. Educational qualifications include a degree in Computer Science, Artificial Intelligence, Machine Learning, or Data Science from IITs or NITs, with a strong academic record and demonstrated interest in AI/ML concepts. If you meet these requirements and are excited about contributing to the future of intelligent systems, we encourage you to apply by sharing your CV with us at jignesh.pandoriya@merillife.com. Join our team and be part of shaping innovative solutions that directly impact lives.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

kolkata, west bengal

On-site

You must have knowledge in Azure Datalake, Azure function, Azure Databricks, Azure Data Factory, and PostgreSQL. Working knowledge in Azure DevOps and Git flow would be an added advantage. Alternatively, you should have working knowledge in AWS Kinesis, AWS EMR, AWS Glue, AWS RDS, AWS Athena, and AWS RedShift. Demonstrable expertise in working with timeseries data is essential. Experience in delivering data engineering/data science projects in Industry 4.0 is an added advantage. Knowledge of Palantir is required. You must possess strong problem-solving skills with a focus on sustainable and reusable development. Proficiency in using statistical computer languages like Python/PySpark, Pandas, Numpy, seaborn/matplotlib is necessary. Knowledge in Streamlit.io is a plus. Familiarity with Scala, GoLang, Java, and big data tools such as Hadoop, Spark, Kafka is beneficial. Experience with relational databases like Microsoft SQL Server, MySQL, PostGreSQL, Oracle, and NoSQL databases including Hadoop, Cassandra, MongoDB is expected. Proficiency in data pipeline and workflow management tools like Azkaban, Luigi, Airflow is required. Experience in building and optimizing big data pipelines, architectures, and data sets is crucial. You should possess strong analytical skills related to working with unstructured datasets. Provide innovative solutions to data engineering problems, document technology choices, and integration patterns. Apply best practices for project delivery with clean code. Demonstrate innovation and proactiveness in meeting project requirements. Reporting to: Director- Intelligent Insights and Data Strategy Travel: Must be willing to be deployed at client locations worldwide for long and short terms, flexible for shorter durations within India and abroad.,

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

You are a highly skilled, detail-oriented, and motivated Python DQ Automation Developer who will be responsible for designing, developing, and maintaining data quality automation solutions using Python. With a deep understanding of data quality principles, proficiency in Python, and experience in data processing and analysis, you will play a crucial role in ensuring accurate and timely data integration and transformation. Your key responsibilities will include designing, developing, and implementing data quality automation processes and solutions to identify, measure, and improve data quality. You will write and optimize Python scripts using libraries such as Pandas, NumPy, and PySpark for data manipulation and processing. Additionally, you will develop and enhance ETL processes, analyze data sets to identify data quality issues, and develop and execute test plans to validate the effectiveness of data quality solutions. As a part of the team, you will maintain comprehensive documentation of data quality processes, procedures, and standards, and collaborate closely with data analysts, data engineers, DQ testers, and other stakeholders to understand data requirements and deliver high-quality data solutions. Required Skills: - Proficiency in Python and related libraries (Pandas, NumPy, PySpark, pyTest). - Experience with data quality tools and frameworks. - Strong understanding of ETL processes and data integration. - Familiarity with data governance and data management principles. - Excellent analytical and problem-solving skills with a keen attention to detail. - Strong verbal and written communication skills to explain technical concepts to non-technical stakeholders. - Ability to work effectively both independently and as part of a team. Qualifications: - Bachelor's degree in computer science or Information Technology. An advanced degree is a plus. - Minimum of 7 years of experience in data quality automation and Python Development. - Proven experience with Python libraries for data processing and analysis. Citi is an equal opportunity and affirmative action employer, encouraging all qualified and interested applicants to apply for career opportunities. If you require a reasonable accommodation due to a disability, please review Accessibility at Citi.,

Posted 2 weeks ago

Apply

0.0 - 3.0 years

0 Lacs

hyderabad, telangana

On-site

As a Computer Vision Intern at our AI research and development team in Hyderabad, you will play a crucial role in building intelligent systems utilizing cutting-edge computer vision techniques. Your primary responsibilities will involve developing and optimizing computer vision models for tasks including object detection, tracking, segmentation, and image classification. You will have the opportunity to work with annotated datasets, preprocess and augment data for vision pipelines, and integrate vision models with Large Language Models (LLMs) for multimodal applications. To excel in this role, you should be currently pursuing or have recently completed a degree in Computer Science, AI, Data Science, or related fields. A strong understanding of deep learning fundamentals and computer vision techniques is essential. Proficiency in Python and experience with libraries such as OpenCV, NumPy, TensorFlow, or PyTorch are required. Moreover, familiarity with LLMs (e.g., GPT, BERT), Retrieval-Augmented Generation (RAG) architecture, agentic frameworks, or Generative AI tools would be advantageous. Your role will also involve contributing to the design and prototyping of RAG-based workflows that combine vision data with large language models. Additionally, you will explore Generative AI approaches for synthetic data generation and enhancement. Your problem-solving skills, curiosity, and eagerness to learn new technologies will be key assets in this position. If you are passionate about leveraging vision and language technologies to create next-gen AI applications and are excited about joining an innovative team, we encourage you to apply for this position. Join us in our mission to explore and implement the latest advancements in AI. Apply now and be part of our awesome squad!,

Posted 2 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What You’ll Be Doing... We are seeking a visionary and technically strong Senior AI Architect to join our Billing IT organization in driving innovation at the intersection of telecom billing, customer experience, and artificial intelligence. This leadership role will be pivotal in designing, developing, and scaling AI-led solutions that redefine how we bill our customers, improve their billing experience, and derive actionable insights from billing data. You will work closely with cross-functional teams to lead initiatives that transform customer-facing systems, backend data platforms, and software development practices through modern AI technologies. Key Responsibilities Customer Experience Innovation: Designing and implementing AI-driven enhancements to improve telecom customer experience, particularly in the billing domain. Leading end-to-end initiatives that personalize, simplify, and demystify billing interactions for customers. AI Tools and Platforms: Evaluating and implementing cutting-edge AI/ML models, LLMs, SLMs, and AI-powered solutions for use across the billing ecosystem. Developing prototypes and production-grade AI tools to solve real-world customer pain points. Prompt Engineering & Applied AI: Exhibiting deep expertise in prompt engineering and advanced LLM usage to build conversational tools, intelligent agents, and self-service experiences for customers and support teams. Partnering with design and development teams to build intuitive AI interfaces and utilities. AI Pair Programming Leadership: Demonstrating hands-on experience with AI-assisted development tools (e.g., GitHub Copilot, Codeium). Driving adoption of such tools across development teams, track measurable productivity improvements, and integrate into SDLC pipelines. Data-Driven Insight Generation: Leading large-scale data analysis initiatives using AI/ML methods to generate meaningful business insights, predict customer behavior, and prevent billing-related issues. Establishing feedback loops between customer behavior and billing system design. Thought Leadership & Strategy: Acting as a thought leader in AI and customer experience within the organization. Staying abreast of trends in AI and telecom customer experience; regularly benchmark internal initiatives with industry best practices. Architectural Excellence: Owning and evolve the technical architecture of AI-driven billing capabilities, ensuring scalability, performance, security, and maintainability. Collaborating with enterprise architects and domain leads to align with broader IT and digital transformation goals. Telecom Billing Domain Expertise: Bring deep understanding of telecom billing functions, processes, and IT architectures, including usage processing, rating, billing cycles, invoice generation, adjustments, and revenue assurance. Providing architectural guidance to ensure AI and analytics solutions are well integrated into core billing platforms with minimal operational risk. Where you'll be working... In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. What We’re Looking For... You’re energized by the prospect of putting your advanced expertise to work as one of the most senior members of the team. You’re motivated by working on groundbreaking technologies to have an impact on people’s lives. You’ll Need To Have Bachelor’s degree or four or more years of work experience. Six or more years of relevant experience required, demonstrated through one or a combination of work Strong understanding of AI/ML concepts, including generative AI, LLMs (Large Language Models) etc with the ability to evaluate and apply them to solve real-world problems in telecom and billing. Familiarity with industry-leading AI models and platforms (e.g., OpenAI GPT, Google Gemini, Microsoft Phi, Meta LLaMA, AWS Bedrock), and understanding of their comparative strengths, pricing models, and applicability. Ability to scan and interpret AI industry trends, identify emerging tools, and match them to business use cases (e.g., bill explainability, predictive analytics, anomaly detection, agent assist). Skilled in adopting and integrating third-party AI tools—rather than building from scratch—into existing IT systems, ensuring fit-for-purpose usage with strong ROI. Experience working with AI product vendors, evaluating PoCs, and influencing make-buy decisions for AI capabilities. Comfortable guiding cross-functional teams (tech, product, operations) on where and how to apply AI tools, including identifying appropriate use cases and measuring impact. Deep expertise in writing effective and optimized prompts across various LLMs. Knowledge of prompt chaining, tool-use prompting, function calling, embedding techniques, and vector search optimization. Ability to mentor others on best practices for LLM prompt engineering and prompt tuning. In-depth understanding of telecom billing functions: mediation, rating, charging, invoicing, adjustments, discounts, taxes, collections, and dispute management. Strong grasp of billing SLAs, accuracy metrics, and compliance requirements in a telcom environment. Proven ability to define and evolve cloud-native, microservices-based architectures with AI components. Deep understanding of software engineering practices including modular design, API-first development, testing automation, and observability. Experience in designing scalable, resilient systems for high-volume data pipelines and customer interactions. Demonstrated hands-on use of tools like GitHub Copilot, Codeium, AWS CodeWhisperer, etc. Strong track record in scaling adoption of AI pair programming tools across engineering teams. Ability to quantify productivity improvements and integrate tooling into CI/CD pipelines. Skilled in working with large-scale structured and unstructured billing and customer data. Proficiency in tools like SQL, Python (Pandas, NumPy), Spark, and data visualization platforms (e.g., Power BI, Tableau). Experience designing and operationalizing AI/ML models to derive billing insights, detect anomalies, or improve revenue assurance. Excellent ability to translate complex technical concepts to business stakeholders. Influential leadership with a track record of driving innovation, change management, and cross-functional collaboration. Ability to coach and mentor engineers, analysts, and product owners on AI technologies and best practices. Keen awareness of emerging AI trends, vendor platforms, open-source initiatives, and market best practices. Active engagement in AI communities, publications, or proof-of-concept experimentation. Even better if you have one or more of the following: A master’s degree If Verizon and this role sound like a fit for you, we encourage you to apply even if you don’t meet every “even better” qualification listed above. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About This Role Wells Fargo is seeking a Lead Software Engineer. In This Role, You Will Lead complex technology initiatives including those that are companywide with broad impact Act as a key participant in developing standards and companywide best practices for engineering complex and large scale technology solutions for technology engineering disciplines Design, code, test, debug, and document for projects and programs Review and analyze complex, large-scale technology solutions for tactical and strategic business objectives, enterprise technological environment, and technical challenges that require in-depth evaluation of multiple factors, including intangibles or unprecedented technical factors Make decisions in developing standard and companywide best practices for engineering and technology solutions requiring understanding of industry best practices and new technologies, influencing and leading technology team to meet deliverables and drive new initiatives Collaborate and consult with key technical experts, senior technology team, and external industry groups to resolve complex technical issues and achieve goals Lead projects, teams, or serve as a peer mentor Required Qualifications: 5+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education. Desired Qualifications: 5+ years in strong engineering skills. Expertise in Python, PySpark, and Object-oriented programming. Strong experience in designing and building Rest APIs using python (Django Rest Framework is added advantage). Excellent experience with AirFlow DAGs, workflow orchestration Experience with Multiprocessing, Multithreading, asynchronous modules and building asynchronous APIs. Good experience with Kubernetes to build open shift container platform . Excellent troubleshooting and debugging skills Collaborate strongly with business teams to understand the business problems and provide technical solutions Design and build reusable solutions in data work stream of model development life cycle. Strong in data science python libraries like Pandas, NumPy, Pyarrow, SciPy. Solid understanding of distributed computing and GCP Good skills in PySpark, Object storage and distributed computing. Good banking domain skills and depth knowledge in risk domain. Good understanding of data models, relationship among the data. Strong analytical and problem-solving skills. Experience in source control using GitHub. Excellent verbal, written, and interpersonal communication skills Should possess excellent communication skills, able to multitask and have a great team spirit. Bachelor's degree in engineering (CS/Math/Electronics/Electricals) from top universities like IIT, NIT and other reputed institutes. Should be able to take ownership and accountability. Should have good leadership skills to collaborate and add value at enterprise solution. Job Expectations: Agile/Scrum software development methodologies and processes Understanding of Python code automatic deployment process ( GitHub actions, Harness ) Good end to end understanding of platform build and maintenance Understanding the python-based IDEs like pycharm, Visual studio code and Jupyter notebook etc Good understanding of Kafka and RabbitMQ Understanding GenAI use cases, identify and automate them in the project. Risk models understanding and rewrite them effectively using Python technology Lead discussion with business, architect team, and other partners to understand the business problems and provide technical solution Quickly assess different tools and present the right tools for the business use cases pros and cons. Design and develop the technical solution using Python, PySpark, Openshift container platform (OCP), GCP CICD pipelines, GITHUB etc Design and maintain operational databases for a application metadata, airflow metadata, iceberg catalog metadata Own and led complex and critical modules in data exchange ( Ingress and Egress) Strong experience using the development ecosystem of applications (JIRA, ALM, GitHub, uDeploy(Urban Code Deploy), Jenkins, Artifactory, SVN, etc...) Posting End Date: 27 Jul 2025 Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants With Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo . Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment And Hiring Requirements Third-Party recordings are prohibited unless authorized by Wells Fargo. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process. Reference Number R-473957

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

Remote

When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What You’ll Be Doing... We are seeking a visionary and technically strong Senior AI Architect to join our Billing IT organization in driving innovation at the intersection of telecom billing, customer experience, and artificial intelligence. This leadership role will be pivotal in designing, developing, and scaling AI-led solutions that redefine how we bill our customers, improve their billing experience, and derive actionable insights from billing data. You will work closely with cross-functional teams to lead initiatives that transform customer-facing systems, backend data platforms, and software development practices through modern AI technologies. Key Responsibilities Customer Experience Innovation: Designing and implementing AI-driven enhancements to improve telecom customer experience, particularly in the billing domain. Leading end-to-end initiatives that personalize, simplify, and demystify billing interactions for customers. AI Tools and Platforms: Evaluating and implementing cutting-edge AI/ML models, LLMs, SLMs, and AI-powered solutions for use across the billing ecosystem. Developing prototypes and production-grade AI tools to solve real-world customer pain points. Prompt Engineering & Applied AI: Exhibiting deep expertise in prompt engineering and advanced LLM usage to build conversational tools, intelligent agents, and self-service experiences for customers and support teams. Partnering with design and development teams to build intuitive AI interfaces and utilities. AI Pair Programming Leadership: Demonstrating hands-on experience with AI-assisted development tools (e.g., GitHub Copilot, Codeium). Driving adoption of such tools across development teams, track measurable productivity improvements, and integrate into SDLC pipelines. Data-Driven Insight Generation: Leading large-scale data analysis initiatives using AI/ML methods to generate meaningful business insights, predict customer behavior, and prevent billing-related issues. Establishing feedback loops between customer behavior and billing system design. Thought Leadership & Strategy: Acting as a thought leader in AI and customer experience within the organization. Staying abreast of trends in AI and telecom customer experience; regularly benchmark internal initiatives with industry best practices. Architectural Excellence: Owning and evolve the technical architecture of AI-driven billing capabilities, ensuring scalability, performance, security, and maintainability. Collaborating with enterprise architects and domain leads to align with broader IT and digital transformation goals. Telecom Billing Domain Expertise: Bring deep understanding of telecom billing functions, processes, and IT architectures, including usage processing, rating, billing cycles, invoice generation, adjustments, and revenue assurance. Providing architectural guidance to ensure AI and analytics solutions are well integrated into core billing platforms with minimal operational risk. Where you'll be working... In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. What We’re Looking For... You’re energized by the prospect of putting your advanced expertise to work as one of the most senior members of the team. You’re motivated by working on groundbreaking technologies to have an impact on people’s lives. You’ll Need To Have Bachelor’s degree or four or more years of work experience. Six or more years of relevant experience required, demonstrated through one or a combination of work Strong understanding of AI/ML concepts, including generative AI, LLMs (Large Language Models) etc with the ability to evaluate and apply them to solve real-world problems in telecom and billing. Familiarity with industry-leading AI models and platforms (e.g., OpenAI GPT, Google Gemini, Microsoft Phi, Meta LLaMA, AWS Bedrock), and understanding of their comparative strengths, pricing models, and applicability. Ability to scan and interpret AI industry trends, identify emerging tools, and match them to business use cases (e.g., bill explainability, predictive analytics, anomaly detection, agent assist). Skilled in adopting and integrating third-party AI tools—rather than building from scratch—into existing IT systems, ensuring fit-for-purpose usage with strong ROI. Experience working with AI product vendors, evaluating PoCs, and influencing make-buy decisions for AI capabilities. Comfortable guiding cross-functional teams (tech, product, operations) on where and how to apply AI tools, including identifying appropriate use cases and measuring impact. Deep expertise in writing effective and optimized prompts across various LLMs. Knowledge of prompt chaining, tool-use prompting, function calling, embedding techniques, and vector search optimization. Ability to mentor others on best practices for LLM prompt engineering and prompt tuning. In-depth understanding of telecom billing functions: mediation, rating, charging, invoicing, adjustments, discounts, taxes, collections, and dispute management. Strong grasp of billing SLAs, accuracy metrics, and compliance requirements in a telcom environment. Proven ability to define and evolve cloud-native, microservices-based architectures with AI components. Deep understanding of software engineering practices including modular design, API-first development, testing automation, and observability. Experience in designing scalable, resilient systems for high-volume data pipelines and customer interactions. Demonstrated hands-on use of tools like GitHub Copilot, Codeium, AWS CodeWhisperer, etc. Strong track record in scaling adoption of AI pair programming tools across engineering teams. Ability to quantify productivity improvements and integrate tooling into CI/CD pipelines. Skilled in working with large-scale structured and unstructured billing and customer data. Proficiency in tools like SQL, Python (Pandas, NumPy), Spark, and data visualization platforms (e.g., Power BI, Tableau). Experience designing and operationalizing AI/ML models to derive billing insights, detect anomalies, or improve revenue assurance. Excellent ability to translate complex technical concepts to business stakeholders. Influential leadership with a track record of driving innovation, change management, and cross-functional collaboration. Ability to coach and mentor engineers, analysts, and product owners on AI technologies and best practices. Keen awareness of emerging AI trends, vendor platforms, open-source initiatives, and market best practices. Active engagement in AI communities, publications, or proof-of-concept experimentation. Even better if you have one or more of the following: A master’s degree If Verizon and this role sound like a fit for you, we encourage you to apply even if you don’t meet every “even better” qualification listed above. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics.

Posted 2 weeks ago

Apply

2.0 - 4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About ValGenesis ValGenesis is a leading digital validation platform provider for life sciences companies. ValGenesis suite of products are used by 30 of the top 50 global pharmaceutical and biotech companies to achieve digital transformation, total compliance and manufacturing excellence/intelligence across their product lifecycle. Learn more about working for ValGenesis, the de facto standard for paperless validation in Life Sciences: https://www.youtube.com/watch?v=tASq7Ld0JsQ About The Role We are seeking a highly skilled AI/ML Engineer to join our dynamic team to build the next gen applications for our global customers. If you are a technology enthusiast and highly passionate, we are eager to discuss with you about the potential role. Responsibilities Implement, and deploy Machine Learning solutions to solve complex problems and deliver real business value, ie. revenue, engagement, and customer satisfaction. Collaborate with data product managers, software engineers and SMEs to identify AI/ML opportunities for improving process efficiency. Develop production-grade ML models to enhance customer experience, content recommendation, content generation, and predictive analysis. Monitor and improve model performance via data enhancement, feature engineering, experimentation and online/offline evaluation. Stay up-to-date with the latest in machine learning and artificial intelligence, and influence AI/ML for the Life science industry. Stay up-to-date with the latest in machine learning and artificial intelligence, and influence AI/ML for the Life science industry. Requirements 2 - 4 years of experience in AI/ML engineering, with a track record of handling increasingly complex projects. Strong programming skills in Python, Rust. Experience with Pandas, NumPy, SciPy, OpenCV (for image processing) Experience with ML frameworks, such as scikit-learn, Tensorflow, PyTorch. Experience with GenAI tools, such as Langchain, LlamaIndex, and open source Vector DBs. Experience with one or more Graph DBs - Neo4J, ArangoDB Experience with MLOps platforms, such as Kubeflow or MLFlow. Expertise in one or more of the following AI/ML domains: Causal AI, Reinforcement Learning, Generative AI, NLP, Dimension Reduction, Computer Vision, Sequential Models. Expertise in building, deploying, measuring, and maintaining machine learning models to address real-world problems. Thorough understanding of software product development lifecycle, DevOps (build, continuous integration, deployment tools) and best practices. Excellent written and verbal communication skills and interpersonal skills. Advanced degree in Computer Science, Machine Learning or related field. We’re on a Mission In 2005, we disrupted the life sciences industry by introducing the world’s first digital validation lifecycle management system. ValGenesis VLMS® revolutionized compliance-based corporate validation activities and has remained the industry standard. Today, we continue to push the boundaries of innovation ― enhancing and expanding our portfolio beyond validation with an end-to-end digital transformation platform. We combine our purpose-built systems with world-class consulting services to help every facet of GxP meet evolving regulations and quality expectations. The Team You’ll Join Our customers’ success is our success. We keep the customer experience centered in our decisions, from product to marketing to sales to services to support. Life sciences companies exist to improve humanity’s quality of life, and we honor that mission. We work together. We communicate openly, support each other without reservation, and never hesitate to wear multiple hats to get the job done. We think big. Innovation is the heart of ValGenesis. That spirit drives product development as well as personal growth. We never stop aiming upward. We’re in it to win it. We’re on a path to becoming the number one intelligent validation platform in the market, and we won’t settle for anything less than being a market leader. How We Work Our Chennai, Hyderabad and Bangalore offices are onsite, 5 days per week. We believe that in-person interaction and collaboration fosters creativity, and a sense of community, and is critical to our future success as a company. ValGenesis is an equal-opportunity employer that makes employment decisions on the basis of merit. Our goal is to have the best-qualified people in every job. All qualified applicants will receive consideration for employment without regard to race, religion, sex, sexual orientation, gender identity, national origin, disability, or any other characteristics protected by local law.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies