Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
karnataka
On-site
Role Overview: The Data & Analytics team at the leading financial institution in MENA is responsible for integrating new data sources, creating data models, developing data dictionaries, and building machine learning models for Wholesale Bank. The primary objective is to design and deliver data products that assist squads at Wholesale Bank in achieving business outcomes and generating valuable business insights. Within this job family, there are two roles - Data Analysts and Data Scientists. Both roles involve working with data, writing queries, collaborating with engineering teams to source relevant data, performing data munging, and extracting meaningful insights from the data. Data Analysts typically work with structured SQL databases or other BI tools and packages, while Data Scientists are expected to develop statistical models and be hands-on with machine learning and advanced programming, including Generative AI. Key Responsibilities: - Extract and analyze data from company databases to optimize and enhance product development and marketing strategies. - Analyze large datasets to uncover trends, patterns, and insights influencing business decisions. - Utilize predictive and AI/ML modeling techniques to enhance customer experience, boost revenue generation, improve ad targeting, and more. - Design, implement, and optimize machine learning models for various applications such as predictive analytics, natural language processing, and recommendation systems. - Stay updated with the latest advancements in data science, machine learning, and artificial intelligence to bring innovative solutions to the team. - Communicate complex findings and model results effectively to both technical and non-technical stakeholders. - Implement advanced data augmentation, feature extraction, and data transformation techniques for optimizing the training process. - Deploy generative AI models into production environments ensuring scalability, efficiency, and reliability for real-time applications. - Use cloud platforms and containerization tools for model deployment and scaling. - Create interactive data applications using Streamlit for various stakeholders. - Conduct prompt engineering to optimize AI model performance and accuracy. - Continuously monitor, evaluate, and refine models to ensure performance and accuracy. - Conduct in-depth research on the latest advancements in generative AI techniques and apply them to real-world business problems. Qualification Required: - Bachelors, Masters or Ph.D in Engineering, Data Science, Mathematics, Statistics, or a related field. - 5+ years of experience in Advanced Analytics, Machine learning, Deep learning. - Proficiency in programming languages such as Python, and familiarity with machine learning libraries. - Experience with generative models such as GANs, VAEs, and transformer-based models. - Strong experience with data wrangling, cleaning, and transforming raw data. - Hands-on experience in developing, training, and deploying machine learning models. - Experience with cloud platforms for model deployment and scalability. - Proficiency in data processing and manipulation techniques. - Hands-on experience in building data applications using Streamlit or similar tools. - Advanced knowledge in prompt engineering, chain of thought processes, and AI agents. - Strong problem-solving skills and effective communication abilities to convey technical concepts. (Note: Any additional details of the company were not provided in the job description.),
Posted 1 day ago
5.0 - 9.0 years
0 Lacs
cuttack
On-site
Role Overview: As a member of our team, you will be instrumental in the design and development of Product 2: the Energy Management System (EMS). You will play a key role in building the system from scratch, focusing on various aspects such as writing EMS logic, integrating with different systems, building telemetry pipelines, and ensuring the platform's reliability and scalability. Key Responsibilities: - Write EMS logic for charging, discharging, and scheduling - Connect to inverters and battery management systems using Modbus and CAN bus - Build telemetry pipelines and reporting tools - Ensure the platform's reliability and scalability Qualifications Required: - Proficiency in Python (pandas, asyncio, REST APIs, Docker) - Experience with databases such as Postgres or TimescaleDB - Knowledge of Modbus / CAN bus integration - Understanding of battery safety basics (SoC, voltage, current limits) - Familiarity with Dashboards (Streamlit or React) - Exposure to DevOps practices (Docker, CI/CD, monitoring) - Understanding of SCADA / IEC protocols - Experience with forecasting and optimization models Additional Company Details: The work environment is fast-paced and demanding, requiring a high level of urgency. Daily progress and review meetings are standard, encouraging intense discussions and pushing each other to maintain high standards. The company believes in hiring smart individuals to contribute their ideas and insights, fostering a culture of collaboration and innovation. The team aims to challenge each other, strive for excellence, and ultimately, make a significant impact on how India manages energy. Compensation for this role will include equity ownership, providing an opportunity for you to be a part of the company's success.,
Posted 1 day ago
4.0 - 8.0 years
0 Lacs
delhi
On-site
Role Overview: You will be a part of the People & ORG CoE team at the Bain Capability Network (BCN), working on building and deploying analytical solutions related to Operating Model and Organization Practice. Your main responsibility will involve developing, maintaining, and evolving advanced internal tools with a focus on Python. Additionally, you will contribute to case delivery by generating meaningful insights for Bain clients using these tools. You will also play a role in creating efficiency gains and driving innovation within the team. Key Responsibilities: - Become an expert in developing and maintaining advanced internal tools with a focus on Python - Handle end-to-end tool processes, including troubleshooting errors and developing Python scripts - Assist in case delivery and provide valuable insights for Bain clients - Potentially build and maintain internal web applications using front-end technologies and frameworks like Streamlit - Work under the guidance of a Team Manager / Sr. Team Manager to drive innovation and identify areas for automation - Lead internal team calls, communicate data, insights, and actionable next steps on tool development - Stay updated on new statistical, database, machine learning, and advanced analytics techniques Qualifications Required: - Graduation/Post-graduation from a top-tier college with a strong academic background - Minimum of 4 years of relevant experience in Python, experience with GenAI, LLMs, or Machine Learning preferred - Advanced understanding of database design and Azure/AWS servers preferred - Experience in SQL, Git, statistical and machine learning models, and front-end technologies like HTML, CSS, JavaScript - Familiarity with low-code development tools such as Streamlit, Mendix, Power Apps - Experience with data science/analytics tools like Alteryx, Informatica will be a plus - Strong analytical skills with the ability to generate realistic answers and recommend actionable solutions - Excellent oral and written communication skills to interact with technical and non-technical stakeholders - Ability to manage multiple projects, prioritize tasks, and drive projects to completion under tight deadlines (Note: Additional details about the company were not included in the provided job description),
Posted 4 days ago
4.0 - 8.0 years
0 Lacs
coimbatore, tamil nadu
On-site
As a Data Scientist at C5i, you will be part of a team that caters to some of the world's largest enterprises, including Fortune 500 companies across various sectors such as Technology, Media, Telecom, Pharma & Lifesciences, CPG, Retail, and Banking. C5i has garnered recognition from industry analysts like Gartner and Forrester for its Analytics and AI capabilities along with proprietary AI-based platforms. **Key Responsibilities:** - Possess 4+ years of experience in machine learning, deep learning, or AI research, focusing on generative models. - Demonstrated expertise in generative models including GANs (Generative Adversarial Networks), VAEs (Variational Autoencoders), and transformer-based models like GPT-3/4, BERT, DALL.E. - Understanding of model fine-tuning, transfer learning, and prompt engineering within the realm of large language models (LLMs). - Knowledge of reinforcement learning (RL) and other advanced machine learning techniques applicable to generative tasks. - Proficient in Python programming and well-versed with relevant libraries and frameworks. - Proven hands-on experience in document detail extraction and feature engineering. - Proficiency in data processing and manipulation techniques. - Hands-on experience in developing data applications using Streamlit or similar tools. - Advanced knowledge in prompt engineering, chain of thought processes, and AI agents. - Strong problem-solving skills and the ability to collaborate effectively in a team environment. - Excellent communication skills to articulate complex technical concepts to non-technical stakeholders. **Qualifications Required:** - 4+ years of experience in machine learning, deep learning, or AI research, with a specific focus on generative models. - Proficiency in generative models such as GANs, VAEs, and transformer-based models. - Knowledge of reinforcement learning and advanced machine learning techniques for generative tasks. - Strong programming skills in Python and familiarity with relevant libraries and frameworks. - Proven experience in document detail extraction and feature engineering. - Hands-on experience in building data applications using Streamlit or similar tools. - Advanced knowledge in prompt engineering, chain of thought processes, and AI agents. - Strong communication skills to effectively convey technical concepts to non-technical stakeholders. Please note that the Job Description also mentions that C5i has been recognized by leading industry analysts like Gartner and Forrester for its Analytics and AI capabilities and proprietary AI-based platforms.,
Posted 5 days ago
5.0 - 10.0 years
8 - 18 Lacs
vadodara
Work from Office
Skills React.JS HTML5 CSS3 and Java Script TypeScript Angular Python Qt Flask PostgreSQL MySQL MongoDB Git PyQt Unit Testing Frameworks Minimum Qualification Bachelors Working Language English Job Description Key responsibilities: Engage in all phases of agile software development, including design, implementation, and deployment. Self-motivated and results-driven, with the ability to work independently with minimal supervision. Detail-oriented and proactive in identifying and resolving issues. Design, develop, and maintain scalable web applications using React and Python. Collaborate with cross-functional teams to define, design, and deliver new features. Write clean, efficient, and reusable code adhering to industry best practices. Build and integrate RESTful APIs to ensure seamless communication between frontend and backend systems. Skills required for the same: Proficiency in Python, with hands-on experience in frameworks like Qt, Streamlit and Flask. Strong understanding of JavaScript (ES6+), TypeScript, React.js, and component-based architecture. Solid grasp of RESTful API design, authentication mechanisms (OAuth, JWT), and API integration. Knowledge of frontend performance optimization and responsive design principles. Understanding of MVC/MVT/MVVM architecture, ORMs, and database schema design. Familiarity with unit testing, debugging, and error handling in both frontend and backend. Awareness of security best practices in web development. Frontend: React.js, HTML5, CSS3, JavaScript, TypeScript, Angular (good to have) Backend: Python, Qt, Flask, Streamlit Databases: PostgreSQL, MySQL, MongoDB Version Control: Git Testing: PyTest, React Testing Library
Posted 5 days ago
2.0 - 4.0 years
6 - 10 Lacs
navi mumbai
Work from Office
About the Role: We are looking for a motivated Full-Stack Python Developer with 24 years of experience to join our team. The candidate will be responsible for building seamless integrations with external APIs and delivering clean, user-friendly web applications using FastAPI (or similar frameworks). You will also have opportunities to work on deployment pipelines and cloud infrastructure, contributing to the full lifecycle of product development. Key Responsibilities: Design, develop, and maintain full-stack applications with a focus on Python (FastAPI). Build robust integrations with third-party APIs and ensure data is handled securely and efficiently. Develop intuitive, responsive web applications using Streamlit. Collaborate with product and design teams to understand requirements and translate them into technical solutions. Write clean, maintainable, and testable code following best practices. Participate in code reviews and contribute to team knowledge sharing. (Preferred) Support CI/CD pipeline development and ensure smooth deployments. (Preferred) Work with AWS cloud services to deploy and maintain applications in production. Required Skills & Experience: 24 years of hands-on experience in Python development. Strong understanding of RESTful APIs and API integrations. Proficiency with FastAPI (or similar Python web frameworks). Experience building Streamlit-based applications for data visualization and user interaction. Experience working with relational databases (e.g., PostgreSQL, MySQL) and ORMs (e.g., SQLAlchemy). Familiarity with version control systems (Git/GitHub). Preferred Skills Experience setting up and maintaining CI/CD pipelines (preferably GitHub Actions). Exposure to AWS services (EC2, S3, RDS, Lambda, etc.) for deployment and scaling. Basic knowledge of containerization (Docker) and orchestration (Kubernetes, ECS). Experience with automated testing frameworks (pytest or similar).
Posted 5 days ago
5.0 - 9.0 years
0 Lacs
delhi
On-site
As a Senior Scientist specializing in Process Modeling and Hybrid Modelling within the Manufacturing Intelligence (MI) team of Pfizer's Global Technology & Engineering (GT&E), your role will involve the development and implementation of advanced analytics solutions such as AI/ML, soft sensor, advanced process control, and process condition monitoring in support of manufacturing operations at Pfizer Global Supply (PGS). **Responsibilities:** - Make technical contributions to high-impact projects requiring expertise in data analytics, advanced modeling, and optimization. - Identify valuable opportunities for applying Advanced Analytics, Advanced Process Control (APC), Artificial Intelligence (AI), Machine Learning (ML), and Industrial Internet of Things (IIoT) in manufacturing settings. - Develop mathematical and machine learning models and assist in the GMP implementation of analytics solutions. - Utilize engineering principles and modeling tools to improve process understanding and enable real-time process monitoring and control. - Collaborate with cross-functional teams, communicate progress effectively to management, and drive project advancement. **Basic Qualifications:** - Preferably a B. Tech in computer science. - Must possess expert-level knowledge in Python; familiarity with R, Matlab, or JavaScript is a plus. - Proficiency in performing data engineering on large, real-world datasets. - Demonstrated experience in applying data science and machine learning methodologies to drive insights and decision-making. - Ability to work collaboratively in diverse teams and communicate effectively with technical and non-technical stakeholders. - Knowledge of Biopharmaceutical Manufacturing processes and experience in technical storytelling. **Preferred Qualifications:** - Expertise in first principles such as thermodynamics, reaction modeling, and hybrid modeling. - Experience with Interpretable Machine Learning or Explainable AI and familiarity with Shapley values. - Proficiency in cloud-based development environments like AWS SageMaker and data warehouses like Snowflake or Redshift. - Understanding of feedback control algorithms, real-time communication protocols, and industrial automation platforms. - Experience in data visualization tools like Streamlit, Plotly, or Spotfire. - Knowledge of Cell Culture, Fermentation, and Vaccines Conjugation. This role offers a flexible work location assignment, including the option to work remotely. Pfizer is an equal opportunity employer, adhering to all applicable equal employment opportunity legislation across jurisdictions.,
Posted 6 days ago
6.0 - 10.0 years
0 Lacs
coimbatore, tamil nadu
On-site
As an Associate Technical Consultant at our company, you will play a crucial role in designing and implementing automation solutions to enhance operational workflows. Your primary responsibility will be to collaborate with various teams to identify automation opportunities and develop scalable solutions using modern technologies. Here's a breakdown of what you can expect in this role: Role Overview: You will be tasked with designing and implementing automation solutions that streamline operational workflows. Collaboration with cross-functional teams to understand business requirements, propose technical strategies, and deliver impactful solutions will be key to your success. Key Responsibilities: - Collaborate with operations teams to identify automation opportunities and gather requirements. - Design, develop, and deploy automation solutions using Python, FastAPI, and Streamlit. - Create user-friendly front-end interfaces using JavaScript, and optionally ReactJS or Mantine. - Integrate solutions with existing systems and ensure smooth data flow. - Maintain technical documentation, such as architecture diagrams and user guides. - Provide technical consultation and support during solution rollout and post-deployment. - Adhere to best practices in version control (Git) and CI/CD pipelines. - Explore and incorporate GenAI or Power Platform tools like PowerApps where applicable. Required Skills & Qualifications: - Bachelor's degree in Computer Science, Engineering, or a related field. - 6-7 years of hands-on experience in software development and automation. - Proficiency in Python, FastAPI, Streamlit, and JavaScript. - Strong understanding of RESTful APIs and microservices architecture. - Experience in collaborating with operations or business teams to deliver tailored solutions. - Familiarity with Git for version control and collaborative development. Preferred/Optional Skills: - Experience with ReactJS, Mantine, or other modern UI frameworks. - Exposure to Generative AI tools and APIs. - Knowledge of PowerApps or other Microsoft Power Platform tools. - Understanding of DevOps practices and cloud platforms like Azure, AWS. Soft Skills: - Strong problem-solving and analytical skills. - Excellent communication and stakeholder management. - Ability to work independently and in a team-oriented environment. - Adaptability to fast-paced and evolving project requirements. Location: - Coimbatore Brand: - Merkle Time Type: - Full time Contract Type: - Permanent,
Posted 6 days ago
3.0 - 5.0 years
0 Lacs
ahmedabad, gujarat, india
On-site
Position: AI Tools & Automation Developer - OnSite(Ahmedabad) Company: Moksh Infotech Private Limited Location: Ahmedabad (Work from Office) Company Overview Moksh Infotech is a rapidly growing IT services company based in Ahmedabad, specializing in digital transformation, AI-driven automation, enterprise integrations, and software development solutions. We empower businesses with modern AI technologies, automated workflows, and custom IT strategies to achieve high efficiency and innovation. Role Summary We are looking for a highly skilled AI Tools & Automation Developer to design, build, and maintain AI-powered tools and automation workflows. The ideal candidate will work closely with cross-functional teams to identify automation opportunities and deploy solutions that enhance productivity, data handling, and system integrations. Key Responsibilities Design, develop, and deploy AI-powered tools using platforms like OpenAI, Hugging Face, and LangChain to solve real-world problems such as content summarization, intelligent search, document analysis, and workflow optimization. Automate routine business operations by building workflows and bots using Python scripting, Selenium automation, and industry-standard tools like UIPath, Power Automate, Zapier, and Make (Integromat). Collaborate with various departments to understand process bottlenecks and recommend AI or automation-driven improvements tailored to team-specific requirements. Develop, test, and maintain REST APIs and webhook-based integrations to connect platforms such as CRMs, Gmail, Slack, internal dashboards, and third-party tools. Create intuitive front-end tools using frameworks like Streamlit, Flask, or React for internal stakeholders to interact with AI workflows, monitor outputs, and initiate automation tasks. Handle and analyze large datasets using Pandas and SQL for data cleaning, transformation, and processing prior to automation or AI analysis. Implement monitoring and logging mechanisms to ensure the stability, scalability, and accuracy of deployed automation workflows and AI tools. Document all created tools, scripts, workflows, and processes for easy maintenance, handover, and compliance. Continuously explore and test new APIs, tools, and LLMs to improve solution effectiveness and stay ahead of technological trends in AI and automation. Work closely with DevOps/IT teams when required to deploy tools on secure internal servers or cloud environments and ensure compliance with security protocols. Required Skills & Qualifications Bachelors or Masters degree in Computer Science, IT, or related field. 35 years of relevant experience in automation development or AI tooling. Strong programming experience in Python (mandatory). Familiarity with automation tools (UIPath, Selenium, Power Automate, Zapier). API development and integration experience. Knowledge of AI/ML APIs (OpenAI, Hugging Face, etc.) and LLMs. Good understanding of data handling with Pandas, SQL, and Google Sheets scripting. Strong communication and documentation skills. Soft Skills Problem-solving and analytical thinking. Quick learner with ability to adapt to new tools and platforms. Proactive and self-motivated attitude. Excellent collaboration and communication abilities. Interview Process Resume + portfolio screening (GitHub, project demos, documentation). Technical assignment focused on automation or AI integration. In-depth technical interview covering past experience and tool knowledge. Managerial round assessing scalability, workflow logic, and ownership. HR round for salary discussion and culture fit. If you&aposre passionate about automation, AI integration, and building tools that truly make an impact wed love to hear from you! Show more Show less
Posted 6 days ago
6.0 - 8.0 years
0 Lacs
india
On-site
About the Role: 11 The Team : As a member of the EDO, Collection Platforms & AI - Cognitive Engineering team you will work on building GenAI-driven and ML-powered products and capabilities to power natural language understanding, data extraction, information retrieval and data sourcing solutions for S&P Global. You will define AI strategy, mentor others, and drive production-ready AI products and pipelines while leading by example in a highly engaging work environment. You will work in a (truly) global team and be encouraged for thoughtful risk-taking and self-initiative. What's in it for you: Be a part of a global company and build solutions at enterprise scale Lead and grow a highly skilled, hands-on technical team (including mentoring junior data scientists) Contribute to solving high-complexity, high-impact problems end-to-end Architect and oversee production-ready pipelines from ideation to deployment Responsibilities: Define AI roadmap, tooling choices, and best practices for model building, prompt engineering, fine-tuning, and vector retrieval systems Architect, develop and deploy large-scale ML and GenAI-powered products and pipelines Own all stages of the data science project lifecycle, including: Identification and scoping of high-value data science and AI opportunities Partnering with business leaders, domain experts, and end-users to gather requirements and align on success metrics Evaluation, interpretation, and communication of results to executive stakeholders Lead exploratory data analysis, proof-of-concepts, model benchmarking, and validation experiments for both ML and GenAI approaches Establish and enforce coding standards, perform code reviews, and optimize data science workflows Drive deployment, monitoring, and scaling strategies for models in production (including both ML and GenAI services) Mentor and guide junior data scientists foster a culture of continuous learning and innovation Manage stakeholders across functions to ensure alignment and timely delivery Technical Requirements: Hands-on experience with large language models (e.g., OpenAI, Anthropic, Llama), prompt engineering, fine-tuning/customization, and embedding-based retrieval Expert proficiency in Python (NumPy, Pandas, SpaCy, scikit-learn, PyTorch/TF 2, Hugging Face Transformers) Deep understanding of ML & Deep Learning models, including architectures for NLP (e.g., transformers), GNNs, and multimodal systems Strong grasp of statistics, probability, and the mathematics underpinning modern AI Ability to surf and synthesize current AI/ML research, with a track record of applying new methods in production Proven experience on at least one end-to-end GenAI or advanced NLP project: custom NER, table extraction via LLMs, Q&A systems, summarization pipelines, OCR integrations, or GNN solutions Familiarity with orchestration and deployment tools: Redis, Flask/Django/FastAPI, SQL, R-Shiny/Dash/Streamlit Openness to evaluate and adopt emerging technologies and programming languages as needed Good to have: Master's or Ph.D. in Computer Science, Statistics, Mathematics, or related field (minimum Bachelor's) 6+ years of relevant experience in Data Science/AI, with at least 2 years in a leadership or technical lead role Prior experience in the Economics/Financial industry, especially with market-intelligence or risk analytics products Public contributions or demos on GitHub, Kaggle, StackOverflow, technical blogs, or publications What's In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology-the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide-so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We're committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We're constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That's why we provide everything you-and your career-need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It's not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards-small perks can make a big difference. For more information on benefits by country visit: Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority - Ratings - (Strategic Workforce Planning)
Posted 6 days ago
4.0 - 8.0 years
0 Lacs
bengaluru, karnataka, india
On-site
GenAI & Agentic AI Consultant Role Overview: We are looking for a dynamic and skilled GenAI & Agentic AI Consultant who can lead the development of intelligent, autonomous solutions using generative and agentic AI technologies. The ideal candidate will bring a strong foundation in applied statistics, machine learning, and data engineering, along with excellent communication and storytelling abilities to contribute to strategic initiatives, including RFPs and stakeholder engagements. Key Responsibilities: Design and implement GenAI and Agentic AI solutions tailored to business needs. Apply statistical techniques including distributions, hypothesis testing, and regression analysis. Build and optimize machine learning models using algorithms such as Neural Networks, Naive Bayes, SVM, Decision Forests, etc. Collaborate with cross-functional teams to define and deliver impactful AI use cases. Contribute to RFPs, solution proposals, and client presentations with clear, compelling narratives. Perform advanced data manipulation, transformation, and integration across varied data sources. Utilize tools and frameworks such as Python, PySpark, R, SQL, Pandas, NumPy, Scikit-learn, Keras, and TensorFlow, Streamlit Work with Big Data platforms and adopt new technologies and languages as needed. Incorporate NLP and text analytics into GenAI solutions where applicable. Ensure alignment with data governance, privacy, and ethical AI standards. Preferred Skills & Experience: Hands-on experience with GenAI technologies and frameworks. Exposure to Agentic AI concepts and autonomous system design. Strong communication and storytelling skills; able to simplify complex ideas for diverse audiences. Experience contributing to RFPs, solutioning, and client-facing engagements. Proficiency in Python, PySpark, R, SQL, and Big Data tools. Familiarity with cloud platforms (Azure, AWS, GCP); cloud certification is a plus. Understanding of MDM (Master Data Management) and Data Quality practices. Experience in building scalable data engineering pipelines and architectures. Qualifications: Bachelors or Masters degree in Computer Science, Data Science, Statistics, or related field. 4+ (for consultant) and 6+ years (for Sr Consultant) of experience in AI/ML, data analytics, or related domains. Cloud certification (Azure, AWS, or GCP) is preferred. Show more Show less
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
haryana
On-site
As a member of Bain Capability Network's People & ORG CoE team, you will be responsible for developing and maintaining advanced internal tools with a focus on Python. Your role will involve end-to-end handling of tool processes, troubleshooting errors, and assisting in case delivery to provide valuable insights to Bain clients. Additionally, you will have the opportunity to build and maintain internal web applications using front-end technologies and frameworks like Streamlit, ensuring compatibility across various devices and browsers. Working under the guidance of a Team Manager, you will play a crucial role in driving innovation within the team, particularly in GenAI topics. Your responsibilities will include leading internal team calls, effectively communicating data and insights, and staying updated on the latest statistical, database, machine learning, and advanced analytics techniques. To excel in this role, you should have a Graduate/Post-graduate degree from a top-tier college with a strong academic background. A minimum of 4 years of experience in Python is required, with a preference for candidates with experience in GenAI, LLMs, or Machine Learning. Proficiency in database design, Azure/AWS servers, SQL, Git, and hands-on experience with statistical and machine learning models in business contexts is highly desirable. Knowledge of HTML, CSS, JavaScript, and low-code development tools such as Streamlit will be beneficial. You should be capable of generating realistic solutions and recommending actionable steps, as well as owning and maintaining high-impact products. Strong communication skills, both oral and written, are essential for effectively engaging with technical and non-technical stakeholders. The ability to prioritize projects, manage competing priorities, and drive projects to completion under tight deadlines is crucial. Bain & Company is renowned for being a top employer globally, championing diversity, and practicing social responsibility. Joining our team means becoming part of an inclusive and collaborative environment that encourages personal and professional growth. We value exceptional talents and abilities and provide a platform for you to thrive and reach your full potential. Recognized by various external parties for our commitment to diversity and inclusion, Bain offers a rewarding and supportive workplace where you can truly excel.,
Posted 1 week ago
0.0 years
0 Lacs
bengaluru, karnataka, india
On-site
About Hevo: At Hevo, we are changing the way companies leverage data to drive user experience, growth, and business processes. There has been a fundamental change in the amount of data companies are generating on a day-to-day basis. More and more users in an organization are now looking to use data to drive business decisions. Data is no longer a second-class citizen and companies are seeing data as a competitive advantage. We see this change and we are on a mission to change the way companies leverage their data. With a technology platform processing more than 100 billion records a month and doubling itself every 6 months, Hevo is poised for exponential growth in the coming future. This position provides a unique opportunity to create a massive impact on all existing and future customers of Hevo through technology innovation. If you are the one who is looking to take a leap of faith and work on the technology of the future, if you obsess over customer satisfaction and experience then we are looking for you. Role Summary The AI Solutions Engineer will prototype, test, and deliver internal AI applications that improve efficiency across GTM teams (Marketing, Sales, BizOps). The role combines applied AI/ML knowledge, software engineering skills, and a strong understanding of GTM workflows to deliver practical, business-impacting solutions. Key Responsibilities AI Application Development Build internal applications using LLMs and AI APIs (e.g., OpenAI, Anthropic). Integrate AI into GTM tools like HubSpot, Salesforce, LinkedIn, Outreach, Google Sheets, etc. Develop scripts and automations for data enrichment, copy assistance, and reporting. Prototyping & Experimentation Rapidly test new AI models and APIs, validate feasibility, and create MVPs. Evaluate and document performance, accuracy, and usability for GTM use cases. Cross-functional Collaboration Work closely with Marketing, Sales, and BizOps teams to capture pain points and translate them into AI-driven solutions. Provide demos, documentation, and training to internal users. Operational Support Maintain and optimize deployed tools. Ensure compliance, data privacy, and reliability in AI-driven workflows. What Success Looks Like Youve delivered 23 working AI prototypes that GTM teams actively use. Youve automated or optimized at least 2 GTM workflows , saving time and improving accuracy. Youve documented solutions in a way that makes them scalable and transferable. Youve demonstrated measurable impact on GTM productivity (hours saved, efficiency gains, improved decision-making). What Were Looking For Knowledge & Skills: Strong understanding of AI/ML fundamentals (NLP, LLMs, embeddings, prompt engineering). Proficiency in Python and working with APIs (REST/GraphQL). Experience with prototyping tools/frameworks (Flask, FastAPI, Streamlit, or similar). Familiarity with GTM systems such as Salesforce, HubSpot, Marketo, or BI tools (Tableau, Looker). Strong problem-solving skills and ability to rapidly prototype creative solutions. Upto 5 years of experience in building end-to-end applications Abilities: Translate messy GTM data into usable inputs for AI models. Map AI outputs into real-world GTM workflows and processes. Communicate technical solutions clearly to non-technical stakeholders. Balance speed of experimentation with reliability and compliance. Why Join Us Work on cutting-edge AI applications in a high-growth SaaS company. Be part of the GTM innovation function , directly impacting Marketing, Sales, and BizOps productivity. Gain exposure to real-world AI use cases that scale with thousands of customers. Collaborate with cross-functional teams and own end-to-end AI solutions from idea to deployment. Show more Show less
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
bengaluru, karnataka, india
On-site
This job is with ABB, an inclusive employer and a member of myGwork the largest global platform for the LGBTQ+ business community. Please do not contact the recruiter directly. At ABB, we help industries outrun - leaner and cleaner. Here, progress is an expectation - for you, your team, and the world. As a global market leader, well give you what you need to make it happen. It wont always be easy, growing takes grit. But at ABB, youll never run alone. Run what runs the world. This Position Reports To E2E Analytics Lead Your Role And Responsibilities In this role, you will have the opportunity to design and develop end-to-end analytics solutions for our Marketing & Sales, and Finance Department. Based on stakeholder requirements, you will work with business data to uncover actionable insights, define key metrics, and identify trends that drive performance improvement. Additionally, you will demonstrate your expertise by creating intuitive, visually compelling dashboards, adhering to best practices in data visualization and user experience. The Work Model For The Role Is This role is contributing to the Robotics & Discrete Automation in Bangalore. This position reports to Global End to End Analytics Leader. You Will Be Mainly Accountable For Executing quantitative analysis that translate data into actionable insights. Interpreting data, analyzing results using statistical techniques, and providing ongoing reports. Owning the knowledge capture and design aspects and ensures they are well documented to maintain and enhance the overall system integrity. Contributing to design and coding related standards, technology, and tool selections and defines new data collection and analysis processes. Acting proactively to propose new scenarios for analysis and insights You will join a high-performing, forward-thinking, and collaborative team where you will have the opportunity to thrive, grow, and make a measurable impact. Qualifications For The Role You are well versed in data analytics, fluent in SQL, DAX, Power BI; you enjoy applying these skills to solve real-world problems. Candidates with exposure to Snowflake, Python and Streamlit are encouraged to apply, though these skills are not mandatory for the role. You have advanced skills and the ability to demonstrate your experience in data analytics and business intelligence. With over 3 years of experience in Data & Analytics, you bring proven expertise and practical knowledge to the role. Youre driven by curiosity and innovation, bringing energy, creativity, and a passion for data to every challenge. You dont just think outside the boxyou challenge its very existence. As a proactive BI specialist, you thrive on turning complex problems into actionable insights, injecting fresh ideas, and collaborating with stakeholders to deliver transformative solutions. Eager to grow and always ready to disrupt the status quo, you bring a forward-thinking mindset that helps shape smarter, data-informed decisions. You hold a University degree in Computer Science, Information Systems, Economics, Finance or related field or equivalent experience in data analysis, reporting, or BIis welcomed You are at ease communicating in English and collaborating with international teams and stakeholders More About Us ABB Robotics & Discrete Automation Business area provides robotics, and machine and factory automation including products, software, solutions and services. Revenues are generated both from direct sales to end users as well as from indirect sales mainly through system integrators and machine builders. www.abb.com/robotics" We value people from different backgrounds. Could this be your story Apply today or visit www.abb.com to read more about us and learn about the impact of our solutions across the globe. Fraud Warning: Any genuine offer from ABB will always be preceded by a formal application and interview process. We never ask for money from job applicants. For current open positions you can visit our career website https://global.abb/group/en/careers and apply. Please refer to detailed recruitment fraud caution notice using the link https://global.abb/group/en/careers/how-to-apply/fraud-warning. Show more Show less
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
bengaluru, karnataka, india
On-site
At ABB, we help industries outrun - leaner and cleaner. Here, progress is an expectation - for you, your team, and the world. As a global market leader, we'll give you what you need to make it happen. It won't always be easy, growing takes grit. But at ABB, you'll never run alone. Run what runs the world. This Position reports to: E2E Analytics Lead Your role and responsibilities In this role, you will have the opportunity to design and develop end-to-end analytics solutions for our Marketing & Sales, and Finance Department. Based on stakeholder requirements, you will work with business data to uncover actionable insights, define key metrics, and identify trends that drive performance improvement. Additionally, you will demonstrate your expertise by creating intuitive, visually compelling dashboards, adhering to best practices in data visualization and user experience. The work model for the role is: #Li-hybrid This role is contributing to the Robotics & Discrete Automation in Bangalore. This position reports to Global End to End Analytics Leader. You will be mainly accountable for: Executing quantitative analysis that translate data into actionable insights. Interpreting data, analyzing results using statistical techniques, and providing ongoing reports. Owning the knowledge capture and design aspects and ensures they are well documented to maintain and enhance the overall system integrity. Contributing to design and coding related standards, technology, and tool selections and defines new data collection and analysis processes. Acting proactively to propose new scenarios for analysis and insights You will join a high-performing, forward-thinking, and collaborative team where you will have the opportunity to thrive, grow, and make a measurable impact. Qualifications for the role You are well versed in data analytics, fluent in SQL, DAX, Power BI you enjoy applying these skills to solve real-world problems. Candidates with exposure to Snowflake, Python and Streamlit are encouraged to apply, though these skills are not mandatory for the role. You have advanced skills and the ability to demonstrate your experience in data analytics and business intelligence. With over 3 years of experience in Data & Analytics, you bring proven expertise and practical knowledge to the role. You're driven by curiosity and innovation, bringing energy, creativity, and a passion for data to every challenge. You don't just think outside the box-you challenge its very existence. As a proactive BI specialist, you thrive on turning complex problems into actionable insights, injecting fresh ideas, and collaborating with stakeholders to deliver transformative solutions. Eager to grow and always ready to disrupt the status quo, you bring a forward-thinking mindset that helps shape smarter, data-informed decisions. You hold a University degree in Computer Science, Information Systems, Economics, Finance or related field or equivalent experience in data analysis, reporting, or BI-is welcomed You are at ease communicating in English and collaborating with international teams and stakeholders More about Us ABB Robotics & Discrete Automation Business area provides robotics, and machine and factory automation including products, software, solutions and services. Revenues are generated both from direct sales to end users as well as from indirect sales mainly through system integrators and machine builders. www.abb.com/robotics We value people from different backgrounds. Could this be your story Apply today or visit www.abb.com to read more about us and learn about the impact of our solutions across the globe. Fraud Warning: Any genuine offer from ABB will always be preceded by a formal application and interview process. We never ask for money from job applicants. For current open positions you can visit our career website and apply. Please refer to detailed recruitment fraud caution notice using the link .
Posted 1 week ago
0.0 - 4.0 years
0 Lacs
karnataka
On-site
As an intern at Akaike Technologies, you will be engaged in a variety of exciting tasks related to working with Large Language Models (LLMs) such as GPT, LLaMA, or Mistral to develop real-world applications. Your responsibilities will include developing effective prompts to enhance AI-generated outputs, implementing Retrieval-Augmented Generation (RAG) for improved knowledge retrieval, and applying classic ML techniques (classification, regression, clustering) using tools like scikit-learn, TensorFlow, or PyTorch. You will also be involved in working with Hugging Face Transformers for pre-trained model adaptation, developing RESTful APIs to seamlessly integrate AI models into applications, and collaborating with APIs from OpenAI, Cohere, Hugging Face, Google AI for model interactions. Additionally, you will implement frameworks like LangChain for LLM-driven applications, deploy AI apps using backend frameworks like Flask, FastAPI, and utilize lightweight frontend technologies such as Streamlit, Gradio, etc. Your role will also include tasks such as collecting and preprocessing both structured and unstructured data for ML training, applying NLP techniques like tokenization, named entity recognition, and text summarization, alongside having a good understanding of SQL querying and PySpark. You will deploy models using Docker and cloud platforms such as AWS, GCP, Azure, ensuring clear documentation of workflows, experiments, and findings, and sharing insights with the team on emerging AI techniques and tools. Join us at Akaike Technologies, where we harness the power of Computer Vision, NLP, Predictive Analytics, and Generative AI to deliver intelligent solutions that redefine what's possible. Over the years, we have amazed clients with our innovative AI solutions that have revolutionized industries, enhanced efficiency, and pushed the boundaries of technological advancement.,
Posted 1 week ago
1.0 - 5.0 years
0 Lacs
indore, madhya pradesh
On-site
You should have 2-4 years of experience in the field. The job is located in Indore. Your primary skills and qualifications should include a strong proficiency in Python and related libraries like Pandas and NumPy. You should also be familiar with PandasAI and Streamlit for AI-driven data visualization and application development. Experience in building and deploying AI chatbots is required. A solid understanding of AI algorithms, data analysis, and machine learning techniques is essential. Experience in integrating AI solutions with ERP systems, particularly in the construction/real estate or transport sectors, is preferred. Knowledge of cloud platforms such as AWS, Azure, and Google Cloud for deploying AI applications is also necessary. This is a full-time position with benefits including health insurance and Provident Fund. The job requires in-person work during day shifts and morning shifts. You should be able to reliably commute or plan to relocate to Indore, Madhya Pradesh before starting work. The required experience includes 1 year in AI and 2 years in Python. Overall, you will be responsible for utilizing your expertise in Python, AI, and related technologies to develop and deploy AI solutions, particularly in the context of ERP systems and data visualization.,
Posted 1 week ago
9.0 - 11.0 years
0 Lacs
gurugram, haryana, india
Remote
Job Description This is a remote position. Job Summary We are looking for a Senior Data Quality & Governance Engineer who will take ownership of enforcing data contracts, quality, and metadata standards across intelligence and analytics platforms. This role involves designing lineage models, implementing statistical validations, and establishing reusable quality dashboardsall using AWS-native and lightweight open-source tools. Responsibilities Design and enforce data contracts across Raw ? Clean ? Modeled zones. Define schema versioning policies, KPI logic, and metadata rules across business-critical datasets. Implement row-level and statistical validations using Great Expectations or Deequ. Create audit-ready QA tables to track failed checks, schema mismatches, and data regressions. Track end-to-end lineage and KPI evolution using OpenMetadata (with Glue/S3/Athena). Auto-classify columns as PII, derived, or forecast-driving fields using AWS Glue Tags/Scripts. Provide CTAS-based Athena queries for building QA dashboards and UAT verifications. Build BI-ready, QA-approved datasets for downstream tools like Superset and Power BI. Establish reusable profiling and validation dashboards for data quality and business teams. Collaborate with engineers, QA, and business SMEs to finalize data validation logic. Requirements Essential Skills: Job Hands-on experience with AWS Glue (Jobs, Crawlers, Catalog), S3, and Athena. Strong foundation in data contracts, quality enforcement, and schema versioning. Expertise in using Deequ or Great Expectations for anomaly detection and data validation. Familiarity with OpenMetadata, Amundsen, or custom metadata tracking solutions. Ability to tag and manage sensitive data fields (e.g., PII, model inputs, derived KPIs). Strong SQL with Athena (CTEs, CTAS, filters, aggregations). Experience building QA dashboards in Superset, Streamlit, or similar BI tools. Personal Excellent collaboration and communication with QA, architects, and business teams. Self-driven with attention to detail in schema accuracy and metadata enrichment. Ability to translate KPIs and quality rules into validation logic. Proactive in surfacing data regressions and audit issues before they reach production. High ownership mindset with a strong data compliance and governance attitude. Preferred Skills Job Implementation of statistical QA techniques like z-score anomalies or entropy thresholds. Experience handling rejected record logs, schema drift validations, and data reconciliation. Awareness of AWS cost optimization techniques in Glue and Athena. Personal Proactive, ownership-driven mindset with a collaborative approach. Strong communication and collaboration skills. Strong problem-solving skills with attention to detail. Have the ability to work under stringent deadlines and demanding client conditions. Strong analytical and problem-solving skills. Ability to work in fast-paced, delivery-focused environments. Should have strong mentoring and documentation skills. Ability to take end-to-end ownership of QA validation modules. Other Relevant Information Bachelors degree in Computer Science, Information Technology, or a related field. Minimum 9+ years of experience in data engineering & architecture. Benefits This role offers the flexibility of working remotely in India. LeewayHertz is an equal opportunity employer and does not discriminate based on race, color, religion, sex, age, disability, national origin, sexual orientation, gender identity, or any other protected status. We encourage a diverse range of applicants. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#6875E2;border-color:#6875E2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> Show more Show less
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
punjab
On-site
You are a skilled and proactive Senior Data Engineer with 5-8 years of hands-on experience in Snowflake, Python, Streamlit, and SQL. You also have expertise in consuming REST APIs and working with modern ETL tools such as Matillion, Fivetran, etc. Your strong foundation in data modeling, data warehousing, and data profiling will be crucial as you play a key role in designing and implementing robust data solutions that drive business insights and innovation. Your responsibilities will include designing, developing, and maintaining data pipelines and workflows using Snowflake and an ETL tool. You will develop data applications and dashboards using Python and Streamlit, as well as create and optimize complex SQL queries for data extraction, transformation, and loading. Integrating REST APIs for data access and process automation, performing data profiling, quality checks, and troubleshooting to ensure data accuracy and integrity, and designing and implementing scalable and efficient data models aligned with business requirements are also part of your key responsibilities. To excel in this role, you must have experience in HR data and databases, along with 5-8 years of professional experience in a data engineering or development role. Strong expertise in Snowflake, proficiency in Python (including data manipulation with libraries like Pandas), experience building web-based data tools using Streamlit, and a solid understanding and experience with RESTful APIs and JSON data structures are essential. Strong SQL skills, experience with advanced data transformation logic, and hands-on experience in data modeling, data warehousing concepts, and data profiling techniques are also required. Familiarity with version control (e.g., Git) and CI/CD processes is a plus. Preferred qualifications include experience working in cloud environments (AWS, Azure, or GCP), knowledge of data governance and cataloging tools, experience with agile methodologies and working in cross-functional teams, and experience in HR data and databases. Experience in Azure Data Factory would also be beneficial.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
About us: Vertiv brings together hardware, software, analytics, and ongoing services to ensure its customers" vital applications run continuously, perform optimally, and scale with business needs. Vertiv solves the most important challenges facing today's data centers, communication networks, and commercial and industrial facilities with a portfolio of power, cooling, and IT infrastructure solutions and services that extends from the cloud to the edge of the network. Job Summary: As an AI Developer at Vertiv, you will be responsible for designing, developing, and implementing AI models and algorithms to address complex business challenges. You will work with large datasets, collaborate with cross-functional teams, and integrate AI solutions into existing systems. Your role will involve optimizing AI models for performance and scalability, developing ETL processes, and deploying models on cloud platforms. You will also stay updated with the latest advancements in AI and machine learning technologies. Key Duties and Responsibilities: - Design, develop, and implement AI models and algorithms to solve complex business problems. - Work with large datasets to extract meaningful insights and build predictive models. - Collaborate with cross-functional teams to integrate AI solutions into existing systems. - Optimize and maintain AI models for performance and scalability. - Develop and maintain Data Pipeline. - Utilize SQL and Big Data platforms (preferably Snowflake) for data manipulation and analysis. - Deploy AI models on cloud platforms such as AWS or similar. - Create AI web apps (preferably Streamlit). - Implement version control and CI/CD pipelines (preferably using GitLab). - Stay updated with the latest advancements in AI and machine learning technologies. - Document and present findings and solutions to stakeholders. Requirements: - Bachelor's degree in Computer Science, Engineering, or a related field. - Strong experience with SQL language and Big Data platforms (Snowflake preferred). - Knowledge of cloud platforms. - Proficiency in Python programming AWS or similar. - Knowledge of version control and CI/CD pipelines (GitLab preferred). - Experience with machine learning projects and large language models (e.g., LLaMA, Mistral, Claude). - Fluent in English with excellent communication skills. - Strong problem-solving skills and attention to detail. - Ability to work independently and as part of a team. Preferred Qualifications: - Master's degree in a related field. - Experience with Snowflake platforms and AWS environment. - Proficiency in Python programming for Data Transformation and ML processes. - Previous experience in a similar role. - Experience with ETL tools (Matillion preferred).,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
bhopal, madhya pradesh
On-site
You will be joining Star Software, an AI-powered automation company with a focus on revolutionizing document processing and workflows. The company specializes in processing complex documents and reports to facilitate precise data extraction. Star Software's technology stack is driven by AI, OCR, and RPA, ensuring accuracy, efficiency, and seamless integration into existing workflows. Your role will involve working with the following tech stacks and key skills & tools: - **Language Skills**: - Strong proficiency in Python - Familiarity with libraries such as Transformers, OpenAI SDK, and Hugging Face Datasets - **Machine Learning/Deep Learning**: - Proficiency in PyTorch and TensorFlow (optional) - Knowledge of fine-tuning methods like LoRA, PEFT, and QLoRA - Understanding of Transformers, Attention, and Tokenization - **NLP Techniques**: - Experience with Named Entity Recognition (NER) - Proficiency in Question Answering, Summarization, and Text Classification - **Model Handling**: - Familiarity with OpenAI models (GPT-4, GPT-3.5) and Hugging Face models (LLaMA, Mistral, Falcon, etc.) - Ability to work with custom-trained models - **Tools & Infrastructure**: - Experience with Weights & Biases, MLflow for experiment tracking - Proficiency in Docker, Kubernetes for containerization - Knowledge of Streamlit/Gradio for demos, FastAPI/Flask for APIs, and Git, CI/CD pipelines Preferred qualifications for this role include: - Graduation from Tier 1 colleges - Minimum of two years of experience in building quality applications - Strong working knowledge of Web technologies and APIs - Experience in SAAS product development would be a plus - Previous experience in IDP (Intelligent Document Processing) is desirable If you are passionate about leveraging AI to streamline document processing and are proficient in the mentioned tech stacks and skills, we encourage you to apply for this exciting opportunity at Star Software.,
Posted 1 week ago
1.0 - 5.0 years
0 Lacs
noida, uttar pradesh
On-site
RocketFrog.ai is an AI Studio for Business, specializing in delivering advanced AI solutions across various industries such as Healthcare, Pharma, BFSI, Hi-Tech, and Consumer Services. We are focused on Agentic AI, Deep Learning models, and AI-driven Product Development to make a significant impact on businesses. Are you ready to take a Rocket Leap with Science As an Assistant Agentic AI Engineer at RocketFrog.ai, you will need at least 1 year of experience in developing LangChain/LangGraph-based solutions. The ideal candidate should possess a foundational understanding of LLMs, Prompt Engineering, AI Frameworks, and a strong willingness to learn and keep up with the latest advancements in AI. Your key responsibilities will include assisting in implementing Agentic AI frameworks like LangGraph, LangChain, CrewAI to develop intelligent workflows. You will work with various LLMs such as ChatGPT, Gemini, Claude to comprehend their capabilities and limitations. Additionally, you will support Prompt Engineering efforts to enhance LLM interactions and utilize Retrieval-Augmented Generation (RAG) techniques for improved AI response generation. Experimenting with pretrained models on platforms like Hugging Face/GitHub, contributing to system architectures using FastAPI, Uvicorn, Celery, and AI pipelines, and building interactive AI applications using Streamlit will also be part of your role. You will be involved in ML model development, applying the model development lifecycle and performance evaluation metrics while staying updated on new AI innovations through continuous learning. Required Skills & Expertise: - Basic experience in LangChain/LangGraph-based solutions, Prompt Engineering, Agentic AI frameworks, Retrieval-Augmented Generation (RAG) techniques, System architecture, AI/ML platforms, and ML Frameworks like Keras, TensorFlow, or PyTorch. - Proficiency in Streamlit for AI application development and strong foundational knowledge in Data Structures, Probability & Statistics, and Machine Learning Concepts. Additional Skills: - Ability to articulate AI/ML concepts to both technical and non-technical stakeholders. - Strong problem-solving skills and a keenness to learn. Required Background: - Bachelor's degree in Computer Science, IT, Data Science, or AI. - Hands-on experience of at least 1 year in AI/ML development. Why Join RocketFrog.ai - Be a part of RocketFrog.ai's growth journey and make a real business impact with AI. - Gain practical experience with cutting-edge AI technologies. - Learn from industry experts and advance your career in Agentic AI. If you are passionate about AI and eager to expand your skills, let's connect!,
Posted 2 weeks ago
0.0 - 4.0 years
0 Lacs
delhi
On-site
We are looking for a Data Science Intern with a passion for Computer Vision to delve into and implement cutting-edge models. The perfect candidate should possess the ability to dissect intricate tasks, enhance models, and adjust swiftly to a dynamic environment. Effective communication skills are crucial for teaming up with mentors and colleagues. An emphasis on producing clean code and upholding detailed documentation is highly appreciated. Key Responsibilities: Build, fine-tune, and optimize computer vision models for tasks such as object detection, segmentation, and OCR utilizing TensorFlow/PyTorch. Develop data preprocessing pipelines with OpenCV and NumPy to enhance model performance. Investigate transfer learning methods to tailor pre-trained models for specific use cases. Collaborate with teams to incorporate AI models into applications using Streamlit or Flask. Engage in end-to-end projects, starting from prototyping to deploying solutions through Docker. Perform experiments to evaluate model performance and record findings. Contribute to version-controlled codebases using Git to ensure organized and effective development. Keep abreast of deep learning advancements like YOLO progressions and transformer-based architectures. Required Skill: Proficiency in Python with hands-on experience in OpenCV, NumPy, and deep learning frameworks like TensorFlow/PyTorch. Basic comprehension of linear algebra principles, including vectors and matrices, for image manipulations. Exposure to deep learning architectures, particularly CNNs, for image categorization. Familiarity with advanced Computer Vision techniques such as Transfer Learning, YOLO, OCR, or segmentation is advantageous. Experience in model deployment, lightweight web applications (Streamlit/Flask), or Docker is beneficial. Basic understanding of version control tools like Git is a plus. A portfolio showcasing computer vision projects or involvement in open-source initiatives/hackathons is a bonus. Job Type: Internship Duration: 3-6 months Pre-Placement Offer (Subject to performance),
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
About Netradyne: Netradyne, founded in 2015, is a technology company that specializes in leveraging expertise in Artificial Intelligence, Deep Learning, and Edge Computing to provide innovative solutions to the transportation industry. The company's technology is currently implemented in a multitude of vehicles, ranging from passenger cars to semi-trailers, operating on various types of roads including interstates, suburban roads, rural highways, and even off-road terrains. Netradyne is seeking talented engineers to join the Analytics team, which consists of graduates from prestigious institutions such as IITs, IISC, Stanford, UIUC, and UCSD. The team is dedicated to developing cutting-edge AI solutions that enable drivers and fleets to identify unsafe driving scenarios in real-time, thereby preventing accidents and reducing fatalities and injuries. Job Title: Staff Data Engineer - ML Experience: 5-8 years Role and Responsibilities: As a Staff Data Engineer specializing in Machine Learning, you will work closely with a team of machine learning engineers and data scientists. Your primary responsibilities will include: - Designing, developing, and deploying scalable solutions that incorporate GenAI, Traditional ML models, Data science, and ETL pipelines. - Collaborating with cross-functional teams to integrate AI-driven solutions into business operations. - Building and improving automation frameworks, data processing mechanisms, and model deployment procedures. - Leveraging Gen-AI tools and workflows to enhance the efficiency and effectiveness of AI solutions. - Conducting research to stay abreast of the latest developments in generative AI and related technologies. - Delivering essential product features within cloud analytics. Requirements: - B. Tech, M. Tech, or PhD in Computer Science, Data Science, Electrical Engineering, Statistics, Mathematics, Operations Research, or a related field. - Proficient programming skills in Python and SQL, with a solid foundation in computer science principles, algorithms, data structures, and object-oriented programming. - Experience in developing end-to-end solutions on AWS cloud infrastructure. - Sound understanding of internal workings and schema design for various data storage systems (RDBMS, Vector databases, and NoSQL). - Familiarity with Gen-AI tools, workflows, and large language models (LLMs). - Experience with cloud platforms and deploying models at scale. - Strong analytical and problem-solving abilities with acute attention to detail. - Thorough knowledge of statistics, probability, and estimation theory. Desired Skills: - Familiarity with frameworks such as PyTorch, TensorFlow, and Hugging Face. - Experience with data visualization tools like Tableau, Graphana, and Plotly-Dash. - Exposure to AWS services such as Kinesis, SQS, EKS, ASG, Lambda, etc. - Expertise in at least one popular Python web-framework (e.g., FastAPI, Django, or Flask). - Experience in rapid prototyping using Streamlit, Gradio, Dash, etc. - Exposure to Big Data processing technologies (e.g., Snowflake, Redshift, HDFS, EMR).,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
You are a skilled and proactive Snowflake Developer with 5-8 years of experience and a strong background in Python development, particularly focusing on Streamlit. In this long-term contract role, you will be part of an international remote team, where your main responsibilities will include designing, developing, and deploying applications using Python and Streamlit while integrating with Snowflake for efficient data querying and processing. Your key responsibilities will involve collaborating with global stakeholders to develop interactive data tools and visual interfaces, optimizing performance, ensuring scalability, and participating in agile ceremonies. You are required to have 4-8 years of professional experience in software or data engineering, proficiency in Python for data manipulation and application development, hands-on experience with Streamlit, a solid understanding of the Snowflake data platform, and experience with SQL queries and data pipelines. Moreover, familiarity with API integrations, basic DevOps for deployment, and experience working in Agile environments are preferred. Your soft skills should include fluent English communication, self-motivation, ability to work independently, strong collaboration skills, and openness to feedback. It is essential to have a commitment to the project without any dual employment and willingness to align with Central European Time working hours. The interview process will include a technical assessment, initial HR screening, and client technical interviews. If you are passionate about building powerful data tools using modern frameworks and technologies, and thrive in an international remote environment, please share your profiles at hiring@khey-digit.com.,
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |