Jobs
Interviews

50 Datarobot Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Where Data Does More. Join the Snowflake team. We are looking for people who have a strong background in data science and cloud architecture to join our AI/ML Workload Services team to create exciting new offerings and capabilities for our customers! This team within the Professional Services group will be working with customers using Snowflake to expand their use of the Data Cloud to bring data science pipelines from ideation to deployment, and beyond using Snowflake's features and its extensive partner ecosystem. The role will be highly technical and hands-on, where you will be designing solutions based on requirements and coordinating with customer teams, and where needed Systems Integrators. AS A SOLUTIONS ARCHITECT - AI/ML AT SNOWFLAKE, YOU WILL: Be a technical expert on all aspects of Snowflake in relation to the AI/ML workload Build, deploy and ML pipelines using Snowflake features and/or Snowflake ecosystem partner tools based on customer requirements Work hands-on where needed using SQL, Python, Java and/or Scala to build POCs that demonstrate implementation techniques and best practices on Snowflake technology within the Data Science workload Follow best practices, including ensuring knowledge transfer so that customers are properly enabled and are able to extend the capabilities of Snowflake on their own Maintain deep understanding of competitive and complementary technologies and vendors within the AI/ML space, and how to position Snowflake in relation to them Work with System Integrator consultants at a deep technical level to successfully position and deploy Snowflake in customer environments Provide guidance on how to resolve customer-specific technical challenges Support other members of the Professional Services team develop their expertise Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake’s products and marketing OUR IDEAL SOLUTION ARCHITECT - AI/ML WILL HAVE: Minimum 10 years experience working with customers in a pre-sales or post-sales technical role Skills presenting to both technical and executive audiences, whether impromptu on a whiteboard or using presentations and demos Thorough understanding of the complete Data Science life-cycle including feature engineering, model development, model deployment and model management. Strong understanding of MLOps, coupled with technologies and methodologies for deploying and monitoring models Experience and understanding of at least one public cloud platform (AWS, Azure or GCP) Experience with at least one Data Science tool such as AWS Sagemaker, AzureML, Dataiku, Datarobot, H2O, and Jupyter Notebooks Hands-on scripting experience with SQL and at least one of the following; Python, Java or Scala. Experience with libraries such as Pandas, PyTorch, TensorFlow, SciKit-Learn or similar University degree in computer science, engineering, mathematics or related fields, or equivalent experience BONUS POINTS FOR HAVING: Experience with Databricks/Apache Spark Experience implementing data pipelines using ETL tools Experience working in a Data Science role Proven success at enterprise software Vertical expertise in a core vertical such as FSI, Retail, Manufacturing etc Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. How do you want to make your impact? For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com

Posted 8 hours ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Company Overview Docusign brings agreements to life. Over 1.5 million customers and more than a billion people in over 180 countries use Docusign solutions to accelerate the process of doing business and simplify people’s lives. With intelligent agreement management, Docusign unleashes business-critical data that is trapped inside of documents. Until now, these were disconnected from business systems of record, costing businesses time, money, and opportunity. Using Docusign’s Intelligent Agreement Management platform, companies can create, commit, and manage agreements with solutions created by the #1 company in e-signature and contract lifecycle management (CLM). What you'll do You will play an important role in applying and implementing effective machine learning solutions, with a significant focus on Generative AI. You will work with product and engineering teams to contribute to data-driven product strategies, explore and implement GenAI applications, and deliver impactful insights. This positionis an individual contributor role reporting to the Senior Manager, Data Science. Responsibility Experiment with, apply, and implement DL/ML models, with a strong emphasis on Large Language Models (LLMs), Agentic Frameworks, and other Generative AI techniques to predict user behavior, enhance product features, and improve automation Utilize and adapt various GenAI techniques (e.g., prompt engineering, RAG, fine-tuning existing models) to derive actionable insights, generate content, or create novel user experiences Collaborate with product, engineering, and other teams (e.g., Sales, Marketing, Customer Success) to build Agentic system to run campaigns at-scale Conduct in-depth analysis of customer data, market trends, and user insights to inform the development and improvement of GenAI-powered solutions Partner with product teams to design, administer, and analyze the results of A/B and multivariate tests, particularly for GenAI-driven features Leverage data to develop actionable analytical insights & present findings, including the performance and potential of GenAI models, to stakeholders and team members Communicate models, frameworks (especially those related to GenAI), analysis, and insights effectively with stakeholders and business partners Stay updated on the latest advancements in Generative AI and propose their application to relevant business problems Complete assignments with a sense of urgency and purpose, identify and help resolve roadblocks, and collaborate with cross-functional team members on GenAI initiatives Job Designation Hybrid: Employee divides their time between in-office and remote work. Access to an office location is required. (Frequency: Minimum 2 days per week; may vary by team but will be weekly in-office expectation) Positions at Docusign are assigned a job designation of either In Office, Hybrid or Remote and are specific to the role/job. Preferred job designations are not guaranteed when changing positions within Docusign. Docusign reserves the right to change a position's job designation depending on business needs and as permitted by local law. What you bring Basic Bachelor's or Master's degree in Computer Science, Physics, Mathematics, Statistics, or a related field 3+ years of hands-on experience in building data science applications and machine learning pipelines, with demonstrable experience in Generative AI projects Experience with Python for research and software development purposes, including common GenAI libraries and frameworks Experience with or exposure to prompt engineering, and utilizing pre-trained LLMs (e.g., via APIs or open-source models) Experience with large datasets, distributed computing, and cloud computing platforms (e.g., AWS, Azure, GCP) Proficiency with relational databases (e.g., SQL) Experience in training, evaluating, and deploying machine learning models in production environments, with an interest in MLOps for GenAI Proven track record in contributing to ML/GenAI projects from ideation through to deployment and iteration Experience using machine learning and deep learning algorithms like CatBoost, XGBoost, LGBM, Feed Forward Networks for classification, regression, and clustering problems, and an understanding of how these can complement GenAI solutions Experience as a Data Scientist, ideally in the SaaS domain with some focus on AI-driven product features Preferred PhD in Statistics, Computer Science, or Engineering with specialization in machine learning, AI, or Statistics, with research or projects in Generative AI 5+ years of prior industry experience, with at least 1-2 years focused on GenAI applications Previous experience applying data science and GenAI techniques to customer success, product development, or user experience optimization Hands-on experience with fine-tuning LLMs or working with RAG methodologies Experience with or knowledge of experimentation platforms (like DataRobot) and other AI related ones (like CrewAI) Experience with or knowledge of the software development lifecycle/agile methodology, particularly in AI product development Experience with or knowledge of Github, JIRA/Confluence Contributions to open-source GenAI projects or a portfolio of GenAI related work Programming Languages like Python, SQL; familiarity with R Strong knowledge of common machine learning, deep learning, and statistics frameworks and concepts, with a specific understanding of Large Language Models (LLMs), transformer architectures, and their applications Ability to break down complex technical concepts (including GenAI) into simple terms to present to diverse, technical, and non-technical audiences Life at Docusign Working here Docusign is committed to building trust and making the world more agreeable for our employees, customers and the communities in which we live and work. You can count on us to listen, be honest, and try our best to do what’s right, every day. At Docusign, everything is equal. We each have a responsibility to ensure every team member has an equal opportunity to succeed, to be heard, to exchange ideas openly, to build lasting relationships, and to do the work of their life. Best of all, you will be able to feel deep pride in the work you do, because your contribution helps us make the world better than we found it. And for that, you’ll be loved by us, our customers, and the world in which we live. Accommodation Docusign is committed to providing reasonable accommodations for qualified individuals with disabilities in our job application procedures. If you need such an accommodation, or a religious accommodation, during the application process, please contact us at accommodations@docusign.com. If you experience any issues, concerns, or technical difficulties during the application process please get in touch with our Talent organization at taops@docusign.com for assistance. Applicant and Candidate Privacy Notice

Posted 3 days ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Data Scientist || 8 Years || Gurgaon Primary skills: • Solid experience in building ML Models. • Proficient in SQL, Python, PySpark, Spark ML language. • Good understanding of cloud platforms such as AWS (preferred), Azure or GCP. • Proficient in source code controls using Github. Secondary skills: • Experience using any Auto ML products like DataRobot / H2O AI. • Provide inputs to build Artificial Intelligence (AI) roadmap for marketing based on TE&O architecture and capability delivery timelines. • Accountable for identifying, embedding, promoting, and ensuring continuous improvement within the use of new data and advanced analytics across the teams

Posted 3 days ago

Apply

5.0 - 8.0 years

0 Lacs

, India

Remote

Job Title Data Robot Machine Learning Specialist Relevant Experience 5 8 years (minimum 2 years hands-on with Data Robot Auto ML platform) Location Remote / Hybrid (Preferred: Chennai, India) Type of Hire Open to Persons with Disabilities (PwD) and Non-PwD candidates Availability Immediate joiners preferred Role Summary We are looking for a results-oriented Machine Learning professional with deep expertise in Data Robot to build, deploy, and monitor high-impact ML models across diverse business domains. You will own the end-to-end modeling lifecyclefrom data preparation and project setup in DataRobot through production deployment and performance trackingwhile guiding stakeholders on best practices for responsible AI. Must-Have Qualifications Bachelors degree in Computer Science, Data Science, Statistics, or related field 2+ years direct experience with DataRobot (v7+)AutoPilot, Feature Discovery, MLOps, and APIs Strong Python or R skills for custom modeling and data wrangling Solid grasp of supervised & unsupervised algorithms, evaluation metrics, and model interpretability techniques Hands-on SQL; familiarity with data warehouses (Snowflake, BigQuery, Redshift, etc.) Experience deploying models via REST endpoints or message queues (Kafka, RabbitMQ) Comfort presenting technical results to business stakeholders Show more Show less

Posted 4 days ago

Apply

8.0 years

0 Lacs

India

Remote

Job Title Data Robot Machine Learning Specialist Relevant Experience 5 – 8 years (minimum 2 years hands-on with Data Robot Auto ML platform) Location Remote / Hybrid (Preferred: Chennai, India) Type of Hire Open to Persons with Disabilities (PwD) and Non-PwD candidates Availability Immediate joiners preferred Role Summary We are looking for a results-oriented Machine Learning professional with deep expertise in Data Robot to build, deploy, and monitor high-impact ML models across diverse business domains. You will own the end-to-end modeling lifecycle—from data preparation and project setup in DataRobot through production deployment and performance tracking—while guiding stakeholders on best practices for responsible AI. Must-Have Qualifications Bachelor’s degree in Computer Science, Data Science, Statistics, or related field 2+ years direct experience with DataRobot (v7+)—AutoPilot, Feature Discovery, MLOps, and APIs Strong Python or R skills for custom modeling and data wrangling Solid grasp of supervised & unsupervised algorithms, evaluation metrics, and model interpretability techniques Hands-on SQL; familiarity with data warehouses (Snowflake, BigQuery, Redshift, etc.) Experience deploying models via REST endpoints or message queues (Kafka, RabbitMQ) Comfort presenting technical results to business stakeholders

Posted 5 days ago

Apply

3.0 - 8.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Software Engineer Dev Python AI/ML REQ/0749 Job Id: REQ/0749 Location: Chennai Experience: 3 to 8 Years CTC: 10L to 18L Date Posted: 15-Jul-2025 Employment Type: Permanent No. of Openings: 6 Looking for an ML Engineer to design & implement AI/ML models using Python, TensorFlow, PyTorch, Scikit-learn. Optimise models, run experiments, transform data via classification/clustering, and stay updated on the latest AI/ML advancements. Desired Candidate Profile 3 years of experience in Software Design & Development in Python Makes pragmatic technical decisions beyond immediate scope Strong in debugging complex issues and mentoring junior engineers Solid understanding of Data Structures and OOP Proficient in TDD, Unit & Integration testing Experience with Databases, Statistics, and Data Science Skilled in Python; can write robust, testable code Hands-on with ML frameworks: Keras, PyTorch, scikit-learn AutoML experience is a plus Familiar with AI Cloud platforms: H2O, DataRobot, AWS, Azure Education/Specific Knowledge Bachelors or Above Degree in any discipline Key Skills Python, AI/ML, Keras, PyTorch, scikit-learn, H2O, DataRobot, AWS, Azure, FastAPI/Flask, MySQL or Oracle or PostgreSQL, XML, Unit Testing Highlights To know the benefits of Sysvine please visit the bottom of this page. We are open to considering candidates who are on a long break but are still passionate about restarting their careers. Our Benefits India Annual Team Trips Happy Fridays GameVine - Annual Games AimVine - Annual Party Social Responsibilities - Tree Planting, Volunteering for Orphans, Gadget Donations, Blood Donations Camps, Flood Relief Support, Cyclone Relief Support Health Campaigns Birthday Celebrations First Aid & Fire Safety Training Guest Speakers Benefits Accidental Insurance Family Health Insurance Parental Health Insurance Sick Leave Casual Leave Privilege Leave Floating Leave Holidays Short Term Disability Insurance Long Term Disability Insurance Employee Referral Bonus Product Referral Bonus Sodexo Passes Remote Working Flexible Working Hours Maternity Benefit Leave Encashment Tuition Reimbursement Niceties Welcome Kit MacBook Pro iPhones and Android Phones for Mobile Departments Coffee and Biscuits Recreation Room Resting Room Fitness Programmes and Equipment International Traditional Day Personal Tax Management Sessions Shuttle Services from/to Train Big Monitor Recognition Performance Bonus Extra Mile Recognition (EMR) Annual Achievement Awards Special Bonuses Overseas Deputations Leadership Training Programs Technical Conferences Engaging Ethical Diverse Team Lunches D-Day (Difficult Day Policy) I-Day (Inconvenient Day Policy) Technical Conferences Personal Financial Management Sessions Leadership Training Programs Tax Saving Sessions Guest Speakers Benefits Health Insurance Unemployment Insurance Paid Time Off Floating Leaves 8 Holidays Short Term Disability Insurance Workmen Compensation Employee Referral Bonus Product Referral Bonus CalSavers Tuition Reimbursement Recognition Performance Bonus Extra Mile Recognition (EMR) Annual Achievement Awards Special Bonuses Technical Conferences

Posted 6 days ago

Apply

12.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Role : AI Governance Director Department : Legal, AI Governance Job Location : Bangalore/Mumbai Experience Range : 12+ Years Job Summary The AI Governance director at LTIMindtree will play a pivotal role in shaping and safeguarding the organization’s enterprise-wide AI Governance and Compliance Program. This position acts as a critical bridge between business, technology, cybersecurity, data privacy, and governance functions, ensuring that AI is deployed responsibly, ethically, and in alignment with global regulatory standards. This role will drive the development and continuous evolution of AI policies, conduct Responsible AI assessments, ensure regulatory compliance, and champion stakeholder education. By embedding Responsible AI (RAI) principles into the organization’s DNA, the officer will ensure LTIMindtree remains a trusted and forward-thinking leader in the IT services and consulting industry. This role owns the enterprise accountability framework for Responsible AI, with binding authority to enforce compliance. The role will mandate collaboration with various stakeholders and drafting standards, tool kits and frameworks required for Responsible AI adoption. The role will be responsible for championing adoption of governance practices by embedding controls into business workflows, driving cultural change, and measuring policy uptake across teams." Key Responsibilities 1) AI Compliance Strategy & Governance Design and lead the enterprise-wide Responsible AI governance framework adoption. Develop compliance roadmaps for AI/ML initiatives in collaboration with business and technical stakeholders. Collaborate and coordinate with business and IT leadership for governing AI Risk & Ethics governance. Be a part of and provide inputs to the AI governance board Define and institutionalize “AI risk appetite” and “compliance thresholds” for AI/ML deployments. As a part of the AI governance office charter manage the “enterprise-wide AI governance framework” aligned with EU AI Act, NIST AI RMF, OECD AI Principles, and other emerging regulations Implement, Manage and Govern the AI assurance framework 2) Policy Development & Implementation Map and maintain the regulatory landscape in line with the Responsible AI framework Draft and maintain AI-related policies, procedures, and controls across the organization. Work with AI governance office and maintain the Regulatory compliance Ensure AI governance aligns with internal policies and external standards like ISO, GDPR, HIPAA, AI regulations and client-specific requirements. Build and manage standard operating procedures (SOPs) and Tool kits for AI lifecycle management and risk controls. Collaborate and assist “InfoSec” to integrate AI compliance into “DevSecOps & MLOps pipelines” 3) Responsible AI framework implementation, governance & Oversight Manage and improvise the Responsible AI assessment frameworks tailored for AI use cases (e.g., bias, security, explainability, and related risks). Collaborate with Technology teams to assess AI models and recommend mitigations. Collaborate with Technology and Quality assurance teams to implement the Responsible AI testing framework Own and represent AI governance for internal and external audits Maintain AI risk register, including use case risk profiling and residual risk monitoring. Implement “AI audit mechanisms” (model monitoring, impact assessments) Institutionalize the AI impact assessments from AI inventory, Risk categorization and AI assurance assessments Ensure all AI systems adopt the AI impact assessment framework through the AI lifecycle Implement, Institutionalize and monitor AI system approval process 4) Regulatory Monitoring and Engagement Track and analyze global regulatory developments (e.g., EU AI Act, NIST AI RMF, OECD Guidelines, India’s DPDP Act). along with the Privacy office and AI governance office Map and maintain the regulatory landscape in line with the Responsible AI framework Act as liaison to legal and government affairs teams to assess impact of evolving laws. Engage with industry bodies (Partnership on AI, IEEE, ISO) to shape AI standards. Prepare compliance documentation and assist in regulatory or client audits involving AI. 5) Training and Culture Building Own the design and roll out of Responsible AI training modules across technical, business, and executive audiences. Promote awareness of AI ethics and responsible innovation culture across the organization. Drive change management and accountability culture through internal campaigns and workshops. Create “AI playbooks” and “AI tool kits” for AI Development, Deployment teams. 6) Client Engagement & Advisory Advise clients on “Responsible AI framework” and “AI governance framework”. Support pre-sales & proposals with AI governance insights. Collaborate with the Delivery excellence team and Project teams to ensure AI solutions meet client contractual and regulatory obligations. 7) Accountability & Enforcement Own end-to-end accountability for implementing the Responsible AI framework, AI Governance, AI assurance, AI Literacy, Responsible AI toolkit adoption, AI risk management and AI compliance breaches. Escalate AI deployments failing risk/compliance thresholds and escalate to the AI governance office/AIGB. 8) Adoption & Change Management Drive **enterprise-wide adoption of Responsible AI practices, AI policies, responsible AI impact assessments through: AI impact assessments Mandatory compliance gates** in AI project lifecycles (e.g., ethics review before model deployment). Integration with existing workflows** (e.g., SDLC, procurement, sales). Define and track **adoption KPIs** (e.g., "% of AI projects passing RAI audits"). Key Competencies Domain: Strong understanding of Responsible AI framework and AI governance Domain: Understanding of AI regulations (EU AI Act, NIST RMF), AI ethics Technical: AI/ML lifecycle, MLOps, XAI, AI security, Agentic AI, GRC tools Technical: AI systems assessments and defining assessment parameters and standards Leadership: Stakeholder influence, compliance strategy, cross-functional collaboration Ability to adopt new technologies and have experience in putting together a compliance framework Ability to understand frameworks and translate them into process and enable the organization for effective adoption via frameworks, toolkits, guidelines etc. Excellent communication skills Excellent presentation skills Excellent collaborative skills Excellent research skills Ability to come up with frameworks for new tech adoption Proactively take on ownership of tasks and take them to closure Required Qualifications 12-18 years in Information Technology, Compliance, Technical governance, Risk management, with 3+ years in AI/ML-related domains. Strong knowledge of AI regulatory frameworks (EU AI Act, NIST AI RMF, OECD AI Principles). Experience working with cross-functional teams (Delivery, InfoSec, Legal, Data Privacy). Familiarity with AI/ML model lifecycle (training, validation, testing, deployment, monitoring). Preferred Qualifications (Optional) Background in Law, Public Policy, Data Governance, or AI Ethics. Certifications in AI Governance (AIGB, IAPP CIPM/CIPT, MIT RAII), Privacy (CIPP/E) Experience in Global IT services/consulting firms/product companies Exposure to data-centric AI product governance or AI MLOps platforms (e.g., Azure ML, SageMaker, DataRobot), Agentic AI implementation, etc.

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

NTT DATA is looking for an Azure's Kubernetes Service (AKS) with DataRobot to join their team in Bangalore, Karnataka (IN-KA), India. As a potential candidate, you should have a Bachelors/Masters degree in Computer Science or Data Science with 5 to 8 years of experience in software development and data structures/algorithms. Additionally, you should possess 5 to 7 years of experience in programming languages such as Python or JAVA, database languages like SQL, and no-sql. Your role will involve working on developing large-scale platforms, distributed systems, and networks, with experience in compute technologies and storage architecture. A strong understanding of microservices architecture is essential, as well as experience in building AKS applications on Azure. You should also have a solid grasp of Kubernetes for ensuring the availability and scalability of applications in Azure Kubernetes Service. Experience in deploying applications with Azure using third-party tools like Docker, Kubernetes, and Terraform will be beneficial. Moreover, familiarity with AKS clusters, VNETs, NSGs, Azure storage technologies, Azure container registries, and building Redis, ElasticSearch, and MongoDB applications is required. Knowledge of RabbitMQ, ELK, Azure Monitor, DataDog, Splunk, and logging stack is also expected. Proficiency in development tools, CI/CD pipelines such as GitLab CI/CD, Artifactory, Cloudbees, Jenkins, Helm, Terraforms, understanding of IAM roles on Azure, and integration/configuration experience is necessary. Experience in working on Data Robot setup or similar applications on Cloud/Azure, along with functional, integration, and security testing, and performance validation will be part of your responsibilities. About NTT DATA: NTT DATA is a $30 billion global innovator of business and technology services, serving 75% of the Fortune Global 100. As a Global Top Employer, they have diverse experts worldwide and a strong partner ecosystem. Their services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation, and management of applications, infrastructure, and connectivity. NTT DATA is committed to helping clients innovate, optimize, and transform for long-term success. They are one of the leading providers of digital and AI infrastructure globally. NTT DATA is part of the NTT Group, investing over $3.6 billion annually in R&D to support organizations and society's move towards a confident and sustainable digital future. Visit their website at us.nttdata.com.,

Posted 1 week ago

Apply

3.0 - 5.0 years

3 - 10 Lacs

Bengaluru, Karnataka, India

On-site

Key Responsibilities: A day in the life of an Infoscion As part of the Infosys delivery team your primary role would be to ensure effective Design Development Validation and Support activities to assure that our clients are satisfied with the high levels of service in the technology domain You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers You would be a key contributor to building efficient programs systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey this is the place for you If you think you fit right in to help our clients navigate their next in their digital transformation journey this is the place for you Technical Requirements: Min 3 years of experience in Hadoop Administration working in production support projects Cloudera Certified Hadoop Administrator certification Must Experience in Installing configuring maintaining troubleshooting and monitoring Hadoop clusters and below components HDFS HBase Hive Sentry Hue Yarn Sqoop Spark Oozie ZooKeeper Flume Solr Experience in Installing configuring maintaining troubleshooting and monitoring of below Analytical tools and Integrating with Hadoop Datameer Paxata DataRobot H2O MRS Python R Studio SAS Dataiku Bluedata Very good at Job level troubleshooting Yarn Impala and other components Must Experience and Strong Knowledge of Unix Linux scripting Must Experience and knowledge on below tools Talend MySQl Galera Pepperdata Autowatch Netbackup Solix UDeploy RLM Troubleshoot development and production application problems across multiple environments and operating platforms Additional Responsibilities: Knowledge of design principles and fundamentals of architecture Understanding of performance engineering Knowledge of quality processes and estimation techniques Basic understanding of project domain Ability to translate functional nonfunctional requirements to systems requirements Ability to design and code complex programs Ability to write test cases and scenarios based on the specifications Good understanding of SDLC and agile methodologies Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Preferred Skills: Technology->Big Data - Hadoop->Hadoop,Technology->Big Data - Hadoop->Hadoop Administration->Hadoop

Posted 1 week ago

Apply

8.0 years

0 Lacs

Sholinganallur, Tamil Nadu, India

On-site

Role: MLE + Vertex AI Mode : Permanent - Full time Exp: 4- 8 years Job Description: The candidate should be a self-starter and be able to contribute independently in the absence of any guidance. strong vertex ai experience, moving multiple MLE workloads on to vertex ai is a pre-requisite. The client is not looking to act as guides/mentors. “They are seeking an MLE with hands-on experience in delivering machine learning solutions using Vertex AI and strong Python skills. The person must have 5+ years of experience, with 3+ in MLE. Advanced knowledge of machine learning, engineering industry frameworks, and professional standards. Demonstrated proficiency using cloud technologies and integrating with ML services including GCP Vertex AI, DataRobot or AWS SageMaker in large and complex organisations and experience with SQL and Python environments. Experience in Technology delivery, waterfall and agile. Python and SQL skills. Experience with distributed programming (e.g. Apache Spark, Pyspark) . Software engineering experience/skills. Experience working with big data cloud platforms (Azure, Google Cloud Platform, AWS). DevOps experience. CI/CD experience. Experience with Unit Testing, TDD. Experience with Infrastructure as code. Direct client interaction. Must Have Skills: Vertex AI, MLE, AWS, PYTHON, SQL Interested candidates can reach us @7338773388 or careers@w2ssolutions.com & hr@way2smile.com

Posted 1 week ago

Apply

3.0 - 8.0 years

0 - 0 Lacs

chennai, tamil nadu

On-site

The position based in Chennai requires an ML Engineer with 3 to 8 years of experience to design and implement AI/ML models using Python, TensorFlow, PyTorch, and Scikit-learn. The role involves optimizing models, running experiments, transforming data through classification and clustering techniques, and staying updated on the latest advancements in AI/ML technologies. The ideal candidate should possess over 3 years of experience in Software Design & Development in Python, demonstrating the ability to make pragmatic technical decisions beyond immediate scope. The candidate should excel in debugging complex issues and mentoring junior engineers, with a solid understanding of Data Structures and Object-Oriented Programming. Proficiency in Test-Driven Development (TDD), Unit & Integration testing, experience with Databases, Statistics, and Data Science, along with strong Python skills to write robust and testable code are essential. Moreover, the candidate should have hands-on experience with ML frameworks such as Keras, PyTorch, and Scikit-learn, and familiarity with AutoML would be a plus. Knowledge of AI Cloud platforms like H2O, DataRobot, AWS, and Azure is desirable. The candidate should hold a Bachelor's degree or higher in any discipline. Key skills required for the role include Python, AI/ML, Keras, PyTorch, Scikit-learn, H2O, DataRobot, AWS, Azure, FastAPI/Flask, and proficiency with databases like MySQL, Oracle, or PostgreSQL. Familiarity with XML and Unit Testing is also necessary. Candidates who are passionate about restarting their careers after a long break are encouraged to apply.,

Posted 1 week ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description: We are looking for a skilled Data Science Specialist with a strong background in Python programming and a solid understanding of machine learning and data science concepts. The ideal candidate will be responsible for maintaining and optimizing existing data science models, conducting experiments on managed ML platforms, and translating complex data insights into actionable Python and SQL code. Key Responsibilities: • Maintain and optimize existing data science models, ensuring their continued effectiveness and relevance • Conduct experiments and analyses using managed ML platforms such as DataRobot • Translate complex data calculations and insights from Tableau datasources into production-ready SQL and Python code • Collaborate with cross-functional teams to understand business requirements and provide data-driven solutions • Perform in-depth data analysis to uncover trends, patterns, and opportunities for improvement • Stay up-to-date with the latest developments in machine learning and data science technologies Requirements: • 5+ years of total experience • Strong proficiency in Python programming, with experience in data manipulation and analysis libraries • Solid understanding of fundamental machine learning and data science concepts • Experience working with managed ML platforms, preferably DataRobot • Proficiency in SQL for data extraction and manipulation • Ability to interpret and translate complex data calculations from visualization tools like Tableau • Strong analytical and problem-solving skills • Excellent communication skills, with the ability to explain technical concepts to non- technical stakeholders • Bachelors or Masters degree in Computer Science, Statistics, Mathematics, or a related field Preferred Qualifications: • Experience with cloud-based data platforms and big data technologies • Familiarity with version control systems (e.g., Git) • Knowledge of data privacy and security best practices

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Analyst in the Financial Crime Surveillance Operations (FCSO) Performance and Metrics Management function at Standard Chartered Bank, your primary responsibility will be to interpret data and transform it into valuable information that influences business processes and decisions within FCSO. You will gather data from various sources, analyze patterns and trends, and present the information in a digestible format through the FCSO Scorecard. It is essential to possess strong analytical skills and a keen curiosity to comprehend and derive meaning from data. Your key responsibilities will include acquiring a detailed understanding of data sourcing and visualization tools, defining clear business requirements for FCSO data, creating and maintaining documentation for data extraction processes, collaborating with downstream business process owners to enhance data quality and effectiveness, analyzing upstream changes impacting FCSO data, identifying areas for process improvement, producing insightful dashboards and reports for stakeholders, and participating in Agile Ceremonies as a functional data expert. You will work closely with the FCSO Management Team, Data Squads, Data Quality Analysts, upstream data teams, and downstream Process Owners to meet data requirements and facilitate data transformation. Additionally, you will be responsible for embedding ethical conduct and regulatory compliance in all data-related activities, following change governance processes, and resolving risk and compliance matters collaboratively. To excel in this role, you should have 8-10 years of industry experience as a Business/Data Analyst, with expertise in data analysis using tools such as Tableau, Dataiku, MSTR, SQL, and Excel. Proficiency in data management techniques, advanced technical skills, and knowledge of Agile development methodologies are essential. Strong stress management and communication skills are crucial for effective collaboration with cross-functional teams and stakeholders. As part of the FCSO Data and Reporting team, you will contribute to strategic solutions and initiatives, drive business requirements for data management, and support risk management efforts. Continuous learning and adherence to Standard Chartered Bank's values and code of conduct are integral to your role as a Data Analyst. If you are passionate about leveraging data to drive business decisions, thrive in a dynamic environment, and possess the necessary skills and experience, we invite you to join our team at Standard Chartered and contribute to our mission of driving commerce and prosperity through diversity and inclusion. For more information and to explore career opportunities with us, visit www.sc.com/careers.,

Posted 2 weeks ago

Apply

0.0 - 2.0 years

15 - 18 Lacs

Bengaluru District, Karnataka

On-site

Job Description: We are looking for a skilled Data Science Specialist with a strong background in Python programming and a solid understanding of machine learning and data science concepts. The ideal candidate will be responsible for maintaining and optimizing existing data science models, conducting experiments on managed ML platforms, and translating complex data insights into actionable Python and SQL code. Key Responsibilities: Maintain and optimize existing data science models, ensuring their continued effectiveness and relevance Conduct experiments and analyses using managed ML platforms such as DataRobot Translate complex data calculations and insights from Tableau datasources into production-ready SQL and Python code Collaborate with cross-functional teams to understand business requirements and provide data-driven solutions Perform in-depth data analysis to uncover trends, patterns, and opportunities for improvement Stay up-to-date with the latest developments in machine learning and data science technologies Requirements: 5+ years of total experience Strong proficiency in Python programming, with experience in data manipulation and analysis libraries Solid understanding of fundamental machine learning and data science concepts Experience working with managed ML platforms, preferably DataRobot Proficiency in SQL for data extraction and manipulation Ability to interpret and translate complex data calculations from visualization tools like Tableau Strong analytical and problem-solving skills Excellent communication skills, with the ability to explain technical concepts to non- technical stakeholders Bachelors or Master & degree in Computer Science, Statistics, Mathematics, or a related field Preferred Qualifications: Experience with cloud-based data platforms and big data technologies Familiarity with version control systems (e.g., Git) Knowledge of data privacy and security best practices Job Type: Contractual / Temporary Contract length: 12 months Pay: ₹1,500,000.00 - ₹1,800,000.00 per year Experience: Datarabot: 2 years (Required) Location: Bengaluru District, Karnataka (Required) Work Location: In person

Posted 2 weeks ago

Apply

8.0 - 20.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Dear Aspirant , Greetings from TCS ! TCS presents excellent opportunity for Data Science & AI/ML Architect (Traditional AI & Generative AI) Exp: 8- 20 Years Job Location: Chennai / Bangalore / Hyderabad / Mumbai / Pune / Kolkata / Delhi / Noida / Gurgaon ●Develop scalable AI/ML solutions that integrate seamlessly with existing systems and align with business objectives ●Experience in defining and designing robust AI/ML architectures on cloud platforms such as Azure, AWS, or Google Cloud. ●Handson experience on implementing solutions using RAG, Agentic AI, Langchain, MLOps ●Experience in implementing ethical AI practices and ensuring responsible AI usage in solutions. ●Proficient in using tools like TensorFlow, PyTorch, Hugging Face Transformers, OpenAI GPT, Stable Diffusion, DALL-E, and AWS SageMaker, Azure ML, Azure Databricks to develop and deploy generative AI models across cloud environments. ●Experience in some of the industry renowned tools for AI/ML workload implementation like Dataiku, Datarobot, Rapidminer etc. ●Exposure to complex AI/ML solutions with computer vison/NLP etc. ●Collaborates with Infrastructure and Security Architects to ensure alignment with Enterprise standards and designs ●Strong oral and written Communication skills. Good presentation skills ●Analytical Skills Business orientation & acumen (exposure)

Posted 3 weeks ago

Apply

8.0 - 10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Summary A Data Analyst in the Financial Crime Surveillance Operations (FCSO) Performance and Metrics Management function interprets data and helps turn it into information that enables or improves a business process, thus affecting business decisions within FCSO. The FCSO Data Analyst gathers information from various sources and interpret patterns and trends to make it digestible for others where it is then reported in the FCSO Scorecard. They must have strong analytical skills, but above all have a burning curiosity to understand, and make sense of, data. Responsibilities Acquire a detailed understanding of the tools for sourcing and visualising of data, transforming as well as analysing of the data required to manage FCSO Performance metrics and Scorecard Define clear, concise and detailed business requirements for FCSO Data that clearly document the data elements and formats that are needed, outline detailed transformation expectations and list the critical data elements that will enable downstream processes to operate effectively Create and maintain documentation that articulates the process by which data is extracted, transformed and loaded in FCSO that can be shared and understood by others Work with downstream FCSO business process owners to constantly improve, refine and expand the datasets to improve the quality and effectiveness of those processes, as well as help them to make sense of the data, providing training where required, and derive meaningful BI / MI Conduct detailed analysis of upstream changes that impact FCSO data – for example the introduction of a new products –to ensure that requirements remain up to date and define any new ones as necessary Identify areas of overlap or data gaps that can lead to increased value, either by eliminating redundant processes or expanding existing data models Produce accurate and insightful dashboards and reports detailing the health, content and insights available from the data, making that actionable for stakeholders and meaningful for management decision making Participate in Agile Ceremonies as a functional data expert and work with a cross functional agile team Innovate with how we present Data to senior management to make actionable insights and metrics enabling business to take data driven decisions. Strategy Work for FCSO Data and Reporting team strategic solutions & initiatives Business Define clear, concise and detailed business requirements for FCSO Data that clearly document the data elements and formats that are needed, outline detailed transformation expectations and list the critical data elements that will enable downstream processes to operate effectively Key Responsibilities Governance Follow TTO and FCSO change governance process, document all the changes and communicate the stakeholders for UVT(Users Verification Testing). Regulatory & Business Conduct Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters. Key stakeholders They Will Work Closely With FCSO Management Team, who provide the team priorities in terms of metrics to be reported and managed, requirements, objectives, and strategy FCSO Data Squads, who are managing the MI transformation and working with the FCSO Performance and Metrics Management team to define, prioritise, and operationalise the use of the FCSO metrics FCSO Data Quality Analysts, who define data quality control requirements and oversee these on a day to day basis to ensure constant system health Upstream data teams, who provide the data that the analyst is sourcing Downstream Process Owners, who depend on the data to perform their business function Data Analysts spend much of their time working with stakeholders to define data requirements, data transformation logic and supporting the delivery of these requirements from start to finish. They are experts in profiling data to understand its contents and will also have a working understanding of the business process or product that generated it in the first place. Data Analysts are the entry point to the FCSO Data Team for most external stakeholders and as such will have a broad, but still detailed, understanding of all the data available and constantly seek opportunities for innovation and expansion. They are the primary liaison between up- and downstream teams. Other Responsibilities Embed Here for good and Group’s brand and values in India / OPS FCSO / Data and Reporting ; Perform other responsibilities assigned under Group, Country, Business or Functional policies and procedures; Multiple functions (double hats); Processes Work with downstream FCSO business process owners to constantly improve, refine and expand the datasets to improve the quality and effectiveness of those processes, as well as help them to make sense of the data, providing training where required, and derive meaningful BI / MI People & Talent Learn all the FCSO processes systems data regularly and apply the knowledge in the Data and MI ETL(Extraction Transformation and Loading) and Reports development. Risk Management Learn the FCSO risk management framework and raise the issues in M7 and diligently and close them in a timely manner. Knowledge: An advanced data management techniques with extensive experience. 8-10 years of industry experience as a Business/Data Analyst with 6-8 years’ experience in data analysis using tools such as Tableau,Dataiku,MSTR, SQL and Excel. Technical Skills: Tableau, MSTR, Dataiku, Python, SQL. Practical knowledge of data in various forms (data warehouses/SQL, unstructured data environments/PIG,HIVE, Impala, Pyspark, Think cell and pivot tables in Excel); Experience working within process management and improvement methodologies – Lean, Six Sigma, etc. and demonstrating knowledge of data governance, data quality management concepts and data quality tools (i.e. Informatica DQ); Understanding of Agile development methodologies, software design patterns, network design and architecture; Experience in quantitative analysis. Past work experience using both Tableau , Dataiku/ Pyspark will be an added advantage Stress Management: The Manager data analyst must be able to work well under pressure and achieve results within the scheduled timeframe Communication skills : The role of a Manager data analyst involves working with various cross functional teams, technology, and Management team. It is crucial that they have exceptional writing and verbal communication skills to perform their job duties effectively. Skills And Experience Data Analytics and Visualisation Tools – Tableau (Preferable), PowerBI, Dataiku(Preferable), MSTR, DataRobot or Paxata) FCC/FCSO Knowledge/ past work experience Microsoft office: PPT, Excel, Macros Agile tools: Confluence, JIRA \SQL, Python, Pyspark Qualifications EDUCATION Graduate / Master’s degree and 8-10 years of Banking Industry experience in data analysis using Tableau & Dataiku/ SQL CERTIFICATIONS Tableau (Preferable), Dataiku, MSTR, Python, SQL. Pyspark. Practical knowledge of data in various forms (data warehouses/SQL, unstructured data environments/PIG,HIVE, Impala, Think cell and pivot tables in Excel LANGUAGES ENGLISH About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential.

Posted 3 weeks ago

Apply

2.0 - 5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

We are hiring for an analyst/AI-ML role,s one of the leading Product-based clients in Bangalore. Position: Data Analyst/Sr. Data Analyst/Associate Revenue Manager Job Location—Bangalore pedigree—Tier 1 institution Notice - immediate or 30 days max Experience in creating, collecting, analysing and communicating business insights from data. ● Experience working on metrics such as retention, churn, engagement and marketing analytics. ● Knowledge of Analytics & self-service BI approaches and predictive analytics solutions is a strong advantage. ● Strong background in SQL and experience working on Python. ● Experience in Statistical techniques (R, hypothesis testing). ● Should have worked on Tableau, Google Analytics and Clevertap. ● Knowledge of DataRobot, Azure or AWS AI/ML will be a plus. Position 2- DATA SCIENTIST ( AI/ML ) Develop, deploy and maintain the ML/Deep Learning models in production or build the regular Cadence to facilitate the same ● Work closely with the Engineering team to strategies and execute the development of Data Science projects. ● Communicate progress/findings to relevant stakeholders. ● Assist in defining short and long term roadmaps that help growth and demonstrate impact at the organisational level. ● Upto date knowledge in the field of technical and industry developments. -2-5 years of experience in with Bachelor's/Master's degree in Statistics, Applied mathematics, or related discipline ● Effective communication of DS requirements and dissemination of right knowledge to relevant stakeholders. ● Experience with Spark, Flask, SQL, Python, Cloud platform. ● Experience in Machine Learning, Deep Learning and Engineering aspects of ML model deployment. ● Proficiency with data analysis, mathematics/probability, and statistical analysis. ● Experience in predictive modelling exercise. ● Comfort working in a dynamic, fast paced, hands-on and ma

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

It's fun to work in a company where people truly BELIEVE in what they are doing! We're committed to bringing passion and customer focus to the business. JOB LOCATION: Bangalore, Mumbai, Pune, Gurgaon, Chennai, Hyderabad, Coimbatore, Noida Job Description Building the machine learning production System(or MLOps) is the biggest challenge most large companies currently have in making the transition to becoming an AI-driven organization. This position is an opportunity for an experienced, server-side developer to build expertise in this exciting new frontier. You will be part of a team deploying state-of-the-art AI solutions for Fractal clients. Responsibilities As MLOps Engineer, you will work collaboratively with Data Scientists and Data engineers to deploy and operate advanced analytics machine learning models. You’ll help automate and streamline Model development and Model operations. You’ll build and maintain tools for deployment, monitoring, and operations. You’ll also troubleshoot and resolve issues in development, testing, and production environments. Enable Model tracking, model experimentation, Model automation Develop scalable ML pipelines Develop MLOps components in Machine learning development life cycle using Model Repository (either of): MLFlow, Kubeflow Model Registry Machine Learning Services (either of): Kubeflow, DataRobot, HopsWorks, Dataiku or any relevant ML E2E PaaS/SaaS Work across all phases of Model development life cycle to build MLOPS components Build the knowledge base required to deliver increasingly complex MLOPS projects on the Cloud(AWS, Azure, GCP)/On Prem Be an integral part of client business development and delivery engagements across multiple domains. QUALIFICATIONS: REQUIRED QUALIFICATIONS: 3-5 years experience building production-quality software Strong experience in System Integration, Application Development or DataWarehouse projects across technologies used in the enterprise space Basic Knowledge of MLOps, machine learning and docker Object-oriented languages (e.g. Python, PySpark, Java, C#, C++ ) Experience developing CI/CD components for production ready ML pipeline. Database programming using any flavors of SQL Knowledge of Git for Source code management Ability to collaborate effectively with highly technical resources in a fast-paced environment Ability to solve complex challenges/problems and rapidly deliver innovative solutions Team handling, problem solving, project management and communication skills & creative thinking Foundational Knowledge of Cloud Computing either one AWS, Azure or GCP Hunger and passion for learning new skills Education B.E/B.Tech/M.Tech in Computer Science or related technical degree OR Equivalent. If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us! Not the right fit? Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

India

On-site

Job Description: DataRobot delivers AI that maximizes impact and minimizes business risk. Our platform and applications integrate into core business processes so teams can develop, deliver, and govern AI at scale. DataRobot empowers practitioners to deliver predictive and generative AI, and enables leaders to secure their AI assets. Organizations worldwide rely on DataRobot for AI that makes sense for their business — today and in the future. DataRobot accelerates the process of building predictive models to get the most out of valuable data. We work hard to create tools that nascent data scientists can use effectively while also exposing the rich detail and control that data science veterans rely on. In order to keep up with the demand for new features in DataRobot, we’re looking to grow the team by bringing on a Senior Backend Engineer. The primary responsibilities of this team include developing and supporting new data science tools, designing and supporting our APIs, and instrumenting DataRobot to integrate with enterprise IT infrastructure. Our team leverages modern technologies to achieve our goals and innovate on our solutions. Key Responsibilities: Develop, test, and support features of DataRobot. Create and maintain automated unit tests and functional tests. Design infrastructure for new features with the input of peers. Manage individual projects and milestones with abundant communication of progress. Seek, give, and receive critical feedback in a constructive manner, including but not limited to code reviews. Engage in engineering on-call escalated support of services owned by the team Competencies should be at a level where a manager can have high confidence in an engineer’s ability to deliver complex solutions on time on an agreed-upon roadmap and manage technical risks. Should be capable of working with product management to get requirements and drive technical feedback on complexity/approaches. Knowledge, Skills & Abilities: 5+ years of proven experience writing high-quality code in a collaborative environment preferably using Python and/or Go Strong Computer Science fundamentals in object-oriented design, data structures, algorithm design, problem-solving, and complexity analysis. An understanding of design for scalability, performance, and reliability. Deep experience with automated testing and test-driven development Demonstrable knowledge of software architecture for large systems Real-world experience decoupling monolithic software into smaller reusable components Self-motivated and proactive, able to take ownership and deliver results. Ability and willingness to learn about new technologies. Personal drive to get things finished. Effective communication behavior. Fundamental understanding of Kubernetes and Helm. Experience in building and running software systems on Kubernetes clusters in production Hands-on experience with infrastructure provisioning and configuration using Infrastructure as Code (IaC) principles Nice to have: Experience with AWS, Azure, and/or Google Cloud platforms CKAD (Certified Kubernetes Application Developer) certification Publicly reviewable contributions to interesting development projects. Experience supporting user-facing code and APIs. Data Science experience Identity and Access Management experience CI/CD pipeline experience The talent and dedication of our employees are at the core of DataRobot’s journey to be an iconic company. We strive to attract and retain the best talent by providing competitive pay and benefits with our employees’ well-being at the core. Here’s what your benefits package may include depending on your location and local legal requirements: Medical, Dental & Vision Insurance, Flexible Time Off Program, Paid Holidays, Paid Parental Leave, Global Employee Assistance Program (EAP) and more! DataRobot Operating Principles: Wow Our Customers Set High Standards Be Better Than Yesterday Be Rigorous Assume Positive Intent Have the Tough Conversations Be Better Together Debate, Decide, Commit Deliver Results Overcommunicate Research shows that many women only apply to jobs when they meet 100% of the qualifications while many men apply to jobs when they meet 60%. At DataRobot we encourage ALL candidates, especially women, people of color, LGBTQ+ identifying people, differently abled, and other people from marginalized groups to apply to our jobs, even if you do not check every box. We’d love to have a conversation with you and see if you might be a great fit. DataRobot is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. DataRobot is committed to working with and providing reasonable accommodations to applicants with physical and mental disabilities. Please see the United States Department of Labor’s EEO poster and EEO poster supplement for additional information. All applicant data submitted is handled in accordance with our Applicant Privacy Policy.

Posted 4 weeks ago

Apply

8.0 - 10.0 years

18 - 27 Lacs

Gurugram

Hybrid

JD Primary skills: • Solid experience in building ML Models. • Proficient in SQL, Python, PySpark, Spark ML language. • Good understanding of cloud platforms such as AWS (preferred), Azure or GCP. • Proficient in source code controls using Github. Secondary skills: • Experience using any Auto ML products like DataRobot / H2O AI. • Provide inputs to build Artificial Intelligence (AI) roadmap for marketing based on TE&O architecture and capability delivery timelines. • Accountable for identifying, embedding, promoting, and ensuring continuous improvement within the use of new data and advanced analytics across the teams

Posted 1 month ago

Apply

3.0 years

3 - 8 Lacs

Bengaluru

Remote

Company Overview Docusign brings agreements to life. Over 1.5 million customers and more than a billion people in over 180 countries use Docusign solutions to accelerate the process of doing business and simplify people’s lives. With intelligent agreement management, Docusign unleashes business-critical data that is trapped inside of documents. Until now, these were disconnected from business systems of record, costing businesses time, money, and opportunity. Using Docusign’s Intelligent Agreement Management platform, companies can create, commit, and manage agreements with solutions created by the #1 company in e-signature and contract lifecycle management (CLM). What You'll Do You will play an important role in applying and implementing effective machine learning solutions, with a significant focus on Generative AI. You will work with product and engineering teams to contribute to data-driven product strategies, explore and implement GenAI applications, and deliver impactful insights. This positionis an individual contributor role reporting to the Senior Manager, Data Science. Responsibility Experiment with, apply, and implement DL/ML models, with a strong emphasis on Large Language Models (LLMs), Agentic Frameworks, and other Generative AI techniques to predict user behavior, enhance product features, and improve automation Utilize and adapt various GenAI techniques (e.g., prompt engineering, RAG, fine-tuning existing models) to derive actionable insights, generate content, or create novel user experiences Collaborate with product, engineering, and other teams (e.g., Sales, Marketing, Customer Success) to build Agentic system to run campaigns at-scale Conduct in-depth analysis of customer data, market trends, and user insights to inform the development and improvement of GenAI-powered solutions Partner with product teams to design, administer, and analyze the results of A/B and multivariate tests, particularly for GenAI-driven features Leverage data to develop actionable analytical insights & present findings, including the performance and potential of GenAI models, to stakeholders and team members Communicate models, frameworks (especially those related to GenAI), analysis, and insights effectively with stakeholders and business partners Stay updated on the latest advancements in Generative AI and propose their application to relevant business problems Complete assignments with a sense of urgency and purpose, identify and help resolve roadblocks, and collaborate with cross-functional team members on GenAI initiatives Job Designation Hybrid: Employee divides their time between in-office and remote work. Access to an office location is required. (Frequency: Minimum 2 days per week; may vary by team but will be weekly in-office expectation) Positions at Docusign are assigned a job designation of either In Office, Hybrid or Remote and are specific to the role/job. Preferred job designations are not guaranteed when changing positions within Docusign. Docusign reserves the right to change a position's job designation depending on business needs and as permitted by local law. What You Bring Basic Bachelor's or Master's degree in Computer Science, Physics, Mathematics, Statistics, or a related field 3+ years of hands-on experience in building data science applications and machine learning pipelines, with demonstrable experience in Generative AI projects Experience with Python for research and software development purposes, including common GenAI libraries and frameworks Experience with or exposure to prompt engineering, and utilizing pre-trained LLMs (e.g., via APIs or open-source models) Experience with large datasets, distributed computing, and cloud computing platforms (e.g., AWS, Azure, GCP) Proficiency with relational databases (e.g., SQL) Experience in training, evaluating, and deploying machine learning models in production environments, with an interest in MLOps for GenAI Proven track record in contributing to ML/GenAI projects from ideation through to deployment and iteration Experience using machine learning and deep learning algorithms like CatBoost, XGBoost, LGBM, Feed Forward Networks for classification, regression, and clustering problems, and an understanding of how these can complement GenAI solutions Experience as a Data Scientist, ideally in the SaaS domain with some focus on AI-driven product features Preferred PhD in Statistics, Computer Science, or Engineering with specialization in machine learning, AI, or Statistics, with research or projects in Generative AI 5+ years of prior industry experience, with at least 1-2 years focused on GenAI applications Previous experience applying data science and GenAI techniques to customer success, product development, or user experience optimization Hands-on experience with fine-tuning LLMs or working with RAG methodologies Experience with or knowledge of experimentation platforms (like DataRobot) and other AI related ones (like CrewAI) Experience with or knowledge of the software development lifecycle/agile methodology, particularly in AI product development Experience with or knowledge of Github, JIRA/Confluence Contributions to open-source GenAI projects or a portfolio of GenAI related work Programming Languages like Python, SQL; familiarity with R Strong knowledge of common machine learning, deep learning, and statistics frameworks and concepts, with a specific understanding of Large Language Models (LLMs), transformer architectures, and their applications Ability to break down complex technical concepts (including GenAI) into simple terms to present to diverse, technical, and non-technical audiences Life At Docusign Working here Docusign is committed to building trust and making the world more agreeable for our employees, customers and the communities in which we live and work. You can count on us to listen, be honest, and try our best to do what’s right, every day. At Docusign, everything is equal. We each have a responsibility to ensure every team member has an equal opportunity to succeed, to be heard, to exchange ideas openly, to build lasting relationships, and to do the work of their life. Best of all, you will be able to feel deep pride in the work you do, because your contribution helps us make the world better than we found it. And for that, you’ll be loved by us, our customers, and the world in which we live. Accommodation Docusign is committed to providing reasonable accommodations for qualified individuals with disabilities in our job application procedures. If you need such an accommodation, or a religious accommodation, during the application process, please contact us at accommodations@docusign.com. If you experience any issues, concerns, or technical difficulties during the application process please get in touch with our Talent organization at taops@docusign.com for assistance. Our global benefits Paid time off Take time to unwind with earned days off, plus paid company holidays based on your region. Paid parental leave Take up to six months off with your child after birth, adoption or foster care placement. Full health benefits Options for 100% employer-paid health plans from day one of employment. Retirement plans Select retirement and pension programs with potential for employer contributions. Learning & development Grow your career with coaching, online courses and education reimbursements. Compassionate care leave Paid time off following the loss of a loved one and other life-changing events.

Posted 1 month ago

Apply

3.0 - 8.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

: Software Engineer Dev Python ML REQ/0749 Job Id: REQ/0749 Location: Chennai Experience: 3 to 8 Years CTC: 10L to 18L Date Posted: 26-Jun-2025 Employment Type: Permanent No. of Openings: 6 Looking for an ML Engineer to design & implement AI/ML models using Python, TensorFlow, PyTorch, Scikit-learn. Optimize models, run experiments, transform data via classification/clustering, and stay updated on the latest AI/ML advancements. Desired Candidate Profile 3 years of experience in Software Design & Development in Python Makes pragmatic technical decisions beyond immediate scope Strong in debugging complex issues and mentoring junior engineers Solid understanding of Data Structures and OOP Proficient in TDD, Unit & Integration testing Experience with Databases, Statistics, and Data Science Skilled in Python; can write robust, testable code Hands-on with ML frameworks: Keras, PyTorch, scikit-learn AutoML experience is a plus Familiar with AI Cloud platforms: H2O, DataRobot, AWS, Azure Education/Specific Knowledge Bachelors or Above Degree in any discipline Key Skills Python AI/ML Keras PyTorch scikit-learn H2O DataRobot AWS Azure FastAPI/Flask MySQL or Oracle or PostgreSQL XML Unit Testing Highlights To know the benefits of Sysvine please visit the bottom of this page. We are open to considering candidates who are on a long break but are still passionate about restarting their careers. Our Benefits India USA Engaging: Annual Team Trips Happy Fridays GameVine - Annual Games AimVine - Annual Party Social Responsibilities - Tree Planting, Volunteering for Orphans, Gadget Donations, Blood Donations Camps, Flood Relief Support, Cyclone Relief Support Health Campaigns Birthday Celebrations First Aid & Fire Safety Training Guest Speakers Benefits: Accidental Insurance Family Health Insurance Parental Health Insurance Sick Leave Casual Leave Privilege Leave Floating Leave Holidays Short Term Disability Insurance Long Term Disability Insurance Employee Referral Bonus Product Referral Bonus Sodexo Passes Remote Working Flexible Working Hours Maternity Benefit Leave Encashment Tuition Reimbursement Niceties: Welcome Kit MacBook Pro iPhones and Android Phones for Mobile Departments Coffee and Biscuits Recreation Room Resting Room Fitness Programmes and Equipment International Traditional Day Personal Tax Management Sessions Shuttle Services from/to Train Big Monitor Recognition: Performance Bonus Extra Mile Recognition (EMR) Annual Achievement Awards Special Bonuses Overseas Deputations Leadership Training Programs Technical Conferences Engaging: Ethical Diverse Team Lunches D-Day (Difficult Day Policy) I-Day (Inconvenient Day Policy) Technical Conferences Personal Financial Management Sessions Leadership Training Programs Tax Saving Sessions Guest Speakers Benefits: Health Insurance Unemployment Insurance Paid Time Off Floating Leaves 8 Holidays Short Term Disability Insurance Workmen Compensation Employee Referral Bonus Product Referral Bonus CalSavers Tuition Reimbursement Recognition: Performance Bonus Extra Mile Recognition (EMR) Annual Achievement Awards Special Bonuses Technical Conferences

Posted 1 month ago

Apply

3.0 years

0 Lacs

India

On-site

Job Description: DataRobot delivers AI that maximizes impact and minimizes business risk. Our platform and applications integrate into core business processes so teams can develop, deliver, and govern AI at scale. DataRobot empowers practitioners to deliver predictive and generative AI, and enables leaders to secure their AI assets. Organizations worldwide rely on DataRobot for AI that makes sense for their business — today and in the future. We are searching for a Customer-Facing DevOps Engineer who enjoys working with cutting-edge technologies and solving complex deployment challenges. This role requires expertise in Kubernetes and cloud computing, strong automation skills, and a customer-centric mindset. The ideal candidate thrives in dynamic environments, excels at troubleshooting, and is passionate about helping customers achieve success with DataRobot. Key Responsibilities: Deploy the DataRobot platform into customer-managed Kubernetes environments, including VPCs and true on-premises deployments, as long as they adhere to CNCF-compliant Kubernetes standards. Design and implement infrastructure automation using tools like Terraform to streamline deployments and ensure reliability. Collaborate directly with customers to address technical challenges, troubleshoot complex environments, and ensure seamless integration of the DataRobot platform. Handle escalated support cases from internal support teams, serving as a subject matter expert for customer deployments. Work closely with engineering and product teams to identify and resolve deployment-related issues. Create and maintain documentation and knowledge base articles to empower customers and internal teams. Knowledge, Skills, & Abilities: Bachelor’s degree in Computer Science, Engineering, or equivalent experience. 3+ years of experience in customer-facing DevOps, systems engineering, or a related role. Deep hands-on experience with Kubernetes, including deploying and managing CNCF-compliant environments. Strong knowledge of cloud infrastructure providers (AWS, Azure, Google Cloud) and hybrid cloud deployments. Expertise in infrastructure automation tools like Terraform, Helm, and Ansible. Proficiency in Linux administration, enterprise networking, and containerization (Docker). Ability to monitor and troubleshoot Kubernetes-based applications, read system logs, and identify issues. Strong Python scripting skills for automation and debugging purposes. Excellent verbal and written communication skills to interact with technical and non-technical audiences. Nice to Have: Certified Kubernetes Administrator (CKA) Will be required to achieve within first 6 months of employment Experience supporting AI, machine learning, or data science platforms. Knowledge of Mongo, Postgres, and Redis database management and optimization. The talent and dedication of our employees are at the core of DataRobot’s journey to be an iconic company. We strive to attract and retain the best talent by providing competitive pay and benefits with our employees’ well-being at the core. Here’s what your benefits package may include depending on your location and local legal requirements: Medical, Dental & Vision Insurance, Flexible Time Off Program, Paid Holidays, Paid Parental Leave, Global Employee Assistance Program (EAP) and more! DataRobot Operating Principles: Wow Our Customers Set High Standards Be Better Than Yesterday Be Rigorous Assume Positive Intent Have the Tough Conversations Be Better Together Debate, Decide, Commit Deliver Results Overcommunicate Research shows that many women only apply to jobs when they meet 100% of the qualifications while many men apply to jobs when they meet 60%. At DataRobot we encourage ALL candidates, especially women, people of color, LGBTQ+ identifying people, differently abled, and other people from marginalized groups to apply to our jobs, even if you do not check every box. We’d love to have a conversation with you and see if you might be a great fit. DataRobot is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. DataRobot is committed to working with and providing reasonable accommodations to applicants with physical and mental disabilities. Please see the United States Department of Labor’s EEO poster and EEO poster supplement for additional information. All applicant data submitted is handled in accordance with our Applicant Privacy Policy.

Posted 1 month ago

Apply

6.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Role Description Roles & Responsibilities: Development & Implementation Design, build, and maintain large-scale batch and real-time data pipelines using PySpark, Spark, Hive, and related big data tools. Write clean, efficient, and scalable code aligned with application design and coding standards. Create and maintain technical documentation including design documents, test cases, and configurations. Technical Leadership Contribute to HLD, LLD, and data architecture documents. Review and validate designs and code from peers and junior developers. Lead technical discussions and decisions with cross-functional teams. Data Management & Optimization Optimize data processing workflows for efficiency, cost, and performance. Manage data quality and ensure data accuracy, lineage, and governance across the pipeline. Stakeholder Collaboration Collaborate with product managers, data stewards, and business stakeholders to translate data requirements into robust engineering solutions. Clarify requirements and propose design options to customers. Testing & Quality Assurance Write and review unit tests and integration tests to ensure data integrity and performance. Monitor and troubleshoot data pipeline issues and ensure minimal downtime. Agile Project Contribution Participate in sprint planning, estimation, and daily stand-ups. Ensure on-time delivery of user stories and bug fixes. Drive release planning and execution processes. Team Mentorship & Leadership Set FAST goals and provide timely feedback to team members. Mentor junior engineers, contribute to a positive team environment, and drive continuous improvement. Compliance & Documentation Ensure adherence to compliance standards such as SOX, HIPAA, and organizational coding standards. Contribute to knowledge repositories, project wikis, and best practice documents. Must-Have Skills Minimum 6+ years of experience as a Data Engineer. Hands-on expertise in PySpark and SQL. Experience in Google Cloud Platform (GCP) or similar cloud environments (AWS, Azure). Proficient in Big Data technologies such as Spark, Hadoop, Hive. Solid understanding of ETL/ELT frameworks, data warehousing, and data modeling. Strong knowledge of CI/CD tools (Jenkins, Git, Ansible, etc.). Excellent problem-solving and analytical skills. Strong written and verbal communication skills. Experience with Agile/Scrum methodologies. Good-to-Have Skills Experience with data orchestration tools (Airflow, Control-M). Familiarity with modern data platforms such as Snowflake, DataRobot, Denodo. Experience in containerized environments (Kubernetes, Docker). Exposure to data security, governance, and compliance frameworks. Hands-on with Terraform, ARM Templates, or similar scripting tools for infrastructure automation. Domain knowledge in banking, healthcare, or retail industries. Skills Spark,Hadoop,Hive,Gcp

Posted 1 month ago

Apply

6.0 - 8.0 years

25 - 40 Lacs

Gurugram

Work from Office

About this role: Lead Software Engineer (AI) position having experience in classic and generative AI techniques, and responsible for design, implementation, and support of Python based applications to help fulfill our Research & Consulting Delivery strategy. What youll do: Deliver client engagements that use AI rapidly, on the order of a few weeks Stay on top of current tools, techniques, and frameworks to be able to use and advise clients on them Build proofs of concept rapidly, to learn and adapt to changing market needs Support building internal applications for use by associates to improve productivity What you’ll need: 6-8 years of experience in classic AI techniques and at least 1.5 years in generative AI techniques. Demonstrated ability to run short development cycles and solid grasp of building software in a collaborative team setting. Must have: Experience building applications for knowledge search and summarization, frameworks to evaluate and compare performance of different GenAI techniques, measuring and improving accuracy and helpfulness of generative responses, implementing observability. Experience with agentic AI frameworks, RAG, embedding models, vector DBs Experience working with Python libraries like Pandas, Scikit-Learn, Numpy, and Scipy is required. Experience deploying applications to cloud platforms such as Azure and AWS. Solid grasp of building software in a collaborative team setting - use of agile scrum and tools like Jira / GitHub. Nice to have: Experience in finetuning Language models. Familiarity with AWS Bedrock / Azure AI / Databricks Services. Experience in Machine learning models and techniques like NLP, BERT, Transformers, Deep learning. Experience in MLOps Frameworks like Kubeflow, MLFlow, DataRobot, Airflow etc., Experience building scalable data models and performing complex relational databases queries using SQL (Oracle, MySQL, PostgreSQL). Who you are: Excellent written, verbal, and interpersonal communication skills with the ability to present technical information in a clear and concise manner to IT Leaders and business stakeholders. Effective time management skills and ability to meet deadlines. Excellent communications skills interacting with technical and business audiences. Excellent organization, multitasking, and prioritization skills. Must possess a willingness and aptitude to embrace new technologies/ideas and master concepts rapidly. Intellectual curiosity, passion for technology and keeping up with new trends. Delivering project work on-time within budget with high quality. Demonstrated ability to run short development cycle.

Posted 1 month ago

Apply
Page 1 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies