Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
delhi
On-site
We are looking for a highly motivated and enthusiastic Senior Data Scientist with 5-8 years of experience to join our dynamic team. The ideal candidate will have a strong background in AI/ML analytics and a passion for leveraging data to drive business insights and innovation. As a Senior Data Scientist, your key responsibilities will include developing and implementing machine learning models and algorithms. You will work closely with project stakeholders to understand requirements and translate them into deliverables. Utilize statistical and machine learning techniques to analyze and interpret complex data sets. It is essential to stay updated with the latest advancements in AI/ML technologies and methodologies and collaborate with cross-functional teams to support various AI/ML initiatives. To qualify for this role, you should have a Bachelor's degree in Computer Science, Data Science, or a related field. A strong understanding of machine learning, deep learning, and Generative AI concepts is required. Preferred skills for this position include experience in machine learning techniques such as Regression, Classification, Predictive modeling, Clustering, and Deep Learning stack using Python. Experience with cloud infrastructure for AI/ML on AWS (Sagemaker, Quicksight, Athena, Glue) is highly desirable. Expertise in building enterprise-grade, secure data ingestion pipelines for unstructured data (ETL/ELT) is a plus. Proficiency in Python, TypeScript, NodeJS, ReactJS, and frameworks like pandas, NumPy, scikit-learn, SKLearn, OpenCV, SciPy, Glue crawler, ETL, as well as experience with data visualization tools like Matplotlib, Seaborn, and Quicksight, is beneficial. Additionally, knowledge of deep learning frameworks such as TensorFlow, Keras, and PyTorch, experience with version control systems like Git and CodeCommit, and strong knowledge and experience in Generative AI/LLM based development are essential for this role. Experience working with key LLM models APIs (e.g., AWS Bedrock, Azure Open AI/OpenAI) and LLM Frameworks (e.g., LangChain, LlamaIndex), as well as proficiency in effective text chunking techniques and text embeddings, are also preferred skills. Good to have skills include knowledge and experience in building knowledge graphs in production and an understanding of multi-agent systems and their applications in complex problem-solving scenarios. Pentair is an Equal Opportunity Employer that values diversity and believes that a diverse workforce contributes different perspectives and creative ideas, enabling continuous improvement.,
Posted 1 day ago
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
Apply Digital is a global digital transformation partner for change agents, offering expertise in Business Transformation Strategy, Product Design & Development, Commerce, Platform Engineering, Data Intelligence, and Change Management. Our goal is to help clients modernize their organizations and deliver meaningful impact to both their business and customers, whether they are initiating, accelerating, or optimizing their digital transformation journey. We specialize in implementing composable tech and leveraging our experience in building smart products and utilizing AI tools to drive value. With over 650 team members, we have successfully transformed global companies like Kraft Heinz, NFL, Moderna, Lululemon, Atlassian, Sony, American Express, and Harvard Business School. Founded in 2016 in Vancouver, Canada, Apply Digital has expanded to nine cities across North America, South America, the UK, and Europe. We are excited to launch a new office in Delhi NCR, India. At Apply Digital, we embrace the "One Team" approach, operating within a "pod" structure that brings together senior leadership, subject matter experts, and cross-functional skill sets. Our teams work within a common tech and delivery framework supported by well-organized scrum and sprint cadences, ensuring alignment towards desired outcomes through regular retrospectives. We envision Apply Digital as a safe, empowered, respectful, and fun community wherever we operate globally. Our team strives to embody our SHAPE values - smart, humble, active, positive, and excellent - creating a space for connection, growth, and mutual support to make a difference every day. Apply Digital is a hybrid-friendly organization with remote options available. The preferred candidate for the role should be based in or within commutable distance to Delhi/NCR, India, working hours that overlap with the Eastern Standard Timezone (EST). The client is seeking an experienced Data Engineer to design, build, and maintain scalable data pipelines and architectures to support analytical and operational workloads. Responsibilities include developing and optimizing ETL/ELT pipelines, integrating data pipelines into cloud-native applications, managing cloud data warehouses, implementing data governance and security best practices, collaborating with analytics teams, maintaining data documentation, monitoring and optimizing data pipelines, and staying updated on emerging data engineering technologies. **Requirements:** - Strong proficiency in English (written and verbal communication). - Experience working with remote teams across different time zones. - 5+ years of data engineering experience with expertise in building scalable data pipelines. - Proficiency in SQL and Python for data modeling and processing. - Experience with Google Cloud Platform (GCP) and tools like BigQuery, Cloud Storage, and Pub/Sub. - Knowledge of ETL/ELT frameworks, workflow orchestration tools, data privacy, and security best practices. - Strong problem-solving skills and excellent communication abilities. **Nice to have:** - Experience with real-time data streaming solutions, machine learning workflows, BI tools, Terraform, and data integrations. - Knowledge of Infrastructure as Code (IaC) in data environments. Apply Digital offers comprehensive benefits including private healthcare coverage, contributions to Provident fund, gratuity bonus, flexible vacation policy, engaging projects with global impact, inclusive and safe work environment, learning opportunities, and a commitment to fostering an inclusive workplace. Apply Digital is dedicated to celebrating differences, promoting equal opportunity, and creating an inclusive culture where individual uniqueness is valued and recognized.,
Posted 3 days ago
4.0 - 8.0 years
0 Lacs
tirupati, andhra pradesh
On-site
You are an experienced Snowflake Data Engineer with expertise in Python and SQL, holding a Snowflake certification and having at least 4 years of hands-on experience with Snowflake. Your primary responsibility will be to design, develop, and maintain robust data pipelines in a cloud environment, ensuring efficient data integration, transformation, and storage within the Snowflake data platform. Your key responsibilities will include designing and developing data pipelines to handle large volumes of structured and unstructured data using Snowflake and SQL. You will also be responsible for developing and maintaining efficient ETL/ELT processes to integrate data from various sources into Snowflake, ensuring data quality and availability. Additionally, you will write Python scripts to automate data workflows, implement data transformation logic, and integrate with external APIs for data ingestion. You will create and optimize complex SQL queries for data extraction, transformation, and reporting purposes. Moreover, you will develop and maintain data models to support business intelligence and analytics, leveraging Snowflake best practices. Ensuring proper data governance, security, and compliance within the Snowflake environment will also be one of your responsibilities by implementing access controls, encryption, and monitoring. Collaboration is key, as you will work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver scalable solutions. As a qualified candidate, you must have a Snowflake Certification, 4+ years of experience with Snowflake, and active Snowflake certification. You should possess strong experience with Python for data processing, automation, and API integration. Expertise in writing and optimizing complex SQL queries and experience with data warehousing and database management is essential. Hands-on experience with designing and implementing ETL/ELT pipelines using Snowflake is also required. Familiarity with cloud environments such as AWS, GCP, or Azure, especially in relation to data storage and processing, is necessary. Experience with implementing data governance frameworks and security protocols in a cloud data platform is also a prerequisite. Preferred skills include experience with CI/CD pipelines for data projects, knowledge of Apache Airflow or other orchestration tools, and familiarity with big data technologies and distributed systems. Educational background should include a Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. Additionally, possessing strong problem-solving and analytical skills, excellent communication skills to interact with both technical and non-technical stakeholders, and the ability to work in a fast-paced, agile environment are essential soft skills for this role.,
Posted 4 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France