Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
Vadodara, Gujarat, India
Remote
Job Description Job Purpose: As a Lead Data Scientist within the Data Science Methods team in the NIQ Product organization, you will drive definition and support of new products and methods development, and improvement initiatives. This position focuses on innovation in data processing methods for retail measurement and automation of existing statistical procedures. Job Responsibilities: Define, plan and execute analyses regarding innovation initiatives, methodology development, standards, and KPIs development and implementation Prototype solutions and support pilot programs for R&D purposes, including trend analyses, representation/sampling, bias reduction, indirect estimation, data integration, automation, and generalization Test-driven development of scalable data processing applications Deliver high quality documentation of new methodologies and best practices Collaborate with experienced Developers, Data Scientists, and Technology engineers Support various Operations team as main users of our solutions Engage with stakeholders on scope, execution, data exchange, and outcomes for assigned projects Participate in multiple projects simultaneously Qualifications Requirements: Essential: PhD degree in Statistics, with outstanding analytical expertise and strong technical skills Extensive experience in trend analyses, multivariate statistics (parametric/non-parametric), sampling, bias reduction, indirect estimation, data aggregation techniques, automation, and generalization High proficiency in Python programming language including data analysis and statistical packages (Pandas, NumPy, Scikit-Learn). Good familiarity with Python standard library, especially unittest and argparse modules Experience with Spark or other big data processing solutions Experience in machine learning Experience with cloud computing and storage (MS Azure preferred) Experience with Docker and Linux command line Ability to quickly manipulate, analyze, and interpret large data sources Strong communication/writing skills with good English (working in a remote team with a global footprint) Preferred: Experience in NIQ methodologies, data collection, platforms, research processes, and operations Additional Information Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion
Posted 2 days ago
9.0 years
0 Lacs
Kochi, Kerala, India
Remote
Experience : 9.00 + years Salary : USD 54000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Contract for 6 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - Andela) What do you need for this opportunity? Must have skills required: LLM (Large Language Models), Prompt Engineering, Retrieval-augmented generation (rag), Natural Language Processing, Data Science, Machine Learning, Python, SQL Andela is Looking for: Senior GenAI Engineer Description: Professionals in the areas of healthcare, legal, business, tax, accounting, finance, audit, risk, and compliance rely on client's market-leading information-enabled tools and software solutions to manage their business efficiently, deliver results to their clients, and succeed in an ever more dynamic world. Every day, our customers make critical decisions to help save lives, improve the way we do business, and build better judicial and regulatory systems. We help them get it right. As a Senior AI Engineer, you will contribute significantly to the design and development of GenAI services. Your contributions will involve enhancing AI capabilities to ensure scalability and reusability across a diverse set of applications. Your analytical and problem-solving skills will be essential, and we encourage you to leverage your coding knowledge to improve our engineering practices. Responsibilities: Contribute to the architecture, design & development of GenAI services that are integral to our product offerings and user experiences. Implement coding best practices to foster code modularity, reusability, and maintainability, enabling our AI services to remain flexible for future advancements. Collaborate with cross-functional and matrixed teams to integrate AI services into the wider product ecosystem, ensuring a smooth developer experience. Assess and optimize existing AI services to enhance performance and conform to the latest industry trends. Support and mentor other engineers, contributing to a culture that values technical skill and code quality. Stay informed on the latest AI technologies and programming techniques, exploring their applicability to our services. Qualifications: Bachelor''s degree in Computer Science, Artificial Intelligence, or a related field, or equivalent practical experience. 8+ years of experience, with experience in AI or machine learning projects. Proficiency in Python for relevant programming languages and frameworks for AI development. Strong knowledge in Machine Learning, Deep Learning, NLP, and AI. Strong hands-on expertise in libraries/frameworks/tools such as NumPy, SciPy, scikit-learn, pandas, matplotlib, spaCy, NLTK, jupyter, Transformers, etc. Experience with cloud-based platforms (AWS or Azure) for solution delivery Proven ability to develop scalable, reusable software components and services. Good knowledge of software engineering principles and architectural standards. Experience in working on and contributing to software project teams. Preferred Qualifications: Familiarity with GenAI concepts, technologies and their implementation. Experience working with OpenAI, Langchain, Azure AI Foundary and AWS Lambda. Experience with cloud-based development and familiarity with AI-related cloud services (e.g.,AWS, Azure, GCP). Interview Process: 1st round: technical interview with the team 2nd round: technical interview on systems design Overlap Hours: 6 hours with EST Contract Length: 6 months, renewable Full-time contractor role (8 hours/day) Device: Bring your own device Requirements & Notes: Assessment Path: Data Science preferred, or ML Engineer; Max All-in rate: $4500/month; Location: India and European Union; Working hours: 6-8 hours overlap with EST; Must-Haves: 8+ years of experience overall; Strong Data Science and Machine Learning foundations, SQL, Python, GenAI, Prompt Engineering, RAG. Location Requirements: Time Start on ASAP Not Available Must have skills: Natural Language Processing Machine Learning Data Science SQL Python Nice to have skills: Prompt Engineering LLM (Large Language Models) Retrieval-Augmented Generation (RAG) How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 2 days ago
9.0 years
0 Lacs
Greater Bhopal Area
Remote
Experience : 9.00 + years Salary : USD 54000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Contract for 6 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - Andela) What do you need for this opportunity? Must have skills required: LLM (Large Language Models), Prompt Engineering, Retrieval-augmented generation (rag), Natural Language Processing, Data Science, Machine Learning, Python, SQL Andela is Looking for: Senior GenAI Engineer Description: Professionals in the areas of healthcare, legal, business, tax, accounting, finance, audit, risk, and compliance rely on client's market-leading information-enabled tools and software solutions to manage their business efficiently, deliver results to their clients, and succeed in an ever more dynamic world. Every day, our customers make critical decisions to help save lives, improve the way we do business, and build better judicial and regulatory systems. We help them get it right. As a Senior AI Engineer, you will contribute significantly to the design and development of GenAI services. Your contributions will involve enhancing AI capabilities to ensure scalability and reusability across a diverse set of applications. Your analytical and problem-solving skills will be essential, and we encourage you to leverage your coding knowledge to improve our engineering practices. Responsibilities: Contribute to the architecture, design & development of GenAI services that are integral to our product offerings and user experiences. Implement coding best practices to foster code modularity, reusability, and maintainability, enabling our AI services to remain flexible for future advancements. Collaborate with cross-functional and matrixed teams to integrate AI services into the wider product ecosystem, ensuring a smooth developer experience. Assess and optimize existing AI services to enhance performance and conform to the latest industry trends. Support and mentor other engineers, contributing to a culture that values technical skill and code quality. Stay informed on the latest AI technologies and programming techniques, exploring their applicability to our services. Qualifications: Bachelor''s degree in Computer Science, Artificial Intelligence, or a related field, or equivalent practical experience. 8+ years of experience, with experience in AI or machine learning projects. Proficiency in Python for relevant programming languages and frameworks for AI development. Strong knowledge in Machine Learning, Deep Learning, NLP, and AI. Strong hands-on expertise in libraries/frameworks/tools such as NumPy, SciPy, scikit-learn, pandas, matplotlib, spaCy, NLTK, jupyter, Transformers, etc. Experience with cloud-based platforms (AWS or Azure) for solution delivery Proven ability to develop scalable, reusable software components and services. Good knowledge of software engineering principles and architectural standards. Experience in working on and contributing to software project teams. Preferred Qualifications: Familiarity with GenAI concepts, technologies and their implementation. Experience working with OpenAI, Langchain, Azure AI Foundary and AWS Lambda. Experience with cloud-based development and familiarity with AI-related cloud services (e.g.,AWS, Azure, GCP). Interview Process: 1st round: technical interview with the team 2nd round: technical interview on systems design Overlap Hours: 6 hours with EST Contract Length: 6 months, renewable Full-time contractor role (8 hours/day) Device: Bring your own device Requirements & Notes: Assessment Path: Data Science preferred, or ML Engineer; Max All-in rate: $4500/month; Location: India and European Union; Working hours: 6-8 hours overlap with EST; Must-Haves: 8+ years of experience overall; Strong Data Science and Machine Learning foundations, SQL, Python, GenAI, Prompt Engineering, RAG. Location Requirements: Time Start on ASAP Not Available Must have skills: Natural Language Processing Machine Learning Data Science SQL Python Nice to have skills: Prompt Engineering LLM (Large Language Models) Retrieval-Augmented Generation (RAG) How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 2 days ago
9.0 years
0 Lacs
Indore, Madhya Pradesh, India
Remote
Experience : 9.00 + years Salary : USD 54000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Contract for 6 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - Andela) What do you need for this opportunity? Must have skills required: LLM (Large Language Models), Prompt Engineering, Retrieval-augmented generation (rag), Natural Language Processing, Data Science, Machine Learning, Python, SQL Andela is Looking for: Senior GenAI Engineer Description: Professionals in the areas of healthcare, legal, business, tax, accounting, finance, audit, risk, and compliance rely on client's market-leading information-enabled tools and software solutions to manage their business efficiently, deliver results to their clients, and succeed in an ever more dynamic world. Every day, our customers make critical decisions to help save lives, improve the way we do business, and build better judicial and regulatory systems. We help them get it right. As a Senior AI Engineer, you will contribute significantly to the design and development of GenAI services. Your contributions will involve enhancing AI capabilities to ensure scalability and reusability across a diverse set of applications. Your analytical and problem-solving skills will be essential, and we encourage you to leverage your coding knowledge to improve our engineering practices. Responsibilities: Contribute to the architecture, design & development of GenAI services that are integral to our product offerings and user experiences. Implement coding best practices to foster code modularity, reusability, and maintainability, enabling our AI services to remain flexible for future advancements. Collaborate with cross-functional and matrixed teams to integrate AI services into the wider product ecosystem, ensuring a smooth developer experience. Assess and optimize existing AI services to enhance performance and conform to the latest industry trends. Support and mentor other engineers, contributing to a culture that values technical skill and code quality. Stay informed on the latest AI technologies and programming techniques, exploring their applicability to our services. Qualifications: Bachelor''s degree in Computer Science, Artificial Intelligence, or a related field, or equivalent practical experience. 8+ years of experience, with experience in AI or machine learning projects. Proficiency in Python for relevant programming languages and frameworks for AI development. Strong knowledge in Machine Learning, Deep Learning, NLP, and AI. Strong hands-on expertise in libraries/frameworks/tools such as NumPy, SciPy, scikit-learn, pandas, matplotlib, spaCy, NLTK, jupyter, Transformers, etc. Experience with cloud-based platforms (AWS or Azure) for solution delivery Proven ability to develop scalable, reusable software components and services. Good knowledge of software engineering principles and architectural standards. Experience in working on and contributing to software project teams. Preferred Qualifications: Familiarity with GenAI concepts, technologies and their implementation. Experience working with OpenAI, Langchain, Azure AI Foundary and AWS Lambda. Experience with cloud-based development and familiarity with AI-related cloud services (e.g.,AWS, Azure, GCP). Interview Process: 1st round: technical interview with the team 2nd round: technical interview on systems design Overlap Hours: 6 hours with EST Contract Length: 6 months, renewable Full-time contractor role (8 hours/day) Device: Bring your own device Requirements & Notes: Assessment Path: Data Science preferred, or ML Engineer; Max All-in rate: $4500/month; Location: India and European Union; Working hours: 6-8 hours overlap with EST; Must-Haves: 8+ years of experience overall; Strong Data Science and Machine Learning foundations, SQL, Python, GenAI, Prompt Engineering, RAG. Location Requirements: Time Start on ASAP Not Available Must have skills: Natural Language Processing Machine Learning Data Science SQL Python Nice to have skills: Prompt Engineering LLM (Large Language Models) Retrieval-Augmented Generation (RAG) How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 2 days ago
9.0 years
0 Lacs
Visakhapatnam, Andhra Pradesh, India
Remote
Experience : 9.00 + years Salary : USD 54000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Contract for 6 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - Andela) What do you need for this opportunity? Must have skills required: LLM (Large Language Models), Prompt Engineering, Retrieval-augmented generation (rag), Natural Language Processing, Data Science, Machine Learning, Python, SQL Andela is Looking for: Senior GenAI Engineer Description: Professionals in the areas of healthcare, legal, business, tax, accounting, finance, audit, risk, and compliance rely on client's market-leading information-enabled tools and software solutions to manage their business efficiently, deliver results to their clients, and succeed in an ever more dynamic world. Every day, our customers make critical decisions to help save lives, improve the way we do business, and build better judicial and regulatory systems. We help them get it right. As a Senior AI Engineer, you will contribute significantly to the design and development of GenAI services. Your contributions will involve enhancing AI capabilities to ensure scalability and reusability across a diverse set of applications. Your analytical and problem-solving skills will be essential, and we encourage you to leverage your coding knowledge to improve our engineering practices. Responsibilities: Contribute to the architecture, design & development of GenAI services that are integral to our product offerings and user experiences. Implement coding best practices to foster code modularity, reusability, and maintainability, enabling our AI services to remain flexible for future advancements. Collaborate with cross-functional and matrixed teams to integrate AI services into the wider product ecosystem, ensuring a smooth developer experience. Assess and optimize existing AI services to enhance performance and conform to the latest industry trends. Support and mentor other engineers, contributing to a culture that values technical skill and code quality. Stay informed on the latest AI technologies and programming techniques, exploring their applicability to our services. Qualifications: Bachelor''s degree in Computer Science, Artificial Intelligence, or a related field, or equivalent practical experience. 8+ years of experience, with experience in AI or machine learning projects. Proficiency in Python for relevant programming languages and frameworks for AI development. Strong knowledge in Machine Learning, Deep Learning, NLP, and AI. Strong hands-on expertise in libraries/frameworks/tools such as NumPy, SciPy, scikit-learn, pandas, matplotlib, spaCy, NLTK, jupyter, Transformers, etc. Experience with cloud-based platforms (AWS or Azure) for solution delivery Proven ability to develop scalable, reusable software components and services. Good knowledge of software engineering principles and architectural standards. Experience in working on and contributing to software project teams. Preferred Qualifications: Familiarity with GenAI concepts, technologies and their implementation. Experience working with OpenAI, Langchain, Azure AI Foundary and AWS Lambda. Experience with cloud-based development and familiarity with AI-related cloud services (e.g.,AWS, Azure, GCP). Interview Process: 1st round: technical interview with the team 2nd round: technical interview on systems design Overlap Hours: 6 hours with EST Contract Length: 6 months, renewable Full-time contractor role (8 hours/day) Device: Bring your own device Requirements & Notes: Assessment Path: Data Science preferred, or ML Engineer; Max All-in rate: $4500/month; Location: India and European Union; Working hours: 6-8 hours overlap with EST; Must-Haves: 8+ years of experience overall; Strong Data Science and Machine Learning foundations, SQL, Python, GenAI, Prompt Engineering, RAG. Location Requirements: Time Start on ASAP Not Available Must have skills: Natural Language Processing Machine Learning Data Science SQL Python Nice to have skills: Prompt Engineering LLM (Large Language Models) Retrieval-Augmented Generation (RAG) How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 2 days ago
9.0 years
0 Lacs
Chandigarh, India
Remote
Experience : 9.00 + years Salary : USD 54000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Contract for 6 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - Andela) What do you need for this opportunity? Must have skills required: LLM (Large Language Models), Prompt Engineering, Retrieval-augmented generation (rag), Natural Language Processing, Data Science, Machine Learning, Python, SQL Andela is Looking for: Senior GenAI Engineer Description: Professionals in the areas of healthcare, legal, business, tax, accounting, finance, audit, risk, and compliance rely on client's market-leading information-enabled tools and software solutions to manage their business efficiently, deliver results to their clients, and succeed in an ever more dynamic world. Every day, our customers make critical decisions to help save lives, improve the way we do business, and build better judicial and regulatory systems. We help them get it right. As a Senior AI Engineer, you will contribute significantly to the design and development of GenAI services. Your contributions will involve enhancing AI capabilities to ensure scalability and reusability across a diverse set of applications. Your analytical and problem-solving skills will be essential, and we encourage you to leverage your coding knowledge to improve our engineering practices. Responsibilities: Contribute to the architecture, design & development of GenAI services that are integral to our product offerings and user experiences. Implement coding best practices to foster code modularity, reusability, and maintainability, enabling our AI services to remain flexible for future advancements. Collaborate with cross-functional and matrixed teams to integrate AI services into the wider product ecosystem, ensuring a smooth developer experience. Assess and optimize existing AI services to enhance performance and conform to the latest industry trends. Support and mentor other engineers, contributing to a culture that values technical skill and code quality. Stay informed on the latest AI technologies and programming techniques, exploring their applicability to our services. Qualifications: Bachelor''s degree in Computer Science, Artificial Intelligence, or a related field, or equivalent practical experience. 8+ years of experience, with experience in AI or machine learning projects. Proficiency in Python for relevant programming languages and frameworks for AI development. Strong knowledge in Machine Learning, Deep Learning, NLP, and AI. Strong hands-on expertise in libraries/frameworks/tools such as NumPy, SciPy, scikit-learn, pandas, matplotlib, spaCy, NLTK, jupyter, Transformers, etc. Experience with cloud-based platforms (AWS or Azure) for solution delivery Proven ability to develop scalable, reusable software components and services. Good knowledge of software engineering principles and architectural standards. Experience in working on and contributing to software project teams. Preferred Qualifications: Familiarity with GenAI concepts, technologies and their implementation. Experience working with OpenAI, Langchain, Azure AI Foundary and AWS Lambda. Experience with cloud-based development and familiarity with AI-related cloud services (e.g.,AWS, Azure, GCP). Interview Process: 1st round: technical interview with the team 2nd round: technical interview on systems design Overlap Hours: 6 hours with EST Contract Length: 6 months, renewable Full-time contractor role (8 hours/day) Device: Bring your own device Requirements & Notes: Assessment Path: Data Science preferred, or ML Engineer; Max All-in rate: $4500/month; Location: India and European Union; Working hours: 6-8 hours overlap with EST; Must-Haves: 8+ years of experience overall; Strong Data Science and Machine Learning foundations, SQL, Python, GenAI, Prompt Engineering, RAG. Location Requirements: Time Start on ASAP Not Available Must have skills: Natural Language Processing Machine Learning Data Science SQL Python Nice to have skills: Prompt Engineering LLM (Large Language Models) Retrieval-Augmented Generation (RAG) How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 2 days ago
9.0 years
0 Lacs
Dehradun, Uttarakhand, India
Remote
Experience : 9.00 + years Salary : USD 54000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Contract for 6 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - Andela) What do you need for this opportunity? Must have skills required: LLM (Large Language Models), Prompt Engineering, Retrieval-augmented generation (rag), Natural Language Processing, Data Science, Machine Learning, Python, SQL Andela is Looking for: Senior GenAI Engineer Description: Professionals in the areas of healthcare, legal, business, tax, accounting, finance, audit, risk, and compliance rely on client's market-leading information-enabled tools and software solutions to manage their business efficiently, deliver results to their clients, and succeed in an ever more dynamic world. Every day, our customers make critical decisions to help save lives, improve the way we do business, and build better judicial and regulatory systems. We help them get it right. As a Senior AI Engineer, you will contribute significantly to the design and development of GenAI services. Your contributions will involve enhancing AI capabilities to ensure scalability and reusability across a diverse set of applications. Your analytical and problem-solving skills will be essential, and we encourage you to leverage your coding knowledge to improve our engineering practices. Responsibilities: Contribute to the architecture, design & development of GenAI services that are integral to our product offerings and user experiences. Implement coding best practices to foster code modularity, reusability, and maintainability, enabling our AI services to remain flexible for future advancements. Collaborate with cross-functional and matrixed teams to integrate AI services into the wider product ecosystem, ensuring a smooth developer experience. Assess and optimize existing AI services to enhance performance and conform to the latest industry trends. Support and mentor other engineers, contributing to a culture that values technical skill and code quality. Stay informed on the latest AI technologies and programming techniques, exploring their applicability to our services. Qualifications: Bachelor''s degree in Computer Science, Artificial Intelligence, or a related field, or equivalent practical experience. 8+ years of experience, with experience in AI or machine learning projects. Proficiency in Python for relevant programming languages and frameworks for AI development. Strong knowledge in Machine Learning, Deep Learning, NLP, and AI. Strong hands-on expertise in libraries/frameworks/tools such as NumPy, SciPy, scikit-learn, pandas, matplotlib, spaCy, NLTK, jupyter, Transformers, etc. Experience with cloud-based platforms (AWS or Azure) for solution delivery Proven ability to develop scalable, reusable software components and services. Good knowledge of software engineering principles and architectural standards. Experience in working on and contributing to software project teams. Preferred Qualifications: Familiarity with GenAI concepts, technologies and their implementation. Experience working with OpenAI, Langchain, Azure AI Foundary and AWS Lambda. Experience with cloud-based development and familiarity with AI-related cloud services (e.g.,AWS, Azure, GCP). Interview Process: 1st round: technical interview with the team 2nd round: technical interview on systems design Overlap Hours: 6 hours with EST Contract Length: 6 months, renewable Full-time contractor role (8 hours/day) Device: Bring your own device Requirements & Notes: Assessment Path: Data Science preferred, or ML Engineer; Max All-in rate: $4500/month; Location: India and European Union; Working hours: 6-8 hours overlap with EST; Must-Haves: 8+ years of experience overall; Strong Data Science and Machine Learning foundations, SQL, Python, GenAI, Prompt Engineering, RAG. Location Requirements: Time Start on ASAP Not Available Must have skills: Natural Language Processing Machine Learning Data Science SQL Python Nice to have skills: Prompt Engineering LLM (Large Language Models) Retrieval-Augmented Generation (RAG) How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 2 days ago
9.0 years
0 Lacs
Mysore, Karnataka, India
Remote
Experience : 9.00 + years Salary : USD 54000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Contract for 6 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - Andela) What do you need for this opportunity? Must have skills required: LLM (Large Language Models), Prompt Engineering, Retrieval-augmented generation (rag), Natural Language Processing, Data Science, Machine Learning, Python, SQL Andela is Looking for: Senior GenAI Engineer Description: Professionals in the areas of healthcare, legal, business, tax, accounting, finance, audit, risk, and compliance rely on client's market-leading information-enabled tools and software solutions to manage their business efficiently, deliver results to their clients, and succeed in an ever more dynamic world. Every day, our customers make critical decisions to help save lives, improve the way we do business, and build better judicial and regulatory systems. We help them get it right. As a Senior AI Engineer, you will contribute significantly to the design and development of GenAI services. Your contributions will involve enhancing AI capabilities to ensure scalability and reusability across a diverse set of applications. Your analytical and problem-solving skills will be essential, and we encourage you to leverage your coding knowledge to improve our engineering practices. Responsibilities: Contribute to the architecture, design & development of GenAI services that are integral to our product offerings and user experiences. Implement coding best practices to foster code modularity, reusability, and maintainability, enabling our AI services to remain flexible for future advancements. Collaborate with cross-functional and matrixed teams to integrate AI services into the wider product ecosystem, ensuring a smooth developer experience. Assess and optimize existing AI services to enhance performance and conform to the latest industry trends. Support and mentor other engineers, contributing to a culture that values technical skill and code quality. Stay informed on the latest AI technologies and programming techniques, exploring their applicability to our services. Qualifications: Bachelor''s degree in Computer Science, Artificial Intelligence, or a related field, or equivalent practical experience. 8+ years of experience, with experience in AI or machine learning projects. Proficiency in Python for relevant programming languages and frameworks for AI development. Strong knowledge in Machine Learning, Deep Learning, NLP, and AI. Strong hands-on expertise in libraries/frameworks/tools such as NumPy, SciPy, scikit-learn, pandas, matplotlib, spaCy, NLTK, jupyter, Transformers, etc. Experience with cloud-based platforms (AWS or Azure) for solution delivery Proven ability to develop scalable, reusable software components and services. Good knowledge of software engineering principles and architectural standards. Experience in working on and contributing to software project teams. Preferred Qualifications: Familiarity with GenAI concepts, technologies and their implementation. Experience working with OpenAI, Langchain, Azure AI Foundary and AWS Lambda. Experience with cloud-based development and familiarity with AI-related cloud services (e.g.,AWS, Azure, GCP). Interview Process: 1st round: technical interview with the team 2nd round: technical interview on systems design Overlap Hours: 6 hours with EST Contract Length: 6 months, renewable Full-time contractor role (8 hours/day) Device: Bring your own device Requirements & Notes: Assessment Path: Data Science preferred, or ML Engineer; Max All-in rate: $4500/month; Location: India and European Union; Working hours: 6-8 hours overlap with EST; Must-Haves: 8+ years of experience overall; Strong Data Science and Machine Learning foundations, SQL, Python, GenAI, Prompt Engineering, RAG. Location Requirements: Time Start on ASAP Not Available Must have skills: Natural Language Processing Machine Learning Data Science SQL Python Nice to have skills: Prompt Engineering LLM (Large Language Models) Retrieval-Augmented Generation (RAG) How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 2 days ago
9.0 years
0 Lacs
Thiruvananthapuram, Kerala, India
Remote
Experience : 9.00 + years Salary : USD 54000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Contract for 6 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - Andela) What do you need for this opportunity? Must have skills required: LLM (Large Language Models), Prompt Engineering, Retrieval-augmented generation (rag), Natural Language Processing, Data Science, Machine Learning, Python, SQL Andela is Looking for: Senior GenAI Engineer Description: Professionals in the areas of healthcare, legal, business, tax, accounting, finance, audit, risk, and compliance rely on client's market-leading information-enabled tools and software solutions to manage their business efficiently, deliver results to their clients, and succeed in an ever more dynamic world. Every day, our customers make critical decisions to help save lives, improve the way we do business, and build better judicial and regulatory systems. We help them get it right. As a Senior AI Engineer, you will contribute significantly to the design and development of GenAI services. Your contributions will involve enhancing AI capabilities to ensure scalability and reusability across a diverse set of applications. Your analytical and problem-solving skills will be essential, and we encourage you to leverage your coding knowledge to improve our engineering practices. Responsibilities: Contribute to the architecture, design & development of GenAI services that are integral to our product offerings and user experiences. Implement coding best practices to foster code modularity, reusability, and maintainability, enabling our AI services to remain flexible for future advancements. Collaborate with cross-functional and matrixed teams to integrate AI services into the wider product ecosystem, ensuring a smooth developer experience. Assess and optimize existing AI services to enhance performance and conform to the latest industry trends. Support and mentor other engineers, contributing to a culture that values technical skill and code quality. Stay informed on the latest AI technologies and programming techniques, exploring their applicability to our services. Qualifications: Bachelor''s degree in Computer Science, Artificial Intelligence, or a related field, or equivalent practical experience. 8+ years of experience, with experience in AI or machine learning projects. Proficiency in Python for relevant programming languages and frameworks for AI development. Strong knowledge in Machine Learning, Deep Learning, NLP, and AI. Strong hands-on expertise in libraries/frameworks/tools such as NumPy, SciPy, scikit-learn, pandas, matplotlib, spaCy, NLTK, jupyter, Transformers, etc. Experience with cloud-based platforms (AWS or Azure) for solution delivery Proven ability to develop scalable, reusable software components and services. Good knowledge of software engineering principles and architectural standards. Experience in working on and contributing to software project teams. Preferred Qualifications: Familiarity with GenAI concepts, technologies and their implementation. Experience working with OpenAI, Langchain, Azure AI Foundary and AWS Lambda. Experience with cloud-based development and familiarity with AI-related cloud services (e.g.,AWS, Azure, GCP). Interview Process: 1st round: technical interview with the team 2nd round: technical interview on systems design Overlap Hours: 6 hours with EST Contract Length: 6 months, renewable Full-time contractor role (8 hours/day) Device: Bring your own device Requirements & Notes: Assessment Path: Data Science preferred, or ML Engineer; Max All-in rate: $4500/month; Location: India and European Union; Working hours: 6-8 hours overlap with EST; Must-Haves: 8+ years of experience overall; Strong Data Science and Machine Learning foundations, SQL, Python, GenAI, Prompt Engineering, RAG. Location Requirements: Time Start on ASAP Not Available Must have skills: Natural Language Processing Machine Learning Data Science SQL Python Nice to have skills: Prompt Engineering LLM (Large Language Models) Retrieval-Augmented Generation (RAG) How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 2 days ago
9.0 years
0 Lacs
Vijayawada, Andhra Pradesh, India
Remote
Experience : 9.00 + years Salary : USD 54000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Contract for 6 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - Andela) What do you need for this opportunity? Must have skills required: LLM (Large Language Models), Prompt Engineering, Retrieval-augmented generation (rag), Natural Language Processing, Data Science, Machine Learning, Python, SQL Andela is Looking for: Senior GenAI Engineer Description: Professionals in the areas of healthcare, legal, business, tax, accounting, finance, audit, risk, and compliance rely on client's market-leading information-enabled tools and software solutions to manage their business efficiently, deliver results to their clients, and succeed in an ever more dynamic world. Every day, our customers make critical decisions to help save lives, improve the way we do business, and build better judicial and regulatory systems. We help them get it right. As a Senior AI Engineer, you will contribute significantly to the design and development of GenAI services. Your contributions will involve enhancing AI capabilities to ensure scalability and reusability across a diverse set of applications. Your analytical and problem-solving skills will be essential, and we encourage you to leverage your coding knowledge to improve our engineering practices. Responsibilities: Contribute to the architecture, design & development of GenAI services that are integral to our product offerings and user experiences. Implement coding best practices to foster code modularity, reusability, and maintainability, enabling our AI services to remain flexible for future advancements. Collaborate with cross-functional and matrixed teams to integrate AI services into the wider product ecosystem, ensuring a smooth developer experience. Assess and optimize existing AI services to enhance performance and conform to the latest industry trends. Support and mentor other engineers, contributing to a culture that values technical skill and code quality. Stay informed on the latest AI technologies and programming techniques, exploring their applicability to our services. Qualifications: Bachelor''s degree in Computer Science, Artificial Intelligence, or a related field, or equivalent practical experience. 8+ years of experience, with experience in AI or machine learning projects. Proficiency in Python for relevant programming languages and frameworks for AI development. Strong knowledge in Machine Learning, Deep Learning, NLP, and AI. Strong hands-on expertise in libraries/frameworks/tools such as NumPy, SciPy, scikit-learn, pandas, matplotlib, spaCy, NLTK, jupyter, Transformers, etc. Experience with cloud-based platforms (AWS or Azure) for solution delivery Proven ability to develop scalable, reusable software components and services. Good knowledge of software engineering principles and architectural standards. Experience in working on and contributing to software project teams. Preferred Qualifications: Familiarity with GenAI concepts, technologies and their implementation. Experience working with OpenAI, Langchain, Azure AI Foundary and AWS Lambda. Experience with cloud-based development and familiarity with AI-related cloud services (e.g.,AWS, Azure, GCP). Interview Process: 1st round: technical interview with the team 2nd round: technical interview on systems design Overlap Hours: 6 hours with EST Contract Length: 6 months, renewable Full-time contractor role (8 hours/day) Device: Bring your own device Requirements & Notes: Assessment Path: Data Science preferred, or ML Engineer; Max All-in rate: $4500/month; Location: India and European Union; Working hours: 6-8 hours overlap with EST; Must-Haves: 8+ years of experience overall; Strong Data Science and Machine Learning foundations, SQL, Python, GenAI, Prompt Engineering, RAG. Location Requirements: Time Start on ASAP Not Available Must have skills: Natural Language Processing Machine Learning Data Science SQL Python Nice to have skills: Prompt Engineering LLM (Large Language Models) Retrieval-Augmented Generation (RAG) How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 2 days ago
9.0 years
0 Lacs
Patna, Bihar, India
Remote
Experience : 9.00 + years Salary : USD 54000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Contract for 6 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - Andela) What do you need for this opportunity? Must have skills required: LLM (Large Language Models), Prompt Engineering, Retrieval-augmented generation (rag), Natural Language Processing, Data Science, Machine Learning, Python, SQL Andela is Looking for: Senior GenAI Engineer Description: Professionals in the areas of healthcare, legal, business, tax, accounting, finance, audit, risk, and compliance rely on client's market-leading information-enabled tools and software solutions to manage their business efficiently, deliver results to their clients, and succeed in an ever more dynamic world. Every day, our customers make critical decisions to help save lives, improve the way we do business, and build better judicial and regulatory systems. We help them get it right. As a Senior AI Engineer, you will contribute significantly to the design and development of GenAI services. Your contributions will involve enhancing AI capabilities to ensure scalability and reusability across a diverse set of applications. Your analytical and problem-solving skills will be essential, and we encourage you to leverage your coding knowledge to improve our engineering practices. Responsibilities: Contribute to the architecture, design & development of GenAI services that are integral to our product offerings and user experiences. Implement coding best practices to foster code modularity, reusability, and maintainability, enabling our AI services to remain flexible for future advancements. Collaborate with cross-functional and matrixed teams to integrate AI services into the wider product ecosystem, ensuring a smooth developer experience. Assess and optimize existing AI services to enhance performance and conform to the latest industry trends. Support and mentor other engineers, contributing to a culture that values technical skill and code quality. Stay informed on the latest AI technologies and programming techniques, exploring their applicability to our services. Qualifications: Bachelor''s degree in Computer Science, Artificial Intelligence, or a related field, or equivalent practical experience. 8+ years of experience, with experience in AI or machine learning projects. Proficiency in Python for relevant programming languages and frameworks for AI development. Strong knowledge in Machine Learning, Deep Learning, NLP, and AI. Strong hands-on expertise in libraries/frameworks/tools such as NumPy, SciPy, scikit-learn, pandas, matplotlib, spaCy, NLTK, jupyter, Transformers, etc. Experience with cloud-based platforms (AWS or Azure) for solution delivery Proven ability to develop scalable, reusable software components and services. Good knowledge of software engineering principles and architectural standards. Experience in working on and contributing to software project teams. Preferred Qualifications: Familiarity with GenAI concepts, technologies and their implementation. Experience working with OpenAI, Langchain, Azure AI Foundary and AWS Lambda. Experience with cloud-based development and familiarity with AI-related cloud services (e.g.,AWS, Azure, GCP). Interview Process: 1st round: technical interview with the team 2nd round: technical interview on systems design Overlap Hours: 6 hours with EST Contract Length: 6 months, renewable Full-time contractor role (8 hours/day) Device: Bring your own device Requirements & Notes: Assessment Path: Data Science preferred, or ML Engineer; Max All-in rate: $4500/month; Location: India and European Union; Working hours: 6-8 hours overlap with EST; Must-Haves: 8+ years of experience overall; Strong Data Science and Machine Learning foundations, SQL, Python, GenAI, Prompt Engineering, RAG. Location Requirements: Time Start on ASAP Not Available Must have skills: Natural Language Processing Machine Learning Data Science SQL Python Nice to have skills: Prompt Engineering LLM (Large Language Models) Retrieval-Augmented Generation (RAG) How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 2 days ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Purpose Hands-on data automation engineer with strong Python or Java coding skills and solid SQL expertise, who can work with large data sets, understand stored procedures, and independently write data-driven automation logic. Develop and execute test cases with a focus on Fixed Income trading workflows. The requirement goes beyond automation tools and aligns better with a junior developer or data automation role. Desired Skills And Experience Strong programming experience in Python (preferred) or Java. Strong experience of working with Python and its libraries like Pandas, NumPy, etc. Hands-on experience with SQL, including: Writing and debugging complex queries (joins, aggregations, filtering, etc.) Understanding stored procedures and using them in automation Experience working with data structures, large tables and datasets Comfort with data manipulation, validation, and building comparison scripts Nice to have: Familiarity with PyCharm, VS Code, or IntelliJ for development and understanding of how automation integrates into CI/CD pipelines Prior exposure to financial data or post-trade systems (a bonus) Excellent communication skills, both written and verbal Experience of working with test management tools (e.g., X-Ray/JIRA). Extremely strong organizational and analytical skills with strong attention to detail Strong track record of excellent results delivered to internal and external clients Able to work independently without the need for close supervision and collaboratively as part of cross-team efforts Experience with delivering projects within an agile environment Key Responsibilities Write custom data validation scripts based on provided regression test cases Read, understand, and translate stored procedure logic into test automation Compare datasets across environments and generate diffs Collaborate with team members and follow structured automation practices Contribute to building and maintaining a central automation script repository Establish and implement comprehensive QA strategies and test plans from scratch. Develop and execute test cases with a focus on Fixed Income trading workflows. Driving the creation of regression test suites for critical back-office applications. Collaborate with development, business analysts, and project managers to ensure quality throughout the SDLC. Provide clear and concise reporting on QA progress and metrics to management. Bring strong subject matter expertise in the Financial Services Industry, particularly fixed income trading products and workflows. Ensure effective, efficient, and continuous communication (written and verbally) with global stakeholders Independently troubleshoot difficult and complex issues on different environments Responsible for end-to-end delivery of projects, coordination between client and internal offshore teams and managing client queries Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards, a natural aptitude to develop good internal working relationships and a flexible work ethic Responsible for Quality Checks and adhering to the agreed Service Level Agreement (SLA) / Turn Around Time (TAT)
Posted 2 days ago
0.0 - 2.0 years
3 - 10 Lacs
Niranjanpur, Indore, Madhya Pradesh
Remote
Job Title - Sr. Data Engineer Experience - 2+ Years Location - Indpre (onsite) Industry - IT Job Type - Full ime Roles and Responsibilities- 1. Design and develop scalable data pipelines and workflows for data ingestion, transformation, and integration. 2. Build and maintain data storage systems, including data warehouses, data lakes, and relational databases. 3. Ensure data accuracy, integrity, and consistency through validation and quality assurance processes. 4. Collaborate with data scientists, analysts, and business teams to understand data needs and deliver tailored solutions. 5. Optimize database performance and manage large-scale datasets for efficient processing. 6. Leverage cloud platforms (AWS, Azure, or GCP) and big data technologies (Hadoop, Spark, Kafka) for building robust data solutions. 7. Automate and monitor data workflows using orchestration frameworks such as Apache Airflow. 8. Implement and enforce data governance policies to ensure compliance and data security. 9. Troubleshoot and resolve data-related issues to maintain seamless operations. 10. Stay updated on emerging tools, technologies, and trends in data engineering. Skills and Knowledge- 1. Core Skills: ● Proficient in Python (libraries: Pandas, NumPy) and SQL. ● Knowledge of data modeling techniques, including: ○ Entity-Relationship (ER) Diagrams ○ Dimensional Modeling ○ Data Normalization ● Familiarity with ETL processes and tools like: ○ Azure Data Factory (ADF) ○ SSIS (SQL Server Integration Services) 2. Cloud Expertise: ● AWS Services: Glue, Redshift, Lambda, EKS, RDS, Athena ● Azure Services: Databricks, Key Vault, ADLS Gen2, ADF, Azure SQL ● Snowflake 3. Big Data and Workflow Automation: ● Hands-on experience with big data technologies like Hadoop, Spark, and Kafka. ● Experience with workflow automation tools like Apache Airflow (or similar). Qualifications and Requirements- ● Education: ○ Bachelor’s degree (or equivalent) in Computer Science, Information Technology, Engineering, or a related field. ● Experience: ○ Freshers with strong understanding, internships and relevant academic projects are welcome. ○ 2+ years of experience working with Python, SQL, and data integration or visualization tools is preferred. ● Other Skills: ○ Strong communication skills, especially the ability to explain technical concepts to non-technical stakeholders. ○ Ability to work in a dynamic, research-oriented team with concurrent projects. Job Types: Full-time, Permanent Pay: ₹300,000.00 - ₹1,000,000.00 per year Benefits: Paid sick time Provident Fund Work from home Schedule: Day shift Monday to Friday Weekend availability Supplemental Pay: Performance bonus Ability to commute/relocate: Niranjanpur, Indore, Madhya Pradesh: Reliably commute or planning to relocate before starting work (Preferred) Experience: Data Engineer: 2 years (Preferred) Work Location: In person Application Deadline: 31/08/2025
Posted 2 days ago
3.0 years
5 - 10 Lacs
Kazhakuttam
On-site
About the Role You will architect, build and maintain end-to-end data pipelines that ingest 100 GB+ of NGINX/web-server logs from Elasticsearch, transform them into high-quality features, and surface actionable insights and visualisations for security analysts and ML models. Acting as both a Data Engineer and a Behavioural Data Analyst, you will collaborate with security, AI and frontend teams to ensure low-latency data delivery, rich feature sets and compelling dashboards that spot anomalies in real time. Key Responsibilities ETL & Pipeline Engineering: Design and orchestrate scalable batch / near-real-time ETL workflows to extract raw logs from Elasticsearch. Clean, normalize and partition logs for long-term storage and fast retrieval. Optimize Elasticsearch indices, queries and retention policies for performance and cost. Feature Engineering & Feature Store: Assist in the development of robust feature-engineering code in Python and/or PySpark. Define schemas and loaders for a feature store (Feast or similar). Manage historical back-fills and real-time feature look-ups ensuring versioning and reproducibility. Behaviour & Anomaly Analysis: Perform exploratory data analysis (EDA) to uncover traffic patterns, bursts, outliers and security events across IPs, headers, user agents and geo data. Translate findings into new or refined ML features and anomaly indicators. Visualisation & Dashboards: Create time-series, geo-distribution and behaviour-pattern visualisations for internal dashboards. Partner with frontend engineers to test UI requirements. Monitoring & Scaling: Implement health and latency monitoring for pipelines; automate alerts and failure recovery. Scale infrastructure to support rapidly growing log volumes. Collaboration & Documentation: Work closely with ML, security and product teams to align data strategy with platform goals. Document data lineage, dictionaries, transformation logic and behavioural assumptions. Minimum Qualifications: Education – Bachelor’s or Master’s in Computer Science, Data Engineering, Analytics, Cybersecurity or related field. Experience – 3 + years building data pipelines and/or performing data analysis on large log datasets. Core Skills Python (pandas, numpy, elasticsearch-py, Matplotlib, plotly, seaborn; PySpark desirable) Elasticsearch & ELK stack query optimisation SQL for ad-hoc analysis Workflow orchestration (Apache Airflow, Prefect or similar) Data modelling, versioning and time-series handling Familiarity with visualisation tools (Kibana, Grafana). DevOps – Docker, Git, CI/CD best practices. Nice-to-Have Kafka, Fluentd or Logstash experience for high-throughput log streaming. Web-server log expertise (NGINX / Apache, HTTP semantics) Cloud data platform deployment on AWS / GCP / Azure. Hands-on exposure to feature stores (Feast, Tecton) and MLOps. Prior work on anomaly-detection or cybersecurity analytics systems. Why Join Us? You’ll sit at the nexus of data engineering and behavioural analytics, turning raw traffic logs into the lifeblood of a cutting-edge AI security product. If you thrive on building resilient pipelines and diving into the data to uncover hidden patterns, we’d love to meet you. Job Type: Full-time Pay: ₹500,000.00 - ₹1,000,000.00 per year Benefits: Health insurance Provident Fund Schedule: Day shift Monday to Friday Work Location: In person
Posted 2 days ago
2.0 years
6 - 9 Lacs
Cochin
On-site
2 Years of experience mandatory Machine Learning (ML): Supervised & unsupervised learning Regression, classification, clustering Model evaluation and tuning Deep Learning: Neural networks (CNNs, RNNs, LSTMs, Transformers) Frameworks: TensorFlow, PyTorch, Keras Data Science & Analytics: Data cleaning and pre-processing Feature engineering EDA (Exploratory Data Analysis) Programming: Proficient in Python (NumPy, pandas, scikit-learn, matplotlib, etc.) Optional: R, Julia, or C++ NLP & LLMs: Text preprocessing, embeddings (word2vec, BERT, etc.) Fine-tuning and using LLMs (e.g., GPT, BERT, T5) Computer Vision: Image classification, object detection OpenCV, YOLO, or similar libraries Math & Stats: Linear algebra, probability, calculus Statistical modeling Infrastructure & DevOps Cloud platforms: AWS, GCP, Azure MLOps: ML pipelines, model versioning (e.g., MLflow, DVC) Docker, Kubernetes for deployment Git and version control Understanding business problems and translating them into AI solutions Job Type: Full-time Pay: ₹50,499.45 - ₹75,027.94 per month Schedule: Day shift Experience: AL/ML: 2 years (Required) Work Location: In person
Posted 2 days ago
0 years
0 - 1 Lacs
Thiruvananthapuram
On-site
Data Science and AI Developer **Job Description:** We are seeking a highly skilled and motivated Data Science and AI Developer to join our dynamic team. As a Data Science and AI Developer, you will be responsible for leveraging cutting-edge technologies to develop innovative solutions that drive business insights and enhance decision-making processes. **Key Responsibilities:** 1. Develop and deploy machine learning models for predictive analytics, classification, clustering, and anomaly detection. 2. Design and implement algorithms for data mining, pattern recognition, and natural language processing. 3. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. 4. Utilize advanced statistical techniques to analyze complex datasets and extract actionable insights. 5. Implement scalable data pipelines for data ingestion, preprocessing, feature engineering, and model training. 6. Stay updated with the latest advancements in data science, machine learning, and artificial intelligence research. 7. Optimize model performance and scalability through experimentation and iteration. 8. Communicate findings and results to stakeholders through reports, presentations, and visualizations. 9. Ensure compliance with data privacy regulations and best practices in data handling and security. 10. Mentor junior team members and provide technical guidance and support. **Requirements:** 1. Bachelor’s or Master’s degree in Computer Science, Data Science, Statistics, or a related field. 2. Proven experience in developing and deploying machine learning models in production environments. 3. Proficiency in programming languages such as Python, R, or Scala, with strong software engineering skills. 4. Hands-on experience with machine learning libraries/frameworks such as TensorFlow, PyTorch, Scikit-learn, or Spark MLlib. 5. Solid understanding of data structures, algorithms, and computer science fundamentals. 6. Excellent problem-solving skills and the ability to think creatively to overcome challenges. 7. Strong communication and interpersonal skills, with the ability to work effectively in a collaborative team environment. 8. Certification in Data Science, Machine Learning, or Artificial Intelligence (e.g., Coursera, edX, Udacity, etc.). 9. Experience with cloud platforms such as AWS, Azure, or Google Cloud is a plus. 10. Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka) is an advantage. Data Manipulation and Analysis : NumPy, Pandas Data Visualization : Matplotlib, Seaborn, Power BI Machine Learning Libraries : Scikit-learn, TensorFlow, Keras Statistical Analysis : SciPy Web Scrapping : Scrapy IDE : PyCharm, Google Colab HTML/CSS/JavaScript/React JS Proficiency in these core web development technologies is a must. Python Django Expertise: In-depth knowledge of e-commerce functionalities or deep Python Django knowledge. Theming: Proven experience in designing and implementing custom themes for Python websites. Responsive Design: Strong understanding of responsive design principles and the ability to create visually appealing and user-friendly interfaces for various devices. Problem Solving: Excellent problem-solving skills with the ability to troubleshoot and resolve issues independently. Collaboration: Ability to work closely with cross-functional teams, including marketing and design, to bring creative visions to life. interns must know about how to connect front end with datascience Also must Know to connect datascience to frontend **Benefits:** - Competitive salary package - Flexible working hours - Opportunities for career growth and professional development - Dynamic and innovative work environment Job Type: Full-time Pay: ₹8,000.00 - ₹12,000.00 per month Ability to commute/relocate: Thiruvananthapuram, Kerala: Reliably commute or planning to relocate before starting work (Preferred) Work Location: In person
Posted 2 days ago
2.0 years
0 Lacs
Hyderābād
On-site
FactSet creates flexible, open data and software solutions for over 200,000 investment professionals worldwide, providing instant access to financial data and analytics that investors use to make crucial decisions. At FactSet, our values are the foundation of everything we do. They express how we act and operate , serve as a compass in our decision-making, and play a big role in how we treat each other, our clients, and our communities. We believe that the best ideas can come from anyone, anywhere, at any time, and that curiosity is the key to anticipating our clients’ needs and exceeding their expectations. Your Team’s Impact The News Content team is responsible for ingesting, maintaining and delivering a variety of unstructured text-based content sets for use across all FactSet applications and workflows. These content sets need to be processed and delivered to our clients in real-time. New feed integration involves working with Product Development to understand the requirements of the feeds as well as working with the vendor to understand the technical specification for ingesting the feeds. Work on existing feeds includes bug fixes, feature enhancements, infrastructure improvements, maintaining data quality, and ensuring feeds are operating properly throughout the day. You will work on both internal and external client-facing applications that shape the user's experience and drive FactSet's growth through technological innovations. What You’II Do FactSet is seeking for a Python Developer along with experienced AWS Developer to join our engineering team responsible for making our product more scalable and reliable. Deliver high quality, re-usable and maintainable code, perform Unit / Integration Testing of assigned tasks with in the estimated timelines. Build robust infrastructure in AWS appropriate to respective component of product Proactive in providing Technical solutions with effective communication & collaboration skills. Perform Code Reviews and ensure best practices are followed. Work in agile team environment and collaborate with internal teams to ensure smooth product delivery. Ownership on end-end product & Individual contribution. Ensure high stability of product. Continuous Knowledge sharing internal & external of Team. What We’re Looking For Bachelor or master’s degree in computer science 2-3 years of Total experience. Minimum 2 year of experience in Python development. Minimum of 2 years working experience in Linux/Unix environments Strong analytical and problem-solving skills. Strong experience and proficiency with Python, Pandas, Numpy. Experience with AWS components. Experience with Github-based development processes Excellent written and verbal communication skills. Organized, self-directed, and resourceful with the ability to appropriately prioritize work in a fast-paced environment Good to have skills: Familiar with Agile software development (Scrum is a plus). Experience in front end development Experience in database development Experience in C++ development Exposure on design patterns What's In It For You At FactSet, our people are our greatest asset, and our culture is our biggest competitive advantage. Being a FactSetter means: The opportunity to join an S&P 500 company with over 45 years of sustainable growth powered by the entrepreneurial spirit of a start-up. Support for your total well-being. This includes health, life, and disability insurance, as well as retirement savings plans and a discounted employee stock purchase program, plus paid time off for holidays, family leave, and company-wide wellness days. Flexible work accommodations. We value work/life harmony and offer our employees a range of accommodations to help them achieve success both at work and in their personal lives. A global community dedicated to volunteerism and sustainability, where collaboration is always encouraged, and individuality drives solutions. Career progression planning with dedicated time each month for learning and development. Business Resource Groups open to all employees that serve as a catalyst for connection, growth, and belonging. Salary is just one component of our compensation package and is based on several factors including but not limited to education, work experience, and certifications. Company Overview: FactSet (NYSE:FDS | NASDAQ:FDS) helps the financial community to see more, think bigger, and work better. Our digital platform and enterprise solutions deliver financial data, analytics, and open technology to more than 8,200 global clients, including over 200,000 individual users. Clients across the buy-side and sell-side, as well as wealth managers, private equity firms, and corporations, achieve more every day with our comprehensive and connected content, flexible next-generation workflow solutions, and client-centric specialized support. As a member of the S&P 500, we are committed to sustainable growth and have been recognized among the Best Places to Work in 2023 by Glassdoor as a Glassdoor Employees’ Choice Award winner. Learn more at www.factset.com and follow us on X and LinkedIn . Ex US: At FactSet, we celebrate difference of thought, experience, and perspective. Qualified applicants will be considered for employment without regard to characteristics protected by law . Company Overview: FactSet ( NYSE:FDS | NASDAQ:FDS ) helps the financial community to see more, think bigger, and work better. Our digital platform and enterprise solutions deliver financial data, analytics, and open technology to more than 8,200 global clients, including over 200,000 individual users. Clients across the buy-side and sell-side, as well as wealth managers, private equity firms, and corporations, achieve more every day with our comprehensive and connected content, flexible next-generation workflow solutions, and client-centric specialized support. As a member of the S&P 500, we are committed to sustainable growth and have been recognized among the Best Places to Work in 2023 by Glassdoor as a Glassdoor Employees’ Choice Award winner. Learn more at www.factset.com and follow us on X and LinkedIn . At FactSet, we celebrate difference of thought, experience, and perspective. Qualified applicants will be considered for employment without regard to characteristics protected by law.
Posted 2 days ago
8.0 - 12.0 years
30 - 35 Lacs
Gurgaon
On-site
Qualifications Strong problem-solving skills with an emphasis on product development. A drive to learn and master new technologies and techniques. Experience using statistical computer languages (R, Python etc.) to manipulate data and draw insights from large data sets. Experience working with different data architectures. Knowledge and experience in statistical and data mining techniques. Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks. Prior experience in working on zero touch solutions can be an advantage We’re looking for someone with 8-12 years of experience manipulating data sets and building statistical models, has a Bachelors or master’s in computer science or another quantitative field, and is familiar with the following software: Coding knowledge and experience with several languages Python, R Experience with Python and common data science toolkits like Jupyter, Pandas, Numpy, Scikit-learn, TensorFlow, Keras etc. Knowledge and experience in statistical and data mining techniques: GLM/Regression, Random Forest, Boosting, Trees, text mining, social network analysis, etc. Experience in using different types of Databases - RDBMS, Graph, No-SQL etc. Strong experience in Generative AI Role & Responsibility: 1. Design production grade AI/ML solutions 2. Collaborate with verticals for AI use case discovery & solutions 3. Identify common components and enablers for AI/ML, GenAI 4. Define best practices for AI/ML development, usage and roll out 5. Hands on development of prototypes Job Type: Full-time Pay: ₹3,000,000.00 - ₹3,500,000.00 per year Application Question(s): Are you an immediate joiner? Experience: Data science: 9 years (Required) Python, R: 8 years (Required) Work Location: In person
Posted 2 days ago
2.0 years
1 - 3 Lacs
Surat
On-site
Experience : 2 + Years Key Responsibilities: Design, develop, and deploy Machine Learning /Artificial Intelligent models to solve real-world problems in our software products. Collaborate with product managers, developers, and data engineers to define AI project goals and requirements. Clean, process, and analyze large datasets to extract meaningful patterns and insights. Implement and fine-tune models using frameworks such as TensorFlow, PyTorch, or Scikit-learn. Develop APIs and services to integrate AI models with production environments. Monitor model performance and retrain as needed to maintain accuracy and efficiency. Stay updated with the latest advancements in AI/ML and evaluate their applicability to our projects. Required Skills & Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Science, AI/ML, or a related field. Strong understanding of machine learning algorithms (supervised, unsupervised, reinforcement learning). Experience with Python and ML libraries like NumPy, pandas, TensorFlow, Keras, PyTorch, Scikit-learn. Familiarity with NLP, computer vision, or time-series analysis is a plus. Experience with model deployment tools and cloud platforms (AWS/GCP/Azure) preferred. Knowledge of software engineering practices including version control (Git), testing, and CI/CD. Qualification: Prior experience working in a product-based or tech-driven startup environment. Exposure to deep learning, recommendation systems, or predictive analytics. Understanding of ethical AI practices and model interpretability Job Type: Full-time Pay: ₹12,059.95 - ₹30,307.34 per month Schedule: Day shift Work Location: In person
Posted 2 days ago
0 years
0 Lacs
Vadodara
On-site
Join us to build cutting-edge apps with Angular & PHP. Grow, innovate, and code with a passionate team. AI/ML Intern / Fresher – Internship + Job Opportunity Company: Logical Wings Infoweb Pvt. Ltd. Location: Vadodara (On-site only) Type: Internship with Pre-placement Offer Opportunity Domain: Enterprise Software Solutions | AI-Powered Applications Role Overview: We are seeking AI/ML Interns or Freshers who are passionate about Artificial Intelligence and Machine Learning, and excited to apply their knowledge to real-world enterprise problems. You’ll gain hands-on experience, work on live projects, and have a pathway to a full-time role based on your performance. What You’ll Do: Assist in designing and developing machine learning models and AI-based solutions Work on data preprocessing, feature engineering, and model training/evaluation Collaborate with development teams to integrate AI modules into enterprise software Research and experiment with new algorithms and frameworks Help build tools for data visualization, analysis, and insights Skills & Qualifications: Solid understanding of Python and key libraries (NumPy, Pandas, Scikit-learn, etc.) Exposure to Machine Learning and Deep Learning concepts Familiarity with frameworks like flask TensorFlow, Keras, or PyTorch is a plus Addition or plus skills Work or knowledge in web related python with Django framework, Mysql Basic understanding of data structures and algorithms Curiosity, problem-solving mindset, and a willingness to learn Eligibility: Final semester students pursuing a degree in Computer Science / Data Science / Engineering OR Recent graduates with a background or strong interest in AI/ML/For Vadodara only Why Join Us? Work on cutting-edge AI solutions for enterprise clients Mentorship from experienced AI professionals Opportunity to convert to a full-time role post-internship A collaborative and innovation-driven environment How to Apply: Send your resume to: Hr@logicalwings.com Visit us at: www.logicalwings.com Note: Applications via phone calls will not be entertained. If you’re driven by data, algorithms, and the idea of solving real-world problems through AI — Logical Wings is your launchpad!
Posted 2 days ago
0 years
0 Lacs
India
Remote
Data Science Intern Location: Remote Duration: 2-6 months Type: Unpaid Internship About Collegepur Collegepur is an innovative platform dedicated to providing students with comprehensive information about colleges, career opportunities, and educational resources. We are building a dynamic team of talented individuals passionate about data-driven decision-making. Job Summary We are seeking a highly motivated Data Science Intern to join our team. This role involves working on data collection, web scraping, analysis, visualization, and machine learning to derive meaningful insights that enhance our platform’s functionality and user experience. Responsibilities: Web Scraping: Collect and extract data from websites using tools like BeautifulSoup, Scrapy, or Selenium. Data Preprocessing: Clean, transform, and structure raw data for analysis. Exploratory Data Analysis (EDA): Identify trends and insights from collected data. Machine Learning: Develop predictive models for data-driven decision-making. Data Visualization: Create dashboards and reports using tools like Matplotlib, Seaborn, Power BI, or Tableau. Database Management: Work with structured and unstructured data, ensuring quality and consistency. Collaboration: Work with cross-functional teams to integrate data solutions into our platform. Documentation: Maintain records of methodologies, findings, and workflows. Requirements: Currently pursuing or recently completed a degree in Data Science, Computer Science, Statistics, Mathematics, or a related field . Experience in web scraping using BeautifulSoup, Scrapy, or Selenium. Proficiency in Python/R and libraries like Pandas, NumPy, Scikit-learn, TensorFlow, or PyTorch. Familiarity with SQL and database management. Strong understanding of data visualization tools . Knowledge of APIs and cloud platforms (AWS, GCP, or Azure) is a plus. Excellent problem-solving and analytical skills. Ability to work independently and as part of a team. Perks and Benefits: Remote work with flexible hours . Certificate of completion and Letter of Recommendation . Performance-based LinkedIn recommendations . Opportunity to work on real-world projects and enhance your portfolio . If you are passionate about data science and web scraping and eager to gain hands-on experience, we encourage you to apply! (recruitment@collegepur.com)
Posted 2 days ago
0 years
0 Lacs
Rajkot, Gujarat, India
On-site
Are you passionate about Artificial Intelligence and Machine Learning? Start your AI/ML career with hands-on learning, real projects, and expert guidance at TechXperts! Skills Required : Technical Knowledge: • Basic understanding of Python and popular ML libraries like scikit-learn, pandas, NumPy • Familiarity with Machine Learning algorithms (Regression, Classification, Clustering, etc.) • Knowledge of Data Preprocessing, Model Training, and Evaluation Techniques • Understanding of AI concepts such as Deep Learning, Computer Vision, or NLP is a plus • Familiarity with tools like Jupyter Notebook, Google Colab, or TensorFlow/Keras is an advantage Soft Skills : • Curiosity to explore and learn new AI/ML techniques • Good problem-solving and analytical thinking • Ability to work independently and in a team • Clear communication and documentation skills What You’ll Do: • Assist in building and training machine learning models • Support data collection, cleaning, and preprocessing activities • Work on AI-driven features in real-time applications • Collaborate with senior developers to implement ML algorithms • Research and experiment with AI tools and frameworks Why Join TechXperts? • Learn by working on live AI/ML projects • Supportive mentorship from experienced developers • Exposure to the latest tools and techniques • Friendly work culture and growth opportunities
Posted 2 days ago
12.0 years
0 Lacs
Kochi, Kerala, India
On-site
At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. Job Description Automation Title Data Architect Type of Employment Permanent Overall Years Of Experience 12-15 years Relevant Years Of Experience 10+ Data Architect Data Architect is responsible for designing and implementing data architecture for multiple projects and also build strategies for data governance Position Summary 12 – 15 yrs of experience in a similar profile with strong service delivery background Experience as a Data Architect with a focus on Spark and Data Lake technologies. Experience in Azure Synapse Analytics Proficiency in Apache Spark for large-scale data processing. Expertise in Databricks, Delta Lake, Azure data factory, and other cloud-based data services. Strong understanding of data modeling, ETL processes, and data warehousing principles. Implement a data governance framework with Unity Catalog . Knowledge in designing scalable streaming data pipeline using Azure Event Hub, Azure Stream analytics, Spark streaming Experience with SQL and NoSQL databases, as well as familiarity with big data file formats like Parquet and Avro. Hands on Experience in python and relevant libraries such as pyspark, numpy etc Knowledge of Machine Learning pipelines, GenAI, LLM will be plus Excellent analytical, problem-solving, and technical leadership skills. Experience in integration with business intelligence tools such as Power BI Effective communication and collaboration abilities Excellent interpersonal skills and a collaborative management style Own and delegate responsibilities effectively Ability to analyse and suggest solutions Strong command on verbal and written English language Essential Roles and Responsibilities Work as a Data Architect and able to design and implement data architecture for projects having complex data such as Big Data, Data lakes etc Work with the customers to define strategy for data architecture and data governance Guide the team to implement solutions around data engineering Proactively identify risks and communicate to stakeholders. Develop strategies to mitigate risks Build best practices to enable faster service delivery Build reusable components to reduce cost Build scalable and cost effective architecture EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
Posted 2 days ago
12.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. Job Description Automation Title Data Architect Type of Employment Permanent Overall Years Of Experience 12-15 years Relevant Years Of Experience 10+ Data Architect Data Architect is responsible for designing and implementing data architecture for multiple projects and also build strategies for data governance Position Summary 12 – 15 yrs of experience in a similar profile with strong service delivery background Experience as a Data Architect with a focus on Spark and Data Lake technologies. Experience in Azure Synapse Analytics Proficiency in Apache Spark for large-scale data processing. Expertise in Databricks, Delta Lake, Azure data factory, and other cloud-based data services. Strong understanding of data modeling, ETL processes, and data warehousing principles. Implement a data governance framework with Unity Catalog . Knowledge in designing scalable streaming data pipeline using Azure Event Hub, Azure Stream analytics, Spark streaming Experience with SQL and NoSQL databases, as well as familiarity with big data file formats like Parquet and Avro. Hands on Experience in python and relevant libraries such as pyspark, numpy etc Knowledge of Machine Learning pipelines, GenAI, LLM will be plus Excellent analytical, problem-solving, and technical leadership skills. Experience in integration with business intelligence tools such as Power BI Effective communication and collaboration abilities Excellent interpersonal skills and a collaborative management style Own and delegate responsibilities effectively Ability to analyse and suggest solutions Strong command on verbal and written English language Essential Roles and Responsibilities Work as a Data Architect and able to design and implement data architecture for projects having complex data such as Big Data, Data lakes etc Work with the customers to define strategy for data architecture and data governance Guide the team to implement solutions around data engineering Proactively identify risks and communicate to stakeholders. Develop strategies to mitigate risks Build best practices to enable faster service delivery Build reusable components to reduce cost Build scalable and cost effective architecture EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
Posted 2 days ago
10.0 - 15.0 years
0 Lacs
Pune/Pimpri-Chinchwad Area
Remote
Req ID: 332901 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Business Intelligence Advisor to join our team in Pune, Mahārāshtra (IN-MH), India (IN). The Business Intelligence Advisor will utilize analytical, statistical, and programming skills to collect, analyze, and interpret large volume data sets and use this information to develop data-driven solutions for addressing difficult business challenges. Monitor, analyze and report business performance, financial results and other KPIs defined by business. Responsibilities Analyze spending patterns, identify cost-savings opportunities, and provide actionable insights to improve decision making. Collaborate with cross-functional teams, interpret data trends, and develop dashboards or reports to communicate findings. Understand user requirements, translate complex data into user-friendly reports ensuring data accuracy. Create visually compelling and insightful reports, interactive dashboards connecting Tableau or Power BI to SQL and Snowflake. Lead or participate in multiple analytical projects or ad-hoc analysis by completing and updating project documentation, managing project scope, adjusting schedules when necessary, determining daily priorities, ensuring efficient and on-time delivery of project tasks and milestones. Perform exploratory data analysis (EDA) to uncover trends, patterns, and insights. Apply statistical techniques and advanced analytical methods to solve business problems. Utilize Python libraries such as Pandas, NumPy, Matplotlib for data analysis, visualization, and modeling. Automate data processing and analysis workflows using Python. Perform Spend analysis by gathering, cleansing, classifying, transforming procurement spend data, providing spend visibility, to facilitate category and spend management. Data enrichment/gap fill, standardization, normalization, and categorization of Spend data via research through different sources like internet, specific websites, database etc. Data quality check and correction, process, clean, and verify the integrity of data used for analysis. Stay updated on industry trends and optimize BI tools for efficient performance. Write optimized SQL & Snowflake queries for data extraction as well as integration with other applications. Design workflows in Alteryx Designer to develop models using data modeling techniques as per requirements. Create automated anomaly detection systems and constant tracking of its performance. Create and maintain the documentation of the architecture, data models and maintenance activities. Continuous process improvement and efficiency gain using automation or any other process standardization techniques. Technical Skills & Competencies Must have experience working on business analytics or spend analytics projects as well as handling day-to-day operational requests from the business. Ability to successfully manage multiple tasks at any given point, strong relationship building skills & communication skills. High proficiency with Microsoft Excel. Visualization capabilities Power BI / Tableau. Knowledge about Alteryx Designer tool and Snowflake is preferred. Experience in requirements gathering and analysis and defining the implementation roadmap. Ability to work remotely with key stakeholders and business partners. Preferred to have skills of Project coordination & management. Self-motivated with a high degree of learning agility and a team player. Experience & Education Bachelor’s degree in information science, Computer Science, Mathematics, Statistics, or a quantitative discipline in science. Advanced degree preferred. Minimum 10-15 years of work experience in the fields of data management and analysis. At least 5 years of work experience in procurement data management, spend analysis, RFP/RPQ/quote analysis. Demonstrated experience with data architecture, data integration/ETL, data warehousing, and/or business intelligence deployed in a complex environment. Demonstrated experience in Python programming for data manipulation, analysis, and visualization. Prior experience working on Reporting/Visualization Tools such as Power BI, Tableau. Must have excellent presentation & communication (written and verbal) skills. Good research and logical skills. Strong data collection, consolidation, and cleansing skills. Ability to scope, plan and execute assigned projects in a fast-paced environment. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here .
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough