Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
1.0 - 3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description Associate Specialist/Analyst - Data Science At our company we are leveraging analytics and technology, as we invent for life on behalf of patients around the world. We are seeking those who have a passion for using data, analytics, and insights to drive decision making, that will allow us to tackle some of the world’s greatest health threats. Within our commercial Insights, Analytics, and Data organization we are transforming to better power decision-making across our end-to-end commercialization process, from business development to late lifecycle management. As we endeavor, we are seeking a dynamic talent to serve in the role of Analyst - Data Science. This role involves working with our partners in different Therapeutic areas (e.g. Oncology, Vaccines, Pharma & Rare Disease, etc.) and Domain areas (HCP Analytics, Patient Analytics, Segmentation & targeting, Market Access, etc.) across the organization to help create scalable and production-grade analytics solutions, ranging from data visualization and reporting to advanced statistical and AI/ML models. You will work in one of the three therapeutic areas of Brand Strategy and Performance Analytics – Oncology/Vaccines/Pharma & Rare Disease, where you will play a pivotal role in leveraging your statistical and machine learning expertise to address critical business challenges and derive insights to drive key decisions. Working alongside experienced data scientists and business analysts, you will have the opportunity to collaborate in translating business queries into analytical problems, employing your critical thinking, problem-solving, statistical, machine learning, and data visualization skills to deliver impactful solutions. We are seeking candidates with prior experience in the healthcare analytics or consulting sectors, prior hands-on experience in Data Science (building end-to-end ML models). It is preferred that you have a good understanding of Physician and Patient-level data (PLD) from leading vendors such as IQVIA, Komodo, and Optum. Familiarity with HCP Analytics, PLD analytics, concepts like persistence, compliance, line of therapy, etc., or Segmentation & Targeting is highly desirable. You will be part of a dynamic team that collaborates with our partners across therapeutic areas. Furthermore, effective communication skills are crucial, as this role requires interfacing with executive and business stakeholders. Who You Are You understand the foundations of statistics and machine learning and can work in high performance computing/cloud environments, with experience/knowledge in aspects across statistical analysis, machine learning, model development, data engineering, data visualization, and data interpretation You are self-motivated, and have demonstrated abilities to think independently as a data scientist You structure your data science approach according to the necessary task, while appropriately applying the correct level of model complexity to the problem at hand You have an agile mindset of continuous learning and will focus on integrating enterprise value into team culture You are kind, collaborative, and capable of seeking and giving candid feedback that effectively contributes to a more seamless day-to-day execution of tasks Key Responsibilities Understand the business requirements and support the manager to translate those to analytical problem statements. Implement the solution steps through SQL/Python, appropriate ML techniques without rigorous handholding. Follow technical requirements (Datasets, business rules, technical architecture) and industry best practices in every task. Collaborate with cross-functional teams to design and implement solutions that meet business requirements. Present the findings to US DS stakeholders in a clear and concise manner and address feedback. Adopt a continuous learning mindset, both technical and functional. Develop deep expertise in therapeutic area, with clear focus on commercial aspects. Minimum Qualifications Bachelor’s degree with at least 1-3 years industry experience Strong Python/R, SQL, Excel skills Strong foundations of statistics and machine learning Preferred Qualifications Advanced degree in STEM (MS, MBA, PhD) 2-3 years’ experience in healthcare analytics and consulting Familiarity with Physician and Patient-Level data (e.g., claims, electronic health records) and data from common healthcare data vendors (IQVIA, Optum, Komodo, etc.) Experience in HCP & Patient Level Data analytics (e.g., HCP Segmentation & targeting, Patient Cohorts, knowledge of Lines of Therapy, Persistency, Compliance, etc.) Proficiency in Data Science Concepts, Microsoft Excel and PowerPoint, and familiarity with Dataiku Our Human Health Division maintains a “patient first, profits later” ideology. The organization is comprised of sales, marketing, market access, digital analytics and commercial professionals who are passionate about their role in bringing our medicines to our customers worldwide. We are proud to be a company that embraces the value of bringing diverse, talented, and committed people together. The fastest way to breakthrough innovation is when diverse ideas come together in an inclusive environment. We encourage our colleagues to respectfully challenge one another’s thinking and approach problems collectively. We are an equal opportunity employer, committed to fostering an inclusive and diverse workplace. Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Intelligence (BI), Database Design, Data Engineering, Data Modeling, Data Science, Data Visualization, Machine Learning, Software Development, Stakeholder Relationship Management, Waterfall Model Preferred Skills Job Posting End Date 08/31/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R336981
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
haryana
On-site
In this role, you will focus on developing data products and analytics solutions for major transformation programs across Macquarie. You will build and maintain strong, trusted stakeholder relationships, manage risk, and promote a collaborative team environment. With a focus on continuous improvement, you will deliver innovative and impactful data solutions. At Macquarie, our advantage is bringing together diverse people and empowering them to shape all kinds of possibilities. We are a global financial services group operating in 31 markets and with 56 years of unbroken profitability. You'll be part of a friendly and supportive team where everyone - no matter what role - contributes ideas and drives outcomes. Join our Post Trade Data team and be a part of our exciting data transformation initiative. We provide data solutions that facilitate effective data management, data-driven insights, and data automation within our Commodities and Global Markets group. You'll collaborate with business stakeholders to understand and fulfill various requirements. A Bachelor's degree with relevant work experience ranging from 2 to 5 years is required. You should have exceptional analytical skills with clear and concise communication. Additionally, you should have the ability to interpret and wrangle data using SQL, Python, or tools like Dataiku and Alteryx. Being able to manage multiple tasks while delivering high-quality, timely results is crucial. You should also be adaptable to evolving business needs, with a finance domain knowledge, especially in commodities, financial markets, and regulatory reporting. If you're excited about the role or working at Macquarie, we encourage you to apply and be a part of Commodities and Global Markets, a global business offering capital and financing, risk management, market access, physical execution, and logistics solutions to its diverse client base across Commodities, Financial Markets, and Asset Finance. Our aim is to provide reasonable adjustments to individuals who may need support during the recruitment process and through working arrangements. If you require additional assistance, please let us know in the application process.,
Posted 1 week ago
170.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Summary Processes Good verbal and written communication skills Possess analytical and structured problem-solving skill Ability to learn and adapt to new technologies and frameworks Good programming & debugging skills Ability to handle raw and unstructured data Good understanding of software development life cycle (Agile and Waterfall model) Understanding of coding standards. Understanding on source control, versioning, branching etc. Hands-on in Big Data Toolset such as Hadoop, HDFS, HIVE, SPARK, Bash Scripting Hands-on in SQL Hands on in any reporting tool (e.g. Tableau, Dataiku, MSTR etc.) is a plus Familiar with Enterprise Data Warehouse and Reference Data Management is a plus Familiar with Control M (or other job orchestration tool) is a plus Familiar with building ELT/ETL pipeline in Hadoop is a plus Familiar with Azure DevOps is a plus Key Responsibilities Regulatory & Business Conduct Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters. Lead to achieve the outcomes set out in the Bank’s Conduct Principles: [Fair Outcomes for Clients; Effective Financial Markets; Financial Crime Compliance; The Right Environment.] * Serve as a Director of the Board Exercise authorities delegated by the Board of Directors and act in accordance with Articles of Association (or equivalent) Key stakeholders FCSO development teams and FCSO Business Skills And Experience Hadoop Apache Hive PySpark SQL Azure DevOps Control M Qualifications Education Diploma/Degree Competencies Action Oriented Collaborates Customer Focus Gives Clarity & Guidance Manages Ambiguity Develops Talent Drives Vision & Purpose Nimble Learning Decision Quality Courage Instills Trust Strategic Mindset Technical Competencies: This is a generic competency to evaluate candidate on role-specific technical skills and requirements About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential. Recruitment Assessments Some of our roles use assessments to help us understand how suitable you are for the role you've applied to. If you are invited to take an assessment, this is great news. It means your application has progressed to an important stage of our recruitment process. Visit our careers website www.sc.com/careers
Posted 1 week ago
6.0 - 8.0 years
5 - 9 Lacs
Noida
Work from Office
6-8 years of over all experience with 3+ years in ML Ops engineering. Strong proficiency in Python , with experience in machine learning libraries such as TensorFlow , PyTorch , scikit-learn , etc. Extensiveexperience with ML Ops frameworks like DataIku (MUST Have),Kubeflow , MLflow , TensorFlow Extended (TFX) , . Strong experience in deploying and managing machine learning models on AWS environments Proficiency in managing CI/CD pipelines for ML workflows using tools such as Jenkins , GitLab , CircleCI , etc. Hands-on experience with containerization (Docker) and orchestration (Kubernetes) technologies for model deployment. Mandatory skills : SageMaker, DataIku, Python, PySpark, AWS Services ; Good to have: AWS CDK Mandatory Competencies Data Science - Machine Learning (ML) Python - Python Beh - Communication DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Jenkins DevOps - Github DevOps - Kubernetes Big Data - Big Data - Pyspark Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Data Science and Machine Learning - Data Science and Machine Learning - AI/ML Data Science and Machine Learning - Data Science and Machine Learning - Python DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - GitLab,Github, Bitbucket DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Containerization (Docker, Kubernetes)
Posted 1 week ago
170.0 years
4 - 5 Lacs
Bengaluru
On-site
Job ID: 32698 Location: Bangalore, IN Area of interest: Technology Job type: Regular Employee Work style: Office Working Opening date: 21 Jul 2025 Job Summary Processes Good verbal and written communication skills Possess analytical and structured problem-solving skill Ability to learn and adapt to new technologies and frameworks Good programming & debugging skills Ability to handle raw and unstructured data Good understanding of software development life cycle (Agile and Waterfall model) Understanding of coding standards. Understanding on source control, versioning, branching etc. Hands-on in Big Data Toolset such as Hadoop, HDFS, HIVE, SPARK, Bash Scripting Hands-on in SQL Hands on in any reporting tool (e.g. Tableau, Dataiku, MSTR etc.) is a plus Familiar with Enterprise Data Warehouse and Reference Data Management is a plus Familiar with Control M (or other job orchestration tool) is a plus Familiar with building ELT/ETL pipeline in Hadoop is a plus Familiar with Azure DevOps is a plus Key Responsibilities Regulatory & Business Conduct Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters. Lead to achieve the outcomes set out in the Bank’s Conduct Principles: [Fair Outcomes for Clients; Effective Financial Markets; Financial Crime Compliance; The Right Environment.] * Serve as a Director of the Board Exercise authorities delegated by the Board of Directors and act in accordance with Articles of Association (or equivalent) Key stakeholders FCSO development teams and FCSO Business Skills and Experience Hadoop Apache Hive PySpark SQL Azure DevOps Control M Qualifications Education Diploma/Degree Competencies Action Oriented Collaborates Customer Focus Gives Clarity & Guidance Manages Ambiguity Develops Talent Drives Vision & Purpose Nimble Learning Decision Quality Courage Instills Trust Strategic Mindset Technical Competencies: This is a generic competency to evaluate candidate on role-specific technical skills and requirements About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together we: Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What we offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential. Recruitment Assessments Some of our roles use assessments to help us understand how suitable you are for the role you've applied to. If you are invited to take an assessment, this is great news. It means your application has progressed to an important stage of our recruitment process. Visit our careers website www.sc.com/careers www.sc.com/careers
Posted 1 week ago
7.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Location: Pune, India Experience: 5–7 Years Employment Type: Full-Time Work Mode: Onsite About the Role: We are seeking a Senior Data Engineer with extensive experience in Dataiku to join our expanding data engineering team. In this role, you will be responsible for designing, developing, and optimizing data pipelines, workflows, and automation using Dataiku's capabilities . You will work closely with data scientists, analysts, and business stakeholders to enable advanced analytics and deliver scalable, production-ready data solutions. Key Responsibilities: Design and Develop Data Pipelines : Build, manage, and optimize scalable data pipelines using Dataiku for data ingestion, transformation, and delivery. Leverage Dataiku Features : Utilize the full capabilities of Dataiku, including flow design, visual and code recipes, scenarios, metrics, and plugins to support end-to-end data solutions. Workflow Automation : Develop and automate workflows and repeatable processes to support analytics and machine learning use cases. ETL/ELT Development : Design and build ETL/ELT pipelines using Dataiku , SQL , Python , and related technologies. Model Integration Support : Collaborate with data science teams to integrate and deploy models as part of the Dataiku flows. Data Governance & Quality : Ensure high data quality, consistency, lineage, and compliance with data governance policies. Performance Optimization : Monitor, troubleshoot, and tune data workflows for performance and scalability. Documentation & Best Practices : Maintain clear documentation of workflows, design decisions, and engineering standards within Dataiku projects. Required Skills and Qualifications: 5–7 years of experience as a Data Engineer with strong hands-on expertise in Dataiku . Proven ability to build complex data workflows and solutions using visual and code-based components in Dataiku. Strong proficiency in SQL and Python for data processing, cleaning, and transformation. Familiarity with cloud environments (AWS, Azure, or GCP) and experience integrating cloud-based data sources with Dataiku. Deep understanding of data modeling, ETL/ELT concepts, and pipeline orchestration. Experience working with large-scale datasets and building high-performance data solutions. Knowledge of Git or other version control tools. Nice to Have: Exposure to MLOps practices and integrating machine learning workflows in Dataiku. Experience with Airflow , Docker , or other orchestration/container tools. Familiarity with data warehouse platforms like Snowflake, BigQuery, or Redshift. Knowledge of data governance tools or frameworks. Dataiku certifications or experience in enterprise-grade Dataiku implementations. Apply now at tisha.goyal@rsquaresoft.com
Posted 1 week ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Key Responsibilities: Design, develop, extend and maintain end-to-end data workflows and pipelines in Dataiku DSS. Collaborate with data scientists and analysts to operationalize machine learning models. Leverage Generative AI models and tools within Dataiku to build advanced AI-powered applications and analytics solutions. Integrate Dataiku with various data sources (databases, cloud storage, APIs). Develop and optimize SQL queries and Python/R scripts for data extraction and transformation across relational and NoSQL databases Work extensively with cloud data warehouses like Amazon Redshift and/or Snowflake for data ingestion, transformation, and analytics. Implement automation and scheduling of data workflows. Monitor and troubleshoot data pipelines to ensure data quality and reliability. Document technical solutions and best practices for data processing and analytics. Required Skills and Qualifications: Proven experience of 4+ years working with Dataiku Data Science Studio (DSS) in a professional environment. Strong knowledge of data engineering concepts, ETL/ELT processes. Proficiency in Python and/or R for data manipulation and automation. Solid SQL skills and experience with relational databases (e.g., MySQL, PostgreSQL, Oracle)
Posted 1 week ago
8.0 years
4 - 8 Lacs
Bengaluru
On-site
Company Description WNS (Holdings) Limited (NYSE: WNS), is a leading Business Process Management (BPM) company. We combine our deep industry knowledge with technology and analytics expertise to co-create innovative, digital-led transformational solutions with clients across 10 industries. We enable businesses in Travel, Insurance, Banking and Financial Services, Manufacturing, Retail and Consumer Packaged Goods, Shipping and Logistics, Healthcare, and Utilities to re-imagine their digital future and transform their outcomes with operational excellence.We deliver an entire spectrum of BPM services in finance and accounting, procurement, customer interaction services and human resources leveraging collaborative models that are tailored to address the unique business challenges of each client. We co-create and execute the future vision of 400+ clients with the help of our 44,000+ employees. Job Description Roles and responsibilities The ideal candidate should have strong communication skills to effectively engage with both technical and non-technical stakeholdersThe candidate will be responsible for developing end to end task/workflow automation pipeline using Python using wide range of data sources (both on premise and cloud)Candidate should have strong working experience is transforming excel based manual processes to fully automated python based process incorporating strong governance around itThe person should be competent in Python Programming and possess high levels of analytical skills - Data pre-processing, engineering, data pipeline development and automation In-depth knowledge of libraries such as pandas, numpy, scikit-learn, openpyxl, pyxlsb, TensorFlow, PyTorch etc.Well-versed with Python coding standards and formatting conventions to ensure maintainable, scalable, and reusable modules.Build and automate workflows using Microsoft Power Platform (Power BI, Power Apps, Power Automate).integrating systems and automating workflows using WTW Unify Knowledge of Dataiku is a plus.Apply GenAI techniques to enhance data exploration, automate content generation, and support decision-making.Ensure data quality, governance, and compliance with organizational standards.Well-versed with CI/CD pipelines using Azure DevOps (ADO) for seamless deployment and integration of data science solutions.Experience working on Posit Workbench, Posit Connect will be an added advantageStay updated with the latest trends in AI, machine learning, and data engineering.Tools/Tech experience – Mandatory – Python (Data processing, Engineering & Automation), SQL, Proficiency with version control systems like ADO/BitbucketPreferred - R programming, Posit Workbench, R Shiny Experience processing large amount of data using BigData technologies is preferredFamiliarity with Microsoft Power Platform tools.Knowledge of Dataiku is a plus.Familiarity with WTW Unify platform and its applications in analytics.Knowledge of Generative AI models and frameworks (e.g., GPT, DALL·E, Llama).Knowledge of data visualization tools and techniques is a plus Functional/Other expertiseRelevant experience: 8+ years of experience using Python programming Language for end-to-end data pre-processing, transformation and automationExperience in the Insurance domain preferred (for e.g. Finance, Actuarial) Qualifications Educational Qualification: Masters in Statistics/Mathematics/Economics/Econometrics from Tier 1 institutions Or BE/B-Tech, MCA or MBA from Tier 1 institutions
Posted 1 week ago
10.0 years
20 - 25 Lacs
India
Remote
Job Title: Senior Data Engineer – Dataiku Experience: 6–10 Years Location: [Remote] Employment Type: [Contract] Shift Timing: [General] Immediate Joiners Preferred Role Overview We are seeking a highly skilled and motivated Senior Data Engineer with hands-on experience in Dataiku, advanced data modeling, ETL/ELT processes, and Python/SQL development. The ideal candidate will have a strong data engineering foundation, exposure to cloud platforms, and a working understanding of Generative AI concepts. This role is pivotal in designing and building robust, scalable, and production-grade data solutions. Key Responsibilities Leverage Dataiku to build end-to-end data pipelines, prepare and transform data, and deliver insightful visualizations and analytics. Design and implement scalable data models using best practices such as dimensional modeling (Kimball/Inmon methodologies). Develop and maintain ETL/ELT workflows using Dataiku, and optionally tools like Apache Airflow, Talend, or SSIS. Integrate and process large datasets from diverse sources into cloud-based environments (AWS / Azure). Write robust and optimized Python scripts and complex SQL queries to automate data workflows and support analytics. Collaborate with cross-functional teams to understand data requirements and translate them into efficient data architecture. Explore and prototype Gen AI and LLM Mesh frameworks to enhance data engineering capabilities. Follow best practices for data quality, governance, and documentation. Required Skills & Experience Proficiency in Dataiku for pipeline creation, data transformation, and analytics workflows. Strong expertise in data modeling techniques (e.g., star schema, snowflake schema, normalized/denormalized models). Hands-on experience with ETL/ELT processes and tools (Dataiku, Airflow, Talend, SSIS, etc.). Solid programming knowledge in Python and strong SQL skills. Experience working with cloud platforms such as AWS or Azure (e.g., S3, EC2, Data Lake, Synapse). Familiarity with LLM Mesh or similar Gen AI frameworks. Understanding of Generative AI use cases in data engineering. Strong problem-solving and debugging capabilities. Excellent communication and stakeholder collaboration skills. Nice to Have Experience with big data technologies like Apache Spark, Hadoop, or Snowflake. Understanding of data governance and data security principles. Familiarity with MLOps tools and workflows. Contributions to open-source data engineering or AI/ML projects. Skills: data,data engineering,dataiku,python,etl/elt,generative ai,cloud platforms (aws, azure),data modeling,stakeholder collaboration,etl,aws,cloud,problem-solving,sql,modeling,analytics
Posted 1 week ago
8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Company Description WNS (Holdings) Limited (NYSE: WNS), is a leading Business Process Management (BPM) company. We combine our deep industry knowledge with technology and analytics expertise to co-create innovative, digital-led transformational solutions with clients across 10 industries. We enable businesses in Travel, Insurance, Banking and Financial Services, Manufacturing, Retail and Consumer Packaged Goods, Shipping and Logistics, Healthcare, and Utilities to re-imagine their digital future and transform their outcomes with operational excellence.We deliver an entire spectrum of BPM services in finance and accounting, procurement, customer interaction services and human resources leveraging collaborative models that are tailored to address the unique business challenges of each client. We co-create and execute the future vision of 400+ clients with the help of our 44,000+ employees. Job Description Roles and responsibilities The ideal candidate should have strong communication skills to effectively engage with both technical and non-technical stakeholdersThe candidate will be responsible for developing end to end task/workflow automation pipeline using Python using wide range of data sources (both on premise and cloud)Candidate should have strong working experience is transforming excel based manual processes to fully automated python based process incorporating strong governance around itThe person should be competent in Python Programming and possess high levels of analytical skills - Data pre-processing, engineering, data pipeline development and automation In-depth knowledge of libraries such as pandas, numpy, scikit-learn, openpyxl, pyxlsb, TensorFlow, PyTorch etc.Well-versed with Python coding standards and formatting conventions to ensure maintainable, scalable, and reusable modules.Build and automate workflows using Microsoft Power Platform (Power BI, Power Apps, Power Automate).integrating systems and automating workflows using WTW Unify Knowledge of Dataiku is a plus.Apply GenAI techniques to enhance data exploration, automate content generation, and support decision-making.Ensure data quality, governance, and compliance with organizational standards.Well-versed with CI/CD pipelines using Azure DevOps (ADO) for seamless deployment and integration of data science solutions.Experience working on Posit Workbench, Posit Connect will be an added advantageStay updated with the latest trends in AI, machine learning, and data engineering.Tools/Tech experience – Mandatory – Python (Data processing, Engineering & Automation), SQL, Proficiency with version control systems like ADO/BitbucketPreferred - R programming, Posit Workbench, R Shiny Experience processing large amount of data using BigData technologies is preferredFamiliarity with Microsoft Power Platform tools.Knowledge of Dataiku is a plus.Familiarity with WTW Unify platform and its applications in analytics.Knowledge of Generative AI models and frameworks (e.g., GPT, DALL·E, Llama).Knowledge of data visualization tools and techniques is a plus Functional/Other expertiseRelevant experience: 8+ years of experience using Python programming Language for end-to-end data pre-processing, transformation and automationExperience in the Insurance domain preferred (for e.g. Finance, Actuarial) Qualifications Educational Qualification: Masters in Statistics/Mathematics/Economics/Econometrics from Tier 1 institutions Or BE/B-Tech, MCA or MBA from Tier 1 institutions
Posted 1 week ago
6.0 years
0 Lacs
India
On-site
🚀 We're Hiring | Machine Learning Engineer | 3–6 Years Experience | Noida / Hyderabad (Hybrid) Are you an experienced ML Engineer ready to hit the ground running? Xebia is hiring for a Machine Learning Engineer role – immediate joiners or candidates with ≤ 2 weeks notice only. 🔍 What We're Looking For: Proven experience with AWS Machine Learning Services (e.g., SageMaker, AWS ML Suite) Solid understanding of ML model lifecycle , evaluation techniques & real-world deployment Proficiency in Python and libraries such as Pandas, NumPy, Scikit-learn Experience with SaaS-based ML platforms (like Dataiku, Indico, H2O.ai or similar) Working knowledge of AWS Data Engineering tools – S3, Glue, Athena, Lambda Strong track record of delivering end-to-end ML solutions 📍 Location: Noida or Hyderabad Hybrid – 3 days per week from office ⚠️ Note: Apply only if you are an Immediate Joiner or can join within 2 weeks. 📩 To Apply: Send your updated CV along with the below details to: 📧 vijay.s@xebia.com Full Name Total Experience Current CTC Expected CTC Current Location Preferred Xebia Location (Noida / Hyderabad) Notice Period / Last Working Day (if serving) Primary Skills LinkedIn Profile URL #Xebia #HiringNow #MachineLearning #AWSJobs #SageMaker #DataScience #Python #HybridJobs #ImmediateJoinersOnly #MLJobs #NoidaJobs #HyderabadJobs
Posted 1 week ago
12.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description Introduction: A Career at HARMAN - Harman Tech Solutions (HTS) We’re a global, multi-disciplinary team that’s putting the innovative power of technology to work and transforming tomorrow. At HARMAN HTS, you solve challenges by creating innovative solutions. Combine the physical and digital, making technology a more dynamic force to solve challenges and serve humanity’s needs. Empower the company to create new digital business models, enter new markets, and improve customer experiences About The Role You will be responsible for driving the strategic direction of our AI and machine learning practice along with other key leaders. This role will involve leading internal AI/ML projects, shaping the technology roadmap, and overseeing client-facing projects including solution development, RFPs, presentations, analyst interactions, partnership development etc. The ideal candidate will have a strong technical background in AI/ML, exceptional leadership skills, and the ability to balance internal and external project demands effectively. In this strategic role, you will be responsible for shaping the future of AI/ML within our organization, driving innovation, and ensuring the successful implementation of AI/ML solutions that deliver tangible business outcomes. What You Will Do Drive Innovation, Differentiation & Growth Develop and implement a comprehensive AI/ML strategy aligned with our business goals and objectives. Ownership on growth of the COE and influencing client revenues through AI practice Identify and prioritize high-impact opportunities for applying AI/ML across various business units, departments and functions. Lead the selection, deployment, and management of AI/ML tools, platforms, and infrastructure. Oversee the design, development, and deployment of AI/ML solutions Define, differentiate & strategize new AI/ML services/offerings and create reference architecture assets Drive partnerships with vendors on collaboration, capability building, go to market strategies, etc. Guide and inspire the organization about the business potential and opportunities around AI/ML Network with domain experts Develop and implement ethical AI practices and governance standards. Monitor and measure the performance of AI/ML initiatives, demonstrating ROI through cost savings, efficiency gains, and improved business outcomes. Oversee the development, training, and deployment of AI/ML models and solutions. Collaborate with client teams to understand their business challenges and needs. Develop and propose AI/ML solutions tailored to client specific requirements. Influence client revenues through innovative solutions and thought leadership. Lead client engagements from project initiation to deployment. Build and maintain strong relationships with key clients and stakeholders. Build re-usable Methodologies, Pipelines & Models Create data pipelines for more efficient and repeatable data science projects Experience of working across multiple deployment environments including cloud, on-premises and hybrid, multiple operating systems and through containerization techniques such as Docker, Kubernetes, AWS Elastic Container Service, and others Coding knowledge and experience in languages including R, Python, Scala, MATLAB, etc. Experience with popular databases including SQL, MongoDB and Cassandra Experience data discovery/analysis platforms such as KNIME, RapidMiner, Alteryx, Dataiku, H2O, Microsoft AzureML, Amazon SageMaker etc. Expertise in solving problems related to computer vision, text analytics, predictive analytics, optimization, social network analysis etc. Experience with regression, random forest, boosting, trees, hierarchical clustering, transformers, convolutional neural network (CNN), recurrent neural network (RNN), graph analysis, etc. People & Interpersonal Skills Build and manage a high-performing team of AI/ML engineers, data scientists, and other specialists. Foster a culture of innovation and collaboration within the AI/ML team and across the organization. Demonstrate the ability to work in diverse, cross-functional teams in a dynamic business environment. Candidates should be confident, energetic self-starters, with strong communication skills. Candidates should exhibit superior presentation skills and the ability to present compelling solutions which guide and inspire. Provide technical guidance and mentorship to the AI/ML team Collaborate with other directors, managers, and stakeholders across the company to align the AI/ML vision and goals Communicate and present the AI/ML capabilities and achievements to clients and partners Stay updated on the latest trends and developments in the AI/ML domain What You Need 12+ years of experience in the information technology industry with strong focus on AI/ML having led, driven and set up an AI/ML practice in IT services or niche AI/ML organizations 10+ years of relevant experience in successfully launching, planning, and executing advanced data science projects. A master’s or PhD degree in computer science, data science, information systems, operations research, statistics, applied mathematics, economics, engineering, or physics. In depth specialization in text analytics, image recognition, graph analysis, deep learning, is required. The candidate should be adept in agile methodologies and well-versed in applying MLOps methods to the construction of ML pipelines. Candidate should have demonstrated the ability to manage data science projects and diverse teams. Should have experience in creating AI/ML strategies & services, and scale capabilities from a technology, platform, and people standpoint. Experience in working on proposals, presales activities, business development and overlooking delivery of AI/ML projects Experience in building solutions with AI/ML elements in any one or more domains – Industrial, Healthcare, Retail, Communication Be an accelerator to grow the practice through technologies, capabilities, and teams both organically as well as inorganically What We Offer Access to employee discounts on world class HARMAN/Samsung products (JBL, Harman Kardon, AKG etc.) Professional development opportunities through HARMAN University’s business and leadership academies. Flexible work schedule with a culture encouraging work life integration and collaboration in a global environment. An inclusive and diverse work environment that fosters and encourages professional and personal development. Tuition reimbursement. “Be Brilliant” employee recognition and rewards program. What Makes You Eligible Be willing to travel up to 25%, domestic and international travel if required. Successfully complete a background investigation as a condition of employment You Belong Here HARMAN is committed to making every employee feel welcomed, valued, and empowered. No matter what role you play, we encourage you to share your ideas, voice your distinct perspective, and bring your whole self with you – all within a support-minded culture that celebrates what makes each of us unique. We also recognize that learning is a lifelong pursuit and want you to flourish. We proudly offer added opportunities for training, development, and continuing education, further empowering you to live the career you want. About HARMAN: Where Innovation Unleashes Next-Level Technology Ever since the 1920s, we’ve been amplifying the sense of sound. Today, that legacy endures, with integrated technology platforms that make the world smarter, safer, and more connected. Across automotive, lifestyle, and digital transformation solutions, we create innovative technologies that turn ordinary moments into extraordinary experiences. Our renowned automotive and lifestyle solutions can be found everywhere, from the music we play in our cars and homes to venues that feature today’s most sought-after performers, while our digital transformation solutions serve humanity by addressing the world’s ever-evolving needs and demands. Marketing our award-winning portfolio under 16 iconic brands, such as JBL, Mark Levinson, and Revel, we set ourselves apart by exceeding the highest engineering and design standards for our customers, our partners and each other. If you’re ready to innovate and do work that makes a lasting impact, join our talent community today!
Posted 1 week ago
7.0 years
0 Lacs
Greater Chennai Area
On-site
Job description: Job Description Mandatory skills : • Hands on exp in DataIku. • Good in Python Coding, SQL, GIT • Proven experience as a Data Engineer, Data Integration, Data Analyst. JD : • 7+ years of experience, including 2+ years of experience in delivering projects in DataIku platforms. • Proficiency in configuring and optimizing Dataiku’s architecture, including data connections, security settings and workflow management. • Hands-on experience with Dataiku recipes, Designer nodes, API nodes & Automation nodes with deployment. • Expertise in python scripting, automation and development of custom workflows in Dataiku • Collaborate with data analyst, business stakeholders and client to gather and understand the requirement. • To contribute to the developments in DataIku environment to apply data integration with given logic to fulfil Bank Regulatory requirement and other customer requirement. • Gather, analyse and interpret requirement specifications received directly from the client. • Ability to work independently and effectively in a fast-paced, dynamic environment. • Strong analytical and problem-solving skills. • Familiarity with agile development methodologies. • Participate in the CR/Production deployment implementation process through Azure DevOps Mandatory Skills: Dataiku . Experience: 5-8 Years . Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 week ago
7.0 - 12.0 years
10 - 14 Lacs
Gurugram
Work from Office
Company Overview Incedo is a US-based consulting, data science and technology services firm with over 3000 people helping clients from our six offices across US, Mexico and India. We help our clients achieve competitive advantage through end-to-end digital transformation. Our uniqueness lies in bringing together strong engineering, data science, and design capabilities coupled with deep domain understanding. We combine services and products to maximize business impact for our clients in telecom, Banking, Wealth Management, product engineering and life science & healthcare industries. Working at Incedo will provide you an opportunity to work with industry leading client organizations, deep technology and domain experts, and global teams. Incedo University, our learning platform, provides ample learning opportunities starting with a structured onboarding program and carrying throughout various stages of your career. A variety of fun activities is also an integral part of our friendly work environment. Our flexible career paths allow you to grow into a program manager, a technical architect or a domain expert based on your skills and interests. Our Mission is to enable our clients to maximize business impact from technology by Harnessing the transformational impact of emerging technologies Bridging the gap between business and technology Role Description As an AWS Data Engineer, your role will be to design, develop, and maintain scalable data pipelines on AWS. You will work closely with technical analysts, client stakeholders, data scientists, and other team members to ensure data quality and integrity while optimizing data storage solutions for performance and cost-efficiency. This role requires leveraging AWS native technologies and Databricks for data transformations and scalable data processing. Technical Skills Responsibilities Lead and support the delivery of data platform modernization projects. Design and develop robust and scalable data pipelines leveraging AWS native services. Optimize ETL processes, ensuring efficient data transformation. Migrate workflows from on-premise to AWS cloud, ensuring data quality and consistency. Design automations and integrations to resolve data inconsistencies and quality issues Perform system testing and validation to ensure successful integration and functionality. Implement security and compliance controls in the cloud environment. Ensure data quality pre- and post-migration through validation checks and addressing issues regarding completeness, consistency, and accuracy of data sets. Collaborate with data architects and lead developers to identify and document manual data movement workflows and design automation strategies. Nice-to-have skills Qualifications 7+ years experience with a core data engineering skillset leveraging AWS native technologies (AWS Glue, Python, Snowflake, S3, Redshift). Experience in the design and development of robust and scalable data pipelines leveraging AWS native services. Proficiency in leveraging Snowflake for data transformations, optimization of ETL pipelines, and scalable data processing. Experience with streaming and batch data pipeline/engineering architectures. Familiarity with DataOps concepts and tooling for source control and setting up CI/CD pipelines on AWS. Hands-on experience with Databricks and a willingness to grow capabilities. Experience with data engineering and storage solutions (AWS Glue, EMR, Lambda, Redshift, S3). Strong problem-solving and analytical skills. Knowledge of Dataiku is needed Graduate/Post-Graduate degree in Computer Science or a related field.
Posted 2 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Ahmedabad
Work from Office
Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Mandatory Skills: Dataiku. Experience: 5-8 Years. >
Posted 2 weeks ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Summary Work as a Senior Developer for a Strategic Tax Reporting application under Finance Technology. Individual will be responsible for end to end Development, Testing and Implementation of Data solutions using Dataiku tool, with Python, Spark, Hadoop and Hive as the core programming languages and frameworks to develop data products, API’s and integrate with other applications within the Bank and leveraging Devops for Continuous Integration and Continuous Development Project Background As part of the Strategic Tax Reporting solution, the aim is to provide an automated tax provisions calculation for Group financial reporting, local statutory reporting and provide inputs for tax returns compliance process globally. The strategic reporting tool will use a vendor product called “Long View” as a tax calculation engine and will integrate with Enterprise ASPIRE and EDM infrastructure to provide for the required automation capabilities. In order for the integration between SAP S4 HANA and Long View, the interim architecture will use Dataiku to invoke S4 API’s to retrieve GL Data required for Tax Reporting by Long view & other Tax processes as appropriate. Key Responsibilities Business Responsible for End to Development with tasks not limited to but covering areas such as Analysis, Design, Development, Data management, Devops Integration using Pipeline build and maintenance, Level 3 support, Issue analysis debugging, Performance tuning, Configuration management, Automation, Monitoring etc. Processes Ensure adherence to Change & incident management process by Coordination and working with other teams in the organization for getting a release deployed into production. People and Talent Effectively uses teamwork to positively contribute to a high-morale, high-performance team culture, leading by example. Consulting attitude who is approachable and ready to offer insights or assistance Should be results oriented and have a positive attitude Effective team player and collaborator Strong personal integrity Strong written and verbal communication skills Can effectively communicate business and technical information across the organization, being sensitive to the needs of unique audiences Risk Management Work in coordination with Production support & SRE to maintain stability of production applications via controlled, automated deployments with minimal outages and impact Is able to collaborate with relevant IT teams and business users who may be Product owner or end users to manage a problem resolution effectively Work cross-functionally and think both critically and strategically Process and Governance Document detailed Unit test results, application functionality, design etc in Confluence and JIRA’s with supporting evidences & test cases Embrace Devops methodology for Dev, Build and Deploy and work in Agile delivery model. Coordinate UAT and system integration testing. Answer user queries support UAT/SIT/PT & other related activities Provide updates to the project / program manager with regards to progress and issues, where appropriate Regulatory & Business Conduct Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters. Skills And Experience Overall 8+ years total experience in IT preferably Banking or Financial Services Domain 4 + years for experience in software development using Python, Spark with strong experience in PySpark and experience developing big data solutions using Hadoop, HDFS, Hive. Experience in delivering majors projects, programs against schedule Experience working on cloud native infrastructure with exposure to AWS or Azure Good experience and mindset towards DevSecOps with exposure to tools like Jenkins, Maven, Artefactory, Gradle, Ansible, Shell scripting Experience with agile frameworks (e.g. Scrum, Kanban, Lean) and toolsets (Git, JIRA, Confluence) Experience Developing API’s , managing, deploying and integrating applications Knowledge on Java, JavaScript will be good to have. Non-technical Skills Proven ability to work within a team environment Ability to work with multiple tasks based on priorities and switching between deliverables. Highly effective verbal and written English communication & presentation skills. Ability to quickly understand and articulate problems and solutions Ability to make good / sound decisions and use independent judgement. Strong reasoning, analytical and inter-personal skills. Excellent attention to detail and time management. Good knowledge on Agile practices Role Specific Technical Competencies Python Spark Linux scripting and ansible Gradle, Maven, Jenkins, Docker, Artefactory and other Devops tools Jira, Confluence and Agile Java, Javascript Hadoop/Hive About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential.
Posted 2 weeks ago
3.0 - 4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Summary We are seeking a highly-skilled and experienced Marketing Cloud Testing team to join our team Marketing Automation team who works closely with brand teams; understands various data sources, adept in building data ingestion pipelines, skilled in testing end-to-end data ingestion layers, data models and visualization dashboards based on previously built test scripts. About The Role Key Responsibilities: Build e2e test scripts for each release based on user epics across the data value chain – ingestion, data model and visualization Post development, run the test scripts using any of testing platforms viz Proton etc Document results and highlight any bugs / errors to development team and work closely with development team to resolve the issues Must audit technical developments and solutions and validate matching of source data with MCI Additional responsibilities may include creating and updating knowledge documents in the repository as needed. Work closely with Technical Lead and Business Analysts to help design testing strategy and testing design as part of pre-build activities Participate in data exploration and data mapping activities along with technical lead and business and DDIT architects for any new data ingestion needs from business along with Development team Build and maintain standard SOPs to run smooth operations that enable proper upkeep of visualization data and insights Qualifications Minimum of 3-4 years of experience in Dataroma / MCI as hands on developer Prior experience in any of visualization platforms viz Tableau, Qlik, Power BI as core developer is a plus Experience of working on Data Cloud and other data platforms is a plus Hand-on experience in using any ETL tools such as Informatica, Alteryx, DataIKU preferred Prior experience in testing automation platforms preferred Excellent written and verbal skills. Strong interpersonal and analytical skills Ability to provide efficient, timely, reliable, and courteous service to customers. Ability to effectively present information Demonstrated knowledge of the Data Engineering & Business Intelligence ecosystem Salesforce MCI certification. Familiarity with AppExchange deployment, Flow, Aura component and Lightning Web component will be a plus. Commitment To Diversity And Inclusion Novartis is committed to building an outstanding, inclusive work environment and diverse teams' representative of the patients and communities we serve. Accessibility And Accommodation Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to diversityandincl.india@novartis.com and let us know the nature of your request and your contact information. Please include the job requisition number in your message Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards
Posted 2 weeks ago
1.0 - 2.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Morgan Stanley Model Risk Process Validation Group - Analyst Profile Description We’re seeking someone to join our team as an [Analyst] to [Model Risk Process Validation Group] Firm Risk Management In the Firm Risk Management division, we advise businesses across the Firm on risk mitigation strategies, develop tools to analyze and monitor risks and lead key regulatory initiatives. Company Profile Morgan Stanley is an industry leader in financial services, known for mobilizing capital to help governments, corporations, institutions, and individuals around the world achieve their financial goals. Since 1935, Morgan Stanley is known as a global leader in financial services, always evolving and innovating to better serve our clients and our communities in more than 40 countries around the world. Primary Responsibilities What you’ll do in the role: Perform independent validations of select FRM processes and controls, including those relating to regulatory and Basel requirements Support execution of reviews (e.g., planning, documenting, reporting) and continuous monitoring activities (e.g., risk assessments) Contribute to improving the team's validation methodology and execution capabilities; Interface with key stakeholders, governing bodies, and business partners to review status of validation work, results of test work, and quarterly reporting Partner with other independent validation teams, e.g., Model Risk Management, Regulatory Reporting Quality Assurance (RRQA), to support a unified validation program end-to-end. Experience What you’ll bring to the role: Bachelor's or higher degree in Finance, Economics, Computer Science, Mathematics, Engineering or other business or risk management related areas Experience from consulting, risk management, or internal audit covering processes and controls across risk stripes (e.g., Credit, Market, Liquidity, Capital and Data Risk) Experience in data analytics, data visualization, or process automation Strong risk, process, and control validation/testing, and assessment skills Strong communication and analytical skills A commitment to teamwork Ability to prioritize and manage multiple competing objectives. Skills Strong understanding banking regulatory environment, including familiarity with Bank of International Settlements (BIS) principles (e.g., Basel III, BCBS 239, FRTB) and FRB Capital Planning requirements and practices (e.g., CCAR, DFAST) 1-2 years of relevant industry experience with core banking, investment and trading products, and banking regulations (e.g., FRB SR 11-07, SR 12-17, SR 14-08, SR 15-18, PRA SS1/23) Understanding of data lineage and database schema; experience working with large data sets, data warehouse, or data lake knowledge of IT general controls; business analyst experience Knowledge and experience with data analytics and data visualization tools and systems (e.g., PowerBI, Alteryx, Dataiku, QlikView, Tableau) experience with writing or editing SQL, VBA delete VBA, Python and/or other programming languages; advanced Excel knowledge Relevant certifications or designations (e.g., CFA or FRM) (preferred). What You Can Expect From Morgan Stanley We are committed to maintaining the first-class service and high standard of excellence that have defined Morgan Stanley for over 89 years. Our values - putting clients first, doing the right thing, leading with exceptional ideas, committing to diversity and inclusion, and giving back - aren’t just beliefs, they guide the decisions we make every day to do what's best for our clients, communities and more than 80,000 employees in 1,200 offices across 42 countries. At Morgan Stanley, you’ll find an opportunity to work alongside the best and the brightest, in an environment where you are supported and empowered. Our teams are relentless collaborators and creative thinkers, fueled by their diverse backgrounds and experiences. We are proud to support our employees and their families at every point along their work-life journey, offering some of the most attractive and comprehensive employee benefits and perks in the industry. There’s also ample opportunity to move about the business for those who show passion and grit in their work. To learn more about our offices across the globe, please copy and paste https://www.morganstanley.com/about-us/global-offices into your browser. Morgan Stanley is an equal opportunities employer. We work to provide a supportive and inclusive environment where all individuals can maximize their full potential. Our skilled and creative workforce is comprised of individuals drawn from a broad cross section of the global communities in which we operate and who reflect a variety of backgrounds, talents, perspectives, and experiences. Our strong commitment to a culture of inclusion is evident through our constant focus on recruiting, developing, and advancing individuals based on their skills and talents.
Posted 2 weeks ago
3.0 - 4.0 years
6 - 9 Lacs
Hyderābād
On-site
Summary We are seeking a highly-skilled and experienced Marketing Cloud Testing team to join our team Marketing Automation team who works closely with brand teams; understands various data sources, adept in building data ingestion pipelines, skilled in testing end-to-end data ingestion layers, data models and visualization dashboards based on previously built test scripts. About the Role Key Responsibilities: Build e2e test scripts for each release based on user epics across the data value chain – ingestion, data model and visualization Post development, run the test scripts using any of testing platforms viz Proton etc Document results and highlight any bugs / errors to development team and work closely with development team to resolve the issues Must audit technical developments and solutions and validate matching of source data with MCI Additional responsibilities may include creating and updating knowledge documents in the repository as needed. Work closely with Technical Lead and Business Analysts to help design testing strategy and testing design as part of pre-build activities Participate in data exploration and data mapping activities along with technical lead and business and DDIT architects for any new data ingestion needs from business along with Development team Build and maintain standard SOPs to run smooth operations that enable proper upkeep of visualization data and insights Qualifications: Minimum of 3-4 years of experience in Dataroma / MCI as hands on developer Prior experience in any of visualization platforms viz Tableau, Qlik, Power BI as core developer is a plus Experience of working on Data Cloud and other data platforms is a plus Hand-on experience in using any ETL tools such as Informatica, Alteryx, DataIKU preferred Prior experience in testing automation platforms preferred Excellent written and verbal skills. Strong interpersonal and analytical skills Ability to provide efficient, timely, reliable, and courteous service to customers. Ability to effectively present information Demonstrated knowledge of the Data Engineering & Business Intelligence ecosystem Salesforce MCI certification. Familiarity with AppExchange deployment, Flow, Aura component and Lightning Web component will be a plus. Commitment to Diversity and Inclusion: Novartis is committed to building an outstanding, inclusive work environment and diverse teams' representative of the patients and communities we serve. Accessibility and accommodation: Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to diversityandincl.india@novartis.com and let us know the nature of your request and your contact information. Please include the job requisition number in your message Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards Division US Business Unit Universal Hierarchy Node Location India Site Hyderabad (Office) Company / Legal Entity IN10 (FCRS = IN010) Novartis Healthcare Private Limited Functional Area Marketing Job Type Full time Employment Type Regular Shift Work No Accessibility and accommodation Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to [email protected] and let us know the nature of your request and your contact information. Please include the job requisition number in your message. Novartis is committed to building an outstanding, inclusive work environment and diverse teams' representative of the patients and communities we serve.
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Role Overview: We are seeking a skilled Data Engineer with hands-on experience in Dataiku DSS to join our team. The ideal candidate will design and develop data pipelines, optimize workflows, and implement AI/ML models on cloud platforms. The role demands technical expertise, problem-solving ability, and a collaborative mindset. Key Responsibilities: Design and develop scalable ETL (Extract, Transform, Load) pipelines to collect and process data from multiple sources. Configure and streamline Dataiku DSS workflows for efficient data processing and machine learning operations. Integrate Dataiku with cloud platforms such as AWS, Azure, and GCP, as well as big data tools like Snowflake, Hadoop, and Spark . Develop and deploy AI/ML models for predictive analytics using Dataiku. Implement MLOps and DataOps practices within the platform for model deployment and data flow automation. Monitor job performance and automate data workflows for improved scalability and reliability. Customize Dataiku functionality with Python or R scripts for enhanced analytics. Manage and support the Dataiku platform, ensuring its reliability and performance. Must-Have Skills: 2 to 6 years of hands-on experience with the Dataiku DSS platform . Strong proficiency in Python and SQL for scripting and data manipulation. Solid understanding of ETL processes and data pipeline development. Experience with cloud environments (AWS, Azure, GCP). Familiarity with big data frameworks such as Spark and Hadoop. Good understanding of AI/ML model development and deployment practices. Ability to automate workflows and monitor performance effectively. Strong analytical thinking and problem-solving abilities. Excellent verbal and written communication skills.
Posted 2 weeks ago
6.0 - 12.0 years
0 Lacs
delhi
On-site
You will support the Analytics solutions team in ramping up F&A analytics and reporting practice using the Dataiku platform. Your role will involve partnering with internal stakeholders and clients to identify, analyze, and deliver analytics and automation solutions using Dataiku. You will be responsible for translating business requirements into technical solutions and managing the end-to-end delivery of Dataiku-based projects. Additionally, you will communicate technical infrastructure requirements to deploy automation solutions and convert solutions into tools and products. Leading and mentoring a team of junior resources to enable skill development in Dataiku, data engineering, and machine learning workflows will also be part of your responsibilities. Essential duties include identifying F&A automation opportunities in the client environment and performing end-to-end automation operations. As a Senior Dataiku developer with over 6 years of experience, you should be proficient in building dynamic workflows, models, and pipelines. You should have experience in developing custom formulas, applications, and plugins within the Dataiku DSS environment, as well as integrating and working with Snowflake. A good understanding of SQL and experience integrating Dataiku with enterprise systems such as SAP, Oracle, or cloud data platforms is required. You must possess a balance of analytical problem-solving skills and strong interpersonal and relationship development abilities. In terms of technical skills, you should have hands-on experience in Dataiku, Alteryx, SQL, Power BI, and Snowflake. Proficiency in creating data pipelines for data ingestion, transformation, and output within the Dataiku platform is essential. An understanding of Python and R scripting within Dataiku is considered a strong advantage, along with a strong working knowledge of JIRA for agile project and task tracking. Desired soft skills for this role include excellent presentation, verbal, and written communication skills, as well as excellent analytical skills and an aptitude for problem-solving, including data analysis and validation. The ability to work independently and as part of a team is also highly valued. To be considered for this position, you should have 5-12 years of total analytics experience, with at least 6 years of experience specifically in Dataiku. Additionally, 1-2 years of working experience in Insurance Analytics would be beneficial.,
Posted 2 weeks ago
3.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Engineer in our organization, you will leverage your expertise in specialized tools such as Python, Dataiku, Spark, and SAS DI to support our clients in their Data and Analytics journey. Your responsibilities will involve designing and implementing data integration solutions, optimizing data workflows, and collaborating with cross-functional teams to ensure successful project delivery. Your key activities will include understanding and defining data initiatives based on business needs, developing and implementing data integration solutions, and conducting performance tuning and optimization of queries and data models. You will also be involved in database design and implementation, ensuring data quality and consistency across the data warehouse. Collaboration and communication are crucial aspects of this role as you will work closely with clients and internal teams to gather requirements, troubleshoot data-related issues, and provide technical support to the development team. Additionally, you will be responsible for driving the project as a technical lead, managing end-to-end project deliverables, and ensuring seamless collaboration with clients. Essential skills for this position include experience with data engineering tools, strong SQL proficiency, BI knowledge, and the ability to convert business requirements into technical specifications. Desirable skills include analytical capabilities for data analysis and experience with BI tools like Microstrategy and Power BI. To qualify for this role, you should have at least 3 years of experience in designing and developing data integration solutions, along with technical certifications that demonstrate your commitment to continuous learning. Your qualities should include the ability to influence change confidently, tackle problems systematically, and collaborate effectively in self-organized, cross-functional teams. If you have a passion for data engineering and a drive to excel in a dynamic work environment, we invite you to join our team and contribute to our clients" success in the realm of Data and Analytics.,
Posted 2 weeks ago
6.0 - 9.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Forvia, a sustainable mobility technology leader We pioneer technology for mobility experience that matter to people. Your Mission, Roles And Responsibilities Sr. Data Engineer Country/Region: Pune/India Contract Type: Full time New trends and expectations are reshaping the automotive industry. Inspired by the exciting new challenges associated with this revolution, Faurecia anticipates the future of mobility developing cutting-edge solutions for smart life on board and sustainable mobility. If you’re willing to contribute and create value for tomorrow’s cleaner and smarter mobility, Faurecia is the place to be. The Digital Services Factory (DSF) is a fast growing team in charge of providing AI solutions for manufacturing excellence and efficiency as well as for smart product development for the automotive industry. In order to meet the increasing expectations we are enriching our team by adding highly skilled resources. Overall Responsibilities And Duties Data Mining from structured, semi-structure & unstructured data sources Develop Business intelligence dashboards & insights with datasets Ensure data quality, interpret & analyse the data problems Prepare data for prescriptive & predictive modelling Create ontologies for knowledge & data management Collaborate with business owners, IT Architects & Data Scientists Qualifications Bachelors/Masters Degree in Computer Science Engineering Minimum 6 to 9 years of experience in working on industrial scale projects in Big Data. Hands on experience in building data models, data mining & segmentation Good knowledge of Big data eco systems (Spark & Hadoop) Good knowledge of python & SQL Knowledge of SQL & NoSQL Databases Knowledge of PowerBI Knowledge of Dataiku, Palantir is a plus Good communication skills in English Good logical & analytical thinking Your profile and competencies to succeed What We Can Do For You At Forvia, you will find an engaging and dynamic environment where you can contribute to the development of sustainable mobility leading technologies. We are the seventh-largest global automotive supplier, employing more than 157,000 people in more than 40 countries which makes a lot of opportunity for career development. We welcome energetic and agile people who can thrive in a fast-changing environment. People who share our strong values. Team players with a collaborative mindset and a passion to deliver high standards for our clients. Lifelong learners. High performers. Globally minded people who aspire to work in a transforming industry, where excellence, speed, and quality count. We cultivate a learning environment, dedicating tools and resources to ensure we remain at the forefront of mobility. Our people enjoy an average of more than 22 hours of online and in-person training within FORVIA University (five campuses around the world) We offer a multicultural environment that values diversity and international collaboration. We believe that diversity is a strength. To create an inclusive culture where all forms of diversity create real value for the company, we have adopted gender diversity targets and inclusion action plans. Achieving CO2 Net Zero as a pioneer of the automotive industry is a priority: In June 2022, Forvia became the first global automotive group to be certified with the new SBTI Net-Zero Standard (the most ambitious standard of SBTi), aligned with the ambition of the 2015 Paris Agreement of limiting global warming to 1.5°C. Three principles guide our action: use less, use better and use longer, with a focus on recyclability and circular economy. Why join us FORVIA is an automotive technology group at the heart of smarter and more sustainable mobility. We bring together expertise in electronics, clean mobility, lighting, interiors, seating, and lifecycle solutions to drive change in the automotive industry. With a history stretching back more than a century, we are the 7th largest global automotive supplier, employing more than 157,000 people in 43 countries. You'll find our technology in around 1 out of 2 vehicles produced anywhere in the world. In June 2022, we became the 1st global automotive group to be certified with the SBTI Net-Zero Standard. We have committed to reach CO2 Net Zero by no later than 2045. As technological innovation and the need for sustainability transform the automotive industry, we are ideally positioned to deliver solutions that will enhance the lives of road-users everywhere.
Posted 2 weeks ago
2.0 - 5.0 years
6 - 9 Lacs
Hyderābād
On-site
Data Engineer, DT US PxE The Data Engineer is an integral part of the technical application development team and primarily responsible for analyze, plan, design, develop, and implement the Azure Data engineering solutions to meet strategic, usability, performance, reliability, control, and security requirements of Data science processes. Requires demonstrable knowledge in areas of Data engineering, AI/ML, Data warehouse and reporting applications. Must be innovative. Work you will do A unique opportunity to be a part of growing team that works on a premier unified data science analytics platform within Deloitte. You will be responsible for implementing/delivering/supporting Data engineering and AI/ML solutions to support the Deloitte US Member Firm. Outcome-Driven Accountability Collaborate with business and IT leaders to develop and refine ideas for integrating predictive and prescriptive analytics within business processes, ensuring measurable customer and business outcomes. Decompose complex business problems into manageable components, facilitating the use of multiple analytic modeling methods for holistic and valuable solutions. Develop and refine prototypes and proofs of concepts, presenting results to business and IT leaders, and demonstrating the impact on customer needs and business outcomes. Technical Leadership and Advocacy Engage in data analysis, generating and testing hypotheses, preparing and analyzing historical data, identifying patterns, and applying statistical methods to formulate solutions that deliver high-quality outcomes. Develop project plans, including resource needs and task dependencies, to meet project deliverables with a focus on incremental and iterative delivery. Engineering Craftsmanship Participate in defining project scope, objectives, and quality controls for new projects, ensuring alignment with customer-centric engineering principles. Present and communicate project deliverable results, emphasizing the value delivered to customers and the business. Customer-Centric Engineering Assist in recruiting and mentoring team members, fostering a culture of engineering craftsmanship and continuous learning. Incremental and Iterative Delivery Stay abreast of changes in technology, leading new technology evaluations for predictive and statistical analytics, and advocating for innovative, lean, and feasible solutions. Education: Bachelor’s degree in Computer Science or Business Information Systems or MCA or equivalent degree. Qualifications: 2 to 5 years Advanced Level of experience in Azure Data engineering Expertise in Development, deployment and monitoring ADF pipelines (using visual studio and browsers) Expertise in Azure databricks internal programming using (PySpark, SparkR and SparkSQL) or Amazon EMR (Elastic MapReduce). Expertise in managing azure storage (Azure Datalake Gen2, Azure Blob Storage, Azure SQL database) or Azure Blob Storage, Azure Data Lake Storage, Azure Synapse Analytics, Azure Data Factory Advanced programming skills in Python, R and SQL (SQL for HANA, MS SQL) Hands on experience in Visualization tools (Tableau / PowerBI) Hands on experience in Data science studios like (Dataiku, Azure ML studio, Amazon SageMaker) The Team Information Technology Services (ITS) helps power Deloitte’s success. ITS drives Deloitte, which serves many of the world’s largest, most respected organizations. We develop and deploy cutting-edge internal and go-to-market solutions that help Deloitte operate effectively and lead in the market. Our reputation is built on a tradition of delivering with excellence. The ~3,000 professionals in ITS deliver services including: Security, risk & compliance Technology support Infrastructure Applications Relationship management Strategy Deployment PMO Financials Communications Product Engineering (PxE) Product Engineering (PxE) is the internal software and applications development team responsible for delivering leading-edge technologies to Deloitte professionals. Their broad portfolio includes web and mobile productivity tools that empower our people to log expenses, enter timesheets, book travel and more, anywhere, anytime. PxE enables our client service professionals through a comprehensive suite of applications across the business lines. In addition to application delivery, PxE offers full-scale design services, a robust mobile portfolio, cutting-edge analytics, and innovative custom development. How you will grow At Deloitte, we have invested a great deal to create a rich environment in which our professionals can grow. We want all our people to develop in their own way, playing to their own strengths as they hone their leadership skills. And, as a part of our efforts, we provide our professionals with a variety of learning and networking opportunities—including exposure to leaders, sponsors, coaches, and challenging assignments—to help accelerate their careers along the way. No two people learn in exactly the same way. So, we provide a range of resources, including live classrooms, team-based learning, and eLearning. Deloitte University (DU): The Leadership Center in India, our state-of-the-art, world-class learning center in the Hyderabad office, is an extension of the DU in Westlake, Texas, and represents a tangible symbol of our commitment to our people’s growth and development. Explore DU: The Leadership Center in India. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Deloitte’s culture Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture that is inclusive, invites authenticity, leverages our diversity, and where our people excel and lead healthy, happy lives. Learn more about Life at Deloitte. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people, and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. Learn more about Deloitte’s impact on the world. Disclaimer: Please note that this description is subject to change basis business/engagement requirements and at the discretion of the management. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India. Benefits to help you thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 302720
Posted 2 weeks ago
7.0 - 10.0 years
10 - 14 Lacs
Noida
Work from Office
About the Job : We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team. In this pivotal role, you will be instrumental in driving our data engineering initiatives, with a strong emphasis on leveraging Dataiku's capabilities to enhance data processing and analytics. You will be responsible for designing, developing, and optimizing robust data pipelines, ensuring seamless integration of diverse data sources, and maintaining high data quality and accessibility to support our business intelligence and advanced analytics projects. This role requires a unique blend of expertise in traditional data engineering principles, advanced data modeling, and a forward-thinking approach to integrating cutting-AI technologies, particularly LLM Mesh for Generative AI applications. If you are passionate about building scalable data solutions and are eager to explore the cutting edge of AI, we encourage you to apply. Key Responsibilities : - Dataiku Leadership : Drive data engineering initiatives with a strong emphasis on leveraging Dataiku capabilities for data preparation, analysis, visualization, and the deployment of data solutions. - Data Pipeline Development : Design, develop, and optimize robust and scalable data pipelines to support various business intelligence and advanced analytics projects. This includes developing and maintaining ETL/ELT processes to automate data extraction, transformation, and loading from diverse sources. - Data Modeling & Architecture : Apply expertise in data modeling techniques to design efficient and scalable database structures, ensuring data integrity and optimal performance. - ETL/ELT Expertise : Implement and manage ETL processes and tools to ensure efficient and reliable data flow, maintaining high data quality and accessibility. - Gen AI Integration : Explore and implement solutions leveraging LLM Mesh for Generative AI applications, contributing to the development of innovative AI-powered features. - Programming & Scripting : Utilize programming languages such as Python and SQL for data manipulation, analysis, automation, and the development of custom data solutions. - Cloud Platform Deployment : Deploy and manage scalable data solutions on cloud platforms such as AWS or Azure, leveraging their respective services for optimal performance and cost-efficiency. - Data Quality & Governance : Ensure seamless integration of data sources, maintaining high data quality, consistency, and accessibility across all data assets. Implement data governance best practices. - Collaboration & Mentorship : Collaborate closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver impactful solutions. Potentially mentor junior team members. - Performance Optimization : Continuously monitor and optimize the performance of data pipelines and data systems. Required Skills & Experience : - Proficiency in Dataiku : Demonstrable expertise in Dataiku for data preparation, analysis, visualization, and building end-to-end data pipelines and applications. - Expertise in Data Modeling : Strong understanding and practical experience in various data modeling techniques (e.g., dimensional modeling, Kimball, Inmon) to design efficient and scalable database structures. - ETL/ELT Processes & Tools : Extensive experience with ETL/ELT processes and a proven track record of using various ETL tools (e.g., Dataiku's built-in capabilities, Apache Airflow, Talend, SSIS, etc.). - Familiarity with LLM Mesh : Familiarity with LLM Mesh or similar frameworks for Gen AI applications, understanding its concepts and potential for integration. - Programming Languages : Strong proficiency in Python for data manipulation, scripting, and developing data solutions. Solid command of SQL for complex querying, data analysis, and database interactions. - Cloud Platforms : Knowledge and hands-on experience with at least one major cloud platform (AWS or Azure) for deploying and managing scalable data solutions (e.g., S3, EC2, Azure Data Lake, Azure Synapse, etc.). - Gen AI Concepts : Basic understanding of Generative AI concepts and their potential applications in data engineering. - Problem-Solving : Excellent analytical and problem-solving skills with a keen eye for detail. - Communication : Strong communication and interpersonal skills to collaborate effectively with cross-functional teams. Bonus Points (Nice to Have) : - Experience with other big data technologies (e.g., Spark, Hadoop, Snowflake). - Familiarity with data governance and data security best practices. - Experience with MLOps principles and tools. - Contributions to open-source projects related to data engineering or AI. Education : Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related quantitative field.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough