Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Summary Provide analytics support to Novartis internal customers (Countries & Regional marketing and sales teams) on various low-medium complexity analytical reports. -Support and facilitate data enabled decision making for Novartis internal customers by providing and communicating qualitative and quantitative analytics. -Support business in building practice by involving in various initiatives like knowledge sharing, on-boarding and training support, support team lead in all business related tasks / activities, building process documentation and knowledge repositories -To be an integral part of a comprehensive design team responsible for designing promotional marketing materials. Responsible for delivering Call plans/Alignments to support the Field teams in reaching HCPs/Patients. About The Role About the Role: Acts as a function level SME, works on multiple client engagements with collaborating with teams members to produce high quality results. Provide though leadership and innovation, lead initiatives of process excellence and possesses very strong analytical skills Your Responsibilities Include, But Are Not Limited To Help develops new service offerings in close collaboration with functional and account management teams The responsibilities for this role include but not limited to deliver Call planning/ Refinements / Territory Alignments / Incentives independently with high quality, on time, error-free and in line with requirements. Able to communicate effectively and handle client calls independently. Should be able to contribute to ideas in team huddles for process improvements. Continuously improve processes by eliminating redundancies and inefficient process.- Should have proven ability to guide and Onboard new members quickly. Should be able to cross collaborate and identify bottle necks in newer processes or launches Build and deliver below customer requirements as per agreed SLAs (timeliness, accuracy, quality, etc) and drive excellent customer service Hands on to Customer segmentation & targeting, Field-force optimization, Territory alignment, Incentive compensation (Plan-Design-Admin), Territory sales performance reports, Activity (QTQ) performance reports, Others SFE support, Call plan management and Territory action plan report Deliver services through structured project management approach with appropriate documentation and communication throughout the delivery of services What You’ll Bring To The Role Should be customer service oriented and strong and proactive business results-focus, and proven track record to provide insights that increase efficiency Support team leaders in recruitment and on-boarding of new associates within the organization. Leads capability building by actively prioritizing various knowledge sharing sessions that enables growth and improves quality of CSP Hyd deliverables across the function. Stay in sync with all internal functional operating procedures like time tracking, critical metric tracking and strong analytical thinking with problem solving approach. Build and maintain standard operating procedures (SOPs), quality checklists that will enable excellent quality outputs for all outputs within the function. Develop and maintain knowledge repositories that captures qualitative and quantitative reports of brands, disease areas, macroeconomic trends of Novartis operating markets, etc. Reporting, and other internal systems and processes. Comply to all Novartis operating procedures as per legal / IT / HR requirements Desirable Requirements University/Advanced degree is required, Master’s degree in fields such as business administration, finance, computer science or technical field is preferred Experience (2+ years) in |Hands-On | Snowflake (SQL), ETL, Data Model Design IC, Pharma Analytics in a market research firm or pharmaceutical company or Pharma KPO and should have an understanding of Pharmaceutical business including its regulatory environment. Hands on to Dataiku, JCPM, JAMS, Python, SQL, Excel, Powerpoint, Zaidyn (good to have). Experience on JTD is required in case of Alignment hiring. Should be keen on learning new tools and techniques and adapt to technological transformations Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards
Posted 3 weeks ago
5.0 - 8.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Mandatory Skills: Dataiku. Experience: 5-8 Years.
Posted 3 weeks ago
5.0 - 8.0 years
3 - 7 Lacs
Chennai
Work from Office
The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Mandatory Skills: Dataiku. Experience: 5-8 Years.
Posted 3 weeks ago
3.0 - 8.0 years
16 - 20 Lacs
Bengaluru
Work from Office
Responsibilities: Drive business growth in assigned portfolio in AI and ML. Responsible for collaborating with clients to understand their business challenges and advise cutting-edge AI solutions using AI and Generative AI technology. Conduct research and stay updated on the latest advancements in Generative AI technology and methodologies. Provide expertise and guidance to clients and internal teams on Generative AI best practices. Collaborate with cross-functional teams to define AI project requirements and objectives, ensuring alignment with overall business goals. Drive the team to build GenAI products and solutions. Tech Skills: Should have consulting experience with senior client stakeholders in AI ML technologies Should have strong experience in building AI and Generative AI solutions Should have solid experience developing and implementing generative AI models Experience with cloud-based platforms and services Experience with natural language processing (NLP) techniques and tools.
Posted 3 weeks ago
5.0 - 8.0 years
3 - 7 Lacs
Chennai
Work from Office
Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Mandatory Skills: Dataiku. Experience: 5-8 Years.
Posted 3 weeks ago
7.0 - 10.0 years
10 - 14 Lacs
Hyderabad
Work from Office
About the Job : We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team. In this pivotal role, you will be instrumental in driving our data engineering initiatives, with a strong emphasis on leveraging Dataiku's capabilities to enhance data processing and analytics. You will be responsible for designing, developing, and optimizing robust data pipelines, ensuring seamless integration of diverse data sources, and maintaining high data quality and accessibility to support our business intelligence and advanced analytics projects. This role requires a unique blend of expertise in traditional data engineering principles, advanced data modeling, and a forward-thinking approach to integrating cutting-AI technologies, particularly LLM Mesh for Generative AI applications. If you are passionate about building scalable data solutions and are eager to explore the cutting edge of AI, we encourage you to apply. Key Responsibilities : - Dataiku Leadership : Drive data engineering initiatives with a strong emphasis on leveraging Dataiku capabilities for data preparation, analysis, visualization, and the deployment of data solutions. - Data Pipeline Development : Design, develop, and optimize robust and scalable data pipelines to support various business intelligence and advanced analytics projects. This includes developing and maintaining ETL/ELT processes to automate data extraction, transformation, and loading from diverse sources. - Data Modeling & Architecture : Apply expertise in data modeling techniques to design efficient and scalable database structures, ensuring data integrity and optimal performance. - ETL/ELT Expertise : Implement and manage ETL processes and tools to ensure efficient and reliable data flow, maintaining high data quality and accessibility. - Gen AI Integration : Explore and implement solutions leveraging LLM Mesh for Generative AI applications, contributing to the development of innovative AI-powered features. - Programming & Scripting : Utilize programming languages such as Python and SQL for data manipulation, analysis, automation, and the development of custom data solutions. - Cloud Platform Deployment : Deploy and manage scalable data solutions on cloud platforms such as AWS or Azure, leveraging their respective services for optimal performance and cost-efficiency. - Data Quality & Governance : Ensure seamless integration of data sources, maintaining high data quality, consistency, and accessibility across all data assets. Implement data governance best practices. - Collaboration & Mentorship : Collaborate closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver impactful solutions. Potentially mentor junior team members. - Performance Optimization : Continuously monitor and optimize the performance of data pipelines and data systems. Required Skills & Experience : - Proficiency in Dataiku : Demonstrable expertise in Dataiku for data preparation, analysis, visualization, and building end-to-end data pipelines and applications. - Expertise in Data Modeling : Strong understanding and practical experience in various data modeling techniques (e.g., dimensional modeling, Kimball, Inmon) to design efficient and scalable database structures. - ETL/ELT Processes & Tools : Extensive experience with ETL/ELT processes and a proven track record of using various ETL tools (e.g., Dataiku's built-in capabilities, Apache Airflow, Talend, SSIS, etc.). - Familiarity with LLM Mesh : Familiarity with LLM Mesh or similar frameworks for Gen AI applications, understanding its concepts and potential for integration. - Programming Languages : Strong proficiency in Python for data manipulation, scripting, and developing data solutions. Solid command of SQL for complex querying, data analysis, and database interactions. - Cloud Platforms : Knowledge and hands-on experience with at least one major cloud platform (AWS or Azure) for deploying and managing scalable data solutions (e.g., S3, EC2, Azure Data Lake, Azure Synapse, etc.). - Gen AI Concepts : Basic understanding of Generative AI concepts and their potential applications in data engineering. - Problem-Solving : Excellent analytical and problem-solving skills with a keen eye for detail. - Communication : Strong communication and interpersonal skills to collaborate effectively with cross-functional teams. Bonus Points (Nice to Have) : - Experience with other big data technologies (e.g., Spark, Hadoop, Snowflake). - Familiarity with data governance and data security best practices. - Experience with MLOps principles and tools. - Contributions to open-source projects related to data engineering or AI. Education : Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related quantitative field.
Posted 3 weeks ago
3.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Morgan Stanley Model Risk process Validation Group - Associate Profile Description We’re seeking someone to join our team as a [Associate] to [Model Risk Process Validation Group]. Firm Risk Management In the Firm Risk Management division, we advise businesses across the Firm on risk mitigation strategies, develop tools to analyze and monitor risks and lead key regulatory initiatives. Company Profile Morgan Stanley is an industry leader in financial services, known for mobilizing capital to help governments, corporations, institutions, and individuals around the world achieve their financial goals. Since 1935, Morgan Stanley is known as a global leader in financial services, always evolving and innovating to better serve our clients and our communities in more than 40 countries around the world. What You’ll Do In The Role The primary responsibilities of the role include, but are not limited to the following: Primary Responsibilities Perform independent validations of select FRM processes and controls, including those relating to regulatory and Basel requirements; Support execution of reviews (e.g., planning, documenting, reporting) and continuous monitoring activities (e.g., risk assessments); Contribute to improving the team's validation methodology and execution capabilities; Interface with key stakeholders, governing bodies, and business partners to review status of validation work, results of test work, and quarterly reporting; Partner with other independent validation teams, e.g., Model Risk Management, Regulatory Reporting Quality Assurance (RRQA), to support a unified validation program end-to-end. Experience What you’ll bring to the role: Bachelor's or higher degree in Finance, Economics, Computer Science, Mathematics, Engineering or other business or risk management related areas Experience from consulting, risk management, or internal audit covering processes and controls across risk stripes (e.g., Credit, Market, Liquidity, Capital and Data Risk) Experience in data analytics, data visualization, or process automation Strong risk, process, and control validation/testing, and assessment skills Strong communication and analytical skills A commitment to teamwork Ability to prioritize and manage multiple competing objectives. Skills Strong understanding banking regulatory environment, including familiarity with Bank of International Settlements (BIS) principles (e.g, Basel III, BCBS 239, FRTB) and FRB Capital Planning requirements and practices (e.g., CCAR, DFAST) 3+ years of relevant industry experience with core banking, investment and trading products, and banking regulations (e.g., FRB SR 11-07, SR 12-17, SR 14-08, SR 15-18, PRA SS1/23) Understanding of data lineage and database schema; experience working with large data sets, data warehouse, or data lake; knowledge of IT general controls; business analyst experience; Knowledge and experience with data analytics and data visualization tools and systems (e.g., PowerBI, Alteryx, Dataiku, QlikView, Tableau), experience with writing or editing SQL, VBA ,delete VBA, Python and/or other programming languages; advanced Excel knowledge Relevant certifications or designations (e.g., CFA or FRM) (preferred). What You Can Expect From Morgan Stanley We are committed to maintaining the first-class service and high standard of excellence that have defined Morgan Stanley for over 89 years. Our values - putting clients first, doing the right thing, leading with exceptional ideas, committing to diversity and inclusion, and giving back - aren’t just beliefs, they guide the decisions we make every day to do what's best for our clients, communities and more than 80,000 employees in 1,200 offices across 42 countries. At Morgan Stanley, you’ll find an opportunity to work alongside the best and the brightest, in an environment where you are supported and empowered. Our teams are relentless collaborators and creative thinkers, fueled by their diverse backgrounds and experiences. We are proud to support our employees and their families at every point along their work-life journey, offering some of the most attractive and comprehensive employee benefits and perks in the industry. There’s also ample opportunity to move about the business for those who show passion and grit in their work. To learn more about our offices across the globe, please copy and paste https://www.morganstanley.com/about-us/global-offices into your browser. Morgan Stanley is an equal opportunities employer. We work to provide a supportive and inclusive environment where all individuals can maximize their full potential. Our skilled and creative workforce is comprised of individuals drawn from a broad cross section of the global communities in which we operate and who reflect a variety of backgrounds, talents, perspectives, and experiences. Our strong commitment to a culture of inclusion is evident through our constant focus on recruiting, developing, and advancing individuals based on their skills and talents.
Posted 3 weeks ago
5.0 - 10.0 years
15 - 30 Lacs
Chennai, Bengaluru
Hybrid
Data Engineer - Dataiku Location - Chennai, Bangalore. Notice Period - Immediate to 30 Days Mandatory Skills DataIku for ETL operations and preferably other ETL tools like Informatica. Good in Python Coding, SQL, GIT Proven experience as a Data Engineer, Data Integration, Data Analyst. Preferred Skills Banking Domain exposure and AI/ML, Data Science exposure Key Roles & Responsibilities 7+ years of experience, including 2+ years of experience in delivering projects in DataIku platforms. Proficiency in configuring and optimizing Dataikus architecture, including data connections, security settings and workflow management. Hands-on experience with Dataiku recipes, Designer nodes, API nodes & Automation nodes with deployment. Expertise in python scripting, automation and development of custom workflows in Dataiku Collaborate with data analyst, business stakeholders and client to gather and understand the requirement. To contribute to the developments in DataIku environment to apply data integration with given logic to fulfil Bank Regulatory requirement and other customer requirement. Gather, analyse and interpret requirement specifications received directly from the client. Ability to work independently and effectively in a fast-paced, dynamic environment. Strong analytical and problem-solving skills. Familiarity with agile development methodologies. Participate in the CR/Production deployment implementation process through Azure DevOps.
Posted 3 weeks ago
12.0 years
6 - 9 Lacs
Hyderābād
On-site
Summary Responsible for the detailed Design, Development, and Delivery of system solutions such as Reporting, Analytical and Gen AI within a specific business or technology area. This role requires alignment with the defined solution architecture, leveraging existing patterns, and ensuring compliance with both business and technical requirements. About the Role Role Title: Assoc. Dir. DDIT DEV Data Analytics DS&AI Location : Hyderabad Hyd-India# LI Hybrid Role Purpose: Create the detailed DDIT solution/service design, based on functional specifications to meet quality and performance requirements and technical constraints. Responsible for detailed design, development, code review and delivery of Analytical and Gen AI solutions Your responsibilities include but are not limited to Responsible for the detailed design, development, and delivery of system solutions within a specific business or technology area. This role requires alignment with the defined solution architecture, leveraging existing patterns, and ensuring compliance with both business and technical requirements. Develop solution architectures that align with enterprise standards and meet functional and non-functional requirements, and the solution could be Reporting, Web application and Gen AI. Leverage reference architectures, patterns, and standards to ensure consistency and scalability. Take accountability for technical delivery of projects /use cases for a specific business/technology area and ensure adherence with Security and Compliance policies and procedures within Service Delivery scope Collaborate and lead with diverse groups of work colleagues (data engineering, data science, platform team and business stakeholders) and positively manage ambiguity Apply best practices in design and continuously improve upon intuitive user experience for business stakeholders. Ensure the overall user experience is considered when designing new solutions and services Individual contributor or leading teams. Engaging with multiple stakeholders (architecture / infrastructure / vendor partners) medium to large-sized complexity of projects What you’ll bring to the role: Should have a background in programming and solution design. Exposure to a wide range of technologies is preferred in Reporting, Web applications and Gen AI domain . Experience in project management and solution/service delivery Strong analytical and conceptual skills for designing and implementing IT solutions that meet business needs. Hands-on experience in Cloud Platforms like AWS ( Amazon Web Services ), Amazon S3, Amazon RDS, Databricks, AWS Glue Extensive hands-on experience with Power BI or Qlik Sense/ Spotfire Experience in web technologies and React JS Working experience and knowledge in ETL tools ( Databricks , Spark, Kafka, Dataiku etc.), data modeling Experience working with Database technologies ( Oracle, Snowflake, etc.) & data processing languages (SQL, Python, R , etc.) proficiency in Generative AI, large language models (LLMs), multimodal AI, and deep learning for pharma applications. Excellent communication and stakeholder management skills. Experience in highly regulated environments, ideally in the Pharma industry, including on Computer System Validation with good documentation practice (GxP) Experience working in Agile Scrum teams Exposure to fine-tuning the LLM models will be a big plus Desirable Requirements: Education & Qualifications bachelor’s degree in computer science, Computer Engineering or related technical discipline or equivalent experience demonstrated 12+ years of experience with expert understanding and proven track record in analyzing business processes, architecting, designing, developing and integrating complex, cross-divisional end to end analytical solutions with large data volumes. Hands on to ETL tools like Databricks, SQL, Database technologies, AWS technologies (Primary Skill - Strong), Power BI technologies (Primary Skill- Good), web technologies and React JS (Primary Skill- Strong), Generative AI, large language models (LLMs), multimodal AI, and deep learning (Secondary Skill or Good to have), Architect for Reporting, Analytical and Web applications(Primary Skill),Python or R (Secondary Skill or Good to have), GxP compliance (Good to have), Pharma domain knowledge (Good to have) Commitment to Diversity & Inclusion: Novartis embraces diversity, equal opportunity, and inclusion. We are committed to building diverse teams, representative of the patients and communities we serve, and we strive to create an inclusive workplace that cultivates bold innovation through collaboration and empowers our people to unleash their full potential. Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards Division Operations Business Unit Universal Hierarchy Node Location India Site Hyderabad (Office) Company / Legal Entity IN10 (FCRS = IN010) Novartis Healthcare Private Limited Functional Area Technology Transformation Job Type Full time Employment Type Regular Shift Work No
Posted 3 weeks ago
10.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Reference # 318225BR Job Type Full Time Your role Are you interested in pursuing a career in Data Science and AI with STAAT team in Global wealth management Americas? You will be responsible for building a team around Machine learning and AI in Mumbai Building deployable machine learning models and embedding it in business workflow Defining AI research problems and criteria for evaluating success Contributing to product design, developing features to enhance the product based on Natural Language techniques Processing massive amounts of structured and unstructured data Researching new machine learning solutions for complex business problems and embedding the models within the business workflow Communicating findings Your team You’ll be working STAAT Data Science team in Mumbai Your expertise degree in Computer Science or related field strong understanding of probability, statistics, linear algebra and calculus. 10+ years’ experience in developing and NLP Models Expert level proficiency in python Experience Leading a team of 10 or more 2+ years’ experience in building NLP models for Sentiment scoring, summarization, abstractions using deep learning, transfer learning techniques Experience in dealing with large-scale unstructured text data Experience in machine learning packages ML experience with different supervised and unsupervised learning algorithms knowledge of a variety of machine learning techniques such as classification, clustering, optimization, Random Forest, PCA, XgBoost, natural language processing, deep neural network, etc good understanding of mathematical underpinning and their real world advantages/drawbacks hands on experience of using programming languages (Python, R,SQL, etc.) to manipulate data, develop models and derive insights hands on experience of database and analytical technologies in the industry, such as Greenplum, DB2, Dataiku, Hadoop, etc hands-on experience deploying analytical models to solve business problems ability to develop experimental and analytical plans for data modeling processes and A/B testing About Us UBS is the world’s largest and the only truly global wealth manager. We operate through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management and the Investment Bank. Our global reach and the breadth of our expertise set us apart from our competitors.. We have a presence in all major financial centers in more than 50 countries. How We Hire We may request you to complete one or more assessments during the application process. Learn more Join us At UBS, we know that it's our people, with their diverse skills, experiences and backgrounds, who drive our ongoing success. We’re dedicated to our craft and passionate about putting our people first, with new challenges, a supportive team, opportunities to grow and flexible working options when possible. Our inclusive culture brings out the best in our employees, wherever they are on their career journey. We also recognize that great work is never done alone. That’s why collaboration is at the heart of everything we do. Because together, we’re more than ourselves. We’re committed to disability inclusion and if you need reasonable accommodation/adjustments throughout our recruitment process, you can always contact us. Disclaimer / Policy Statements UBS is an Equal Opportunity Employer. We respect and seek to empower each individual and support the diverse cultures, perspectives, skills and experiences within our workforce.
Posted 3 weeks ago
6.0 years
0 Lacs
New Delhi, Delhi, India
On-site
Role Supports Analytics solutions team in ramping up F&A analytics and reporting practice using Dataiku platform Partner with internal stakeholders and clients to identify, analyze, and deliver analytics and automation solutions using Dataiku. Translate business requirements into technical solutions and manage end-to-end delivery of Dataiku-based projects. Communicate technical infrastructure requirements to deploy automation solutions and convert solutions to tools, and products Lead and mentor a team of junior resources, enabling skill development in Dataiku, data engineering, and machine learning workflows. Essentials Identify F&A automation opportunities in client environment and able to perform end to end automation operations Senior Dataiku developer with 6+ years of experience in building dynamic workflows, models, and pipelines. Experience in developing custom formulas, applications, and plugins within the Dataiku DSS environment Experience integrating and working with Snowflake Good understanding of SQL, and experience integrating Dataiku with enterprise systems such as SAP, Oracle, or cloud data platforms Must possess a balance of analytical problem solving and strong interpersonal, and relationship development skills. Technical Skills Hands-on experience in Dataiku, Alteryx, SQL, Power BI, Snowflake Proficient in creating data pipelines for data ingestion, transformation, and output within the Dataiku platform Understanding of Python and R scripting within Dataiku is a strong plus Strong working knowledge of JIRA for agile project and task tracking Soft skills (Desired) Excellent Presentation, verbal and written communication skills Excellent analytical skills and aptitude for problem solving, including data analysis and validation Able to work independently and as part of a team Work Experience Requirements 5-12 yrs. of total analytics experience 6+ years of experience in Dataiku 1-2 years of working experience in Insurance Analytics would be good
Posted 3 weeks ago
6.0 - 8.0 years
7 - 11 Lacs
Chennai
Work from Office
The Python Developer will play a critical role in building and maintaining financial applications and tools that support data processing, analysis, and reporting within a fast-paced financial services environment. This position involves developing scalable and secure systems. The developer will collaborate with business analysts, finance users/or finance BA to translate complex business requirements into efficient, high-quality software solutions. A strong understanding of financial concepts, data integrity, and regulatory compliance is essential. The detailed responsibilities are mentioned below. Responsibilities Direct Responsibilities - Proficient in object-oriented programming, especially Python, with a minimum of 6-8 years of core python development experience. - Strong competency with Python libraries such as Pandas and NumPy for data wrangling, analysis, and manipulation. - Expertise in PySpark for large-scale data processing and loading into databases. - Proficiency in data querying and manipulation with Oracle and PostgreSQL. - Strong communication skills to effectively collaborate with team members and stakeholders. - Familiarity with the Software Development Life Cycle (SDLC) process and its various stages, including experience with JIRA and Confluence. Technical & Behavioral Competencies - Proficient in object-oriented programming, especially Python, with a minimum of 6-8 years of core python development experience. - Strong competency with Python libraries such as Pandas and NumPy for data wrangling, analysis, and manipulation. - Expertise in PySpark for large-scale data processing and loading into databases. - Proficiency in data querying and manipulation with Oracle and PostgreSQL. - Strong communication skills to effectively collaborate with team members and stakeholders. - Familiarity with the Software Development Life Cycle (SDLC) process and its various stages, including experience with JIRA and Confluence. - Good analytical, problem solving, & communication skills - Engage in technical discussions and to help in improving the system, process etc Nice to Have - Familiarity with Plotly and Matplotlib for data visualization of large datasets. - Skilled in API programming, handling JSON, CSV, and other unstructured data from various systems. - Familiarity with JavaScript, CSS, and HTML. - Experience with cloud architecture applications such as Dataiku or Databricks; competency with ETL tools. - Knowledge of regulatory frameworks, RISK, CCAR, and GDPR. Specific Qualifications (if required) Skills Referential Behavioural Skills : (Please select up to 4 skills) Ability to collaborate / Teamwork Critical thinking Ability to deliver / Results driven Communication skills - oral & written Transversal Skills: Analytical Ability Ability to develop and adapt a process Ability to understand, explain and support change Ability to develop others & improve their skills Choose an item. Education Level: Bachelor Degree or equivalent Experience Level At least 5 years
Posted 3 weeks ago
3.0 years
4 - 9 Lacs
Hyderābād
On-site
Summary We are seeking a highly analytical and collaborative professional to join the Analytics, Insights & Decisions Enablement (AIDE) team within Insights and Decision Science (IDS). This role will focus on delivering actionable insights into patient and provider behaviors, supporting brand strategy, marketing effectiveness, customer engagement, etc. You will partner closely with cross-functional stakeholders across IDS pillars. The team of Novartis specialists within Insights and Decision Science (IDS) centralizes insights and analytics across the US enterprise, empowering our organization to make informed and interconnected decisions in service of patients and customers. About the Role Key responsibilities Provide analytics support to Novartis internal customers on various high complexity analytical reports. Working knowledge of multiple datasets e.g. LAAD, Xponent, SMART etc. and managing and organizing data sets from databases to find patterns and trends in data. Transforming these complex and granular data into actionable insights. Putting together specifications to extract/transform data into required formats for different analytical elements using SQL/DSS or other data processing tools. Require experience in quantitative analysis with a demonstrated focus in analytics, and experience with coding languages (SQL OR Python) to query and extract data. Also, experience with BI tools, working with very large data sets is a plus. Create the foundation for more sophisticated approaches to APLD analysis and advanced analytics wherever it is required and beneficial. Establish and maintain positive relationships with key functional stakeholders. Takes initiative to drive standardization of reports across brands. Essential Requirements : Masters/ Bachelor’s in technology/ life-sciences/ management Minimum of 3+ years’ experience processing and analyzing real world patient level data (Claims data/EHR etc.,). Expertise in SQL, DataIKU and/or other data processing tool. Knowledge of Statistical modeling or ML is a plus. Understanding of healthcare terminology and real-world patient level data Good communication and interpersonal skills. Conceptual, analytical & tactical thinking, strategic thought process Ability to multi-task, work in a demanding global team environment, work under tight deadlines. Develop and maintain strong individual and team performance. Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards Division US Business Unit Universal Hierarchy Node Location India Site Hyderabad (Office) Company / Legal Entity IN10 (FCRS = IN010) Novartis Healthcare Private Limited Functional Area Marketing Job Type Full time Employment Type Regular Shift Work No Accessibility and accommodation Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to [email protected] and let us know the nature of your request and your contact information. Please include the job requisition number in your message. Novartis is committed to building an outstanding, inclusive work environment and diverse teams' representative of the patients and communities we serve.
Posted 3 weeks ago
6.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Title: ML Engineer ( Dataiku DSS Specialist) Experience: 2–6 Years Onsite Location: Gurgaon Job Description: We are seeking a skilled professional with proven experience in Dataiku DSS to support end-to-end data and machine learning workflows. Key Responsibilities: Utilize advanced knowledge of Dataiku DSS for data preparation, model building, and workflow automation Develop robust Python and SQL code for data analysis and pipeline integration Build and manage automated data pipelines for streamlined data operations Deploy machine learning models as scalable APIs in production environments Ensure reliability and performance of data workflows using MLOps/DataOps practices Work with cloud platforms such as AWS, Azure, or GCP, and big data tools like Hadoop and Spark Preferred Skills: Strong analytical and problem-solving abilities Exposure to cloud-based data infrastructure Experience in building scalable and resilient data systems
Posted 3 weeks ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Summary We are seeking a highly analytical and collaborative professional to join the Analytics, Insights & Decisions Enablement (AIDE) team within Insights and Decision Science (IDS). This role will focus on delivering actionable insights into patient and provider behaviors, supporting brand strategy, marketing effectiveness, customer engagement, etc. You will partner closely with cross-functional stakeholders across IDS pillars. The team of Novartis specialists within Insights and Decision Science (IDS) centralizes insights and analytics across the US enterprise, empowering our organization to make informed and interconnected decisions in service of patients and customers. About The Role Key responsibilities Provide analytics support to Novartis internal customers on various high complexity analytical reports. Working knowledge of multiple datasets e.g. LAAD, Xponent, SMART etc. and managing and organizing data sets from databases to find patterns and trends in data. Transforming these complex and granular data into actionable insights. Putting together specifications to extract/transform data into required formats for different analytical elements using SQL/DSS or other data processing tools. Require experience in quantitative analysis with a demonstrated focus in analytics, and experience with coding languages (SQL OR Python) to query and extract data. Also, experience with BI tools, working with very large data sets is a plus. Create the foundation for more sophisticated approaches to APLD analysis and advanced analytics wherever it is required and beneficial. Establish and maintain positive relationships with key functional stakeholders. Takes initiative to drive standardization of reports across brands. Essential Requirements Masters/ Bachelor’s in technology/ life-sciences/ management Minimum of 3+ years’ experience processing and analyzing real world patient level data (Claims data/EHR etc.,). Expertise in SQL, DataIKU and/or other data processing tool. Knowledge of Statistical modeling or ML is a plus. Understanding of healthcare terminology and real-world patient level data Good communication and interpersonal skills. Conceptual, analytical & tactical thinking, strategic thought process Ability to multi-task, work in a demanding global team environment, work under tight deadlines. Develop and maintain strong individual and team performance. Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards
Posted 3 weeks ago
2.0 - 7.0 years
30 - 40 Lacs
Hyderabad, Gurugram
Work from Office
We are seeking a skilled MLOps/ML Engineer to serve as our subject matter expert for Dataiku DSS. In this pivotal role, you will manage and scale our end-to-end machine learning operations, all of which are built on the Dataiku platform. Key responsibilities include designing automated data pipelines, deploying models as production APIs, ensuring the reliability of scheduled jobs, and championing platform best practices. Extensive, proven experience with Dataiku is mandatory. Gurgaon - Work from office 30 to 40 LPA MAX Immediate or Serving Notice 2 Weeks Data Pipeline Development: Design and implement Extract, Transform, Load (ETL) processes to collect, process, and analyze data from diverse sources. Workflow Optimization: Develop, configure, and optimize Dataiku DSS workflows to streamline data processing and machine learning operations. Integration: Integrate Dataiku DSS with cloud platforms (e.g., AWS, Azure, Google Cloud Platform) and big data technologies such as Snowflake, Hadoop, and Spark. AI/ML Model development & Implementation: Implement and optimize machine learning models within Dataiku for predictive analytics and AI-driven solutions. MLOps & Data Ops: Deployment of data pipelines and AI/ML models within the Dataiku platform. Dataiku Platform Management: Build, Manage & Support Dataiku platform. Automation: Automate data workflows, monitor job performance, and ensure scalable execution. Customization: Develop and maintain custom Python/R scripts within Dataiku to enhance analytics capabilities. Dataiku Project Management: Develop and maintain custom Python/R scripts within Dataiku to enhance analytics capabilities. Required Skills and Qualifications: Experience Level: 2 to 6 years of hands-on experience with Dataiku DSS platform and data engineering. Educational Background: Bachelors or Master’s degree in Computer Science, Data Science, Information Technology, or a related field. Technical Proficiency: Experience with Dataiku DSS platform. Strong programming skills in Python and SQL. Familiarity with cloud services (AWS, Azure, GCP) and big data technologies (Hadoop, Spark). Analytical Skills: Ability to analyze complex data sets and provide actionable insights. Problem-Solving: Strong troubleshooting skills to address and resolve issues in data workflows and models. Communication: Effective verbal and written communication skills to collaborate with team members and stakeholders.
Posted 3 weeks ago
7.0 - 10.0 years
10 - 14 Lacs
Mumbai
Work from Office
About the Job : We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team. In this pivotal role, you will be instrumental in driving our data engineering initiatives, with a strong emphasis on leveraging Dataiku's capabilities to enhance data processing and analytics. You will be responsible for designing, developing, and optimizing robust data pipelines, ensuring seamless integration of diverse data sources, and maintaining high data quality and accessibility to support our business intelligence and advanced analytics projects. This role requires a unique blend of expertise in traditional data engineering principles, advanced data modeling, and a forward-thinking approach to integrating cutting-AI technologies, particularly LLM Mesh for Generative AI applications. If you are passionate about building scalable data solutions and are eager to explore the cutting edge of AI, we encourage you to apply. Key Responsibilities : - Dataiku Leadership : Drive data engineering initiatives with a strong emphasis on leveraging Dataiku capabilities for data preparation, analysis, visualization, and the deployment of data solutions. - Data Pipeline Development : Design, develop, and optimize robust and scalable data pipelines to support various business intelligence and advanced analytics projects. This includes developing and maintaining ETL/ELT processes to automate data extraction, transformation, and loading from diverse sources. - Data Modeling & Architecture : Apply expertise in data modeling techniques to design efficient and scalable database structures, ensuring data integrity and optimal performance. - ETL/ELT Expertise : Implement and manage ETL processes and tools to ensure efficient and reliable data flow, maintaining high data quality and accessibility. - Gen AI Integration : Explore and implement solutions leveraging LLM Mesh for Generative AI applications, contributing to the development of innovative AI-powered features. - Programming & Scripting : Utilize programming languages such as Python and SQL for data manipulation, analysis, automation, and the development of custom data solutions. - Cloud Platform Deployment : Deploy and manage scalable data solutions on cloud platforms such as AWS or Azure, leveraging their respective services for optimal performance and cost-efficiency. - Data Quality & Governance : Ensure seamless integration of data sources, maintaining high data quality, consistency, and accessibility across all data assets. Implement data governance best practices. - Collaboration & Mentorship : Collaborate closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver impactful solutions. Potentially mentor junior team members. - Performance Optimization : Continuously monitor and optimize the performance of data pipelines and data systems. Required Skills & Experience : - Proficiency in Dataiku : Demonstrable expertise in Dataiku for data preparation, analysis, visualization, and building end-to-end data pipelines and applications. - Expertise in Data Modeling : Strong understanding and practical experience in various data modeling techniques (e.g., dimensional modeling, Kimball, Inmon) to design efficient and scalable database structures. - ETL/ELT Processes & Tools : Extensive experience with ETL/ELT processes and a proven track record of using various ETL tools (e.g., Dataiku's built-in capabilities, Apache Airflow, Talend, SSIS, etc.). - Familiarity with LLM Mesh : Familiarity with LLM Mesh or similar frameworks for Gen AI applications, understanding its concepts and potential for integration. - Programming Languages : Strong proficiency in Python for data manipulation, scripting, and developing data solutions. Solid command of SQL for complex querying, data analysis, and database interactions. - Cloud Platforms : Knowledge and hands-on experience with at least one major cloud platform (AWS or Azure) for deploying and managing scalable data solutions (e.g., S3, EC2, Azure Data Lake, Azure Synapse, etc.). - Gen AI Concepts : Basic understanding of Generative AI concepts and their potential applications in data engineering. - Problem-Solving : Excellent analytical and problem-solving skills with a keen eye for detail. - Communication : Strong communication and interpersonal skills to collaborate effectively with cross-functional teams. Bonus Points (Nice to Have) : - Experience with other big data technologies (e.g., Spark, Hadoop, Snowflake). - Familiarity with data governance and data security best practices. - Experience with MLOps principles and tools. - Contributions to open-source projects related to data engineering or AI. Education : Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related quantitative field.
Posted 3 weeks ago
7.0 - 10.0 years
9 - 12 Lacs
Bengaluru
Work from Office
About the Job : We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team. In this pivotal role, you will be instrumental in driving our data engineering initiatives, with a strong emphasis on leveraging Dataiku's capabilities to enhance data processing and analytics. You will be responsible for designing, developing, and optimizing robust data pipelines, ensuring seamless integration of diverse data sources, and maintaining high data quality and accessibility to support our business intelligence and advanced analytics projects. This role requires a unique blend of expertise in traditional data engineering principles, advanced data modeling, and a forward-thinking approach to integrating cutting-AI technologies, particularly LLM Mesh for Generative AI applications. If you are passionate about building scalable data solutions and are eager to explore the cutting edge of AI, we encourage you to apply. Key Responsibilities : - Dataiku Leadership : Drive data engineering initiatives with a strong emphasis on leveraging Dataiku capabilities for data preparation, analysis, visualization, and the deployment of data solutions. - Data Pipeline Development : Design, develop, and optimize robust and scalable data pipelines to support various business intelligence and advanced analytics projects. This includes developing and maintaining ETL/ELT processes to automate data extraction, transformation, and loading from diverse sources. - Data Modeling & Architecture : Apply expertise in data modeling techniques to design efficient and scalable database structures, ensuring data integrity and optimal performance. - ETL/ELT Expertise : Implement and manage ETL processes and tools to ensure efficient and reliable data flow, maintaining high data quality and accessibility. - Gen AI Integration : Explore and implement solutions leveraging LLM Mesh for Generative AI applications, contributing to the development of innovative AI-powered features. - Programming & Scripting : Utilize programming languages such as Python and SQL for data manipulation, analysis, automation, and the development of custom data solutions. - Cloud Platform Deployment : Deploy and manage scalable data solutions on cloud platforms such as AWS or Azure, leveraging their respective services for optimal performance and cost-efficiency. - Data Quality & Governance : Ensure seamless integration of data sources, maintaining high data quality, consistency, and accessibility across all data assets. Implement data governance best practices. - Collaboration & Mentorship : Collaborate closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver impactful solutions. Potentially mentor junior team members. - Performance Optimization : Continuously monitor and optimize the performance of data pipelines and data systems. Required Skills & Experience : - Proficiency in Dataiku : Demonstrable expertise in Dataiku for data preparation, analysis, visualization, and building end-to-end data pipelines and applications. - Expertise in Data Modeling : Strong understanding and practical experience in various data modeling techniques (e.g., dimensional modeling, Kimball, Inmon) to design efficient and scalable database structures. - ETL/ELT Processes & Tools : Extensive experience with ETL/ELT processes and a proven track record of using various ETL tools (e.g., Dataiku's built-in capabilities, Apache Airflow, Talend, SSIS, etc.). - Familiarity with LLM Mesh : Familiarity with LLM Mesh or similar frameworks for Gen AI applications, understanding its concepts and potential for integration. - Programming Languages : Strong proficiency in Python for data manipulation, scripting, and developing data solutions. Solid command of SQL for complex querying, data analysis, and database interactions. - Cloud Platforms : Knowledge and hands-on experience with at least one major cloud platform (AWS or Azure) for deploying and managing scalable data solutions (e.g., S3, EC2, Azure Data Lake, Azure Synapse, etc.). - Gen AI Concepts : Basic understanding of Generative AI concepts and their potential applications in data engineering. - Problem-Solving : Excellent analytical and problem-solving skills with a keen eye for detail. - Communication : Strong communication and interpersonal skills to collaborate effectively with cross-functional teams. Bonus Points (Nice to Have) : - Experience with other big data technologies (e.g., Spark, Hadoop, Snowflake). - Familiarity with data governance and data security best practices. - Experience with MLOps principles and tools. - Contributions to open-source projects related to data engineering or AI. Education : Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related quantitative field.
Posted 3 weeks ago
7.0 - 10.0 years
10 - 14 Lacs
Chennai
Work from Office
About the Job : We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team. In this pivotal role, you will be instrumental in driving our data engineering initiatives, with a strong emphasis on leveraging Dataiku's capabilities to enhance data processing and analytics. You will be responsible for designing, developing, and optimizing robust data pipelines, ensuring seamless integration of diverse data sources, and maintaining high data quality and accessibility to support our business intelligence and advanced analytics projects. This role requires a unique blend of expertise in traditional data engineering principles, advanced data modeling, and a forward-thinking approach to integrating cutting-AI technologies, particularly LLM Mesh for Generative AI applications. If you are passionate about building scalable data solutions and are eager to explore the cutting edge of AI, we encourage you to apply. Key Responsibilities : - Dataiku Leadership : Drive data engineering initiatives with a strong emphasis on leveraging Dataiku capabilities for data preparation, analysis, visualization, and the deployment of data solutions. - Data Pipeline Development : Design, develop, and optimize robust and scalable data pipelines to support various business intelligence and advanced analytics projects. This includes developing and maintaining ETL/ELT processes to automate data extraction, transformation, and loading from diverse sources. - Data Modeling & Architecture : Apply expertise in data modeling techniques to design efficient and scalable database structures, ensuring data integrity and optimal performance. - ETL/ELT Expertise : Implement and manage ETL processes and tools to ensure efficient and reliable data flow, maintaining high data quality and accessibility. - Gen AI Integration : Explore and implement solutions leveraging LLM Mesh for Generative AI applications, contributing to the development of innovative AI-powered features. - Programming & Scripting : Utilize programming languages such as Python and SQL for data manipulation, analysis, automation, and the development of custom data solutions. - Cloud Platform Deployment : Deploy and manage scalable data solutions on cloud platforms such as AWS or Azure, leveraging their respective services for optimal performance and cost-efficiency. - Data Quality & Governance : Ensure seamless integration of data sources, maintaining high data quality, consistency, and accessibility across all data assets. Implement data governance best practices. - Collaboration & Mentorship : Collaborate closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver impactful solutions. Potentially mentor junior team members. - Performance Optimization : Continuously monitor and optimize the performance of data pipelines and data systems. Required Skills & Experience : - Proficiency in Dataiku : Demonstrable expertise in Dataiku for data preparation, analysis, visualization, and building end-to-end data pipelines and applications. - Expertise in Data Modeling : Strong understanding and practical experience in various data modeling techniques (e.g., dimensional modeling, Kimball, Inmon) to design efficient and scalable database structures. - ETL/ELT Processes & Tools : Extensive experience with ETL/ELT processes and a proven track record of using various ETL tools (e.g., Dataiku's built-in capabilities, Apache Airflow, Talend, SSIS, etc.). - Familiarity with LLM Mesh : Familiarity with LLM Mesh or similar frameworks for Gen AI applications, understanding its concepts and potential for integration. - Programming Languages : Strong proficiency in Python for data manipulation, scripting, and developing data solutions. Solid command of SQL for complex querying, data analysis, and database interactions. - Cloud Platforms : Knowledge and hands-on experience with at least one major cloud platform (AWS or Azure) for deploying and managing scalable data solutions (e.g., S3, EC2, Azure Data Lake, Azure Synapse, etc.). - Gen AI Concepts : Basic understanding of Generative AI concepts and their potential applications in data engineering. - Problem-Solving : Excellent analytical and problem-solving skills with a keen eye for detail. - Communication : Strong communication and interpersonal skills to collaborate effectively with cross-functional teams. Bonus Points (Nice to Have) : - Experience with other big data technologies (e.g., Spark, Hadoop, Snowflake). - Familiarity with data governance and data security best practices. - Experience with MLOps principles and tools. - Contributions to open-source projects related to data engineering or AI. Education : Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related quantitative field.
Posted 3 weeks ago
7.0 - 10.0 years
10 - 14 Lacs
Kolkata
Work from Office
About the Job : We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team. In this pivotal role, you will be instrumental in driving our data engineering initiatives, with a strong emphasis on leveraging Dataiku's capabilities to enhance data processing and analytics. You will be responsible for designing, developing, and optimizing robust data pipelines, ensuring seamless integration of diverse data sources, and maintaining high data quality and accessibility to support our business intelligence and advanced analytics projects. This role requires a unique blend of expertise in traditional data engineering principles, advanced data modeling, and a forward-thinking approach to integrating cutting-AI technologies, particularly LLM Mesh for Generative AI applications. If you are passionate about building scalable data solutions and are eager to explore the cutting edge of AI, we encourage you to apply. Key Responsibilities : - Dataiku Leadership : Drive data engineering initiatives with a strong emphasis on leveraging Dataiku capabilities for data preparation, analysis, visualization, and the deployment of data solutions. - Data Pipeline Development : Design, develop, and optimize robust and scalable data pipelines to support various business intelligence and advanced analytics projects. This includes developing and maintaining ETL/ELT processes to automate data extraction, transformation, and loading from diverse sources. - Data Modeling & Architecture : Apply expertise in data modeling techniques to design efficient and scalable database structures, ensuring data integrity and optimal performance. - ETL/ELT Expertise : Implement and manage ETL processes and tools to ensure efficient and reliable data flow, maintaining high data quality and accessibility. - Gen AI Integration : Explore and implement solutions leveraging LLM Mesh for Generative AI applications, contributing to the development of innovative AI-powered features. - Programming & Scripting : Utilize programming languages such as Python and SQL for data manipulation, analysis, automation, and the development of custom data solutions. - Cloud Platform Deployment : Deploy and manage scalable data solutions on cloud platforms such as AWS or Azure, leveraging their respective services for optimal performance and cost-efficiency. - Data Quality & Governance : Ensure seamless integration of data sources, maintaining high data quality, consistency, and accessibility across all data assets. Implement data governance best practices. - Collaboration & Mentorship : Collaborate closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver impactful solutions. Potentially mentor junior team members. - Performance Optimization : Continuously monitor and optimize the performance of data pipelines and data systems. Required Skills & Experience : - Proficiency in Dataiku : Demonstrable expertise in Dataiku for data preparation, analysis, visualization, and building end-to-end data pipelines and applications. - Expertise in Data Modeling : Strong understanding and practical experience in various data modeling techniques (e.g., dimensional modeling, Kimball, Inmon) to design efficient and scalable database structures. - ETL/ELT Processes & Tools : Extensive experience with ETL/ELT processes and a proven track record of using various ETL tools (e.g., Dataiku's built-in capabilities, Apache Airflow, Talend, SSIS, etc.). - Familiarity with LLM Mesh : Familiarity with LLM Mesh or similar frameworks for Gen AI applications, understanding its concepts and potential for integration. - Programming Languages : Strong proficiency in Python for data manipulation, scripting, and developing data solutions. Solid command of SQL for complex querying, data analysis, and database interactions. - Cloud Platforms : Knowledge and hands-on experience with at least one major cloud platform (AWS or Azure) for deploying and managing scalable data solutions (e.g., S3, EC2, Azure Data Lake, Azure Synapse, etc.). - Gen AI Concepts : Basic understanding of Generative AI concepts and their potential applications in data engineering. - Problem-Solving : Excellent analytical and problem-solving skills with a keen eye for detail. - Communication : Strong communication and interpersonal skills to collaborate effectively with cross-functional teams. Bonus Points (Nice to Have) : - Experience with other big data technologies (e.g., Spark, Hadoop, Snowflake). - Familiarity with data governance and data security best practices. - Experience with MLOps principles and tools. - Contributions to open-source projects related to data engineering or AI. Education : Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related quantitative field.
Posted 3 weeks ago
0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Company Description Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, and gain deeper insights into the automotive market, all using our unique combination of data, analytics and software. We also assist millions of people to realize their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.com. Job Description Carry out and when required, supervise others carrying out, the extraction, formulation and manipulation of data Interpret and, where appropriate, supervise others in the interpretation of data specifications. Apply the most relevant method to solve client’s problem while balancing data quality and project timeline Develops creative solutions that better utilize Experian and client data and produce new insights Ensure quality and accuracy of own work and the work of any junior data engineer on the project. Produced analysis is accurate and completed to agreed timescales, resulting in positive feedback from client/customer. Follow Business Line procedures and processes throughout the data extraction, formulation and manipulation process, ensuring all work is produced to the agreed specification, and meets all requirements. Take responsibility for the production of high quality results documentation. Identify and resolve data problems, producing appropriate analysis to verify results. Identify potential gaps in own technical/business knowledge that might compromise the deadlines. Work with Project Managers or Consultant to plan timely steps to fill gaps. Challenge existing processes and recommend improvements. Share findings with peers and management where required. Prepared to communicate ideas in knowledge sharing forums. Ensure the quality of all data sent to clients/customers Analyse and confirm the integrity of source data to be evaluated. Collaborate with internal and external clients to determine the appropriate performance measures and statistical metrics to be applied during the analysis project. Document work completed in user guide formats, ect… Strong documentation skills are required. Communicate with client and/or other departments/divisions within Experian to help ensure accurate and timely implementation of project work, and to enable them to manage client expectations. Potential to build models for our Experian Marketplace. Familiarity with programs like Dataiku and other model building tools and modeling techniques like linner regression, ect… is extremely useful for success in this role. Qualifications Associates or higher in a quantitative discipline like Business Analytics or Computer Science. Strong quantitative, qualitative and communication skills when working with others as a remote employees. Demonstrate how you communicate to ‘win’, what does that mean to you? Poses strong time management skills. As a remote employee, you will have to stay on task, on focus and be a strong communicator that ensures deadlines are met and exceeded. High degree of collaboration as a remote employee on the Boost team is critical to success. Be flexible in the type of projects you work on that will vary in difficulty required. Poses the ability to provide solutions to problems that might not have a clear solution or be able to talk through how you would do that through school or prior professional projects. Poses a wide verity of technical skills and programming skills (Eg. SAS, Python, SQL). Strong desire if you know SQL and Python. Effectively uses project management / planning tools Ensure full understanding of the client’s/customer’s requirements. Asking questions is a plus on our team. Ensure that all written and verbal communication is clear and professional. Additional Information Our uniqueness is that we celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what matters; DEI, work/life balance, development, authenticity, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's people first approach is award-winning; World's Best Workplaces™ 2024 (Fortune Top 25), Great Place To Work™ in 24 countries, and Glassdoor Best Places to Work 2024 to name a few. Check out Experian Life on social or our Careers Site to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is an important part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, colour, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here
Posted 3 weeks ago
1.0 - 3.0 years
4 - 9 Lacs
Hyderābād
On-site
Summary -Provide analytics support to Novartis internal customers (CPOs & Regional marketing and sales teams) on various low-medium complexity analytical reports. -Support and facilitate data enabled decision making for Novartis internal customers by providing and communicating qualitative and quantitative analytics. -Support GBS -GCO business in building practice by involving in various initiatives like knowledge sharing, on-boarding and training support, support team lead in all business related tasks / activities, building process documentation and knowledge repositories -To be an integral part of a comprehensive design team responsible for designing promotional marketing materials. About the Role Location – Hyderabad #Hybrid About the role: Novartis specialists within Data and Product Solutions are on a data and digital transformation journey, leveraging analytics to generate actionable insights for Novartis medicines impacting more than 799 million patients worldwide. The team is poised to enable easier, faster, and reliable decisions for Novartis divisions across the globe. Key Responsibilities: Explore, develop, implement, and evaluate innovative solutions that address customer needs Co-create with key partners to build partnerships & collaborations Develop and coordinate project plans across the design, development, and production stages of a project to support the successful delivery within set KPI's Works in collaboration with brand teams, technical teams & all functions to improve value. Serve as collaborate with the global and local Brand teams on Project Planning and delivery management through delivery of proven analytics-based solutions Take initiative to drive standardization of reports across brands, data, and platforms Essential Requirements: Experience (1-3 years) in data analytics in a market research firm or pharmaceutical company or Pharma KPO Proficiency in SQL, Dataiku, PowerBI, Alteryx, Matillion, Excel, PowerPoint Exposure to US pharma datasets like APLD,SP, IQVIA Xponent, DDD, SMART, Affiliations, Promotional Activity etc. Exposure to DevOps tools like Azure DevOps, JIRA x-Ray etc. -Exposure to Workflow management and automation tools like SharePoint, MS -PowerApps, and Testing techniques on RPA, etc is preferred Proven communication, presentation and stakeholder management skills Strong and proactive business results-focus, and proven ability to provide insights Strong analytical thinking with problem solving approach Ability to drive initiative and deliver Desirable requirements: Exposure to Python is preferred Should have worked in an international company with exposure to healthcare analytics and working in cross-cultural environment. Commitment to Diversity and Inclusion: Novartis is committed to building an outstanding, inclusive work environment and diverse teams' representative of the patients and communities we serve. Accessibility and accommodation: Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to diversityandincl.india@novartis.com and let us know the nature of your request and your contact information. Please include the job requisition number in your message. Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards Division Operations Business Unit Universal Hierarchy Node Location India Site Hyderabad (Office) Company / Legal Entity IN10 (FCRS = IN010) Novartis Healthcare Private Limited Functional Area Marketing Job Type Full time Employment Type Regular Shift Work No Accessibility and accommodation Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to [email protected] and let us know the nature of your request and your contact information. Please include the job requisition number in your message. Novartis is committed to building an outstanding, inclusive work environment and diverse teams' representative of the patients and communities we serve.
Posted 3 weeks ago
6.0 years
6 - 8 Lacs
Hyderābād
On-site
About Deloitte Deloitte refers to one or more of Deloitte Touche Tohmatsu Limited, a UK private company limited by guarantee (“DTTL”), its network of member firms, and their related entities. DTTL and each of its member firms are legally separate and independent entities. DTTL (also referred to as “Deloitte Global”) does not provide services to clients. In the United States, Deloitte refers to one or more of the US member firms of DTTL, their related entities that operate using the “Deloitte” name in the United States and their respective affiliates. Certain services may not be available to attest clients under the rules and regulations of public accounting. Please see www.deloitte.com/about to learn more about our global network of member firms. USI Assistant Manager, Data Scientist- Strategic Analytics –Data Science Are you ready to apply your financial and analytical skills in a dynamic environment? Are you looking for an exciting opportunity to be a strategic business advisor to executive leaders? If so, Deloitte could be the place for you! Join our team of experienced financial professionals who support financial planning and analysis for Strategic Analytics in a fast-paced business climate. If you are seeking a role that offers you the opportunity to develop personally and professionally, consider a career with the Data Science team within Strategic Analytics at Deloitte. The team and the role The Financial Planning & Analysis (FP&A) organization supports our business, market and enabling area leaders by providing world class financial support. FP&A is made up of advisors who are influential in decision-making and partner with finance leaders to drive meaningful strategic and financial outcomes. Within FP&A, the Strategic Analytics (SA) team integrates big data analytics, strategic thinking, and deep firm knowledge to provide leadership with insights to drive informed decision-making and enable continued growth. The team partners with business and operational leaders to provide an impact by leveraging data and analytical tools, strategic thinking, and hypothesis-based analysis. Specific responsibilities and qualifications for the USI Data Scientist role are outlined below. Work you’ll do Core responsibilities Support large scale, data science projects that support key strategic priorities Support the development of first-class tools and insights for leadership by identifying key indicators of economic shifts, identifying business and/or policy changes, and preparing situational playbooks Support the automation and streamlining of projects by using advanced analytic techniques Understand fundamental procedures and processing to optimize data interactions and drive insights Facilitate storyboarding by performing data-driven analysis to inform prospective recommendations Work with cross-functional teams, including Deloitte’s businesses and Enabling Areas, to support key business functions Provide ad-hoc specialized analyses and support for leadership meetings Other responsibilities Support Lead Data Scientist by maintaining project plans and milestones Lay out the elements of a story using logical structuring and assist in developing presentation content Build supporting materials that assist leaders and stakeholders in the decision-making process Support leaders in strategic activities and engage in stretch opportunities aligned with professional development goals and strengths Qualifications Required: Bachelor’s degree in Finance, Accounting, Computer Science or related subject Minimum of 6+ years of relevant experience Demonstrated accomplishments in the following areas: ― Financial reporting and analysis ― Data analysis, management, and visualization About Deloitte Deloitte refers to one or more of Deloitte Touche Tohmatsu Limited, a UK private company limited by guarantee (“DTTL”), its network of member firms, and their related entities. DTTL and each of its member firms are legally separate and independent entities. DTTL (also referred to as “Deloitte Global”) does not provide services to clients. In the United States, Deloitte refers to one or more of the US member firms of DTTL, their related entities that operate using the “Deloitte” name in the United States and their respective affiliates. Certain services may not be available to attest clients under the rules and regulations of public accounting. Please see www.deloitte.com/about to learn more about our global network of member firms. ― Machine learning algorithms (e.g., clustering, logistic regression) ― Artificial intelligence software (e.g., Dataiku, DataRobot) ― Advanced in MS Office (Excel, PowerPoint, Outlook, Teams) ― Advanced understanding of data science and visualization tools (Python, R, SQL) Preferred: Advanced education degree a plus Experience in a professional services firm is a plus Proficiency in Tableau Location: Hyderabad Shift timing: 2pm to 11pm How You’ll Grow At Deloitte, our professional development plan focuses on helping people at every level of their career to identify and use their strengths to do their best work every day. From entry-level employees to senior leaders, we believe there is always room to learn. We offer opportunities to help sharpen skills in addition to hands-on experience in the global, fast-changing business world. From on-the-job learning experiences to formal development programs at Deloitte University, our professionals have a variety of opportunities to continue to grow throughout their career. Explore Deloitte University, The Leadership Center. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Deloitte’s culture Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture where our people excel and lead healthy, happy lives. Learn more about Life at Deloitte. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. Learn more about Deloitte’s impact on the world. Recruiter tips We want job seekers exploring opportunities at Deloitte to feel prepared and confident. To help you with your interview, we suggest that you do your research: know some background about the organization and the business area you are applying to. Check out recruiting tips from Deloitte professionals. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India. Benefits to help you thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300329
Posted 3 weeks ago
0 years
4 - 7 Lacs
Hyderābād
Remote
Company Description Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, and gain deeper insights into the automotive market, all using our unique combination of data, analytics and software. We also assist millions of people to realize their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.com. Job Description Carry out and when required, supervise others carrying out, the extraction, formulation and manipulation of data Interpret and, where appropriate, supervise others in the interpretation of data specifications. Apply the most relevant method to solve client’s problem while balancing data quality and project timeline Develops creative solutions that better utilize Experian and client data and produce new insights Ensure quality and accuracy of own work and the work of any junior data engineer on the project. Produced analysis is accurate and completed to agreed timescales, resulting in positive feedback from client/customer. Follow Business Line procedures and processes throughout the data extraction, formulation and manipulation process, ensuring all work is produced to the agreed specification, and meets all requirements. Take responsibility for the production of high quality results documentation. Identify and resolve data problems, producing appropriate analysis to verify results. Identify potential gaps in own technical/business knowledge that might compromise the deadlines. Work with Project Managers or Consultant to plan timely steps to fill gaps. Challenge existing processes and recommend improvements. Share findings with peers and management where required. Prepared to communicate ideas in knowledge sharing forums. Ensure the quality of all data sent to clients/customers Analyse and confirm the integrity of source data to be evaluated. Collaborate with internal and external clients to determine the appropriate performance measures and statistical metrics to be applied during the analysis project. Document work completed in user guide formats, ect… Strong documentation skills are required. Communicate with client and/or other departments/divisions within Experian to help ensure accurate and timely implementation of project work, and to enable them to manage client expectations. Potential to build models for our Experian Marketplace. Familiarity with programs like Dataiku and other model building tools and modeling techniques like linner regression, ect… is extremely useful for success in this role. Qualifications Associates or higher in a quantitative discipline like Business Analytics or Computer Science. Strong quantitative, qualitative and communication skills when working with others as a remote employees. Demonstrate how you communicate to ‘win’, what does that mean to you? Poses strong time management skills. As a remote employee, you will have to stay on task, on focus and be a strong communicator that ensures deadlines are met and exceeded. High degree of collaboration as a remote employee on the Boost team is critical to success. Be flexible in the type of projects you work on that will vary in difficulty required. Poses the ability to provide solutions to problems that might not have a clear solution or be able to talk through how you would do that through school or prior professional projects. Poses a wide verity of technical skills and programming skills (Eg. SAS, Python, SQL). Strong desire if you know SQL and Python. Effectively uses project management / planning tools Ensure full understanding of the client’s/customer’s requirements. Asking questions is a plus on our team. Ensure that all written and verbal communication is clear and professional. Additional Information Our uniqueness is that we celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what matters; DEI, work/life balance, development, authenticity, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's people first approach is award-winning; World's Best Workplaces™ 2024 (Fortune Top 25), Great Place To Work™ in 24 countries, and Glassdoor Best Places to Work 2024 to name a few. Check out Experian Life on social or our Careers Site to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is an important part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, colour, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here
Posted 3 weeks ago
8.0 - 10.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Summary A Data Analyst in the Financial Crime Surveillance Operations (FCSO) Performance and Metrics Management function interprets data and helps turn it into information that enables or improves a business process, thus affecting business decisions within FCSO. The FCSO Data Analyst gathers information from various sources and interpret patterns and trends to make it digestible for others where it is then reported in the FCSO Scorecard. They must have strong analytical skills, but above all have a burning curiosity to understand, and make sense of, data. Responsibilities Acquire a detailed understanding of the tools for sourcing and visualising of data, transforming as well as analysing of the data required to manage FCSO Performance metrics and Scorecard Define clear, concise and detailed business requirements for FCSO Data that clearly document the data elements and formats that are needed, outline detailed transformation expectations and list the critical data elements that will enable downstream processes to operate effectively Create and maintain documentation that articulates the process by which data is extracted, transformed and loaded in FCSO that can be shared and understood by others Work with downstream FCSO business process owners to constantly improve, refine and expand the datasets to improve the quality and effectiveness of those processes, as well as help them to make sense of the data, providing training where required, and derive meaningful BI / MI Conduct detailed analysis of upstream changes that impact FCSO data – for example the introduction of a new products –to ensure that requirements remain up to date and define any new ones as necessary Identify areas of overlap or data gaps that can lead to increased value, either by eliminating redundant processes or expanding existing data models Produce accurate and insightful dashboards and reports detailing the health, content and insights available from the data, making that actionable for stakeholders and meaningful for management decision making Participate in Agile Ceremonies as a functional data expert and work with a cross functional agile team Innovate with how we present Data to senior management to make actionable insights and metrics enabling business to take data driven decisions. Strategy Work for FCSO Data and Reporting team strategic solutions & initiatives Business Define clear, concise and detailed business requirements for FCSO Data that clearly document the data elements and formats that are needed, outline detailed transformation expectations and list the critical data elements that will enable downstream processes to operate effectively Key Responsibilities Governance Follow TTO and FCSO change governance process, document all the changes and communicate the stakeholders for UVT(Users Verification Testing). Regulatory & Business Conduct Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters. Key stakeholders They Will Work Closely With FCSO Management Team, who provide the team priorities in terms of metrics to be reported and managed, requirements, objectives, and strategy FCSO Data Squads, who are managing the MI transformation and working with the FCSO Performance and Metrics Management team to define, prioritise, and operationalise the use of the FCSO metrics FCSO Data Quality Analysts, who define data quality control requirements and oversee these on a day to day basis to ensure constant system health Upstream data teams, who provide the data that the analyst is sourcing Downstream Process Owners, who depend on the data to perform their business function Data Analysts spend much of their time working with stakeholders to define data requirements, data transformation logic and supporting the delivery of these requirements from start to finish. They are experts in profiling data to understand its contents and will also have a working understanding of the business process or product that generated it in the first place. Data Analysts are the entry point to the FCSO Data Team for most external stakeholders and as such will have a broad, but still detailed, understanding of all the data available and constantly seek opportunities for innovation and expansion. They are the primary liaison between up- and downstream teams. Other Responsibilities Embed Here for good and Group’s brand and values in India / OPS FCSO / Data and Reporting ; Perform other responsibilities assigned under Group, Country, Business or Functional policies and procedures; Multiple functions (double hats); Processes Work with downstream FCSO business process owners to constantly improve, refine and expand the datasets to improve the quality and effectiveness of those processes, as well as help them to make sense of the data, providing training where required, and derive meaningful BI / MI People & Talent Learn all the FCSO processes systems data regularly and apply the knowledge in the Data and MI ETL(Extraction Transformation and Loading) and Reports development. Risk Management Learn the FCSO risk management framework and raise the issues in M7 and diligently and close them in a timely manner. Knowledge: An advanced data management techniques with extensive experience. 8-10 years of industry experience as a Business/Data Analyst with 6-8 years’ experience in data analysis using tools such as Tableau,Dataiku,MSTR, SQL and Excel. Technical Skills: Tableau, MSTR, Dataiku, Python, SQL. Practical knowledge of data in various forms (data warehouses/SQL, unstructured data environments/PIG,HIVE, Impala, Pyspark, Think cell and pivot tables in Excel); Experience working within process management and improvement methodologies – Lean, Six Sigma, etc. and demonstrating knowledge of data governance, data quality management concepts and data quality tools (i.e. Informatica DQ); Understanding of Agile development methodologies, software design patterns, network design and architecture; Experience in quantitative analysis. Past work experience using both Tableau , Dataiku/ Pyspark will be an added advantage Stress Management: The Manager data analyst must be able to work well under pressure and achieve results within the scheduled timeframe Communication skills : The role of a Manager data analyst involves working with various cross functional teams, technology, and Management team. It is crucial that they have exceptional writing and verbal communication skills to perform their job duties effectively. Skills And Experience Data Analytics and Visualisation Tools – Tableau (Preferable), PowerBI, Dataiku(Preferable), MSTR, DataRobot or Paxata) FCC/FCSO Knowledge/ past work experience Microsoft office: PPT, Excel, Macros Agile tools: Confluence, JIRA \SQL, Python, Pyspark Qualifications EDUCATION Graduate / Master’s degree and 8-10 years of Banking Industry experience in data analysis using Tableau & Dataiku/ SQL CERTIFICATIONS Tableau (Preferable), Dataiku, MSTR, Python, SQL. Pyspark. Practical knowledge of data in various forms (data warehouses/SQL, unstructured data environments/PIG,HIVE, Impala, Think cell and pivot tables in Excel LANGUAGES ENGLISH About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough