Home
Jobs

596 Teradata Jobs - Page 19

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Our Company Teradata is the connected multi-cloud data platform for enterprise analytics company. Our enterprise analytics solve business challenges from start to scale. Only Teradata gives you the flexibility to handle the massive and mixed data workloads of the future, today. The Teradata Vantage architecture is cloud native, delivered as-a-service, and built on an open ecosystem. These design features make Vantage the ideal platform to optimize price performance in a multi-cloud environment. What You’ll Do The Teradata Pricing Analyst or Manager is responsible for bid and P&L development for accounts in APJ. The Pricing Analyst will be responsible for detailed pricing and P&L analysis for these deals utilizing Teradata’s Pricing Tools and Pricing Automation. Evaluate bid pricing and terms & conditions, identify risks and options, and offer recommendations to sales management. Provide price structuring & scenario analysis relevant to sales strategy Provide advice and business consulting for sales teams and Unit VP Interact on Pricing projects and present to Teradata LT Members Understand data analytics technology and competitor products and tactics Participate in many aspects of sales-oriented activities including participation on customer calls and visits with sales team Support and provide pricing for Support and Consulting/Professional Services Participate and provide input to account planning sessions Consolidate opportunity pricing & prepare revenue recognition analysis for major bids Submit approvals via CPQ in Salesforce Ensure that appropriate levels of approval have been obtained at region and corporate level for all bids as determined by Corporate Policy. Gather and disseminate general product pricing information Monitor price & cost changes, and other product information as it impacts deals Provide opportunity & competitive feedback to America’s region/corporate level Pricing Provide general pricing, business, accounting advice for sales teams Maintain Excel tracking sheet with updated sales opportunity data Upload completed proposals to our internal shared site (Seismic) Pre-populate customer questionnaires with responses from our Content Library Who You’ll Work With The Pricing Analyst will provide vital counsel to Sales and Sales Management and Deal Managers on product alternatives, pricing options, competition, and price negotiation techniques, then provide and secure bid approval as required via Teradata’s Pricing Automation deployment in Salesforce. On a day-to-day basis, working with the sales teams, this includes information, analysis and recommendations that assist the Unit VP and Sales management to win business and increase profitability. Close working relationship is required with Sales, Planning, Revenue Recognition, Product Management, and other functions. In addition, you will provide support in the tracking and data management of our customer-facing proposals. The position reports to the APJ Deal Desk Director. What Makes You a Qualified Candidate Bachelor’s degree, Minimum 2-5 years related work experience or college new hire with Internship experience in Finance Strong capability in Excel, PowerPoint, and other windows applications Demonstrated analytical/financial skills gained through formal education, certification, or work experience. Self-directed individual Good command of English Mandatory experience in Japanese language - JLPT N1 or N2 What You’ll Bring Knowledge of web-based applications, sales support tools, or other modern programming languages Experience in Finance, Technology, or Sales-support function Knowledge of Data Warehousing / Analytics and Cloud MBA, prior Pricing, Solution Architect, or Financial experience Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less

Posted 3 weeks ago

Apply

2.0 - 5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Our Company Teradata is the connected multi-cloud data platform for enterprise analytics company. Our enterprise analytics solve business challenges from start to scale. Only Teradata gives you the flexibility to handle the massive and mixed data workloads of the future, today. The Teradata Vantage architecture is cloud native, delivered as-a-service, and built on an open ecosystem. These design features make Vantage the ideal platform to optimize price performance in a multi-cloud environment. What You’ll Do The Teradata Pricing Analyst or Manager is responsible for bid and P&L development for accounts in the Americas and APJ. The Pricing Analyst will be responsible for detailed pricing and P&L analysis for these deals utilizing Teradata’s Pricing Tools and Pricing Automation. The Pricing Analyst will provide vital counsel to Sales and Sales Management and Deal Managers on product alternatives, pricing options, competition, and price negotiation techniques, then provide and secure bid approval as required via Teradata’s Pricing Automation deployment in Salesforce. On a day-to-day basis, working with the sales teams, this includes information, analysis and recommendations that assist the Unit VP and Sales management to win business and increase profitability. Close working relationship is required with Sales, Planning, Revenue Recognition, Product Management, and other functions. The position reports to the APJ and Americas Deal Desk Director . Evaluate bid pricing and terms & conditions, identify risks and options, and offer recommendations to sales management. Provide price structuring & scenario analysis relevant to sales strategy Provide advice and business consulting for sales teams and Unit VP Interact on Pricing projects and present to Teradata LT Members Understand data analytics technology and competitor products and tactics Participate in many aspects of sales-oriented activities including participation on customer calls and visits with sales team Support and provide pricing for Support and Consulting/Professional Services Participate and provide input to account planning sessions Consolidate opportunity pricing & prepare revenue recognition analysis for major bids Submit approvals via CPQ in Salesforce Ensure that appropriate levels of approval have been obtained at region and corporate level for all bids as determined by Corporate Policy. Gather and disseminate general product pricing information Monitor price & cost changes, and other product information as it impacts deals Provide opportunity & competitive feedback to America’s region/corporate level Pricing Provide general pricing, business, accounting advice for sales teams Who You’ll Work With The Pricing Analyst will provide vital counsel to Sales and Sales Management and Deal Managers on product alternatives, pricing options, competition, and price negotiation techniques, then provide and secure bid approval as required via Teradata’s Pricing Automation deployment in Salesforce. On a day-to-day basis, working with the sales teams, this includes information, analysis and recommendations that assist the Unit VP and Sales management to win business and increase profitability. Close working relationship is required with Sales, Planning, Revenue Recognition, Product Management, and other functions. The position reports to the APJ Deal Desk Director. What Makes You a Qualified Candidate Bachelor’s degree, Minimum 2-5 years related work experience or college new hire with Internship experience in Finance Strong capability in Excel, PowerPoint, and other windows applications Demonstrated analytical/financial skills gained through formal education, certification, or work experience. Self-directed individual Good command of English Able to work USA workday What You’ll Bring Knowledge of web-based applications, sales support tools, or other modern programming languages Experience in Finance, Technology, or Sales-support function Knowledge of Data Warehousing / Analytics and Cloud MBA, prior Pricing, Solution Architect, or Financial experience Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Role - Tableau Developer Exp- 4-7 yrs Location- Pune Hybrid Job Details - Role Purpose: Working for Business Intelligence requires a good understanding of the business context and the business requirements. The focus of the role is the development and maintenance of BI dashboards and BI self-service front-end tools for data analysis of business users in close collaboration with multidisciplinary teams in other IT departments. The individuals in this role possess great understanding of data visualization concepts and best practices, ability to create visualization that help business to make the right interpretation quickly with accuracy and with ease. Need to be sensitive about data security, data quality, business SLA and timeliness. Role Description: Design, create and maintain dashboards and reports based on following tools – Tableau Business Objects and BO Universes Schedule and run reports Distribution of reports Check report failures and errors in dashboards and reports Communicate issues/delays to users Support users on dashboard reporting related issues/questions/service requests Support content-related analysis Support user access management Support maintenance of test and pre-production Tableau and BO Server Basic technologies: Tableau, Microsoft Office (Excel, Access, VBA), BO, SQL (Teradata) Show more Show less

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: Big Data Engineer Tech Primary Skills: Hadoop (HDFS, Map-Reduce, YARN) SPARK (SQL, DataFrame) ETL/ELT (Professional experience with Teradata, Ab Initio) Python or PySpark MongoDB, GCP, Big Query CI/CD, Hive Unix, Autosys Responsibilities: Design, develop, and test robust ETL/ELT data pipelines using Map-Reduce and SPARK. Process large datasets in multiple file formats such as CSV, JSON, Parquet, and Avro. Perform metadata configuration and optimize job performance. Analyze and recommend changes to data models (E-R and Dimensional models) for enhanced efficiency. Collaborate with cross-functional teams to ensure smooth data workflows and processing. Implement best practices in coding, performance tuning, and process automation. Lead the team in troubleshooting complex data issues and provide guidance on best approaches. Ensure high-quality delivery of data pipelines with regular performance and scalability checks. Conduct code reviews and mentor junior engineers on technical skills and best practices. Design and optimize processes for scalable data storage, management, and access. Eligibility Criteria: 10+ years of experience in Big Data Engineering with hands-on expertise in Hadoop and Spark. Strong understanding and practical experience in designing, coding, and testing ETL/ELT pipelines. Proficiency in Spark (SQL and DataFrame) for processing large datasets. Experience with data models (E-R & Dimensional) and their optimization. Strong skills in Unix Shell scripting (simple to moderate). Familiarity with Sparkflow framework (preferred). Proficiency in SQL, GCP Big Query, and Python is desirable. Strong problem-solving and analytical skills. Good communication and collaboration skills to work in a cross-functional team environment. Ability to work independently and manage multiple tasks simultaneously. Show more Show less

Posted 3 weeks ago

Apply

2.0 years

0 Lacs

Greater Chennai Area

On-site

Linkedin logo

The AI, Data, and Analytics (AIDA) organization team, a Pfizer Digital organization, is responsible for the development and management of all data and analytics tools and platforms across the enterprise – from global product development, to manufacturing, to commercial, to point of patient care across over 100+ countries. One of the team’s top priorities is the development of Business Intelligence (BI), Reporting, and Visualization products which will serve as an enabler for the company’s digital transformation to bring innovative therapeutics to patients. Role Summary We are looking for a technically skilled and experienced Reporting Engineering, Senior Associate who is passionate about developing BI and data visualization products for our Customer Facing and Sales Enablement Colleagues, totaling over 20,000 individuals. This role involves working across multiple business segments globally to deliver top-tier BI Reporting and Visualization capabilities that enable impactful business decisions and high engagement user experiences. This role will work across multiple business segments globally to deliver best in class BI Reporting and Visualization capabilities that enable impactful business decisions and cohesive high engagement user experiences. In this position, you will be responsible for development of crucial business operations reports and dashboard products that drive company performance through continuously monitoring, measuring, identifying root cause, and proactively identifying patterns and triggers across the company to optimize performance. This role will also drive best practices and standards for BI & Visualization. This role will work closely with lead architect and lead engineers to develop reporting capabilities that elevate Customer Experience. This role requires working with complex and advanced data environments, employing the right architecture to build scalable semantic layers and contemporary reporting visualizations. The Reporting Senior Associate, will ensure data quality and integrity by validating the accuracy of KPIs and insights, resolving anomalies, implementing data quality checks, and conducting system integration testing (SIT) and user acceptance testing (UAT). The ideal candidate is a passionate and results-oriented product lead with a proven track record of delivering data and analytics driven solutions for the pharmaceutical industry. Role Responsibilities Engineering developer in business intelligence and data visualization products in service of field force and HQ enabling functions. Act as a Technical BI & Visualization developer on projects and collaborate with global team members (e.g. other engineers, regional delivery and activation teams, vendors) to architect, design and create BI & Visualization products at scale. Thorough understanding of data, business, and analytic requirements (incl. BI Product Blueprints such as SMART) to deliver high-impact, relevant data visualizations products while respecting project or program budgets and timelines. Deliver quality Functional Requirements and Solution Design, adhering to established standards and best practices. Follow Pfizer Process in Portfolio Management, Project Management, Product Management Playbook following Agile, Hybrid or Enterprise Solution Life Cycle. Extensive technical and implementation knowledge of multitude of BI and Visualization platforms not limiting to Tableau, MicroStrategy, Business Objects, MS-SSRS, and etc. Experience of cloud-based architectures, cloud analytics products / solutions, and data products / solutions (eg: AWS Redshift, MS SQL, Snowflake, Oracle, Teradata). Qualifications Bachelor’s degree in a technical area such as computer science, engineering, or management information science. Recent Healthcare Life Sciences (pharma preferred) and/or commercial/marketing data experience is highly preferred. Domain knowledge in the pharmaceutical industry preferred. Good knowledge of data governance and data cataloging best practices. 2+ years Relevant experience or knowledge in areas such as database management, data quality, master data management, metadata management, performance tuning, collaboration, and business process management. Strong Business Analysis acumen to meet or exceed business requirements following User Center Design (UCD). Working experience with testing of BI and Analytics applications – Unit Testing (e.g. Phased or Agile Sprints or MVP), System Integration Testing (SIT) and User Integration Testing (UAT). Experience with technical solution management tools such as JIRA or Github. Stay abreast of customer, industry, and technology trends with enterprise Business Intelligence (BI) and visualization tools. Technical Skillset 2+ years of hands-on experience in developing BI capabilities using Microstrategy Proficiency in common BI tools, such as Tableau, PowerBI, etc.. is a plus. Understand Dimensional Data Modelling principles (eg: Star Schema) Develop using Design System for Reporting as well as Adhoc Analytics Template Experience in working with cloud native SQL and NoSQL database platforms. Snowflake experience is desirable. Experience in AWS services EC2, EMR, RDS, Spark is preferred. Solid understanding of Scrum/Agile is preferred and working knowledge of CI/CD, GitHub MLflow. Familiarity with data privacy standards, governance principles, data protection, pharma industry practices/GDPR compliance is preferred. Great communication skills. Great business influencing and stakeholder management skills. Pfizer is an equal opportunity employer and complies with all applicable equal employment opportunity legislation in each jurisdiction in which it operates. Information & Business Tech Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Greater Chennai Area

On-site

Linkedin logo

Role Summary The AI, Data, and Analytics (AIDA) organization team, a Pfizer Digital organization, is responsible for the development and management of all data and analytics tools and platforms across the enterprise – from global product development, to manufacturing, to commercial, to point of patient care across over 100+ countries. One of the team’s top priorities is the development of Business Intelligence (BI), Reporting, and Visualization products which will serve as an enabler for the company’s digital transformation to bring innovative therapeutics to patients. We are looking for a technically skilled and experienced Reporting Engineering Manager who is passionate about developing BI and data visualization products for our Customer Facing and Sales Enablement Colleagues, totaling over 20,000 individuals. This role involves working across multiple business segments globally to deliver top-tier BI Reporting and Visualization capabilities that enable impactful business decisions and high engagement user experiences. This role will work across multiple business segments globally to deliver best in class BI Reporting and Visualization capabilities that enable impactful business decisions and cohesive high engagement user experiences. In this position, you will be accountable to have a thorough understanding of data, business, and analytic requirements to deliver high-impact, relevant interactive data visualizations products that drive company performance through continuously monitoring, measuring, identifying root cause, and proactively identifying patterns and triggers across the company to optimize performance. This role will also drive best practices and standards for BI & Visualization. This role will work closely with stakeholders to understand their needs and ensure that reporting assets are created with a focus on Customer Experience. This role requires working with complex and advanced data environments, employing the right architecture to build scalable semantic layers and contemporary reporting visualizations. The Reporting Manager will ensure data quality and integrity by validating the accuracy of KPIs and insights, resolving anomalies, implementing data quality checks, and conducting system integration testing (SIT) and user acceptance testing (UAT). The ideal candidate is a passionate and results-oriented product lead with a proven track record of delivering data and analytics driven solutions for the pharmaceutical industry. Role Responsibilities Engineering expert in business intelligence and data visualization products in service of field force and HQ enabling functions. Act as a Technical BI & Visualization developer on projects and collaborate with global team members (e.g. other engineers, regional delivery and activation teams, vendors) to architect, design and create BI & Visualization products at scale. Thorough understanding of data, business, and analytic requirements (incl. BI Product Blueprints such as SMART) to deliver high-impact, relevant data visualizations products while respecting project or program budgets and timelines. Deliver quality Functional Requirements and Solution Design, adhering to established standards and best practices. Follow Pfizer Process in Portfolio Management, Project Management, Product Management Playbook following Agile, Hybrid or Enterprise Solution Life Cycle. Extensive technical and implementation knowledge of multitude of BI and Visualization platforms not limiting to Tableau, MicroStrategy, Business Objects, MS-SSRS, and etc. Experience of cloud-based architectures, cloud analytics products / solutions, and data products / solutions (eg: AWS Redshift, MS SQL, Snowflake, Oracle, Teradata). Qualifications Bachelor’s degree in a technical area such as computer science, engineering, or management information science. Recent Healthcare Life Sciences (pharma preferred) and/or commercial/marketing data experience is highly preferred. Domain knowledge in the pharmaceutical industry preferred. Good knowledge of data governance and data cataloging best practices. Relevant experience or knowledge in areas such as database management, data quality, master data management, metadata management, performance tuning, collaboration, and business process management. Strong Business Analysis acumen to meet or exceed business requirements following User Center Design (UCD). Strong Experience with testing of BI and Analytics applications – Unit Testing (e.g. Phased or Agile Sprints or MVP), System Integration Testing (SIT) and User Integration Testing (UAT). Experience with technical solution management tools such as JIRA or Github. Stay abreast of customer, industry, and technology trends with enterprise Business Intelligence (BI) and visualization tools. Technical Skillset 5+ years of hands-on experience in developing BI capabilities using Microstrategy Proficiency in common BI tools, such as Tableau, PowerBI, etc.. is a plus. Common Data Model (Logical & Physical), Conceptual Data Model validation to create Consumption Layer for Reporting (Dimensional Model, Semantic Layer, Direct Database Aggregates or OLAP Cubes) Develop using Design System for Reporting as well as Adhoc Analytics Template BI Product Scalability, Performance-tuning Platform Admin and Security, BI Platform tenant (licensing, capacity, vendor access, vulnerability testing) Experience in working with cloud native SQL and NoSQL database platforms. Snowflake experience is desirable. Experience in AWS services EC2, EMR, RDS, Spark is preferred. Solid understanding of Scrum/Agile is preferred and working knowledge of CI/CD, GitHub MLflow. Familiarity with data privacy standards, governance principles, data protection, pharma industry practices/GDPR compliance is preferred. Great communication skills. Great business influencing and stakeholder management skills. Pfizer is an equal opportunity employer and complies with all applicable equal employment opportunity legislation in each jurisdiction in which it operates. Information & Business Tech Show more Show less

Posted 3 weeks ago

Apply

9.0 years

0 Lacs

Greater Chennai Area

On-site

Linkedin logo

The AI, Data, and Analytics (AIDA) organization team, a Pfizer Digital organization, is responsible for the development and management of all data and analytics tools and platforms across the enterprise – from global product development, to manufacturing, to commercial, to point of patient care across over 100+ countries. One of the team’s top priorities is the development of Business Intelligence (BI), Reporting, and Visualization products which will serve as an enabler for the company’s digital transformation to bring innovative therapeutics to patients. Role Summary We are looking for a technically skilled and experienced Reporting Engineering Senior Manager who is passionate about developing BI and data visualization products for our Customer Facing and Sales Enablement Colleagues, totaling over 20,000 individuals. This role involves working across multiple business segments globally to deliver top-tier BI Reporting and Visualization capabilities that enable impactful business decisions and high engagement user experiences. This role will work across multiple business segments globally to deliver best in class BI Reporting and Visualization capabilities that enable impactful business decisions and cohesive high engagement user experiences. In this position, you will be accountable to have a thorough understanding of data, business, and analytic requirements to deliver high-impact, relevant interactive data visualizations products that drive company performance through continuously monitoring, measuring, identifying root cause, and proactively identifying patterns and triggers across the company to optimize performance. This role will also drive best practices and standards for BI & Visualization. This role will work closely with stakeholders to understand their needs and ensure that reporting assets are created with a focus on Customer Experience. This role requires working with complex and advanced data environments, employing the right architecture to build scalable semantic layers and contemporary reporting visualizations. The Reporting Manager will ensure data quality and integrity by validating the accuracy of KPIs and insights, resolving anomalies, implementing data quality checks, and conducting system integration testing (SIT) and user acceptance testing (UAT). The ideal candidate is a passionate and results-oriented product lead with a proven track record of delivering data and analytics driven solutions for the pharmaceutical industry. Role Responsibilities Engineering expert in business intelligence and data visualization products in service of field force and HQ enabling functions. Act as a lead Technical BI & Visualization developer on projects and collaborate with global team members (e.g. other engineers, regional delivery and activation teams, vendors) to architect, design and create BI & Visualization products at scale. Responsible for BI solution architecture design and implementation. Thorough understanding of data, business, and analytic requirements (incl. BI Product Blueprints such as SMART) to deliver high-impact, relevant data visualizations products while respecting project or program budgets and timelines. Deliver quality Functional Requirements and Solution Design, adhering to established standards and best practices. Follow Pfizer Process in Portfolio Management, Project Management, Product Management Playbook following Agile, Hybrid or Enterprise Solution Life Cycle. Extensive technical and implementation knowledge of multitude of BI and Visualization platforms not limiting to Tableau, MicroStrategy, Business Objects, MS-SSRS, and etc. Experience of cloud-based architectures, cloud analytics products / solutions, and data products / solutions (eg: AWS Redshift, MS SQL, Snowflake, Oracle, Teradata). Qualifications Bachelor’s degree in a technical area such as computer science, engineering, or management information science. 9+ years Relevant experience or knowledge in areas such as database management, data quality, master data management, metadata management, performance tuning, collaboration, and business process management. Recent Healthcare Life Sciences (pharma preferred) and/or commercial/marketing data experience is highly preferred. Domain knowledge in the pharmaceutical industry preferred. Good knowledge of data governance and data cataloging best practices. Strong Business Analysis acumen to meet or exceed business requirements following User Center Design (UCD). Strong Experience with testing of BI and Analytics applications – Unit Testing (e.g. Phased or Agile Sprints or MVP), System Integration Testing (SIT) and User Integration Testing (UAT). Experience with technical solution management tools such as JIRA or Github. Stay abreast of customer, industry, and technology trends with enterprise Business Intelligence (BI) and visualization tools. Technical Skillset 9+ years of hands-on experience in developing BI capabilities using Microstrategy Proficiency in industry common BI tools, such as Tableau, PowerBI, etc. is a plus. Common Data Model (Logical & Physical), Conceptual Data Model validation to create Consumption Layer for Reporting (Dimensional Model, Semantic Layer, Direct Database Aggregates or OLAP Cubes) Develop using Design System for Reporting as well as Adhoc Analytics Template BI Product Scalability, Performance-tuning Platform Admin and Security, BI Platform tenant (licensing, capacity, vendor access, vulnerability testing) Experience in working with cloud native SQL and NoSQL database platforms. Snowflake experience is desirable. Experience in AWS services EC2, EMR, RDS, Spark is preferred. Solid understanding of Scrum/Agile is preferred and working knowledge of CI/CD, GitHub MLflow. Familiarity with data privacy standards, governance principles, data protection, pharma industry practices/GDPR compliance is preferred. Great communication skills. Great business influencing and stakeholder management skills. Pfizer is an equal opportunity employer and complies with all applicable equal employment opportunity legislation in each jurisdiction in which it operates. Information & Business Tech Show more Show less

Posted 3 weeks ago

Apply

9.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

The AI, Data, and Analytics (AIDA) organization team, a Pfizer Digital organization, is responsible for the development and management of all data and analytics tools and platforms across the enterprise – from global product development, to manufacturing, to commercial, to point of patient care across over 100+ countries. One of the team’s top priorities is the development of Business Intelligence (BI), Reporting, and Visualization products which will serve as an enabler for the company’s digital transformation to bring innovative therapeutics to patients. Role Summary We are looking for a technically skilled and experienced Reporting Engineering Senior Manager who is passionate about developing BI and data visualization products for our Customer Facing and Sales Enablement Colleagues, totaling over 20,000 individuals. This role involves working across multiple business segments globally to deliver top-tier BI Reporting and Visualization capabilities that enable impactful business decisions and high engagement user experiences. This role will work across multiple business segments globally to deliver best in class BI Reporting and Visualization capabilities that enable impactful business decisions and cohesive high engagement user experiences. In this position, you will be accountable to have a thorough understanding of data, business, and analytic requirements to deliver high-impact, relevant interactive data visualizations products that drive company performance through continuously monitoring, measuring, identifying root cause, and proactively identifying patterns and triggers across the company to optimize performance. This role will also drive best practices and standards for BI & Visualization. This role will work closely with stakeholders to understand their needs and ensure that reporting assets are created with a focus on Customer Experience. This role requires working with complex and advanced data environments, employing the right architecture to build scalable semantic layers and contemporary reporting visualizations. The Reporting Manager will ensure data quality and integrity by validating the accuracy of KPIs and insights, resolving anomalies, implementing data quality checks, and conducting system integration testing (SIT) and user acceptance testing (UAT). The ideal candidate is a passionate and results-oriented product lead with a proven track record of delivering data and analytics driven solutions for the pharmaceutical industry. Role Responsibilities Engineering expert in business intelligence and data visualization products in service of field force and HQ enabling functions. Act as a lead Technical BI & Visualization developer on projects and collaborate with global team members (e.g. other engineers, regional delivery and activation teams, vendors) to architect, design and create BI & Visualization products at scale. Responsible for BI solution architecture design and implementation. Thorough understanding of data, business, and analytic requirements (incl. BI Product Blueprints such as SMART) to deliver high-impact, relevant data visualizations products while respecting project or program budgets and timelines. Deliver quality Functional Requirements and Solution Design, adhering to established standards and best practices. Follow Pfizer Process in Portfolio Management, Project Management, Product Management Playbook following Agile, Hybrid or Enterprise Solution Life Cycle. Extensive technical and implementation knowledge of multitude of BI and Visualization platforms not limiting to Tableau, MicroStrategy, Business Objects, MS-SSRS, and etc. Experience of cloud-based architectures, cloud analytics products / solutions, and data products / solutions (eg: AWS Redshift, MS SQL, Snowflake, Oracle, Teradata). Qualifications Bachelor’s degree in a technical area such as computer science, engineering, or management information science. 9+ years Relevant experience or knowledge in areas such as database management, data quality, master data management, metadata management, performance tuning, collaboration, and business process management. Recent Healthcare Life Sciences (pharma preferred) and/or commercial/marketing data experience is highly preferred. Domain knowledge in the pharmaceutical industry preferred. Good knowledge of data governance and data cataloging best practices. Strong Business Analysis acumen to meet or exceed business requirements following User Center Design (UCD). Strong Experience with testing of BI and Analytics applications – Unit Testing (e.g. Phased or Agile Sprints or MVP), System Integration Testing (SIT) and User Integration Testing (UAT). Experience with technical solution management tools such as JIRA or Github. Stay abreast of customer, industry, and technology trends with enterprise Business Intelligence (BI) and visualization tools. Technical Skillset 9+ years of hands-on experience in developing BI capabilities using Microstrategy Proficiency in industry common BI tools, such as Tableau, PowerBI, etc. is a plus. Common Data Model (Logical & Physical), Conceptual Data Model validation to create Consumption Layer for Reporting (Dimensional Model, Semantic Layer, Direct Database Aggregates or OLAP Cubes) Develop using Design System for Reporting as well as Adhoc Analytics Template BI Product Scalability, Performance-tuning Platform Admin and Security, BI Platform tenant (licensing, capacity, vendor access, vulnerability testing) Experience in working with cloud native SQL and NoSQL database platforms. Snowflake experience is desirable. Experience in AWS services EC2, EMR, RDS, Spark is preferred. Solid understanding of Scrum/Agile is preferred and working knowledge of CI/CD, GitHub MLflow. Familiarity with data privacy standards, governance principles, data protection, pharma industry practices/GDPR compliance is preferred. Great communication skills. Great business influencing and stakeholder management skills. Pfizer is an equal opportunity employer and complies with all applicable equal employment opportunity legislation in each jurisdiction in which it operates. Information & Business Tech Show more Show less

Posted 3 weeks ago

Apply

5.0 - 10.0 years

45 - 55 Lacs

Bengaluru

Work from Office

Naukri logo

As a Data Scientist lead within the Data and Analytics team, you will drive the analytics book of work, transforming Chase s cross-channel, cross-line of business acquisition strategy to a hyper-personalized, data-driven, customer-centric model. You will partner strategically across the firm with marketers, channel owners, digital experts, and the broader analytics community to help drive business goals through deep understanding of marketing analytics and optimization. Job Responsibilities Develop the analytics framework and data infrastructure necessary to support the platform team in evaluating value addition. Define and assess the OKRs and goals related to the platforms performance. Provide top-tier business intelligence through dashboards and executive reporting to the Acquisitions Center of Excellence and Line of Business leadership. Construct business cases that drive prioritization and investment in Acquisition & Enablement Platforms. Communicate effectively with product, technology, data, and design teams to identify and advance a data-driven analytical roadmap. Serve as the Acquisition Center of Excellence Analytics local site lead, overseeing local operations and contributing to the expansion of the teams presence in India. Required Qualifications, Capabilities, and Skills 5+ years leveraging data visualization tools for data exploration and marketing performance evaluation. Proven ability to lead and manage teams effectively, showcasing strong leadership skills. Experience in querying big data platforms and SAS/SQL. Comfort building and managing relationships with both analytics and business stakeholders. Proven track record of problem-solving using data and building new analytics capabilities. Talent for translating numbers into an actionable story for business leaders. Experience with best-in-class web analytic tools (Google Analytics, Adobe/Omniture Insight/Visual Sciences, Webtrends, CoreMetrics, etc). Superior written, oral communication and presentation skills with experience communicating concisely and effectively with all levels of management and partners. Bachelor s degree is required, with data science, mathematics, statistics, econometrics, engineering, MIS, finance, or related fields accordingly. Preferred Qualifications, Capabilities, and Skills Financial services experience preferred. Tableau experience preferred. Familiarity with Teradata, AWS, & Snowflake preferred.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

18 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Gracenote is the content business unit of Nielsen that powers the world of media entertainment. Our metadata solutions help media and entertainment companies around the world deliver personalized content search and discovery, connecting audiences with the content they love. We re at the intersection of people and media entertainment. With our cutting-edge technology and solutions, we help audiences easily find TV shows, movies, music and sports across multiple platforms. As the world leader in entertainment data and services, we power the world s top streaming platforms, cable and satellite TV providers, media companies, consumer electronics manufacturers, music services and automakers to navigate and succeed in the competitive streaming world. Our metadata entertainment solutions have a global footprint of 80+ countries, 100K+ channels and catalogs, 70+ sports and 100M+ music tracks, all across 35 languages. Job Overview We are seeking an experienced Senior Data Engineer with 8-10 years of experience to join our Video engineering team with Gracenote - a NielsenIQ Company. In this role, you will design, build, and maintain our data processing systems and pipelines. You will work closely with Product managers, Architects, analysts, and other stakeholders to ensure data is accessible, reliable, and optimized for Business, analytical and operational needs. Responsibilities Design, develop, and maintain scalable data pipelines and ETL processes Optimize data flow and collection for cross-functional teams. Build infrastructure required for optimal extraction, transformation, and loading of data Ensure data quality, reliability, and integrity across all data systems. Collaborate with data scientists and analysts to help implement models and algorithms Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, etc. Create and maintain comprehensive technical documentation Evaluate and integrate new data management technologies and tools Requirements 5-8 years of professional experience in data engineering roles Bachelors degree in Computer Science, Engineering, or related field; Masters degree preferred Expert knowledge of SQL and experience with relational databases (e.g., PostgreSQL, Redshift, TIDB, MySQL, Oracle, Teradata) Extensive experience with big data technologies (e.g., Hadoop, Spark, Hive, Flink) Proficiency in at least one programming language such as Python, Java, or Scala Experience with data modeling, data warehousing, and building ETL pipelines Strong knowledge of data pipeline and workflow management tools (e.g., Airflow, Luigi, NiFi) Experience with cloud platforms (AWS, Azure, or GCP) and their data services. AWS Preferred Hands on Experience with building streaming pipelines with flink, Kafka, Kinesis. Flink Preferred. Understanding of data governance and data security principles Experience with version control systems (e.g., Git) and CI/CD practices Preferred Skill and Qualities Preferred Skills Experience with containerization and orchestration tools (Docker, Kubernetes) Basic knowledge of machine learning workflows and MLOps Experience with NoSQL databases (MongoDB, Cassandra, etc.) Familiarity with data visualization tools (Tableau, Power BI, etc.)Experience with real-time data processing Knowledge of data governance frameworks and compliance requirements (GDPR, CCPA, etc.) Experience with infrastructure-as-code tools (Terraform, CloudFormation) Personal Attribute Strong problem-solving skills and attention to detail Excellent communication skills, both written and verbal Ability to work independently and as part of a team Proactive approach to identifying and solving problems Adaptability and willingness to learn new technologies

Posted 3 weeks ago

Apply

5.0 - 8.0 years

11 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Software Engineering Lead Analyst - Business Intelligence Position Overview We are looking for a Software Engineer to design, build, and QA business intelligence (BI) solutions using a variety of BI tools. Candidates must have solid experiences designing, developing, testing, and implementing analytics reports using various technologies across traditional data center and cloud environments. This role will work directly with business partners and other IT team members to understand requirements and deliver effective solutions within Agile methodology and participate in all phases of the development and system support life cycle. Responsibilities Be comfortable working within a scrum team Actively participate in scrum ceremonies (e.g. daily scrum, story refinement, sprint planning, Program Increment (PI) planning, retrospectives) Work with business and technical partners to understand the current state system design and produce solutions for future capabilities Design, develop, test, and triage data pipelines; includes understanding source data and the transformation processes towards target states Share your perspective and prior experiences for the betterment of the team Qualifications Bachelors degree in software engineering, Computer Science or related field, or equivalent work experience Demonstrate an understanding of agile development practices Demonstrate an understanding of cloud technologies; AWS preferred Experience building interactive dashboards and reports will a Business Intelligences tool Quality Assurance experience Required Technical Skills: 5 - 8 Years of Hands-on experience using SAP Business Objects 4.x version. Experience in Universe Design & Development using IDT. Experience with SAP Business Objects report development using Universes, Web Intelligence and Rich Client. Run unit testing and validation of Business Objects reports and universes. Experience in scheduling and distribution of the Reports. Design reports against large volumes of data and experience in fine tuning reports and queries. Demonstrated experience automating QA scripts within a CI/CD process. Proficiency in SQL and performance tuning on databases like Teradata, Oracle, Postgres & Snowflake. Preferred Technical Skills: Knowledge of Cloud database Databricks, Snowflake. Experience in other BI tools: Tableau, Power BI, Cognos Knowledge of development languages such as Python and Scala About Evernorth Health Services

Posted 3 weeks ago

Apply

12.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Data n’ Analytics – Data Strategy - Assistant Director, Strategy and Transactions EY’s Data n’ Analytics team is a multi-disciplinary technology team delivering client projects and solutions across Data Management, Visualization, Business Analytics and Automation. The assignments cover a wide range of countries and industry sectors. The opportunity We’re looking for Assistant Director - Data Strategy. The main objective of the role is to develop and articulate a clear and concise data strategy aligned with the overall business strategy. Communicate the data strategy effectively to stakeholders across the organization, ensuring buy-in and alignment. Establish and maintain data governance policies and procedures to ensure data quality, security, and compliance. Oversee data management activities, including data acquisition, integration, transformation, and storage. Develop and implement data quality frameworks and processes.The role will primarily involve conceptualizing, designing, developing, deploying and maintaining complex technology solutions which help EY solve business problems for the clients. This role will work closely with technical architects, product and business subject matter experts (SMEs), back-end developers and other solution architects and is also on-shore facing. Discipline Data Strategy Key Skills Strong understanding of data models (relational, dimensional), data warehousing concepts, and cloud-based data architectures (AWS, Azure, GCP). Proficiency in data analysis techniques (e.g., SQL, Python, R), statistical modeling, and data visualization tools. Familiarity with big data technologies such as Hadoop, Spark, and NoSQL databases. Client Handling and Communication, Problem Solving, Systems thinking, Passion of technology, Adaptability, Agility, Analytical thinking, Collaboration Skills And Attributes For Success 12-14 years of total experience with 8+ years in Data Strategy and Architecture field Solid hands-on 8+ years of professional experience with designing and architecting of data warehouses/ data lakes on client engagements and helping create enhancements to a data warehouse Architecture design and implementation experience with medium to complex on-prem to cloud migrations with any of the major cloud platforms (preferably AWS/Azure/GCP) 5+ years’ experience in Azure database offerings [ Relational, NoSQL, Datawarehouse ] 5+ years experience in various Azure services preferred – Azure Data Factory, Kafka, Azure Data Explorer, Storage, Azure Data Lake, Azure Synapse Analytics, Azure Analysis Services & Databricks Minimum of 8 years of hands-on database design, modelling and integration experience with relational data sources, such as SQL Server databases, Oracle/MySQL, Azure SQL and Azure Synapse Knowledge and direct experience using business intelligence reporting tools (Power BI, Alteryx, OBIEE, Business Objects, Cognos, Tableau, MicroStrategy, SSAS Cubes etc.) Strong creative instincts related to data analysis and visualization. Aggressive curiosity to learn the business methodology, data model and user personas. Strong understanding of BI and DWH best practices, analysis, visualization, and latest trends. Experience with the software development lifecycle (SDLC) and principles of product development such as installation, upgrade and namespace management Willingness to mentor team members Solid analytical, technical and problem-solving skills Excellent written and verbal communication skills Strong project and people management skills with experience in serving global clients To qualify for the role, you must have Master’s Degree in Computer Science, Business Administration or equivalent work experience. Fact driven and analytically minded with excellent attention to details Hands-on experience with data engineering tasks such as building analytical data records and experience manipulating and analysing large volumes of data Relevant work experience of minimum 12 to 14 years in a big 4 or technology/ consulting set up Help incubate new finance analytic products by executing Pilot, Proof of Concept projects to establish capabilities and credibility with users and clients. This may entail working either as an independent SME or as part of a larger team Ideally, you’ll also have Ability to think strategically/end-to-end with result-oriented mindset Ability to build rapport within the firm and win the trust of the clients Willingness to travel extensively and to work on client sites / practice office locations Strong experience in SQL server and MS Excel plus atleast one other SQL dialect e.g. MS Access\Postgresql\Oracle PLSQL\MySQLStrong in Data Structures & Algorithm Experience of interfacing with databases such as Azure databases, SQL server, Oracle, Teradata etc Preferred exposure to JSON, Cloud Foundry, Pivotal, MatLab, Spark, Greenplum, Cassandra, Amazon Web Services, Microsoft Azure, Google Cloud, Informatica, Angular JS, Python, etc. What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY SaT practices globally with leading businesses across a range of industries What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 3 weeks ago

Apply

16.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Title: Technical Program Manager Data Engineering & Analytics Experience : 16 - 20 Years ( Relevant Years ) Salary : Based on Current CTC Location : Chennai and Hyderabad Notice Period : Immediate Joiners Only. Critical Expectations : 1 ) Candidate should have handled min 100 people Team size. 2) Should Have Min 8 Years experience into Data and AI Development 3) Should have exp in Complex Data Migration in Cloud. Position Overview: We are seeking an experienced Program Manager to lead large-scale, complex Data, BI, and AI/ML initiatives. The ideal candidate will have a deep technical understanding of modern data architectures, hands-on expertise in end-to-end solution delivery, and a proven ability to manage client relationships and multi-functional teams. This role will involve driving innovation, operational excellence, and strategic growth within Data Engineering & Analytics programs. Job Description: Responsible to manage large and complex programs encompassing multiple Data, BI and AI/ML solutions Lead the design, development, and implementation of Data Engineering & Analytics solution involving Teradata, Google Cloud Data Platform (GCP) platform, AI/ML, Qlik, Tableau etc. Work closely with clients in understanding their needs and translating them to technology solutions Provide technical leadership to solve complex business issues that translate into data analytics solutions Prepare operational/strategic reports based on defined cadences and present to steering & operational committees via WSR, MSR etc Responsible for ensuring compliance with defined service level agreements(SLA) and Key performance indicators(KPI) metrics Track and monitor the performance of services, identify areas for improvement, implement changes as needed Continuously evaluate and improve processes to ensure that services are delivered efficiently and effectively Proactive identification of issues and risks, prepare appropriate mitigation/resolution plans Foster positive work environment and build culture of automation & innovation to improve service delivery performance Developing team as coach, mentor, support, and manage team members Creating SOW, Proposals, Solution, Estimation for Data Analytics Solutions Contribute in building Data Analytics, AI/ML practice by creating case studies, POC etc Shaping opportunities and create execution approaches throughout the lifecycle of client engagements Colloborate with various functions/teams in the organization to support recruitment, hiring, onboarding and other operational activities Maintain positive relationship with all stakeholders and ensure proactive response to opportunities and challenges. Must Have Skills : Deep hands-on expertise in E2E solution life cycle management in Data Engineering and Data Management. Strong technical understanding of modern data architecture and solutions Ability to execute strategy for implementations through a roadmap and collaboration with different stakeholders Understanding of Cloud data architecture and data modeling concepts and principles, including Cloud data lakes, warehouses and marts, dimensional modeling, star schemas, real time and batch ETL/ELT Would be good to have experience in driving AI/ML, GenAI projects Experience with cloud-based data analytic platforms such as GCP, Snowflake, Azure etc Good understanding of SDLC and Agile methodologies Would be good to have a Telecom background. Must gave handled team size of 50+ Qualification: 15-20 yrs experience primarily working on Data Warehousing, BI& Analytics, Data management projects as Tech Architect, delivery, client relationship and practice roles - involving ETL, reporting, big data and analytics. Experience architecting, designing & developing Data Engineering, Business Intelligence and reporting projects Experience on working with data management solutions like Data Quality, Metadata, Master Data, Governance. Strong experience in Cloud Data migration programs Focused on value, innovation and automation led account mining Strong Interpersonal, stakeholder management and team building skills Show more Show less

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Description The leader must demonstrate an ability to anticipate, understand, and act on evolving customer needs, both stated and unstated. Through this, the candidate must create a customer-centric organization and use innovative thinking frameworks to foster value-added relations. With the right balance of bold initiatives, continuous improvement, and governance, the leader must adhere to the delivery standards set by the client and eClerx by leveraging the knowledge of market drivers and competition to effectively anticipate trends and opportunities. Besides, the leader must demonstrate a capacity to transform, align, and energize organization resources, and take appropriate risks to lead the organization in a new direction. As a leader, the candidate must build engaged and high-impact direct, virtual, and cross-functional teams, and take the lead towards raising the performance bar, build capability and bring out the best in their teams. By collaborating and forging partnerships both within and outside the functional area, the leader must work towards a shared vision and achieve positive business outcomes. Associate Program Manager Role And Responsibilities Represent eClerx in client pitches, external forums, and COE (Center of Excellence) activities to promote cloud engineering expertise. Lead research, assessments, and development of best practices to keep our cloud engineering solutions at the forefront of technology. Contribute to the growth of the cloud engineering practice through thought leadership, including the creation of white papers and articles. Lead and collaborate on multi-discipline assessments at client sites to identify new cloud-based opportunities. Provide technical leadership in the design and development of robust, scalable cloud architectures. Drive key cloud engineering projects, ensuring high performance, scalability, and adherence to best practices. Design and implement data architectures that address performance, scalability, and data latency requirements. Lead the development of cloud-based solutions, ensuring they are scalable, robust, and aligned with business needs. Anticipate and mitigate data bottlenecks, proposing strategies to enhance data processing efficiency. Provide mentorship and technical guidance to junior team members. Technical And Functional Skills Bachelor’s with 10+ years of experience in data management and cloud engineering. Proven experience in at least 2-3 large-scale cloud implementations within industries such as Retail, Manufacturing, or Technology. Expertise in Azure Cloud, Azure Data Lake, Databricks, Teradata, and ETL technologies. Strong problem-solving skills with a focus on performance optimization and data quality. Ability to collaborate effectively with analysts, subject matter experts, and external partners. About Us At eClerx, we serve some of the largest global companies – 50 of the Fortune 500 clients. Our clients call upon us to solve their most complex problems, and deliver transformative insights. Across roles and levels, you get the opportunity to build expertise, challenge the status quo, think bolder, and help our clients seize value About The Team eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law. Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Summary Position Summary Strategy & Analytics AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-makin g. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements AI & Data Testing Consultant The position is suited for individuals who have demonstrated ability to work effectively in a fast paced, high volume, deadline driven environment. An ETL Tester is responsible for testing and validating the accuracy and completeness of data being extracted, transformed, and loaded (ETL) from various sources into the target systems They work closely with ETL developers, data analysts, and other stakeholders to ensure the quality of data and the reliability of the ETL processes Education And Experience Education: B.Tech/M.Tech/MCA/MS/MBA Require experienced ETL testers (Informatica Power center) with an experience of 2-5 yrs. and having below skills: Required Skills Strong in Data warehouse testing - ETL and BI Strong Database Knowledge – Oracle, SQL Server, Teradata and Snowflake Strong SQL skills with experience in writing complex data validation SQL’s Experience working in Agile environment Experience creating test strategy, release level test plan and test cases Develop and Maintain test data for ETL testing Design and Execute test cases for ETL processes and data integration Good Knowledge of Rally, Jira and HP ALM Experience in ETL Automation and data validation using Python Document test results and communicate with stakeholders on the status of ETL testing Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300067 Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Summary Position Summary Strategy & Analytics AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-makin g. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements AI & Data Testing Consultant The position is suited for individuals who have demonstrated ability to work effectively in a fast paced, high volume, deadline driven environment. An ETL Tester is responsible for testing and validating the accuracy and completeness of data being extracted, transformed, and loaded (ETL) from various sources into the target systems They work closely with ETL developers, data analysts, and other stakeholders to ensure the quality of data and the reliability of the ETL processes Education And Experience Education: B.Tech/M.Tech/MCA/MS/MBA Require experienced ETL testers (Informatica Power center) with an experience of 2-5 yrs. and having below skills: Required Skills Strong in Data warehouse testing - ETL and BI Strong Database Knowledge – Oracle, SQL Server, Teradata and Snowflake Strong SQL skills with experience in writing complex data validation SQL’s Experience working in Agile environment Experience creating test strategy, release level test plan and test cases Develop and Maintain test data for ETL testing Design and Execute test cases for ETL processes and data integration Good Knowledge of Rally, Jira and HP ALM Experience in ETL Automation and data validation using Python Document test results and communicate with stakeholders on the status of ETL testing Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300070 Show more Show less

Posted 3 weeks ago

Apply

4.0 - 6.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Foreword: At iGreenTree, we're passionate about empowering energy and utility providers with innovative IT solutions. With deep domain knowledge and a dedication to innovation, we help our clients stay ahead of the curve in a rapidly changing industry. Whether you need IT consulting, application development, system integration, or digital transformation services, our team of experts has the expertise to deliver the right solution for your business. Partner with iGreenTree to unleash the power of technology and achieve sustainable growth in today's dynamic landscape. Who We Are Looking For: An ideal candidate who must demonstrate in-depth knowledge and understanding of RDBMS concepts and experienced in writing complex queries and data integration processes in SQL/TSQL and NoSQL. This individual will be responsible for helping the design, development and implementation of new and existing applications. Roles and Responsibilities: Reviews existing database designs and data management procedures and provides recommendations for improvement. Responsible for providing subject matter expertise in design of database schemes and performing data modeling (logical and physical models), for product feature enhancements as well as extending analytical capabilities. Develop technical documentation as needed. Architect, develop, validate and communicate Business Intelligence (BI) solutions like dashboards, reports, KPIs, instrumentation, and alert tools. Define data architecture requirements for cross-product integration within and across cloud-based platforms. Analyze, architect, develop, validate and support integrating data into SaaS platforms (like ERP, CRM, etc.) from external data source; Files (XML, CSV, XLS, etc.), APIs (REST, SOAP), RDBMS. Perform thorough analysis of complex data and recommend actionable strategies. Effectively translate data modeling and BI requirements into the design process. Big Data platform design i.e. tool selection, data integration, and data preparation for predictive modeling. Required Skills: Minimum of 4-6 years of experience in data modeling (including conceptual, logical and physical data models. 2-3 years of experience in Extraction, Transformation and Loading ETL work using data migration tools like Talend, Informatica, DataStage, etc. 4-6 years of experience as a database developer in Oracle, MS SQL or other enterprise database with focus on building data integration processes. Candidate should have any NoSQL technology exposure preferably MongoDB. Experience in processing large data volumes indicated by experience with Big Data platforms (Teradata, Netezza, Vertica or Cloudera, Hortonworks, SAP HANA, Cassandra, etc.) Understanding data warehousing concepts and decision support systems. Ability to deal with sensitive and confidential material and adhere to worldwide data security and Experience writing documentation for design and feature requirements. Experience developing data-intensive applications on cloud-based architectures and infrastructures such as AWS, Azure, etc. Excellent communication and collaboration skills. Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Summary Position Summary Strategy & Analytics AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-makin g. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements AI & Data Testing Consultant The position is suited for individuals who have demonstrated ability to work effectively in a fast paced, high volume, deadline driven environment. An ETL Tester is responsible for testing and validating the accuracy and completeness of data being extracted, transformed, and loaded (ETL) from various sources into the target systems They work closely with ETL developers, data analysts, and other stakeholders to ensure the quality of data and the reliability of the ETL processes Education And Experience Education: B.Tech/M.Tech/MCA/MS/MBA Require experienced ETL testers (Informatica Power center) with an experience of 2-5 yrs. and having below skills: Required Skills Strong in Data warehouse testing - ETL and BI Strong Database Knowledge – Oracle, SQL Server, Teradata and Snowflake Strong SQL skills with experience in writing complex data validation SQL’s Experience working in Agile environment Experience creating test strategy, release level test plan and test cases Develop and Maintain test data for ETL testing Design and Execute test cases for ETL processes and data integration Good Knowledge of Rally, Jira and HP ALM Experience in ETL Automation and data validation using Python Document test results and communicate with stakeholders on the status of ETL testing Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300070 Show more Show less

Posted 3 weeks ago

Apply

8.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Overview Data Analyst will be responsible to partner closely with business and S&T teams in preparing final analysis reports for the stakeholders enabling them to make important decisions based on various facts and trends and lead data requirement, source analysis, data analysis, data transformation and reconciliation activities. This role will be interacting with DG, DPM, EA, DE, EDF, PO and D &Ai teams for historical data requirement and sourcing the data for Mosaic AI program to scale solution to new markets. Responsibilities Lead data requirement, source analysis, data analysis, data transformation and reconciliation activities. Partners with FP&A Product Owner and associated business SME’s to understand & document business requirements and associated needs Performs the analysis of business data requirements and translates into a data design that satisfies local, sector and global requirements Using automated tools to extract data from primary and secondary sources. Using statistical tools to identify, analyse, and interpret patterns and trends in complex data sets could be helpful for the diagnosis and prediction. Working with engineers, and business teams to identify process improvement opportunities, propose system modifications. Proactively identifies impediments and looks for pragmatic and constructive solutions to mitigate risk. Be a champion for continuous improvement and drive efficiency. Preference will be given to candidate having functional understanding of financial concepts (P&L, Balance Sheet, Cash Flow, Operating Expense) and has experience modelling data & designing data flows Qualifications Bachelor of Technology from a reputed college Minimum 8-10 years of relevant work experience on data modelling / analytics, preferably Minimum 5-6year experience of navigating data in Azure Databricks, Synapse, Teradata or similar database technologies Expertise in Azure (Databricks, Data Factory, Date Lake Store Gen2) Proficient in SQL, Pyspark to analyse data for both development validation and operational support is critical Exposure to GenAI Good Communication & Presentation skill is must for this role. Show more Show less

Posted 3 weeks ago

Apply

2.0 - 4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description At United, we care about our customers. To be the best airline in aviation history, we need to deliver the best service to our customers. And it takes a whole team of dedicated customer-focused advocates to make it happen! From our Contact Center to customer analytics, insights, innovation, and everything in between, the Customer Experience team delivers outstanding service and helps us to run a more customer-centric and dependable airline. Job Overview And Responsibilities The team is currently looking for a well-rounded individual who has a passion for data and analytics. The role requires supporting the team by gathering data, conducting analyses, verifying reports and assist in ad-hoc decision support. Excellent time management, strong analytical capabilities and communication skills are keys to success in this role. Extract and analyze data from relational databases and reporting systems Proactively identify problems and opportunities and perform root cause analysis/diagnosis leading to business impact Identify issues, create hypotheses, and translate data into meaningful insights; present recommendations to key decision makers Responsible for the development and maintenance of reports, analyses, dashboards to drive key business decisions Build predictive models and analyze results for dissemination of insights to leadership of the numerous operations groups at United Prepare presentations that summarize data and help facilitate decision-making for business partners and senior leadership team This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. This position is for United Airlines Business Services Pvt. Ltd - a wholly owned subsidiary of United Airlines Inc. Qualifications What’s needed to succeed (Minimum Qualifications): Bachelor's Degree required 2-4 years of analytics-related experience Must be proficient in Microsoft Excel and PowerPoint Must be competent in querying and manipulating relational databases via Teradata SQL assistant, Microsoft SQL server, Oracle SQL Developer Must be proficient in at least one quantitative analysis tool – Python / R Must be familiar with one or more reporting tools – Tableau / Oracle OBIEE / TIBCO Spotfire Must be detail-oriented, thorough and analytical with a desire for continuous improvements Must be adept at juggling several projects and initiatives simultaneously through appropriate prioritization Exhibit written and spoken English fluency Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position What will help you propel from the pack (Preferred Qualifications): MBA preferred GGN00002020 Show more Show less

Posted 3 weeks ago

Apply

8.0 - 10.0 years

0 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

Overview Data Analyst will be responsible to partner closely with business and S&T teams in preparing final analysis reports for the stakeholders enabling them to make important decisions based on various facts and trends and lead data requirement, source analysis, data analysis, data transformation and reconciliation activities. This role will be interacting with DG, DPM, EA, DE, EDF, PO and D &Ai teams for historical data requirement and sourcing the data for Mosaic AI program to scale solution to new markets. Responsibilities Lead data requirement, source analysis, data analysis, data transformation and reconciliation activities. Partners with FP&A Product Owner and associated business SME's to understand & document business requirements and associated needs Performs the analysis of business data requirements and translates into a data design that satisfies local, sector and global requirements Using automated tools to extract data from primary and secondary sources. Using statistical tools to identify, analyse, and interpret patterns and trends in complex data sets could be helpful for the diagnosis and prediction. Working with engineers, and business teams to identify process improvement opportunities, propose system modifications. Proactively identifies impediments and looks for pragmatic and constructive solutions to mitigate risk. Be a champion for continuous improvement and drive efficiency. Preference will be given to candidate having functional understanding of financial concepts (P&L, Balance Sheet, Cash Flow, Operating Expense) and has experience modelling data & designing data flows Qualifications Bachelor of Technology from a reputed college Minimum 8-10 years of relevant work experience on data modelling / analytics, preferably Minimum 5-6year experience of navigating data in Azure Databricks, Synapse, Teradata or similar database technologies Expertise in Azure (Databricks, Data Factory, Date Lake Store Gen2) Proficient in SQL, Pyspark to analyse data for both development validation and operational support is critical Exposure to GenAI Good Communication & Presentation skill is must for this role.

Posted 3 weeks ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About This Role Wells Fargo is seeking a Financial Crimes Associate Manager In This Role, You Will Supervise entry to mid level roles in transactional or tactical less complex tasks and processes to ensure timely completion, quality and compliance Manage the implementation of procedures, controls, analytics and trend analysis to ensure identification, prevention execution, detection, investigation, recovery, government and internal reporting of financial crime activity Maintain awareness of financial crimes activity companywide and ensure all issues are proactively addressed, and escalated where necessary Ensure compliance with regulatory requirements such as Bank Secrecy Act, USA PATRIOT Act, and FATCA Identify opportunities for process improvement and risk control development in less complex functions Manage a risk based financial crimes program or functional area with low to moderate risk and complexity Lead implementation of multiple complex initiatives with low to moderate risk Make supervisory and tactical decisions and resolve issues related to team supervision, work allocation and daily operations under direction of functional area management Leverage interpretation of policies, procedures, and compliance requirements Collaborate and consult with peers, colleagues and managers Ensure coordination with team, line of business, other business units, Audit, and regulators on risk related topics Manage allocation of people and financial resources for Financial Crimes Mentor and guide talent development of direct reports and assist in hiring talent Required Qualifications: 2+ years of Financial Crimes, Operational Risk, Fraud, Sanctions, Anti-Bribery, Corruption experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education 1+ years of Leadership experience Desired Qualifications: Hands on experience as a people manager in financial institution managing team of fin crime data quality , model development as well as financial reporting analysts Experience with managing procedures and controls for teams that support data analytics, data platforms, and reporting Demonstrated experience working in Anti-Money Laundering (AML) programs and/or data platform management for large financial institutions Experience mentoring and guiding guide talent development of direct reports and assist in hiring talent Hands on experience with handling BSA/AML/OFAC laws and regulation specific and financial crimes /regulatory/fraud specific data Knowledge of fin crime data quality activities and controls required for large financial institutions. Demonstrated experience with report and dashboard creations using large data sets, including non-standard data is desired but not mandatory Team handling experience for UAT/regression testing on data outputs involving complex data mapping designs Hands on experience as a people manager leading a team of 15 + data analysts who are responsible for conducting data quality analysis to support financial crime data modelling Prior Experience enhancing AML Monitoring Models and Systems including Oracle/Actimize using tools like Advance SQL/SAS/Python Manage team of technical analysts working on AML technology leveraging SAS/SQL/Python/Teradata and technical data validation tools and relevant AML technologies including Norkom, Actimize, Oracle FCCM etc.to support technical project deliveries Support AML technology initiatives during new AML product implementation as well as during technology migrations for Transactions Monitoring and Fraud Detection Handle large technology transformation programs with phased delivery of technical deliverables /features of a Transactions Monitoring and Fraud Detection system and associated data validations/transformation logics as well as MIS reporting using Power BI/Tableau Manage team to deliver AML/BSA technology project deliveries including AML model developments, Transactions Monitoring Model validations and enhancements, Critical Data Elements identification, Data quality/data validation, Threshold testing , MIS Reporting using Tableau/Power BI as well as AI/ML based AML technology developments and testing Manage the implementation of procedures, controls, analytics and trend analysis to ensure identification, prevention execution, detection, investigation, recovery, government and internal reporting of financial crime activity Maintain awareness of financial crimes activity companywide and ensure all issues are proactively addressed, and escalated where necessary Ensure compliance with regulatory requirements such as Bank Secrecy Act, USA PATRIOT Act, and FATCA Posting End Date: 28 May 2025 Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants With Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo . Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment And Hiring Requirements Third-Party recordings are prohibited unless authorized by Wells Fargo. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process. Reference Number R-448974 Show more Show less

Posted 3 weeks ago

Apply

5.0 - 14.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Skill: - Abinitio Experience: 5 to 14 years Location: - Kochi (Walkin on 14th June) Responsibilities Ab Initio Skills: Graph Development, Ab Initio standard environment parameters, GD(PDL,MFS Concepts)E, EME basics, SDLC, Data Analysis Database: SQL Proficient, DB Load / Unload Utilities expert, relevant experience in Oracle, DB2, Teradata (Preferred) UNIX:Shell Scripting (must), Unix utilities like sed, awk, perl, python Scheduling knowledge (Control M, Autosys, Maestro, TWS, ESP) Project Profiles:Atleast 2-3 Source Systems, Multiple Targets, simple business transformations with daily, monthly Expected to produce LLD, work with testers, work with PMO and develop graphs, schedules, 3rd level support Should have hands on development experience with various Ab Initio components such as Rollup Scan, join Partition, by key Partition, by Round Robin. Gather, Merge, Interleave Lookup etc Experience in finance and ideally capital markets products. Requires experience in development and support of complex frameworks to handle multiple data ingestion patterns.e.g, messaging files,hierarchical polymorphic xml structures conformance of data to a canonical model curation and distribution of data QA Resource. Data modeling experience creating CDMs LDMs PDMs using tools like ERWIN, Power designer or MagicDraw. Detailed knowledge of the capital markets including derivatives products IRS CDS Options structured products and Fixed Income products. Knowledge on Jenkins and CICD concepts. Knowledge on scheduling tool like Autosys and Control Center. Demonstrated understanding of how AbInitio applications and systems interact with the underlying hardware ecosystem. Experience working in an agile project development lifecycle. Strong in depth knowledge of databases and database concepts DB2 knowledge is a plus Show more Show less

Posted 3 weeks ago

Apply

3.0 - 4.0 years

5 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

Overview This role serves as an Associate Analyst for the GTM Data analytics COE project development team. This role is one of the go-to resource for building/ maintaining key reports, data pipelines and advanced analytics necessary to bring insights to light for senior leaders and Sector and field end users. Responsibilities The COEs core competencies are a mastery of data visualization, data engineering, data transformation, predictive and prescriptive analytics Enhance data discovery, processes, testing, and data acquisition from multiple platforms. Apply detailed knowledge of PepsiCos applications for root-cause problem-solving. Ensure compliance with PepsiCo IT governance rules and design best practices. Participate in project planning with stakeholders to analyze business opportunities and define end-to-end processes. Translate operational requirements into actionable data presentations. Support data recovery and integrity issue resolution between business and PepsiCo IT. Provide performance reporting for the GTM function, including ad-hoc requests using internal, shipment data systems Develop on-demand reports and scorecards for improved agility and visualization. Collate and analyze large data sets to extract meaningful insights on performance trends and opportunities. Present insights and recommendations to the GTM Leadership team regularly. Manage expectations through effective communication with headquarters partners. Ensure timely and accurate data delivery per service level agreements (SLA). Collaborate across functions to gather insights for action-oriented analysis. Identify and act on opportunities to improve work delivery. Implement process improvements, reporting standardization, and optimal technology use. Foster an inclusive and collaborative environment. Provide baseline support for monitoring SPA mailboxes, work intake, and other ad-hoc requests.Additionally, the role will provide baseline support for monitoring work intake & other adhoc requests, queries Qualifications Undergrad degree in Business or related technology 3-4 Yrs working experience in Power BI 1-2 Yrs working experience in SQL and Python Preferred qualifications : Information technology or analytics experience is a plus Familiarity with Power BI/ Tableau, Python, SQL, Teradata, Azure, MS Fabric Requires a level of analytical, critical thinking, and problem-solving skills as well as great attention to detail Strong time management skills, ability to multitask, set priorities, and plan

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

Overview As Senior Analyst, Data Modeling, your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling - documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper management: business and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; data str/cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications 8+ years of overall technology experience that includes at least 4+ years of data modeling and systems architecture. 3+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI). Show more Show less

Posted 3 weeks ago

Apply

Exploring Teradata Jobs in India

Teradata is a popular data warehousing platform that is widely used by businesses in India. As a result, there is a growing demand for skilled professionals who can work with Teradata effectively. Job seekers in India who have expertise in Teradata have a wide range of opportunities available to them across different industries.

Top Hiring Locations in India

  1. Bengaluru
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

These cities are known for their thriving tech industries and have a high demand for Teradata professionals.

Average Salary Range

The average salary range for Teradata professionals in India varies based on experience levels. Entry-level roles can expect to earn around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15 lakhs per annum.

Career Path

In the field of Teradata, a typical career path may involve progressing from roles such as Junior Developer to Senior Developer, and eventually to a Tech Lead position. With experience and skill development, professionals can take on more challenging and higher-paying roles in the industry.

Related Skills

In addition to Teradata expertise, professionals in this field are often expected to have knowledge of SQL, data modeling, ETL tools, and data warehousing concepts. Strong analytical and problem-solving skills are also essential for success in Teradata roles.

Interview Questions

  • What is Teradata and how is it different from other database management systems? (basic)
  • Can you explain the difference between a join and a merge in Teradata? (medium)
  • How would you optimize a Teradata query for performance? (medium)
  • What are fallback tables in Teradata and why are they important? (advanced)
  • How do you handle duplicate records in Teradata? (basic)
  • What is the purpose of a collect statistics statement in Teradata? (medium)
  • Explain the concept of indexing in Teradata. (medium)
  • How does Teradata handle concurrency control? (advanced)
  • Can you describe the process of data distribution in Teradata? (medium)
  • What are the different types of locks in Teradata and how are they used? (advanced)
  • How would you troubleshoot performance issues in a Teradata system? (medium)
  • What is a Teradata View and how is it different from a Table? (basic)
  • How do you handle NULL values in Teradata? (basic)
  • Can you explain the difference between FastLoad and MultiLoad in Teradata? (medium)
  • What is the Teradata Parallel Transporter? (advanced)
  • How do you perform data migration in Teradata? (medium)
  • Explain the concept of fallback protection in Teradata. (advanced)
  • What are the different types of Teradata macros and how are they used? (advanced)
  • How do you monitor and manage Teradata performance? (medium)
  • What is the purpose of the Teradata QueryGrid? (advanced)
  • How do you optimize the storage of data in Teradata? (medium)
  • Can you explain the concept of Teradata indexing strategies? (advanced)
  • How do you handle data security in Teradata? (medium)
  • What are the best practices for Teradata database design? (medium)
  • How do you ensure data integrity in a Teradata system? (medium)

Closing Remark

As you prepare for interviews and explore job opportunities in Teradata, remember to showcase your skills and experience confidently. With the right preparation and determination, you can land a rewarding role in the dynamic field of Teradata in India. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies