Home
Jobs

823 Teradata Jobs - Page 8

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Data Engineer - Google Cloud  5+ years direct experience working in Enterprise Data Warehouse technologies.  5+ years in a customer facing role working with enterprise clients.  Experience with implementing and/or maintaining technical solutions in virtualized environments.  Experience in design, architecture and implementation of Data warehouses, data pipelines and flows.  Experience with developing software code in one or more languages such as Java, Python and SQL.  Experience designing and deploying large scale distributed data processing systems with few technologies such as Oracle, MS SQL Server, MySQL, PostgreSQL, MongoDB, Cassandra, Redis, Hadoop, Spark, HBase, Vertica, Netezza, Teradata, Tableau, Qlik or MicroStrategy.  Customer facing migration experience, including service discovery, assessment, planning, execution, and operations.  Demonstrated excellent communication, presentation, and problem-solving skills.  Mandatory Certifications Required Google Cloud Professional Cloud Architect Or Google Cloud Professional Data Engineer + AWS Big Data Specialty Certification Mandatory skill sets-GCP Data Engineering, SQL, Python Preferred Skill Sets-GCP Data Engineering, SQL, Python Year of experience required-4-8 years Qualifications-B.E / B.TECH/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field Of Study Required Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Python (Programming Language) Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) –Senior- Snowflake As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for candidates with strong technology and data understanding in big data engineering space, having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Lead and Architect migration of data analytics environment from Teradata to Snowflake with performance and reliability Develop & deploy big data pipelines in a cloud environment using Snowflake cloud DW ETL design, development and migration of existing on-prem ETL routines to Cloud Service Interact with senior leaders, understand their business goals, contribute to the delivery of the workstreams Design and optimize model codes for faster execution Skills And Attributes For Success Hands on developer in the field of data warehousing, ETL Hands on development experience in Snowflake. Experience in Snowflake modelling - roles, schema, databases. Experience in Integrating with third-party tools, ETL, DBT tools Experience in Snowflake advanced concepts like setting up resource monitors and performance tuning would be preferable Applying object-oriented and functional programming styles to real-world Big Data Engineering problems using Java/Scala/Python Develop data pipelines to perform batch and Real - Time/Stream analytics on structured and unstructured data. Data processing patterns, distributed computing and in building applications for real-time and batch analytics. Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake’s products and marketing. To qualify for the role, you must have Be a computer science graduate or equivalent with 3 - 7 years of industry experience Have working experience in an Agile base delivery methodology (Preferable) Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Be a technical expert on all aspects of Snowflake Deploy Snowflake following best practices, including ensuring knowledge transfer so that customers are properly enabled and are able to extend the capabilities of Snowflake on their own Work hands-on with customers to demonstrate and communicate implementation best practices on Snowflake technology Maintain deep understanding of competitive and complementary technologies and vendors and how to position Snowflake in relation to them Work with System Integrator consultants at a deep technical level to successfully position and deploy Snowflake in customer environments Provide guidance on how to resolve customer-specific technical challenges Ideally, you’ll also have Client management skills What We Look For Minimum 5 years of experience as Architect on Analytics solutions and around 2 years of experience with Snowflake. People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 week ago

Apply

4.0 years

2 - 3 Lacs

Gurgaon

On-site

Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description Our Marketing and Loyalty team is the strategic force behind United’s industry-leading brand and experience, supporting revenue growth by turning customers into lifelong United flyers. Our marketing communications, market research and brand teams drive travelers’ familiarity and engagement with the United brand. Our product, design and service teams bring the signature United brand to life in our clubs and onboard our aircraft, ensuring a seamless, premier experience. And when customers choose United again and again, that’s because the loyalty team has been hard at work crafting an award-winning program. Our loyalty team manages United MileagePlus®, building travel and lifestyle partnerships that customers can engage with every day, and works with our Star Alliance partners to ensure United can take you anywhere you want to go. Job overview and responsibilities United Airlines reaches out to customers and potential travelers via digital campaigns with new information, travel inspiration, personalized offers, promos, etc. The Digital Marketing & Personalized Offers team at IKC supports all such digital acquisition initiatives with insights to help strategize campaigns and analytics to help measure performance. We work closely with stakeholders in the US to bring these campaigns to life and continuously improve performance with learnings and actionable insights. Ensure alignment and prioritization with business objectives and initiatives – help teams make faster, smarter decisions Conduct exploratory analysis, identify opportunities, and proactively suggest initiatives to meet marketing objectives Assist in campaign planning, targeting and audience identification; measure campaign results and performance using data analysis Create content for and deliver presentations to United leadership and external stakeholders Own workstreams to deliver results, while leading other team members Ensure seamless stakeholder management and keep lines of communication open with all stakeholders Create, modify and automate reports and dashboards - take ownership of reporting structure and metrics, clearly and effectively communicate relevant information to decision makers using data visualization tools This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. This position is for United Airlines Business Services Pvt. Ltd - a wholly owned subsidiary of United Airlines Inc. Qualifications What’s needed to succeed (Minimum Qualifications): Bachelor's degree or 4 years of relevant work experience 4+ years of experience in Analytics and working with analytical tools Proven comfort and an intellectual curiosity for working with very large data sets Experience in manipulating and analyzing complex, high-volume, high-dimensionality data from various sources to highlight patterns and relationships Proficiency in using database querying tools and writing complex queries and procedures using Teradata SQL and/ or Microsoft SQL Familiarity with one or more reporting tools – Spotfire/ Tableau Advanced level comfort with Microsoft Office, especially Excel and PowerPoint Ability to communicate analysis in a clear and precise manner High sense of ownership of work, and ability to lead a team Ability to work under time constraints Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position What will help you propel from the pack (Preferred Qualifications): Master's degree Bachelor’s Degree in a quantitative field like Math, Statistics, Analytics and/ or Business SQL/ Python/ R Visualization tools – Tableau/ Spotfire Understanding of digital acquisition channels Strong knowledge of either Python or R

Posted 1 week ago

Apply

5.0 - 8.0 years

0 Lacs

Chennai

On-site

Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. Job Description Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ͏ Mandatory Skills: Teradata. Experience: 5-8 Years. Please find the Teradata JD Below: Good understanding of DWH Good Experience in Teradata (Bteqs), Oracle and SQL Server ETL Knowledge like (Informatica) BI Knowledge (like Power BI or looker) Basic Shell scripting Scheduling tools like (Maestro) Good understanding of CICD in Azure DevOps or GitHub, with knowledge of Jenkin Jobs ͏ Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ͏ Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ͏ Deliver No Performance Parameter Measure 1 Process No. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT 2 Team Management Productivity, efficiency, absenteeism 3 Capability development Triages completed, Technical Test performance Mandatory Skills: Teradata. Experience: 5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 week ago

Apply

0 years

5 - 6 Lacs

Chennai

On-site

Date live: 06/02/2025 Business Area: Physical Operations Area of Expertise: Technology Contract: Permanent Reference Code: JR-0000046220 Be a part of a place where challenges are measured in billions, qubits and nanoseconds. Build your career in an environment where we’re advancing machine learning, leveraging blockchains, and harnessing FinTech. Working in Barclays technology, you’ll reimagine possibilities: learning and innovating to solve the challenges ahead, delivering for millions of customers. We are shaping the future of financial technology. Why not join us and make it happen here? Join us as a "Senior Data Operations Analyst" at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionise our digital offerings, ensuring unapparelled customer experiences. You may be assessed on the key critical skills relevant for success in role, such as experience with SQL and Unix, as well as job-specific skillsets. To be successful as a "Data Operations Analyst", you should have experience with: Developing and managing your campaigns Proactively monitor and track the outcome of scheduled processes daily (Monday – Friday 8am – 4pm) with issues expected to be raised within an hour of detection and on same day. Operate the scheduler change process, including tracking new requests, confirming all governance steps have been performed, and then deploying changes into the scheduler in a controlled way. In the event of process failures being detected, promptly communicate this to stakeholders, then perform any agreed action to remediate this issue. Continuously maintain documentation for processes that support the scheduler, for instance the program and specification documentation. Where required of our area, coordinate the capture of information needed to provide a collective response to impact assessment requests from infrastructure teams. In the event of issues being caused by infrastructure, lead the raising of incidents to platform support teams, and regularly communicate progress and outcomes to stakeholders. Perform all steps as set out in role process documentation for scheduler management. Stakeholder Management Main stakeholders are the Customer Targeting team within Customer Communication Delivery, with the expectation that you support the team through accurately and promptly dealing with scheduled process queries and issues. Other stakeholders are the multiple support teams we rely upon for the platforms and tools that the team use to execute customer communications. You’ll need to contact them whenever their support is needed with an issue. On occasion you may need to communicate with processes owners in Barclays UK . Decision-making and Problem Solving If a problem is found with a scheduled process then evaluation of the issue and root cause identification should be performed quickly before escalating to the process owner or Team Lead(s), ideally with a recommendation of how to resolve the issue. It’s important that this evaluation considers a range of possible causes and considers whether multiple issues in a short time could be linked to a common root cause. Person Specification Excellent attention to detail. Well organised and diligent when updating documentation. Strong written and verbal communication skills with the ability to communicate technical information in a clear and appropriate way to colleagues who may not have the same understanding, as well as share relevant information or updates with the team. Ability to problem solve and work under pressure Able to build effective and respectful working relationships with colleagues and other teams across geographies and as part of virtual teams. Speaks up to ensure that team processes and controls are followed. Basic/ Essential Qualifications: Strong planning, organisational, and stakeholder management skills Good analytical and problem solving skills. High attention to detail and quality of work. Knowledge and practical experience with SQL syntax and logic Desirable skillsets/ good to have: Practical experience with working with schedulers, such as Unix CRON or Tivoli’s Dynamic Workload Console (TWS) Practical experience with Teradata or Hadoop databases, ideally in a commercial environment. Practical experience with SAS or R programming languages, ideally in a commercial environment. Microsoft Excel to an advanced level. This role will be based out of Chennai. Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data. Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. Development of processing and analysis algorithms fit for the intended data complexity and volumes. Collaboration with data scientist to build and deploy machine learning models. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team’s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave. (Opens in new tab or window) (Opens in new tab or window) (Opens in new tab or window) Reasonable adjustment Our purpose Equal opportunities Right to work statement Who succeeds in Tech at Barclays? For a career with us, you need to be prepared to take big steps forward, curious to face the challenges ahead, and driven to focus on the outcomes. We need people with the Barclays mindset to make it happen here. What you'll get in return Competitive holiday allowance Life assurance Private medical care Pension contribution Our technology Supporting our 48 million customers and clients worldwide takes a lot of forward thinking. It means harnessing technology to support the economy. It means making a difference to people’s lives. And it requires the maintenance and development of a global, technological infrastructure. At Barclays, technology helps us keep transactions moving, manages data, and protects our customers. Join a world where your work creates unique moments of impact. Make it happen here. This is Barclays Chennai Barclays Chennai is one of our key strategic locations. It’s home to over 5,000 talented and passionate Barclays colleagues, across Finance, HR, operational and technology functions ensuring the bank’s technology and infrastructure runs smoothly. A workplace of the future Your wellbeing Your commute More than just a workplace Working flexibly We’re committed to providing a supportive and inclusive culture and environment for you to work in. This environment recognises and supports your personal needs, alongside the professional needs of our business. If you'd like to explore flexible working arrangements, please discuss this with the hiring manager. Your request will be reviewed in-line with the requirements of the role/business needs of the team. Hybrid working We have a structured approach to hybrid working, where colleagues work at an onsite location on fixed, ‘anchor’, days, as set by the business area. Please discuss the working pattern requirements for the role you are applying for with the hiring manager. Please note that working arrangements may be subject to change on reasonable notice to ensure we meet the needs of our business. Barclays is built on an international scale. Our geographic reach, our wide variety of functions, businesses, roles and locations reflect the rich diversity of our worldwide customer base. All of which means we offer incredible variety, depth and breadth of experience. And the chance to learn from a globally diverse mix of colleagues, including some of the very best minds in banking, finance, technology and business. Throughout, we’ll encourage you to embrace mobility, exploring every part of our operations as you build your career.

Posted 1 week ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

Chennai

On-site

About us: Analytics & Information Management AIM is a global community that is driving data driven transformation across Citi in multiple functions with the objective to create actionable intelligence for our business leaders. We are a fast-growing organization working with Citi businesses and functions across the world. Remediation & Remuneration COE Remediation team is responsible for cross functional coordination of customer facing remediation efforts. Provide oversight, prioritization, and scheduling of remediation activities with remediation partner teams including Technology, FSO, Analytics groups, Shared Services (mail vendor) and Controllers. R&R AIM Team works as the data Analytic partner for the Issue Remediation Business Team. Job responsibilities: R&R team manages the analysis of the customer remediation issues across globe, currently in retail consumer bank. The critical areas are work is divided into: Remediation analysis : Execution of the comprehensive data remediation approach on Customer issues due to gaps observed in policies and governance, Self-identified, or through IA. Impact assessment : Identification of size of the customers and the dollar amount impacted due to these issues. Issue Management & Root cause analysis : Identifying the issues and reasons for the issues by leveraging analytical methods. Audit Support : Tracking implementation plans and providing data evidence, artifacts for audit completion Report Generation : Generate business metrics and actionable business insights using latest reporting tools Expertise Required: Tools and Platforms Proficient in SAS, SQL, RDBMS, Teradata, Unix Proficient in MS Excel, PowerPoint, and VBA Jira, Bitbucket, Sharepoint Tableau, Cognos Exposure to Big data, Python Domain Skills Good understanding of banking domain and consumer products (Retail Banking, Deposit, Loans, Wealth management, Mortgage, Insurance, etc.) (Preferred) Knowledge of Finance Regulations, Understanding on Retail Business/ Banking Domain Analytical & Interpersonal Skills Ability to identify, clearly articulate and solve complex business problems and present them to the management in a structured and simpler form Data analysis, Data profiling, Data Management skills MIS reporting and generate actionable Business Insights Should have excellent communication and inter-personal skills Good process management skills Coming up with automated Techniques to reduce redundancy, remove false positives and enhance optimization Identification of control gaps and providing recommendations as per data strategy (Preferred) - Risk & control Metrics & Audit Framework Exposure Other Info: Education Level : Master’s / Advanced Degree in Information Technology/ Computer Applications/ Engineering/ MBA from a premier institute Overall experience of 2-5 years Job Category : Decision Management Schedule : Full-time Shift : Regular Local Working Hours (aligned with NAM working hours) Employee Status : Regular Salary Grade : C09 - Job Family Group: Decision Management - Job Family: Data/Information Management - Time Type: Full time - Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 1 week ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

Overview As Senior Analyst, Data Modeling, your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling - documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper management: business and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; data str/cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications 8+ years of overall technology experience that includes at least 4+ years of data modeling and systems architecture. 3+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI). Show more Show less

Posted 1 week ago

Apply

17.0 - 19.0 years

0 Lacs

Andhra Pradesh

On-site

Software Engineering Associate Director - HIH - Evernorth. About Evernorth Evernorth Health Services, a division of The Cigna Group (NYSE: CI), creates pharmacy, care, and benefits solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention, and treatment of illness and disease more accessible to millions of people. Position Summary: The Software Development Associate Director provides hands on leadership, management, and thought leadership for a Delivery organization enabling Cigna's Technology teams. This individual will lead a team based in our Hyderabad Innovation Hub to deliver innovative solutions supporting multiple business and technology domains within Cigna. This includes the Sales & Underwriting, Producer, Service Operations, and Pharmacy business lines, as well as testing and DevOps enablement. The focus of the team is to build innovative go-to-market solutions enabling business while modernizing our existing asset base to support business growth. The Technology strategy is aligned to our business strategy and the candidate will not only be able to influence technology direction but also establishing our team through recruiting and mentoring employees and vendor resources. This is a hands-on position with visibility to the highest levels of the Cigna Technology team. This leader will focus on enabling innovation using the latest technologies and development techniques. This role will foster rapidly building out a scalable delivery organization that aligns with all areas within the Technology team. The ideal candidate will be able to attract and develop talent in a highly dynamic environment. Job Description & Responsibilities: Provide leadership, vision, and design direction for the quality and development of the US Medical and Health Services Technology teams based at the Hyderabad Innovation Hub (HIH). Work in close coordination with leaders and teams based in the United States, as well as contractors employed by the US Medical and Health Services Technology team who are based both within and outside of the United States, to deliver products and capabilities in support of Cigna's business lines. Provide leadership to HIH leaders and teams ensuring the team is meeting the following objectives: Design, configuration, implementation application design/development, and quality engineering within the supported technologies and products. Hands-on people manager who has experience leading agile teams of highly talented technology professionals developing large solutions and internal facing applications. They are expected to work closely with developers, quality engineers, technical project managers, principal engineers, and business stakeholders to ensure that application solutions meet business/customer requirements. A servant leader mentality and a history of creating an inclusive environment, fostering diverse views and approaches from the team, and coaching and mentoring them to thrive in a dynamic workplace. A history of embracing and incubating emerging technology and open-source products. A passion for building highly resilient, scalable, and available platforms, rich reusable foundational capabilities and seamless developer experience while focusing on strategic vision and technology roadmap delivery in an MVP / iterative fast paced approach. Accountable for driving towards timely decisions while influencing across engineering and development delivery teams to drive towards meeting project timelines while balancing destination state. Ensure engineering solutions align with the Technology strategy and that they support the application’s requirements. Plan and implement procedures that will maximize engineering and operating efficiency for application integration technologies. Identify and drive process improvement opportunities. Proactive monitoring and management design of supported assets assuring performance, availability, security, and capacity. Maximize the efficiency (operational, performance, and cost) of the application assets. Experience Required: 17 to 19 years of IT and business/industry or equivalent experience preferred, with at least 5 years of experience in a leadership role with responsibility for the delivery of large-scale projects and programs. Leadership, cross-cultural communication, and familiarity with wide range of technologies and stakeholders. Strong Emotional Intelligence with the ability to foster collaboration across geographically dispersed teams. Experience Desired: Recognized leader with proven track record of delivering software engineering initiatives and cross-IT/business initiatives. Proven experience leading/managing technical teams with a passion for developing talent within the team. Experience with vendor management in an onshore/offshore model. Experience in Healthcare, Pharmacy and/or Underwriting systems. Experience with AWS. Education and Training Required: B.S. degree in Computer Science, Information Systems, or other related degrees; Industry certifications such as AWS Solution Architect, PMP, Scrum Master, or Six Sigma Green Belt are also ideal. Primary Skills: Familiarity with most of the following Application Development technologies: Python, RESTful services, React, Angular, Postgres, and MySQL (relational database management systems). Familiarity with most of the following Data Engineering technologies: Databricks, Spark, PySpark, SQL, Teradata, and multi-cloud environments. Familiarity with most of the following Cloud and Emerging technologies: AWS, LLMs (OpenAI, Anthropic), vector databases (Pinecone, Milvus), graph databases (Neo4j, JanusGraph, Neptune), prompt engineering, and fine-tuning AI models. Familiarity with enterprise software development lifecycle to include production reviews and ticket resolution, navigating freeze/stability periods effectively, total cost of ownership reporting, and updating applications to align with evolving security and cloud standards. Familiarity with agile methodology including SCRUM team leadership or Scaled Agile (SAFE). Familiarity with modern delivery practices such as continuous integration, behavior/test driven development, and specification by example. Deep people and matrix management skills, with a heavy emphasis on coaching and mentoring of less senior staff, and a strong ability to influence VP level leaders. Proven ability to resolve issues and mitigate risks that could undermine the delivery of critical initiatives. Strong written and verbal communication skills with the ability to interact with all levels of the organization. Strong influencing/negotiation skills. Strong interpersonal/relationship management skills. Strong time and project management skills. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.

Posted 1 week ago

Apply

8.0 years

0 Lacs

Andhra Pradesh

Remote

HIH - Software Engineering Associate Advisor Position Overview The successful candidate will be a member of our US medical Integration Solutions ETL team. They will play a major role in the design and development if the ETL application in support of various portfolio projects. Responsibilities Analyze business requirements and translate into ETL architecture and data rules Serve as advisor and subject matter expert on project teams Manage both employees and consultants on multiple ETL projects. Oversee and review all design and coding from developers to ensure they follow company standards and best practices, as well as architectural direction Assist in data analysis and metadata management Test planning and execution Effectively operate within a team of technical and business professionals Asses new talent and mentor direct reports on best practices Review all designs and code from developers Qualifications Desired Skills & Experience: 8 - 11 Years of Experience in Java and Python, PySpark to support new development as well as support existing 7+ Years of Experience with Cloud technologies, specifically AWS Experience in AWS services such as Lambda, Glue, s3, MWAA, API Gateway and Route53, DynamoDB, RDS MySQL, SQS, CloudWatch, Secrete Manager, KMS, IAM, EC2 and Auto Scaling Group, VPC and Security Groups Experience with Boto3, Pandas and Terraforms for building Infrastructure as a Code Experience with IBM Datastage ETL tool Experience with CD /CI methodologies and processing and the development of these processes DevOps experience Knowledge in writing SQL Data mapping: source to target : target to multiple formats Experience in the development of data extraction and load processes in a parallel framework Understanding of normalized and de-normalized data repositories Ability to define ETL standards & processes SQL Standards / Processes / Tools: Mapping of data sources ETL Development, monitoring, reporting and metrics Focus on data quality Experience with DB2/ZOS, Oracle, SQL Server, Teradata and other database environments Unix experience Excellent problem solving and organizational skills Strong teamwork and interpersonal skills and ability to communicate with all management levels Leads others toward technical accomplishments and collaborative project team efforts Very strong communication skills, both verbal and written, including technical writing Strong analytical and conceptual skills Location & Hours of Work (Specify whether the position is remote, hybrid, in-office and where the role is located as well as the required hours of work) Equal Opportunity Statement Evernorth is an Equal Opportunity Employer actively encouraging and supporting organization-wide involvement of staff in diversity, equity, and inclusion efforts to educate, inform and advance both internal practices and external work with diverse client populations. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.

Posted 1 week ago

Apply

12.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

Overview PepsiCo operates in an environment undergoing immense and rapid change. Big-data and digital technologies are driving business transformation that is unlocking new capabilities and business innovations in areas like eCommerce, mobile experiences and IoT. The key to winning in these areas is being able to leverage enterprise data foundations built on PepsiCo’s global business scale to enable business insights, advanced analytics, and new product development. PepsiCo’s Data Management and Operations team is tasked with the responsibility of developing quality data collection processes, maintaining the integrity of our data foundations, and enabling business leaders and data scientists across the company to have rapid access to the data they need for decision-making and innovation. What PepsiCo Data Management and Operations does: Maintain a predictable, transparent, global operating rhythm that ensures always-on access to high-quality data for stakeholders across the company. Responsible for day-to-day data collection, transportation, maintenance/curation, and access to the PepsiCo corporate data asset. Work cross-functionally across the enterprise to centralize data and standardize it for use by business, data science or other stakeholders. Increase awareness about available data and democratize access to it across the company. As a data engineering lead, you will be the key technical expert overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be empowered to create & lead a strong team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. Ideally Candidate must be flexible to work an alternative schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon coverage requirements of the job. The candidate can work with immediate supervisor to change the work schedule on rotational basis depending on the product and project requirements. Responsibilities Manage a team of data engineers and data analysts by delegating project responsibilities and managing their flow of work as well as empowering them to realize their full potential. Design, structure and store data into unified data models and link them together to make the data reusable for downstream products. Manage and scale data pipelines from internal and external data sources to support new product launches and drive data quality across data products. Create reusable accelerators and solutions to migrate data from legacy data warehouse platforms such as Teradata to Azure Databricks and Azure SQL. Enable and accelerate standards-based development prioritizing reuse of code, adopt test-driven development, unit testing and test automation with end-to-end observability of data Build and own the automation and monitoring frameworks that captures metrics and operational KPIs for data pipeline quality, performance and cost. Collaborate with internal clients (product teams, sector leads, data science teams) and external partners (SI partners/data providers) to drive solutioning and clarify solution requirements. Evolve the architectural capabilities and maturity of the data platform by engaging with enterprise architects to build and support the right domain architecture for each application following well-architected design standards. Define and manage SLA’s for data products and processes running in production. Create documentation for learnings and knowledge transfer to internal associates. Qualifications 12+ years of overall technology experience that includes at least 5+ years of hands-on software development, data engineering, and systems architecture. 8+ years of experience with Data Lakehouse, Data Warehousing, and Data Analytics tools. 6+ years of experience in SQL optimization and performance tuning on MS SQL Server, Azure SQL or any other popular RDBMS 6+ years of experience in Python/Pyspark/Scala programming on big data platforms like Databricks 4+ years in cloud data engineering experience in Azure or AWS. Fluent with Azure cloud services. Azure Data Engineering certification is a plus. Experience with integration of multi cloud services with on-premises technologies. Experience with data modelling, data warehousing, and building high-volume ETL/ELT pipelines. Experience with data profiling and data quality tools like Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one business intelligence tool such as Power BI or Tableau Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes. Experience with version control systems like ADO, Github and CI/CD tools for DevOps automation and deployments. Experience with Azure Data Factory, Azure Databricks and Azure Machine learning tools. Experience with Statistical/ML techniques is a plus. Experience with building solutions in the retail or in the supply chain space is a plus. Understanding of metadata management, data lineage, and data glossaries is a plus. BA/BS in Computer Science, Math, Physics, or other technical fields. Candidate must be flexible to work an alternative work schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon product and project coverage requirements of the job. Candidates are expected to be in the office at the assigned location at least 3 days a week and the days at work needs to be coordinated with immediate supervisor Skills, Abilities, Knowledge: Excellent communication skills, both verbal and written, along with the ability to influence and demonstrate confidence in communications with senior level management. Proven track record of leading, mentoring data teams. Strong change manager. Comfortable with change, especially that which arises through company growth. Ability to understand and translate business requirements into data and technical requirements. High degree of organization and ability to manage multiple, competing projects and priorities simultaneously. Positive and flexible attitude to enable adjusting to different needs in an ever-changing environment. Strong leadership, organizational and interpersonal skills; comfortable managing trade-offs. Foster a team culture of accountability, communication, and self-management. Proactively drives impact and engagement while bringing others along. Consistently attain/exceed individual and team goals. Ability to lead others without direct authority in a matrixed environment. Comfortable working in a hybrid environment with teams consisting of contractors as well as FTEs spread across multiple PepsiCo locations. Domain Knowledge in CPG industry with Supply chain/GTM background is preferred. Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

Overview As Senior Analyst, Data Modeling, your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling - documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper management: business and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; data str/cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications 8+ years of overall technology experience that includes at least 4+ years of data modeling and systems architecture. 3+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI). Show more Show less

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) –Senior- Snowflake As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for candidates with strong technology and data understanding in big data engineering space, having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Lead and Architect migration of data analytics environment from Teradata to Snowflake with performance and reliability Develop & deploy big data pipelines in a cloud environment using Snowflake cloud DW ETL design, development and migration of existing on-prem ETL routines to Cloud Service Interact with senior leaders, understand their business goals, contribute to the delivery of the workstreams Design and optimize model codes for faster execution Skills And Attributes For Success Hands on developer in the field of data warehousing, ETL Hands on development experience in Snowflake. Experience in Snowflake modelling - roles, schema, databases. Experience in Integrating with third-party tools, ETL, DBT tools Experience in Snowflake advanced concepts like setting up resource monitors and performance tuning would be preferable Applying object-oriented and functional programming styles to real-world Big Data Engineering problems using Java/Scala/Python Develop data pipelines to perform batch and Real - Time/Stream analytics on structured and unstructured data. Data processing patterns, distributed computing and in building applications for real-time and batch analytics. Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake’s products and marketing. To qualify for the role, you must have Be a computer science graduate or equivalent with 3 - 7 years of industry experience Have working experience in an Agile base delivery methodology (Preferable) Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Be a technical expert on all aspects of Snowflake Deploy Snowflake following best practices, including ensuring knowledge transfer so that customers are properly enabled and are able to extend the capabilities of Snowflake on their own Work hands-on with customers to demonstrate and communicate implementation best practices on Snowflake technology Maintain deep understanding of competitive and complementary technologies and vendors and how to position Snowflake in relation to them Work with System Integrator consultants at a deep technical level to successfully position and deploy Snowflake in customer environments Provide guidance on how to resolve customer-specific technical challenges Ideally, you’ll also have Client management skills What We Look For Minimum 5 years of experience as Architect on Analytics solutions and around 2 years of experience with Snowflake. People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) –Senior- Snowflake As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for candidates with strong technology and data understanding in big data engineering space, having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Lead and Architect migration of data analytics environment from Teradata to Snowflake with performance and reliability Develop & deploy big data pipelines in a cloud environment using Snowflake cloud DW ETL design, development and migration of existing on-prem ETL routines to Cloud Service Interact with senior leaders, understand their business goals, contribute to the delivery of the workstreams Design and optimize model codes for faster execution Skills And Attributes For Success Hands on developer in the field of data warehousing, ETL Hands on development experience in Snowflake. Experience in Snowflake modelling - roles, schema, databases. Experience in Integrating with third-party tools, ETL, DBT tools Experience in Snowflake advanced concepts like setting up resource monitors and performance tuning would be preferable Applying object-oriented and functional programming styles to real-world Big Data Engineering problems using Java/Scala/Python Develop data pipelines to perform batch and Real - Time/Stream analytics on structured and unstructured data. Data processing patterns, distributed computing and in building applications for real-time and batch analytics. Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake’s products and marketing. To qualify for the role, you must have Be a computer science graduate or equivalent with 3 - 7 years of industry experience Have working experience in an Agile base delivery methodology (Preferable) Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Be a technical expert on all aspects of Snowflake Deploy Snowflake following best practices, including ensuring knowledge transfer so that customers are properly enabled and are able to extend the capabilities of Snowflake on their own Work hands-on with customers to demonstrate and communicate implementation best practices on Snowflake technology Maintain deep understanding of competitive and complementary technologies and vendors and how to position Snowflake in relation to them Work with System Integrator consultants at a deep technical level to successfully position and deploy Snowflake in customer environments Provide guidance on how to resolve customer-specific technical challenges Ideally, you’ll also have Client management skills What We Look For Minimum 5 years of experience as Architect on Analytics solutions and around 2 years of experience with Snowflake. People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 week ago

Apply

6.0 - 11.0 years

15 - 25 Lacs

Bengaluru

Hybrid

Naukri logo

Job Title: Teradata Developer Location: Bangalore Experience: 6+ Type: Full-time If you are interested please share me your updated resume on my official mail id Mohd.hashim@thehrsolutions.in Job Description: We are looking for a highly skilled Teradata Developer with deep expertise in SQL development, performance tuning, and UNIX scripting . The ideal candidate will be responsible for developing, optimizing, and maintaining complex SQL queries and Teradata processes to support enterprise-level data solutions. Key Responsibilities: Design, develop, and optimize complex SQL queries in Teradata. Work with large datasets to implement ETL processes and data pipelines. Develop and maintain UNIX shell scripts for job automation and data processing. Perform performance tuning and query optimization. Collaborate with business analysts and data engineers to meet data requirements. Required Skills: Expert-level SQL and Teradata development. Strong experience with UNIX Good understanding of data warehousing concepts.

Posted 1 week ago

Apply

3.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

You Lead the Way. We’ve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities and each other. Here, you’ll learn and grow as we help you create a career journey that’s unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. Join Team Amex and let's lead the way together. Experience writing software in Python or similar. Experience with data structures, algorithms, and software design. Exposure to Data Science including Predictive Modelling. Develop Algorithms in multilingual conversational systems. Solve real-world scenarios for user commands and requests by identifying the right LLM models, tooling and frameworks. Proven experience in developing and working with large language models (GPT-3, BERT, T5, etc.) and productionizing them on the cloud. Strong foundation in machine learning concepts and techniques, including deep learning architectures, natural language processing, and text generation. Proficiency in programming languages such as Python, TensorFlow, PyTorch, and related libraries for model development and deployment. Demonstrated ability to design, train, fine-tune, and optimize large language models for specific tasks. Expertise in pre-processing and cleaning large datasets for training models. Familiarity with data augmentation techniques to enhance model performance. Knowledge of LLM operations , including evaluating model performance using appropriate metrics and benchmarks. Ability to iterate and improve models based on evaluation results. Experience in deploying language models in production environments and integrating them into applications, platforms, or services. Exposure in building Predictive models using machine learning through all phases of development, from design through training, evaluation, validation, and implementation. Experience with modern AI/ML & NLP Frameworks (e.g. Tensorflow), Dialogue Managers (e.g.Rasa), Search (e.g. Google Bert, GPT-3), Parsers (e.g. Dialogflow). Review architecture and provide technical guidance for engineers Perform statistical analysis of results and refine models Experience on various data architectures, latest tools, current and future trends in data engineering space especially Big Data, Streaming and Cloud technologies like GCP, AWS, Azure. Hands on experience with Big Data technologies (Spark, Kafka, Hive, etc.) and have at least 1 Big data implementation on platforms like Cornerstone, Teradata, etc. Experience with Visualization Tools like Tableau, Power BI, etc. Experience with complex, high volume, multi-dimensional data, as well as ML/AI models for unstructured, structured, and streaming datasets. Knowledge of cloud computing, including virtualization, hosted services, multi-tenant cloud infrastructures, storage systems, and content delivery networks Exposure in building cloud-native platforms on modern tech stack: AWS, Java, Spring Framework, RESTful API, and container-based application. Ability to learn new tools and paradigms in data engineering and science Proven experience attracting, hiring, retaining, and leading top engineering talent. Creative, passionate, and experienced leader of both people and technology Team Management savvy (e.g., planning, budgetary control, people management, vendor management, etc.). Experience with DevOps, reliability engineering, and platform monitoring Well versed in AGILE, DevOps and Program Management methods Bachelor's degree with a preference for Computer Science We back our colleagues and their loved ones with benefits and programs that support their holistic well-being. That means we prioritize their physical, financial, and mental health through each stage of life. Minimum Qualifications · 3 years of experience with applying Agile methodologies · Bachelor's degree in computer science, Engineering, Information Systems, or a related STEM field · 3+ years of experience with Java, microservices, React framework. · 3 years of experience with applying Agile methodologies · 1 year of experience with public cloud platform (GCP, AWS, ...) optimization, enabling managed and serverless service. Preferred Qualifications · Bachelor's degree in computer science, Engineering, Information Systems, or a related STEM field · 3 + years of experience with Python, microservices, React framework. · 3+ years of experience with Python, microservices, React framework. We back our colleagues and their loved ones with benefits and programs that support their holistic well-being. That means we prioritize their physical, financial, and mental health through each stage of life. Benefits include: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Telangana, India

On-site

Linkedin logo

Our company: At Teradata, we believe that people thrive when empowered with better information. That’s why we built the most complete cloud analytics and data platform for AI. By delivering harmonized data, trusted AI, and faster innovation, we uplift and empower our customers—and our customers’ customers—to make better, more confident decisions. The world’s top companies across every major industry trust Teradata to improve business performance, enrich customer experiences, and fully integrate data across the enterprise. What You’ll Do This role will require you to Design, develop, and maintain scalable and high-performing database features Write efficient, scalable, and clean code primarily in C/C++ Collaborate with cross-functional teams to define, design, and ship new features Ensure the availability, reliability, and performance of deployed applications Integrate with CI/CD pipelines to facilitate seamless deployment and development cycles Monitor and optimize application performance and troubleshoot issues as needed Evaluate, investigate, and tune/optimize the performance of the application Resolve customer incidents and provide support to Customer Support and Operations teams You will be successful on achieving measurable improvements in software performance and user satisfaction. Who You’ll Work With You will join a high performing engineering team with Emphasis on innovation, continuous learning, and open communication Strong focus on mutual respect and empowering team members Commitment to celebrating diverse perspectives and fostering professional growth This role is an Individual Contributor role closely working with team members, reports to Engineering Manager What Makes You a Qualified Candidate B. Tech/M. Tech/MCA in CSE disciplines 3-5 years of relevant industry experience Expert level knowledge in C/C++ Working experience with with data structures, REST API, parquet files in Linux environments Experience in one or more public cloud environments – AWS, Azure or Google Cloud Experience in Deltalake, Iceberg architecture, integration of Datalake to databases Work experience in providing the ability to read, write of catalog, metadata, manifestfile, manifestlist of Iceberg and Deltalake What You’ll Bring You will be a preferred candidate if you have Working knowledge of Teradata database A proactive and solution-oriented mindset with a passion for technology and continuous learning An ability to work independently and take initiative while contributing to the team’s success Creativity and adaptability in a dynamic environment A strong sense of ownership, accountability, and a drive to make an impact Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Our Company Teradata is the connected multi-cloud data platform for enterprise analytics company. Our enterprise analytics solve business challenges from start to scale. Only Teradata gives you the flexibility to handle the massive and mixed data workloads of the future, today. The Teradata Vantage architecture is cloud native, delivered as-a-service, and built on an open ecosystem. These design features make Vantage the ideal platform to optimize price performance in a multi-cloud environment. What You’ll Do The Principal Data Scientist (pre-sales) is an experienced and expert Data Scientist, able to provide industry thought-leadership on Analytics and its application across industries and across use-cases. The Principal Data Scientist supports the account team in framing business problems and in identifying analytic solutions that leverage Teradata technology and that are disruptive, innovative - and above all, practical. An articulate and compelling communicator, the Principal Data Scientist establishes our position as an important partner for advanced analytics with customers and prospects and is a trusted advisor to executives, senior managers and fellow data scientists alike across a range of target accounts. They are also a hands-on practitioner who is ready, willing and able to roll-up her sleeves and to deliver POC and short-term pre-sales engagements. The Principal Data Scientist has an excellent theoretical and practical understanding of statistics and machine learning and has a strong track record of applying this understanding at scale to drive business benefit. They are insanely curious and is a natural problem-solver and able to effectively promote Teradata technology and solutions to our customers. Who You’ll Work With The successful candidate will work with other expert team members to Provide pre-sales support at an executive level to the Teradata account teams at a local country, Geo and an International Theatre level. Helping them to position and sell complex Analytic solutions that drive sales of Teradata software. Provide strategic pre-sales consulting to executives and senior managers in our target market. Support the delivery of PoC and PoV projects that demonstrate the viability and applicability of Analytic use-cases and the superiority of Teradata solutions and services. Work with the extended Account team, and Sales Analytics Specialists to develop new Analytic propositions that are aligned with industry trends and customer requirements. What Makes You a Qualified Candidate Have proven hands-on experience of complex analytics at scale for example in the areas of IoT and sensor data. Have experience with Teradata partner’s analytical products, Cloud Service providers such as AzureML and Sagemaker and partner products such as Dataiku and H2O Have strong hands-on programming skills in at least one major analytic programming language and/or tool in addition to SQL Strong understanding of data engineering and database systems. Recognised in the local country, geo and International Theatre as the go-to expert What You’ll Bring An expertise in Data Science with a strong theoretical grounding in statistics, advanced analytics, and machine learning and at least 10 years real-world experience in the application of advanced analytics. A passion about knowledge sharing and demonstrate a commitment to continuous professional development. A Belief in Teradata's Analytic solutions and services and be a commitment to working with the product, engineering, and consulting teams to ensure that they continue to lead the market An ability to turn complex technical subject matter into relatable easy to digest and understand content for senior audiences. a degree level qualification (preferably Masters or PHD) in Statistics, Data Science, the physical or biological sciences or a related discipline Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Maharashtra, India

On-site

Linkedin logo

Our Company Teradata is the connected multi-cloud data platform for enterprise analytics company. Our enterprise analytics solve business challenges from start to scale. Only Teradata gives you the flexibility to handle the massive and mixed data workloads of the future, today. The Teradata Vantage architecture is cloud native, delivered as-a-service, and built on an open ecosystem. These design features make Vantage the ideal platform to optimize price performance in a multi-cloud environment. What You’ll Do The Principal Data Scientist (pre-sales) is an experienced and expert Data Scientist, able to provide industry thought-leadership on Analytics and its application across industries and across use-cases. The Principal Data Scientist supports the account team in framing business problems and in identifying analytic solutions that leverage Teradata technology and that are disruptive, innovative - and above all, practical. An articulate and compelling communicator, the Principal Data Scientist establishes our position as an important partner for advanced analytics with customers and prospects and is a trusted advisor to executives, senior managers and fellow data scientists alike across a range of target accounts. They are also a hands-on practitioner who is ready, willing and able to roll-up her sleeves and to deliver POC and short-term pre-sales engagements. The Principal Data Scientist has an excellent theoretical and practical understanding of statistics and machine learning and has a strong track record of applying this understanding at scale to drive business benefit. They are insanely curious and is a natural problem-solver and able to effectively promote Teradata technology and solutions to our customers. Who You’ll Work With The successful candidate will work with other expert team members to Provide pre-sales support at an executive level to the Teradata account teams at a local country, Geo and an International Theatre level. Helping them to position and sell complex Analytic solutions that drive sales of Teradata software. Provide strategic pre-sales consulting to executives and senior managers in our target market. Support the delivery of PoC and PoV projects that demonstrate the viability and applicability of Analytic use-cases and the superiority of Teradata solutions and services. Work with the extended Account team, and Sales Analytics Specialists to develop new Analytic propositions that are aligned with industry trends and customer requirements. What Makes You a Qualified Candidate Have proven hands-on experience of complex analytics at scale for example in the areas of IoT and sensor data. Have experience with Teradata partner’s analytical products, Cloud Service providers such as AzureML and Sagemaker and partner products such as Dataiku and H2O Have strong hands-on programming skills in at least one major analytic programming language and/or tool in addition to SQL Strong understanding of data engineering and database systems. Recognised in the local country, geo and International Theatre as the go-to expert What You’ll Bring An expertise in Data Science with a strong theoretical grounding in statistics, advanced analytics, and machine learning and at least 10 years real-world experience in the application of advanced analytics. A passion about knowledge sharing and demonstrate a commitment to continuous professional development. A Belief in Teradata's Analytic solutions and services and be a commitment to working with the product, engineering, and consulting teams to ensure that they continue to lead the market An ability to turn complex technical subject matter into relatable easy to digest and understand content for senior audiences. a degree level qualification (preferably Masters or PHD) in Statistics, Data Science, the physical or biological sciences or a related discipline Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Linkedin logo

Our Company Teradata is the connected multi-cloud data platform for enterprise analytics company. Our enterprise analytics solve business challenges from start to scale. Only Teradata gives you the flexibility to handle the massive and mixed data workloads of the future, today. The Teradata Vantage architecture is cloud native, delivered as-a-service, and built on an open ecosystem. These design features make Vantage the ideal platform to optimize price performance in a multi-cloud environment. What You’ll Do The Principal Data Scientist (pre-sales) is an experienced and expert Data Scientist, able to provide industry thought-leadership on Analytics and its application across industries and across use-cases. The Principal Data Scientist supports the account team in framing business problems and in identifying analytic solutions that leverage Teradata technology and that are disruptive, innovative - and above all, practical. An articulate and compelling communicator, the Principal Data Scientist establishes our position as an important partner for advanced analytics with customers and prospects and is a trusted advisor to executives, senior managers and fellow data scientists alike across a range of target accounts. They are also a hands-on practitioner who is ready, willing and able to roll-up her sleeves and to deliver POC and short-term pre-sales engagements. The Principal Data Scientist has an excellent theoretical and practical understanding of statistics and machine learning and has a strong track record of applying this understanding at scale to drive business benefit. They are insanely curious and is a natural problem-solver and able to effectively promote Teradata technology and solutions to our customers. Who You’ll Work With The successful candidate will work with other expert team members to Provide pre-sales support at an executive level to the Teradata account teams at a local country, Geo and an International Theatre level. Helping them to position and sell complex Analytic solutions that drive sales of Teradata software. Provide strategic pre-sales consulting to executives and senior managers in our target market. Support the delivery of PoC and PoV projects that demonstrate the viability and applicability of Analytic use-cases and the superiority of Teradata solutions and services. Work with the extended Account team, and Sales Analytics Specialists to develop new Analytic propositions that are aligned with industry trends and customer requirements. What Makes You a Qualified Candidate Have proven hands-on experience of complex analytics at scale for example in the areas of IoT and sensor data. Have experience with Teradata partner’s analytical products, Cloud Service providers such as AzureML and Sagemaker and partner products such as Dataiku and H2O Have strong hands-on programming skills in at least one major analytic programming language and/or tool in addition to SQL Strong understanding of data engineering and database systems. Recognised in the local country, geo and International Theatre as the go-to expert What You’ll Bring An expertise in Data Science with a strong theoretical grounding in statistics, advanced analytics, and machine learning and at least 10 years real-world experience in the application of advanced analytics. A passion about knowledge sharing and demonstrate a commitment to continuous professional development. A Belief in Teradata's Analytic solutions and services and be a commitment to working with the product, engineering, and consulting teams to ensure that they continue to lead the market An ability to turn complex technical subject matter into relatable easy to digest and understand content for senior audiences. a degree level qualification (preferably Masters or PHD) in Statistics, Data Science, the physical or biological sciences or a related discipline Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Position - Data Engineer Location - Pune Experience - 6+ years Must Have: Tech-savvy engineer - willing and able to learn new skills, track industry trend 6+ years of total experience of solid data engineering experience, especially in open-source, data-intensive, distributed environments with experience in Big data-related technologies like Spark, Hive, HBase, Scala, etc. Programming background – preferred Scala / Python. Experience in Scala, Spark, PySpark and Java (Good to have). Experience in migration of data to AWS or any other cloud. Experience in SQL and NoSQL databases. Optional: Model the data set from Teradata to the cloud. Experience in Building ETL Pipelines Experience in Building Data pipelines in AWS (S3, EC2, EMR, Athena, Redshift) or any other cloud. Self-starter & resourceful personality with the ability to manage pressure situations Exposure to Scrum and Agile Development Best Practices Experience working with geographically distributed teams Role & Responsibilities: Build Data and ETL pipelines in AWS Support migration of data to the cloud using Big Data Technologies like Spark, Hive, Talend, Python Interact with customers on a daily basis to ensure smooth engagement Responsible for timely and quality deliveries. Fulfill organization responsibilities – Sharing knowledge and experience within the other groups in the organization, conducting various technical sessions and training. Education: Bachelor’s degree in computer science, Software Engineering, MIS or equivalent combination of education and experience Every individual comes with a different set of skills and qualities so even if you don’t tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow! Facebook | Twitter | LinkedIn | Instagram | Youtube Show more Show less

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Linkedin logo

Our Company At Teradata, we believe that people thrive when empowered with better information. That’s why we built the most complete cloud analytics and data platform for AI. By delivering harmonized data, trusted AI, and faster innovation, we uplift and empower our customers—and our customers’ customers—to make better, more confident decisions. The world’s top companies across every major industry trust Teradata to improve business performance, enrich customer experiences, and fully integrate data across the enterprise. What You'll Do Teradata is looking to add a new Program Manager / Product Owner to our existing Global Sales Operations Tools & Technologies team of analytical, problem solving and solution-oriented product owners and program managers with experience supporting sales teams. Day to day focus is on implementation, adoption, hygiene and documenting best practices while being on the leading edge of developing and representing business requirements for our Sales and Channel Partner Cloud Platform and other sales technologies. This position will work closely with Sales and GTM Leadership, Account Teams, Partner Team (Global Alliances / Client Relationship Management), Sales Operations Managers, Technology and Enablement teams, Marketing, IT to define and deliver channel partner technology solutions and business processes aligned with our strategy and roadmap. The ideal candidate will be data driven, intellectually curious, a fast learner, and able to move quickly while maintaining focus on high impact projects aligned to a global strategy and to develop and make recommendations on business technology and business process improvements. This is a full-time individual contributor position based in a Teradata office in India. Responsibilities: Product Owner for assigned capability / program area representing the business stakeholder(s) and/or customer(s) and process owner for such designated areas and capabilities Define, document, and share CRM best practices to ensure sales processes and terminology are consistently understood and applied across the organization and regions Develop and make recommendations to business process improvements and impacts to different business / sales and partner areas Build and manage relationships with cross-functional teams such as Geographic Sales Leadership and Sales Operations Managers, Marketing, and IT to ensure that tools and technologies are set up and aligned to effectively support Teradata’s coverage models around the world. Work closely with Sales Enablement to identify training needs for leadership and account team members on technology, tools, business practices and processes. Actively participate in roadmap identification and prioritization with Business and IT partners managing all phases of the program / project delivery cycle and consult / bring recommendations for programs / projects. Determine the business impact of current and future technologies for the GTM Organization Who You'll Work With You will interact directly with field sales, sales leaders, and other team members to capture feedback for sales technology and process improvements to drive adoption and deliver business value. What Makes You a Qualified Candidate 3-5 years of experience as an Agile / Scrum product or process owner experience or 3-5 years Sales Operations, Sales Support, or Sales Field Impacting role experience. Direct experience in managing and driving value from CRM (Salesforce. com) and sales tools and leading / partnering cross-functionally to deliver complex programs and projects. Experience with direct sales and resellers/distribution partner processes in a SaaS/Cloud enterprise company or software vendor, and knowledge of how these processes integrate into existing systems/tools. Experience with Salesforce Partner Relationship Management (PRM), Salesforce Communities, Partner platforms and a good understanding of the different channel partner types is a plus. Must possess business acumen, field facing acumen, strong analytical, troubleshooting, problem-solving, and project management skills. Proactive and passionate: Independently capable of seeking information, solving conceptual problems, corralling resources, and delivering results in challenging situations. Ability to manage multiple concurrent projects and drive initiatives in a cross-functional environment. Solution Business Consulting skills, including analysis/evaluation of business and/or system processes and functional recommendation highly desired. Experience working and communicating with senior executives to solve complex business problems. Bachelor’s Degree in an analytical field (e. g. Computer Science, Information Systems, Business Administration, Engineering, Mathematics, or Statistics) What You Will Bring Project/Program Management, or Agile / Product Owner Certification a plus, but not required Salesforce or PRM Certifications a plus but not required Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less

Posted 1 week ago

Apply

2.0 years

0 Lacs

Gurugram, Haryana

On-site

Indeed logo

Location Gurugram, Haryana, India Category Corporate Job Id GGN00002085 Marketing / Loyalty / Mileage Plus / Alliances Job Type Full-Time Posted Date 06/05/2025 Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description Our Marketing and Loyalty team is the strategic force behind United’s industry-leading brand and experience, supporting revenue growth by turning customers into lifelong United flyers. Our marketing communications, market research and brand teams drive travelers’ familiarity and engagement with the United brand. Our product, design and service teams bring the signature United brand to life in our clubs and onboard our aircraft, ensuring a seamless, premier experience. And when customers choose United again and again, that’s because the loyalty team has been hard at work crafting an award-winning program. Our loyalty team manages United MileagePlus®, building travel and lifestyle partnerships that customers can engage with every day, and works with our Star Alliance partners to ensure United can take you anywhere you want to go. Job overview and responsibilities United Airlines reaches out to customers and potential travelers via digital campaigns with new information, travel inspiration, personalized offers, promos, etc. The Digital Marketing & Personalized Offers team at IKC supports all such digital acquisition initiatives with insights to help strategize campaigns and analytics to help measure performance. We work closely with stakeholders in the US to bring these campaigns to life and continuously improve performance with learnings and actionable insights. Assist in campaign planning, targeting and audience identification; measure campaign results and performance using data analysis Gather and organize data from various sources using SQL/ Python/ R; continuously develop and demonstrate improved analysis methodologies Create content for and deliver presentations to United leadership and external stakeholders Ensure alignment and prioritization with business objectives and initiatives – help teams make faster, smarter decisions Conduct exploratory analysis, identify opportunities, and proactively suggest initiatives to meet marketing objectives Create, modify and automate reports and dashboards - take ownership of reporting structure and metrics, clearly and effectively communicate relevant information to decision makers using data visualization tools Ensure seamless stakeholder management and keep lines of communication open with all stakeholders This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. This position is for United Airlines Business Services Pvt. Ltd - a wholly owned subsidiary of United Airlines Inc. Qualifications What’s needed to succeed (Minimum Qualifications): Bachelor's degree 2+ years of experience in Analytics and working with analytical tools Proven comfort and an intellectual curiosity for working with very large data sets Experience in manipulating and analyzing complex, high-volume, high-dimensionality data from various sources to highlight patterns and relationships Proficiency in using database querying tools and writing complex queries and procedures using Teradata SQL and/ or Microsoft SQL Familiarity with one or more reporting tools – Spotfire/ Tableau/ PowerBI Advanced level comfort with Microsoft Office, especially Excel and PowerPoint Ability to communicate analysis in a clear and precise manner High sense of ownership of work Ability to work under time constraints Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position What will help you propel from the pack (Preferred Qualifications): Master's degree Bachelor’s Degree in a quantitative field like Math, Statistics, Analytics and/ or Business SQL/ Python/ R Visualization tools – Tableau/ Spotfire/ PowerBI Understanding of digital acquisition channels Strong knowledge of either Python or R

Posted 1 week ago

Apply

0.0 - 4.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Indeed logo

Tesco India • Bengaluru, Karnataka, India • Hybrid • Full-Time • Permanent • Apply by 30-Jun-2025 About the role Responsible to provide support via automation while devising efficient reporting solutions in alignment with customer and business needs What is in it for you At Tesco, we are committed to providing the best for you. As a result, our colleagues enjoy a unique, differentiated, market- competitive reward package, based on the current industry practices, for all the work they put into serving our customers, communities and planet a little better every day. Our Tesco Rewards framework consists of pillars - Fixed Pay, Incentives, and Benefits. Total Rewards offered at Tesco is determined by four principles - simple, fair, competitive, and sustainable. Salary - Your fixed pay is the guaranteed pay as per your contract of employment. Performance Bonus - Opportunity to earn additional compensation bonus based on performance, paid annually Leave & Time-off - Colleagues are entitled to 30 days of leave (18 days of Earned Leave, 12 days of Casual/Sick Leave) and 10 national and festival holidays, as per the company’s policy. Making Retirement Tension-FreeSalary - In addition to Statutory retirement beneets, Tesco enables colleagues to participate in voluntary programmes like NPS and VPF. Health is Wealth - Tesco promotes programmes that support a culture of health and wellness including insurance for colleagues and their family. Our medical insurance provides coverage for dependents including parents or in-laws. Mental Wellbeing - We offer mental health support through self-help tools, community groups, ally networks, face-to-face counselling, and more for both colleagues and dependents. Financial Wellbeing - Through our financial literacy partner, we offer one-to-one financial coaching at discounted rates, as well as salary advances on earned wages upon request. Save As You Earn (SAYE) - Our SAYE programme allows colleagues to transition from being employees to Tesco shareholders through a structured 3-year savings plan. Physical Wellbeing - Our green campus promotes physical wellbeing with facilities that include a cricket pitch, football field, badminton and volleyball courts, along with indoor games, encouraging a healthier lifestyle. You will be responsible for Understands business needs and in depth understanding of Tesco processes Accountable for high quality and timely completion of specified reporting & dash-boarding work Understanding the end to end process of generating reports Understanding the underlying data sources Action any change request received from partners Develop users manual for reporting procedures and related process changes Handle new report development requests Lead the transformation of reports into new age tools and technologies Provide solutions to issues related to reports development and delivery Maintain the log of issues, risks and mitigation plans Identifying operational improvements and apply solution and automation using Python, Alteryx Enhance and Develop Daily, Weekly and Periodic reports and dashboards using Advanced excel, Advanced SQL, Hadoop, Teradata Partnering with stakeholders to identify problems, collaborate with them to brainstorm on the best possible reporting solution, and deliver solutions in the form of intelligent business reports / dashboards (Tableau, BI) Following our Business Code of Conduct and always acting with integrity and due diligence You will need - 2-4 year Experience in analytics delivery in any one of domains like retail, cpg, telecom or hospitality and for one of the following functional areas - marketing, supply chain, customer, merchandising, operations, finance or digital preferred Adv Excel, Strong Verbal and Written Communication Adv SQL, Big Data Infra, Hadoop, Hive, Phython, Spark Automation platforms Alteryx/Python Advanced Developer knowledge of Tableau, PowerBI, Logical Reasoning Eye for detail About us Tesco in Bengaluru is a multi-disciplinary team serving our customers, communities, and planet a little better every day across markets. Our goal is to create a sustainable competitive advantage for Tesco by standardising processes, delivering cost savings, enabling agility through technological solutions, and empowering our colleagues to do even more for our customers. With cross-functional expertise, a wide network of teams, and strong governance, we reduce complexity, thereby offering high-quality services for our customers. Tesco in Bengaluru, established in 2004 to enable standardisation and build centralised capabilities and competencies, makes the experience better for our millions of customers worldwide and simpler for over 3,30,000 colleagues. Tesco Business Solutions: Established in 2017, Tesco Business Solutions (TBS) has evolved from a single entity traditional shared services in Bengaluru, India (from 2004) to a global, purpose-driven solutions-focused organisation. TBS is committed to driving scale at speed and delivering value to the Tesco Group through the power of decision science. With over 4,400 highly skilled colleagues globally, TBS supports markets and business units across four locations in the UK, India, Hungary, and the Republic of Ireland. The organisation underpins everything that the Tesco Group does, bringing innovation, a solutions mindset, and agility to its operations and support functions, building winning partnerships across the business. TBS's focus is on adding value and creating impactful outcomes that shape the future of the business. TBS creates a sustainable competitive advantage for the Tesco Group by becoming the partner of choice for talent, transformation, and value creation

Posted 1 week ago

Apply

3.0 - 8.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Position: Data Analyst Location: Gurgaon Timings: 12:00 PM to 10:00 PM Role Overview Doing independent research, analyze, and present data as assigned Expected to work in close collaboration with the EXL team and clients on Commercial insurance actuarial projects for US/UK markets Should be able to understand risk and underwriting , plicate rating methodology Develop and use collaborative relationship to facilitate the accomplishment of working goals Working experience in P&C insurance domain for US insurance markets is a must Excellent written and verbal communication skills Facilitate data requirements while working with actuaries Have excellent SQL skills to extract data for scheduled processes and adhoc requests Automate manual processes and ETL pipelines using Python Utilise/help migrate existing SAS processes from SAS to SAS Viya Key Responsibilities Collaborate with actuaries to understand their data and reporting needs related to premium, loss, and exporsure analysis. Build and optimize complex SQL queries to extract, join, and aggregate large datasets from multiple relational sources. Develop and automate data pipelines in Python for ETL ,data wrangling , and wexploratory analytics. Use SAS for legacy processes, statistical outputs, and ad hoc data manipulation as required by actuarial models/processes Validate data outputs for accuracy and consistency, troubleshoot discrepancies, and ensure data quality before delivery Create documentation of data logc, process flows, and metadata over confluence and SharePoint to ensure transparency and knowledge sharing. Contribute to continuous improvement by recommending process automation or optimization opportunities in existing workflows Support Dashboarding or visualization needs(optional) using tools like Powe BI. Work in an agile or iterative environment with clear communication or progress, blockers and timelines. Required Skillset SQL(Expert Level) : Complex Joins, subqueries, window functions, CTEs, query Optimization and performance tuning, working with large tables in cloud/on-premise environments( Teradata, SQL Server, or equivalent) Python( intermediate to expert): Data wrangling using pandas, NumPy, Script automation and API Consumption, Familiarity with Visual Studio, Jupyter and modular Python Scripting SAS(Intermediate): Reading/ writing from/to datasets, connecting with external sources, macros, PROC SQL Knowledge of AWS is preferred Experience with commercial insurance Understanding of actuarial concepts such as loss triangles, reserving, and pricing Exposure to Git, JIRA, Confluence Proficiency in Excel, VBA Macros(Preferred) Candidate Profile Bachelor’s/Master's degree in engineering, economics, mathematics, actuarial sciences, or similar technical degree. Master’s in business or financial management is also suitable Affiliation to IAI or IFOA, with at least 3 actuarial exams 3-8 years’ experience in data analytics in insurance or financial service industry with good understanding of actuarial concepts - pricing, reserving, and/or valuation Demonstrated ability to work with actuarial or statistical teams in delivering high-quality data and insights Strong problem solving attitude and comfort with ambiguity in requirements Strong ability to learn technical and business knowledge Outstanding written and verbal communication skills Excellent time and work management skills Able to work in fast pace continuously evolving environment and ready to take up uphill challenges Is able to understand cross cultural differences and can work with clients across the globe Show more Show less

Posted 1 week ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Overall experience of 6 years in DW/ BI technologies and minimum 5 years development experience in ETL DataStage 8.x/ 9.x tool. Design, develop, and maintain ETL processes using IBM DataStage to extract, transform, and load data from multiple sources into our data warehouse Collaborate with business analysts and stakeholders to understand data requirements and translate them into technical specifications Worked extensively in parallel jobs, sequences and preferably in routines. Good conceptual knowledge on data- warehouse and various methodologies. Strong SQL database skills in Teradata and other databases like Oracle, SQL Server, DB2 etc Working knowledge in UNIX shell scripting. Good communication and presentation skills Should be flexible with the overlapping working hours Should be able to work independently and act proactively Develop, implement, and maintain best practices for DataStage Mandatory skills* DataStage , SQL Desired skills* Unix , PL/SQL

Posted 1 week ago

Apply

Exploring Teradata Jobs in India

Teradata is a popular data warehousing platform that is widely used by businesses in India. As a result, there is a growing demand for skilled professionals who can work with Teradata effectively. Job seekers in India who have expertise in Teradata have a wide range of opportunities available to them across different industries.

Top Hiring Locations in India

  1. Bengaluru
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

These cities are known for their thriving tech industries and have a high demand for Teradata professionals.

Average Salary Range

The average salary range for Teradata professionals in India varies based on experience levels. Entry-level roles can expect to earn around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15 lakhs per annum.

Career Path

In the field of Teradata, a typical career path may involve progressing from roles such as Junior Developer to Senior Developer, and eventually to a Tech Lead position. With experience and skill development, professionals can take on more challenging and higher-paying roles in the industry.

Related Skills

In addition to Teradata expertise, professionals in this field are often expected to have knowledge of SQL, data modeling, ETL tools, and data warehousing concepts. Strong analytical and problem-solving skills are also essential for success in Teradata roles.

Interview Questions

  • What is Teradata and how is it different from other database management systems? (basic)
  • Can you explain the difference between a join and a merge in Teradata? (medium)
  • How would you optimize a Teradata query for performance? (medium)
  • What are fallback tables in Teradata and why are they important? (advanced)
  • How do you handle duplicate records in Teradata? (basic)
  • What is the purpose of a collect statistics statement in Teradata? (medium)
  • Explain the concept of indexing in Teradata. (medium)
  • How does Teradata handle concurrency control? (advanced)
  • Can you describe the process of data distribution in Teradata? (medium)
  • What are the different types of locks in Teradata and how are they used? (advanced)
  • How would you troubleshoot performance issues in a Teradata system? (medium)
  • What is a Teradata View and how is it different from a Table? (basic)
  • How do you handle NULL values in Teradata? (basic)
  • Can you explain the difference between FastLoad and MultiLoad in Teradata? (medium)
  • What is the Teradata Parallel Transporter? (advanced)
  • How do you perform data migration in Teradata? (medium)
  • Explain the concept of fallback protection in Teradata. (advanced)
  • What are the different types of Teradata macros and how are they used? (advanced)
  • How do you monitor and manage Teradata performance? (medium)
  • What is the purpose of the Teradata QueryGrid? (advanced)
  • How do you optimize the storage of data in Teradata? (medium)
  • Can you explain the concept of Teradata indexing strategies? (advanced)
  • How do you handle data security in Teradata? (medium)
  • What are the best practices for Teradata database design? (medium)
  • How do you ensure data integrity in a Teradata system? (medium)

Closing Remark

As you prepare for interviews and explore job opportunities in Teradata, remember to showcase your skills and experience confidently. With the right preparation and determination, you can land a rewarding role in the dynamic field of Teradata in India. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies