Home
Jobs

823 Teradata Jobs - Page 31

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8 - 10 years

7 - 12 Lacs

Chennai

Work from Office

Naukri logo

Altivate is a digital transformation enabler on a mission to help businesses find smarter and more innovative ways of doing business. We combine different knowledge and technologies to offer our clients tailored solutions and services to address their unique needs. Altivate provides end-to-end services and solutions based on industry best practices. Our technology competencies are vast and unique; they include SAP, AWS, Microsoft Azure, Microsoft PowerBI, Google Cloud Platform, Teradata, Tableau, MicroStrategy, etc. We work with our clients on unravelling new business opportunities presented by new technologies. We help our clients become more resilient, sustainable, and profitable, efficiently improving their performance and bottom line. Altivate is a proud: *SAP Gold Partner *SAP Certified Partner Center of Expertise *AWS Select Partner *Azure Partner *GCP Partner Job Summary: We are looking for Senior SAP FICO Consultant who will be working mainly remotely from Egypt and is expected to carry out quality assured project for our clients. The ideal candidate must make use of SAP Best Practices and impart impeccable solutions to the clients and must contribute towards building of Centre of Excellence within the organisation. Duties and Responsibilities: Understand the Functional requirements of accounting function and processes like GL, AR, AP, Banks, and Fixed Assets. Responsible for end-to-end delivery of FI and CO modules through Design, Configuration and Deployment of S/4HANA Solution. SAP process flow for business processes and Master data for FI and CO. Various master data elements and related configuration for FI CO and cross module master data for MM and SD Module. Good knowledge on Product Costing and CO-PA. Understand the basic CO Reporting framework along with cycles for cost allocation /settlement, period end actualization etc. Understand the issues faced by end users on day-to-day basis and provide solution to their satisfaction. Conceptualize and be part of developments to ensure smooth running of FICO function meeting organisation requirements. Carry out Business Blueprint with respect to projects and/or user requirements and conceptualizing solutions. Carry out unit testing facilitate end user testing. Skills, knowledge, capabilities and experience required: 8-10 years of experience in SAP FICO ECC and S/4HANA. Experience in S/4HANA Implementation. At least 4 or more end to end implementation projects and 2 or more S/4HANA projects. Hands-on SAP Experience in FICO including New G/L, AP/AR, Banking, New Credit Management, Controlling areas etc. Kingdom of Saudi Arabia Taxation knowledge experience in VAT. SAP Certification (preferred but not mandatory) Sound knowledge in integration with SD/MM and other SAP modules. Excellent verbal and written communication skills as well as interpersonal skills. Ability to maintain composure under pressure in ambiguous and complex situations. Positive attitude and the ability to work in changing situations and adapt in problem solving and multi-tasking. Exposure to SAP FM (Funds Management) will be preferred. *Please make sure you dont leave the applications question(s) unanswered as candidates who fail to genuinely respond to our questions at this stage wont have high chances of being shortlisted by our tanlent acquisition team.

Posted 3 months ago

Apply

5 - 9 years

7 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

About Altivate: Altivate is a digital transformation enabler on a mission to help businesses find smarter and more innovative ways of doing business. With headquarters located in Saudi Arabia along with 3 regional offices in Egypt, India, and Jordan, we combine different knowledge and technologies to offer our clients tailored solutions and services to address their unique needs. Altivate provides end-to-end services and solutions based on industry best practices. Our technology competencies are vast and unique; they include SAP, AWS, Microsoft Azure, Microsoft PowerBI, Google Cloud Platform, Teradata, Tableau, MicroStrategy, etc. We work with our clients on unravelling new business opportunities presented by new technologies. We help our clients become more resilient, sustainable, and profitable, efficiently improving their performance and bottom line. Altivate is proud: SAP Gold Partner, SAP Certified Partner Center of Expertise, AWS Select Partner, Azure Partner and GCP Partner. Were also a proud recent holder of SAP MEA Norths Delivery Excellence award for 2022. Purpose and Objective: Altivate seeks a Senior Sales Manager in Hyderabad with a hybrid setting to be responsible for the regional sales operations and deliverables including and not limited to prospecting, qualifying, selling, and closing software/service revenue for both new and existing Altivate customers. The candidate can be considered if they will work remotely as well, and there s a possibility to travel to KSA for sales and marketing activities. Job Summary: Accountable for sustaining and enhancing the continuous growth of Altivate s SAP business in India, KSA and in the Middle East region. In alignment with the CEO, the Sales Manager will be assuring his sales team are always on the right track of meeting and exceeding Altivates goals! Duties and Responsibilities: - Working hand-in-hand with the sales team to build and maintain strong relationships with key decision-making executives within the assigned industries in the region. -Formulate deliberate sales plans strategies for each region, customer profiles, targeted programs, application descriptions, forecast reports, and action items. -Develop and maintain technical and marketing knowledge of the different SAP products. -Maintain and continuously update an in-depth knowledge of all key competitors. -Assume full responsibility of quota attainment in alignment with Altivates plans. Skills, knowledge, capabilities, and experience required: -5+ years of experience in sales/business development related discipline specifically in ERP solutions and preferrable in SAP products. -A solid history of achieving large ERP closures. -Excellent general soft skills including interpersonal skills, communication skills, problem solving skills, presentation skills, negotiation skills and customer facing skills. -Must be fluent in English. -Willing to travel to Riyadh, KSA. -Please make sure you answer our application questions to enable us to have a better understanding of the relevance of your experience at this stage.

Posted 3 months ago

Apply

5 - 9 years

7 - 12 Lacs

Kochi

Work from Office

Naukri logo

About Altivate: Altivate is a digital transformation enabler on a mission to help businesses find smarter and more innovative ways of doing business. With headquarters located in Saudi Arabia along with 3 regional offices in Egypt, India, and Jordan, we combine different knowledge and technologies to offer our clients tailored solutions and services to address their unique needs. Altivate provides end-to-end services and solutions based on industry best practices. Our technology competencies are vast and unique; they include SAP, AWS, Microsoft Azure, Microsoft PowerBI, Google Cloud Platform, Teradata, Tableau, MicroStrategy, etc. We work with our clients on unravelling new business opportunities presented by new technologies. We help our clients become more resilient, sustainable, and profitable, efficiently improving their performance and bottom line. Altivate is proud: SAP Gold Partner, SAP Certified Partner Center of Expertise, AWS Select Partner, Azure Partner and GCP Partner. Were also a proud recent holder of SAP MEA Norths Delivery Excellence award for 2022. Purpose and Objective: Altivate seeks a Senior Sales Manager in Kochi with a hybrid setting to be responsible for the regional sales operations and deliverables including and not limited to prospecting, qualifying, selling, and closing software/service revenue for both new and existing Altivate customers. The candidate can be considered if they will work remotely as well, and there s a possibility to travel to KSA for sales and marketing activities. Job Summary: Accountable for sustaining and enhancing the continuous growth of Altivate s SAP business in India, KSA and in the Middle East region. In alignment with the CEO, the Sales Manager will be assuring his sales team are always on the right track of meeting and exceeding Altivates goals! Duties and Responsibilities: - Working hand-in-hand with the sales team to build and maintain strong relationships with key decision-making executives within the assigned industries in the region. -Formulate deliberate sales plans strategies for each region, customer profiles, targeted programs, application descriptions, forecast reports, and action items. -Develop and maintain technical and marketing knowledge of the different SAP products. -Maintain and continuously update an in-depth knowledge of all key competitors. -Assume full responsibility of quota attainment in alignment with Altivates plans. Skills, knowledge, capabilities, and experience required: -5+ years of experience in sales/business development related discipline specifically in ERP solutions and preferrable in SAP products. -A solid history of achieving large ERP closures. -Excellent general soft skills including interpersonal skills, communication skills, problem solving skills, presentation skills, negotiation skills and customer facing skills. -Must be fluent in English. -Willing to travel to Riyadh, KSA. -Please make sure you answer our application questions to enable us to have a better understanding of the relevance of your experience at this stage.

Posted 3 months ago

Apply

3 - 6 years

18 - 20 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Role & responsibilities : Manages and optimizes the performance, security, and reliability of databases in the Snowflake Data Cloud platform with below Roles and responsibilities . Experience in Cloud(Azure or GCP) and Automation Platforms(Terraform , Azure DevOps , Python or Shell scripting) is Must Good to have : DBA Experience in another DB Platforms Postgressql , MySQL and Mongo DB Understand of snowflake architecture and database concepts in intermediate level. Security and compliance Infrastructure Database administration Tools User management Storage Cost control Updates

Posted 3 months ago

Apply

4 - 10 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title: AJO Specialist Primary Skill: AJO Must-Have Skills: Communication Skills: Strong ability to interact with customers independently with confidence, articulation, and clarity. Digital Marketing Experience: Expertise in Martech platform implementation (Adobe/SFDC) from solution design to implementation, specifically with Adobe Experience Platform (AEP) and Adobe Journey Optimizer (AJO) . Database Skills: Proficiency in SQL, Oracle, Teradata , including querying, working with aggregates, using filter conditions, creating joins, defining keys, and setting indexes. Data Model Design: Strong understanding of marketing database architecture and data modeling. Logical Programming Knowledge: Experience with JavaScript, Python, PHP , and familiarity with HTML/CSS . Complex Data Models (Preferably XDM): Ability to understand and utilize data models for marketing campaigns. Journey Customization: Familiarity with establishing rules and conditional logic to customize journeys based on customer attributes and behaviors. Cross-functional Collaboration: Strong ability to partner and work across different business units and technology teams with excellent coordination and communication skills. Good-to-Have Skills: Ability to own project deliverables end-to-end while working independently. Prior experience with CDP (Customer Data Platform) / Adobe Campaign . Data engineering skills. Skills & Experience: 2+ years of implementation experience with Adobe Experience Cloud products , especially Adobe Experience Platform (AEP) and Journey Optimizer (AJO) . 6+ years of overall experience in digital marketing and campaign management . Expertise in deploying, configuring, and optimizing Experience Platform services and Journey Optimizer features . Strong SQL skills for querying, data transformation, and cleansing. Hands-on experience in: Developing custom applications, workflows, and integrations using Experience Platform APIs . Configuring Experience Events (XDM schemas) for capturing data from various sources. Creating audience segments using both custom and AI/ML-based segmentation . Triggering journeys and activations based on segments and profiles. Implementing journey automation, actions, and messaging . Integrating with CRM, email, and other destination platforms . Strong development experience with: JavaScript for API calls. Java for extending functionality with custom connectors or applications. Expertise in multi-channel data modeling using Experience Data Model (XDM) . Knowledge of configuring IMS authentication for external system connectivity. Experience in debugging, optimizing, and troubleshooting custom server-side applications. Strong understanding of: Multi-channel data management, segmentation, orchestration, automation, and personalization . Data structures like JSON (for API representation), Data Tables (for tabular data processing), and Streaming Data Flows. Experience with automation tools such as: Postman for API testing. Git for source control. Excellent documentation and communication skills, with the ability to present technical recommendations effectively. Proven experience in troubleshooting API integrations, data ingestion, segmentation, and destination-related issues . Self-motivated problem solver , capable of managing resource constraints, shifting priorities, and tight deadlines. Continuous improvement mindset to optimize and enhance deployed solutions. Sql, Html & Css, Adobe, Php, Python

Posted 3 months ago

Apply

3 - 6 years

7 - 11 Lacs

Chennai, Mumbai

Work from Office

Naukri logo

The AI, Data, and Analytics (AIDA) organization team, a Pfizer Digital organization, is responsible for the development and management of all data and analytics tools and platforms across the enterprise - from global product development, to manufacturing, to commercial, to point of patient care across over 100+ countries. One of the team s top priorities is the development of Business Intelligence (BI), Reporting, and Visualization products which will serve as an enabler for the company s digital transformation to bring innovative therapeutics to patients. Role Summary We are looking for a technically skilled and experienced Reporting Engineering, Senior Associate who is passionate about developing BI and data visualization products for our Customer Facing and Sales Enablement Colleagues, totaling over 20, 000 individuals. This role involves working across multiple business segments globally to deliver top-tier BI Reporting and Visualization capabilities that enable impactful business decisions and high engagement user experiences. This role will work across multiple business segments globally to deliver best in class BI Reporting and Visualization capabilities that enable impactful business decisions and cohesive high engagement user experiences. In this position, you will be responsible for development of crucial business operations reports and dashboard products that drive company performance through continuously monitoring, measuring, identifying root cause, and proactively identifying patterns and triggers across the company to optimize performance. This role will also drive best practices and standards for BI & Visualization. This role will work closely with lead architect and lead engineers to develop reporting capabilities that elevate Customer Experience. This role requires working with complex and advanced data environments, employing the right architecture to build scalable semantic layers and contemporary reporting visualizations. The Reporting Senior Associate, will ensure data quality and integrity by validating the accuracy of KPIs and insights, resolving anomalies, implementing data quality checks, and conducting system integration testing (SIT) and user acceptance testing (UAT). The ideal candidate is a passionate and results-oriented product lead with a proven track record of delivering data and analytics driven solutions for the pharmaceutical industry. Role Responsibilities Engineering developer in business intelligence and data visualization products in service of field force and HQ enabling functions. Act as a Technical BI & Visualization developer on projects and collaborate with global team members (e. g. other engineers, regional delivery and activation teams, vendors) to architect, design and create BI & Visualization products at scale. Thorough understanding of data, business, and analytic requirements (incl. BI Product Blueprints such as SMART) to deliver high-impact, relevant data visualizations products while respecting project or program budgets and timelines. Deliver quality Functional Requirements and Solution Design, adhering to established standards and best practices. Follow Pfizer Process in Portfolio Management, Project Management, Product Management Playbook following Agile, Hybrid or Enterprise Solution Life Cycle. Extensive technical and implementation knowledge of multitude of BI and Visualization platforms not limiting to Tableau, MicroStrategy, Business Objects, MS-SSRS, and etc. Experience of cloud-based architectures, cloud analytics products / solutions, and data products / solutions (eg: AWS Redshift, MS SQL, Snowflake, Oracle, Teradata). Qualifications Bachelor s degree in a technical area such as computer science, engineering, or management information science. Recent Healthcare Life Sciences (pharma preferred) and/or commercial/marketing data experience is highly preferred. Domain knowledge in the pharmaceutical industry preferred. Good knowledge of data governance and data cataloging best practices. 2+ years Relevant experience or knowledge in areas such as database management, data quality, master data management, metadata management, performance tuning, collaboration, and business process management. Strong Business Analysis acumen to meet or exceed business requirements following User Center Design (UCD). Working experience with testing of BI and Analytics applications - Unit Testing (e. g. Phased or Agile Sprints or MVP), System Integration Testing (SIT) and User Integration Testing (UAT). Experience with technical solution management tools such as JIRA or Github. Stay abreast of customer, industry, and technology trends with enterprise Business Intelligence (BI) and visualization tools. Technical Skillset 2+ years of hands-on experience in developing BI capabilities using Microstrategy Proficiency in common BI tools, such as Tableau, PowerBI, etc. . is a plus. Understand Dimensional Data Modelling principles (eg: Star Schema) Develop using Design System for Reporting as well as Adhoc Analytics Template Experience in working with cloud native SQL and NoSQL database platforms. Snowflake experience is desirable. Experience in AWS services EC2, EMR, RDS, Spark is preferred. Solid understanding of Scrum/Agile is preferred and w orking knowledge of CI/CD, GitHub MLflow. Familiarity with data privacy standards, governance principles, data protection, pharma industry practices/GDPR compliance is preferred. Great communication skills. Great business influencing and stakeholder management skills. Information & Business Tech #LI-PFE

Posted 3 months ago

Apply

15 - 16 years

15 - 17 Lacs

Chennai, Mumbai

Work from Office

Naukri logo

ROLE SUMMARY The AI, Data, and Analytics (AIDA) organization team, a Pfizer Digital organization, is responsible for the development and management of all data and analytics tools and platforms across the enterprise - from global product development, to manufacturing, to commercial, to point of patient care across over 100+ countries. One of the team s top priorities is the development of Business Intelligence (BI), Reporting, and Visualization products which will serve as an enabler for the company s digital transformation to bring innovative therapeutics to patients. We are looking for a technically skilled and experienced Reporting Engineering Manager who is passionate about developing BI and data visualization products for our Customer Facing and Sales Enablement Colleagues, totaling over 20, 000 individuals. This role involves working across multiple business segments globally to deliver top-tier BI Reporting and Visualization capabilities that enable impactful business decisions and high engagement user experiences. This role will work across multiple business segments globally to deliver best in class BI Reporting and Visualization capabilities that enable impactful business decisions and cohesive high engagement user experiences. In this position, you will be accountable to have a thorough understanding of data, business, and analytic requirements to deliver high-impact, relevant interactive data visualizations products that drive company performance through continuously monitoring, measuring, identifying root cause, and proactively identifying patterns and triggers across the company to optimize performance. This role will also drive best practices and standards for BI & Visualization. This role will work closely with stakeholders to understand their needs and ensure that reporting assets are created with a focus on Customer Experience. This role requires working with complex and advanced data environments, employing the right architecture to build scalable semantic layers and contemporary reporting visualizations. The Reporting Manager will ensure data quality and integrity by validating the accuracy of KPIs and insights, resolving anomalies, implementing data quality checks, and conducting system integration testing (SIT) and user acceptance testing (UAT). The ideal candidate is a passionate and results-oriented product lead with a proven track record of delivering data and analytics driven solutions for the pharmaceutical industry. ROLE RESPONSIBILITIES Engineering expert in business intelligence and data visualization products in service of field force and HQ enabling functions. Act as a Technical BI & Visualization developer on projects and collaborate with global team members (e. g. other engineers, regional delivery and activation teams, vendors) to architect, design and create BI & Visualization products at scale. Thorough understanding of data, business, and analytic requirements (incl. BI Product Blueprints such as SMART) to deliver high-impact, relevant data visualizations products while respecting project or program budgets and timelines. Deliver quality Functional Requirements and Solution Design, adhering to established standards and best practices. Follow Pfizer Process in Portfolio Management, Project Management, Product Management Playbook following Agile, Hybrid or Enterprise Solution Life Cycle. Extensive technical and implementation knowledge of multitude of BI and Visualization platforms not limiting to Tableau, MicroStrategy, Business Objects, MS-SSRS, and etc. Experience of cloud-based architectures, cloud analytics products / solutions, and data products / solutions (eg: AWS Redshift, MS SQL, Snowflake, Oracle, Teradata). Qualifications Bachelor s degree in a technical area such as computer science, engineering, or management information science. Recent Healthcare Life Sciences (pharma preferred) and/or commercial/marketing data experience is highly preferred. Domain knowledge in the pharmaceutical industry preferred. Good knowledge of data governance and data cataloging best practices. Relevant experience or knowledge in areas such as database management, data quality, master data management, metadata management, performance tuning, collaboration, and business process management. Strong Business Analysis acumen to meet or exceed business requirements following User Center Design (UCD). Strong Experience with testing of BI and Analytics applications - Unit Testing (e. g. Phased or Agile Sprints or MVP), System Integration Testing (SIT) and User Integration Testing (UAT). Experience with technical solution management tools such as JIRA or Github. Stay abreast of customer, industry, and technology trends with enterprise Business Intelligence (BI) and visualization tools. Technical Skillset 4+ years of hands-on experience in developing BI capabilities using Microstrategy Proficiency in common BI tools, such as Tableau, PowerBI, etc. . is a plus. Common Data Model (Logical & Physical), Conceptual Data Model validation to create Consumption Layer for Reporting (Dimensional Model, Semantic Layer, Direct Database Aggregates or OLAP Cubes) Develop using Design System for Reporting as well as Adhoc Analytics Template BI Product Scalability, Performance-tuning Platform Admin and Security, BI Platform tenant (licensing, capacity, vendor access, vulnerability testing) Experience in working with cloud native SQL and NoSQL database platforms. Snowflake experience is desirable. Experience in AWS services EC2, EMR, RDS, Spark is preferred. Solid understanding of Scrum/Agile is preferred and w orking knowledge of CI/CD, GitHub MLflow. Familiarity with data privacy standards, governance principles, data protection, pharma industry practices/GDPR compliance is preferred. Great communication skills. Great business influencing and stakeholder management skills. Information & Business Tech #LI-PFE

Posted 3 months ago

Apply

4 - 8 years

11 - 15 Lacs

Chennai, Pune, Bengaluru

Hybrid

Naukri logo

Greetings from TSIT Digital !! This is with regard to an excellent opportunity with us and if you have that unique and unlimited passion for building world-class enterprise software products that turn into actionable intelligence, then we have the right opportunity for you and your career. This is an opportunity for Permanent Employment with TSIT Digital. What are we looking for: Ab-Initio Developer Experience:- 4 - 8 Year's Location: Pune /Chennai /Bangalore ( work from the office 3 days a week) Job Description: Minimum 5+ years of experience working as a Abinitio Developer in financial services industry. Skills Must have: Strong Abinitio, Database-Oracle/Teradata/SQL-Server/Hadoop, and Unix development experience. Good to have : AWS

Posted 3 months ago

Apply

6 - 10 years

25 - 32 Lacs

Bengaluru, Gurgaon

Work from Office

Naukri logo

Role: Java + Angular + Teradata Location: Gurgaon/ Bengaluru Mode: 5 days WFO Java with Angular, Database-Teradata ( Hardcore Developer who has worked on Teradata as a data base extensively) Bachelors degree in computer science, Software Engineering or a related field. 6+ years of experience as a Java Full stack Developer with Angular experience and a strong portfolio of projects. Experience developing desktop and mobile applications Familiarity with common stacks. Knowledge of multiple front-end languages and libraries (e.g. HTML/ CSS, JavaScript, XML, jQuery) Knowledge of multiple back-end languages (e.g. C#, Java, Python) and JavaScript frameworks (e.g. Angular, React, Node.js) Database- Teradata ( Hardcore Developer who has worked on Teradata as a data base extensively)

Posted 3 months ago

Apply

5 - 8 years

20 - 25 Lacs

Chennai, Bengaluru, Hyderabad

Hybrid

Naukri logo

Job Title : Senior GCP Data Engineer Qualification : Any Graduate or Above Relevant Experience : 5 to 8 Years Must Have Skills : Data Engineering,Teradata,GCP/Azure Good to Have Skills: Python Roles and Responsibilities : Design and implementation of data pipelines using GCP services for one or more projects • Manage Deployments of data applications and ensure efficient orchestration of services. • Implement CI/CD pipelines using Jenkins or cloud native tools to automate data pipeline deployment, testing, and integration with other services, ensuring quick iterations and deployments. • Guide a team of data engineers in building and maintaining scalable, high-performance data pipelines. • Build data pipelines and ETL/ELT processes leveraging Python, Beam and SQL scripts. • Willingness and ability to learn and adapt to new technologies as needed during client engagements. • Continuously monitor and optimize data workflows for performance and cost-effectiveness. • Design workflows to integrate data from various sources using GCP services and orchestrate complex tasks with Cloud Composer (Apache Airflow). • Set up monitoring, logging, and alerting using Cloud Monitoring, Datadog, or other tools to ensure visibility into pipeline performance and quickly identify and resolve issues. • Guide and mentor junior developers and data engineers, helping them overcome technical challenges and ensuring high-quality code and solutions. • Work closely with application developers, data architects, and business stakeholders to define and deliver robust data-driven solutions. • Work on data migration from various databases/legacy DW systems build on Oracle, Teradata, SQL server etc to GCP cloud data platform, • Facilitate agile processes like sprint planning, daily scrums, and backlog grooming. • Interact with clients stakeholders on assigned data, BI and analytics programs • Work closely with program leadership team on stakeholder management, governance and communication. Location : Bangalore/Hyderabad/Chennai CTC Range : 20 LPA - 24 LPA (Lakhs Per Annum) Notice period : Immediate Shift Timing : General Mode of Interview : Virtual Mode of Work : Hybrid Jyoti Team Lead Black and White Business solutions PVT Ltd Bangalore, Karnataka, INDIA. jyotirekha@blackwhite.in I www.blackwhite.in

Posted 3 months ago

Apply

2 - 7 years

8 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

DESCRIPTION AWS Sales, Marketing, and Global Services (SMGS) is responsible for driving revenue, adoption, and growth from the largest and fastest growing small- and mid-market accounts to enterprise-level customers including public sector. Amazon Web Services is the global market leader and technology forerunner in the Cloud business. As a member of the AWS Support team in Amazon Web Services, you will be at the forefront of this transformational technology, assisting a global list of companies and developers that are taking advantage of a growing set of services and features to run their mission-critical applications. As a Cloud Support Engineer, you will act as the Cloud Ambassador’ across all the cloud products, arming our customers with required tools & tactics to get the most out of their Product and Support investment. Would you like to use the latest cloud computing technologies? Do you have an interest in helping customers understand application architectures and integration approaches? Are you familiar with best practices for applications, servers and networks? Do you want to be part of a customer facing technology team in India helping to ensure the success of Amazon Web Services (AWS) as a leading technology organization? If you fit the description, you might be the person we are looking for! We are a team passionate about cloud computing, and believe that world class support is critical to customer success. Key job responsibilities - Diagnose and resolve issues related to Kafka performance, connectivity, and configuration. - Monitor Kafka clusters and perform regular health checks to ensure optimal performance. - Collaborate with development teams to identify root causes of problems and implement effective solutions. - Provide timely and effective support to customers via email, chat, and phone. - Create and maintain documentation for troubleshooting procedures and best practices. - Assist in the deployment and configuration of Kafka environments, including brokers, producers, and consumers. - Conduct training sessions and provide knowledge transfer to team members. - You will be continuously learning groundbreaking technologies, and developing new technical skills and other professional competencies. - You will act as interviewer in hiring processes, and coach/mentor new team members. A day in the life • First and foremost this is a customer support role – in The Cloud. • On a typical day, a Support Engineer will be primarily responsible for solving customer’s cases through a variety of customer contact channels which include telephone, email, and web/live chat. You will apply advanced troubleshooting techniques to provide tailored solutions for our customers and drive customer interactions by thoughtfully working with customers to dive deep into the root cause of an issue. • Apart from working on a broad spectrum of technical issues, an AWS Support Engineer may also coach/mentor new hires, develop & present training, partner with development teams on complex issues or contact deflection initiatives, participate in new hiring, write tools/script to help the team, or work with leadership on process improvement and strategic initiatives to ensure better CX and compliance with global AWS standards, practices and policies. • Career development: We promote advancement opportunities across the organization to help you meet your career goals. • Training: We have training programs to help you develop the skills required to be successful in your role. • We hire smart people who are keen to build a career with AWS, so we are more interested in the areas that you do know instead of those you haven’t been exposed to yet. • Support engineers interested in travel have presented training or participated in focused summits across our sites or at specific AWS events. AWS Support is 24/7/365 operations and shift work will be required in afternoon i.e. 1 PM to 10 PM IST About the team Diverse Experiences AWS values diverse experiences. Even if you do not meet all of the preferred qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn’t followed a traditional path, or includes alternative experiences, don’t let it stop you from applying. Why AWS? Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating — that’s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Inclusive Team Culture Here at AWS, it’s in our nature to learn and be curious. Our employee-led affinity groups foster a culture of inclusion that empower us to be proud of our differences. Ongoing events and learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon (gender diversity) conferences, inspire us to never stop embracing our uniqueness. Mentorship & Career Growth We’re continuously raising our performance bar as we strive to become Earth’s Best Employer. That’s why you’ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional. Work/Life Balance We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there’s nothing we can’t achieve in the cloud. BASIC QUALIFICATIONS - Bachelor’s degree OR equivalent experience in a technical position; Requires minimum of 2+ yrs experience in relevant technical position - Exposure to Database Fundamentals and General Troubleshooting (tuning and optimization, deadlocks, keys, normalization) in any Relational Database Engines (MySQL, PostgreSQL, Oracle, SQLServer) OR exposure to search services fundamentals and troubleshooting (indices and JVMMemory analysis and CPU utilization) for key open source products like Elasticsearch and Solr OR exposure to streaming services like Kafka / Kinesis. - Experience in Business Analytics application, support, and troubleshooting concepts; Experience with System Administration and troubleshooting with Linux (Ubuntu, CentOS, RedHat) and/or Microsoft Windows Server and associated technologies (Active Directory); Experience with Networking and troubleshooting (TCP/IP, DNS, OSI model, routing, switching, firewalls, LAN/WAN, traceroute, iperf, dig, URL or related) PREFERRED QUALIFICATIONS - Experience in a customer support environment and Experience in analyzing, troubleshooting, and providing solutions to technical issues - Knowledge in data warehousing and ETL process • Understanding of Cloud Computing concepts; Experience in scripting or developing in at least one of the following languages :Python, R, Ruby, GO, Java, .NET (C#), JavaScript - Expertise in any one of the Data warehouse technology (example Redshift, Teradata, Exadata or Snowflake) OR expertise in search services products like Elasticsearch / Solr; Expertise in streaming services like Kafka / Kinesis Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 3 months ago

Apply

3 - 6 years

25 - 30 Lacs

Noida

Work from Office

Naukri logo

Analytics Associate Zenon Analytics is a global boutique consulting firm and the leading AI partner for Fortune 500 firms. We partner with clients across the globe to identify their highest-value opportunities, address their most critical challenges, and transform their enterprises using advanced analytics. In a nutshell: An Analytics Associate at Zenon Analytics is responsible for analyzing data and deriving actionable insights for timely decision making by our clients. A person in this role will be responsible for client engagements, understanding client requirements, problem structuring, extracting requisite data, performing extensive data analyses and building statistical predictive models. The ideal candidate should have exceptional coding skills (Python/R), strong analytical skills including theoretical and practical knowledge of machine learning algorithms. The ideal candidate should also possess excellent soft skills to support client engagements. They will work closely with highly experienced leaders in designing, developing and implementing cutting-edge analytics solutions for a variety of business problems. What will you do Engage with clients to understand their business needs from an analytical standpoint and accordingly define the problem statement(s) to focus on. Understand the data requirements to solve the problem and perform exhaustive exploratory analyses. Build statistical and/or predictive models on the data to assist in better business decision making. Test the performance of the models and deploy them into production by closely working with the client teams. Prepare client deliverables, followed by engaging with the clients to discuss results and recommendations. Be passionate about applying analytics to real-world business situations and producing measurable improvements for clients. Be comfortable with ambiguity and truly enjoy the challenge of creating new solutions for difficult Big Data problems. Bring in analytical expertise to assist with Business Development proposals. What are we looking for Strong academic background, preferably from Tier I institutes with a major in Engineering / Mathematics / Statistics Self-driven, with the ability to work in a fast paced, dynamic environment 3-6 years of experience in data analytics with practical exposure to building predictive models Analytical thinker with strong problem structuring and solving abilities Ability to work in a systematic manner with focus on quality of work and efficiency Strong communication skills Data science expertise - Hands-on experience in Python , R, SQL Worked on NLP unstructured data / semi structured data Have prior experience in pre-trained models, like BERT, Glove, Worked on Neural Network, with experience in Tensor Flow, PyTorch Model development using RNN, CNN, LSTM Machine learning model development, hands on experience in Random Forest, XGboost, SVM, etc Experience in model performance tuning Good to have experience Financial services or Healthcare industry experience will be a plus Experience in UiPath AI Fabric Worked on Cloud AWS Experience in bigdata Hadoop, Teradata PySpark exposure Experience in Model governance and deployment

Posted 3 months ago

Apply

3 - 6 years

27 - 32 Lacs

Noida

Work from Office

Naukri logo

Associate (Analytics) Zenon Analytics is a global boutique consulting firm and the leading AI partner for Fortune 500 firms. We partner with clients across the globe to identify their highest-value opportunities, address their most critical challenges, and transform their enterprises using advanced analytics. In a nutshell: An Associate (Analytics) at Zenon Analytics is responsible for analysing data and deriving actionable insights for timely decision making by our clients. A person in this role will be responsible for client engagements, understanding client requirements, problem structuring, extracting requisite data, performing extensive data analyses and building statistical predictive models. The ideal candidate should have exceptional coding skills (Python/R), strong analytical skills including theoretical and practical knowledge of putting ML models into production. The ideal candidate should also possess excellent soft skills to support client engagements. They will work closely with highly experienced leaders in designing, developing and implementing cutting-edge analytics solutions for a variety of business problems. What will you do Engage with clients to understand their business needs from an analytical standpoint and accordingly define the problem statement(s) to focus on. Understand the data requirements to solve the problem and perform exhaustive exploratory analyses. Build statistical and/or predictive models on the data to assist in better business decision making. Test the performance of the models and deploy them into production by closely working with the client teams Prepare client deliverables, followed by engaging with the clients to discuss results and recommendations. Be passionate about applying analytics to real-world business situations and producing measurable improvements for clients. Be comfortable with ambiguity and truly enjoy the challenge of creating new solutions for difficult Big Data problems. Bring in analytical expertise to assist with Business Development proposals. What are we looking for Strong academic background, preferably from Tier I institutes with a major in Engineering / Mathematics / Statistics Self-driven, with the ability to work in a fast paced, dynamic environment 3-6 years of experience in putting ML models into production Analytical thinker with strong problem structuring and solving abilities Ability to work in a systematic manner with focus on quality of work and efficiency Strong communication skills MLops and Data Science expertise - Hands-on experience in Python , R, SQL Knowledge of building traditional and advanced ML models for structured and unstructured data and putting models into production Expertise in one of API frameworks like FastAPI, Flask etc. Engineering Deep hands-on experience with one or more of the following Cloud and AI technologies: Cloud-native architectures (Containers, Kubernetes, Serverless) DataOps, MLOps (Kubeflow, Ray) Tightly-coupled and Loosely-coupled HPC frameworks Deep Learning, ML Frameworks (PyTorch, TensorFlow) Knowledge of infrastructure stacks at-least one. (AWS, Azure, GCP, Docker, Kubernetes, Kafka, Spark, etc.) Experience in agile development, utilizing Git, MLFlow CI/CD, and MLOps frameworks. Architecture - the ability to create and explain 3-tier architecture diagrams, system context diagrams, system interaction diagrams, etc. Model Management, MLOps, Model Deployment approaches, Model monitoring and tuning with respect to at-least one of the major cloud providers Hands on working experience on at least one of the MLOps Platform like AWS Sagemaker, Dataiku, Data Robot etc. Knowledge to design a solution approach, leading a team to deploy and maintain it in production. Good to have experience Financial services or Healthcare industry experience will be a plus Experience in UiPath AI Fabric Worked on Cloud AWS Experience in bigdata Hadoop, Teradata PySpark exposure

Posted 3 months ago

Apply

14 - 20 years

27 - 42 Lacs

Hyderabad

Hybrid

Naukri logo

In this role, you will: Define secure configuration baselines for database management system software, including but not limited to Oracle, Db2, SAP ASE, SQL Server, Db2 z/OS, MongoDB, and PostgreSQL, Teradata, HADOOP. Work with database technical subject matter experts to agree secure configuration baselines. Work with database technical subject matter experts to define/develop/implement checks for compliance scans. Work with database technical subject matter experts to provide remediation guidance for IT Service Owners. Work with the Configuration Baseline Management team to ensure they receive configuration compliance data. Interact with stakeholders across the organisation to understand their security needs and expectations. Define and maintain capability strategy, supported by Enterprise Architecture, Security Architecture and, Control Owners, in response to business strategies, regulator expectations, technology and practice advancement, best practice, and threat actor evolution [will overlap with Architecture]. Ensure success with delivery partners (in alignment with support functions). Runs / drives respective Delivery forum, QBRs, SteerCos and Capability PODs. Maintain and prioritise a capability backlog based on objectives and value released to identify what teams work on next. Supports the prioritisation of backlogs from supporting technology and operations/service teams. Lead vendor relationships with owned technologies. Evaluate and adopt new technologies and practices which may impact the capability's needs and/or control environment. Monitor and communicate progress of capability performance through agreed indicators and metrics. Close working with Control Owners: Oversees Control Owner activity from a technical point-of-view, e.g. accurate assessment of control defect severities. Close working with Service Owners: understands general performance of associated services, exceptions, customer feedback and service uplift roadmaps. Close working withTechnology/Platform Owners: understands general performance of associated IT services, significant bugs, technology health, customer feedback and technology uplift roadmaps (including technical debt resolution). Run a Pod per L2 capability with Architecture, Engineering, Service Delivery, Control Owner, Programme Manager, and Product Management Own all medium-rated and below risk Control Issues, Audit points and Regulatory findings To be successful in this role, you should meet the following requirements: Any Graduation Exp : 14+ years Minimum 5 years in-depth experience with multiple database technologies from the list of Oracle, Db2, SAP ASE, SQL Server, Db2 z/OS, MongoDB, and PostgreSQL, Teradata, HADOOP. Demonstrated experience with database platform security. Minimum 2 years experience leading a technical team. Demonstrated understanding of and experience with Center for Internet Security (CIS) benchmarks. Strong stakeholder management skills, with demonstrated experience of understanding and meeting the needs of multiple stakeholders. Excellent communication skills, including the ability to translate complex technical concepts into business-friendly language. Customer-centric consultancy approach. Strong analytical and problem-solving skills. Ability to manage budgets and allocate resources effectively. Reliant and adaptive to changing situations, with strong desire to delegate and empower the team.

Posted 3 months ago

Apply

5 - 9 years

10 - 14 Lacs

Mumbai

Work from Office

Naukri logo

The purpose of this role is to oversee the development of our database marketing solutions, using database technologies such as Microsoft SQL Server/Azure, Amazon Redshift, Google BigQuery. The role will be involved in design, development, troubleshooting, and issue resolution. The role involves upgrading, enhancing, and optimizing the technical solution. It involves continuous integration and continuous deployment of various requirements changes in the business logic implementation. Interactions with internal stakeholders and/or clients to explain technology solutions and a clear understanding of client s business requirements through which to guide optimal design/solution to meet their needs. The ability to communicate to both technical and non-technical audiences is key. Job Description: Must Have Skills: Database (SQL server / Snowflake / Teradata / Redshift / Vertica / Oracle / Big query / Azure DW etc. ETL (Extract, Transform, Load) tool (Talend, Informatica, SSIS, DataStage, Matillion) Python, UNIX shell scripting, Project resource management Workflow Orchestration (Tivoli, Tidal, Stonebranch) Client-facing skills Good to have Skills: Experience in Cloud computing (one or more of AWS, Azure, GCP) . AWS Preferred. Key responsibilities: Understanding and practical knowledge of data warehouse, data mart, data modelling, data structures, databases, and data ingestion and transformation Strong understanding of ETL processes as well as database skills and common IT offerings i.e. storage, backups and operating system. Has a strong understanding of the SQL and data base programming language Has strong knowledge of development methodologies and tools Contribute to design and oversees code reviews for compliance with development standards Designs and implements technical vision for existing clients Able to convert documented requirements into technical solutions and implement the same in given timeline with quality issues. Able to quickly identify solutions for production failures and fix them. Document project architecture, explain detailed design to team and create low level to high level design. Perform mid to complex level tasks independently. Support Client, Data Scientists and Analytical Consultants working on marketing solution. Work with cross functional internal team and external clients . Strong project Management and organization skills . Ability to lead/work 1 - 2 projects of team size 2 - 3 team members. Code management systems which include Code review and deployments Location: DGS India - Pune - Baner M- Agile Brand: Merkle Time Type: Full time Contract Type: Permanent

Posted 3 months ago

Apply

5 - 10 years

10 - 20 Lacs

Chennai, Hyderabad

Work from Office

Naukri logo

Required Skills: Proficiency in Python and SQL , with a strong focus on writing optimized queries and scripts. Extensive experience working with GCP services , especially BigQuery , Cloud Composer (Airflow) , Dataproc , and GCS . Strong experience migrating data from Teradata to modern cloud data warehouses like BigQuery , ensuring seamless data transformation and load processes. Familiarity with ETL/ELT processes and data modeling concepts , with an emphasis on cloud data architectures. Understanding of CI/CD tools like Jenkins and GitLab for continuous integration and deployment. Basic exposure to distributed processing frameworks like Apache Spark . Good problem-solving skills and the ability to work in a collaborative team environment.

Posted 3 months ago

Apply

2 - 6 years

7 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

About this role: Wells Fargo is seeking an Analytics Associate Manager In this role, you will: Supervise entry to mid-level individual contributors in transactional or tactical reporting and analytical tasks to ensure timely completion, quality, and compliance Evaluate and identify strategies for developing and utilizing technical and analytic resources to enable initiatives and projects that predict, improve, support, and measure business success Manage and prioritize multiple projects and maintain necessary processes, controls, and procedures to ensure data accuracy and integrity Leverage interpretation of data management and data governance regulations, policies, and compliance requirements Provide analytical assistance for designing, testing, evaluating, implementing, measuring, monitoring, and supporting business strategies Work with clients to define information needs to support business processes, initiatives, and projects Present findings and solutions to contribute to business success Collaborate and consult with peers, colleagues, and mid-level managers Manage allocation of people and financial resources Be responsible for selection, evaluation, mentoring and development of staff to ensure client service obligations are met Required Qualifications: 2+ years of Analytics experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education 1+ years of leadership experience Desired Qualifications: Experience with SAS and GCP will be an added advantage Experience in Agile methodology and leveraging Jira tools for workflow and productivity management. 2+ years of experience in one or a combination of the following: administrative support, Team management, project, business operations or strategic planning in financial services demonstrated through work or military experience management, implementation. 2+ years of business systems analysis experience 2+ years of data warehouse experience 2+ years of data modelling experience 2+ years of SQL/Teradata experience 2+ years of Linux experience 2+ years of Alteryx experience 2+ years of team management experience Strong analytical skills with high attention to detail and accuracy Experience in onshore/offshore support model Strong presentation, communication, writing and interpersonal skills. Experience working with business leaders and being able to translate needs into technical requirements. Understanding of the importance of control, governance, and audit procedures Job Expectations: Manage a team of analytics consultants to expand offshore support for Data Enablement function within CAR-Conduct Analytics and Reporting. Lead the strategy and resolution of highly complex and unique challenges requiring analytical application of industry techniques and advanced data from multiple sources across the enterprise. Deliver solutions that are long-term, large-scale, and require vision, innovation, and coordination of highly complex activities. Provide vision, direction, and expertise to senior leadership on implementing innovative and significant business solutions that align to the enterprise. Lead the requirements design and elicitation for small and large data initiatives Self-direct and lead others to complete multiple tasks and meet aggressive time frames. Be a strong influencer and consensus builder with a positive attitude, who motivates self and others Collaborate across people, processes, and technology/tools within a highly intensive and complex data environment Ensure adherence to compliance and legal regulations and policies on all projects managed. Strategically engage with all levels of professionals and managers across the enterprise and serve as an expert advisor to leadership. Communicate effectively, in both written and verbal formats Be comfortable presenting and tailoring communication approaches based on the audience Present findings and solutions to contribute to business success Collaborate and consult with peers, colleagues, and mid-level managers.

Posted 3 months ago

Apply

4 - 7 years

7 - 13 Lacs

Chennai, Hyderabad

Work from Office

Naukri logo

Role & responsibilities Collaborate with senior engineers to design and implement scalable ETL/ELT data pipelines using Python and SQL on GCP platforms. Assist in building data warehouse solutions on BigQuery and optimizing data workflows for efficient processing and storage. Support data migration processes from legacy systems (e.g., Teradata, Hive ) to BigQuery. Work closely with cross-functional teams to understand data requirements, perform data modeling, and develop curation layers for analytics and ML model deployment. Troubleshoot and resolve data processing issues to ensure data accuracy, consistency, and availability. Maintain code quality using Git for version control and collaborate on agile development processes using Jira Preferred candidate profile Strong proficiency in Teradata for data engineering tasks. Strong understanding and experience with distributed computing principles and frameworks like Hadoop, Apache Spark etc. Advanced experience with GCP services, including BigQuery, Dataflow, Cloud Composer (Airflow) , and Dataproc . Expertise in data modeling, ETL/ELT pipeline development, and workflow orchestration using Airflow DAGs. Hands-on experience with data migration from legacy systems ( Teradata, Hive ) to cloud platforms (BigQuery). Familiarity with streaming data ingestion tools like Kafka and NiFi. Strong problem-solving skills and experience with performance optimization in large-scale data environments. Proficiency in CI/CD tools (Jenkins, GitLab) and version control systems (Git). GCP Professional Data Engineer certification.

Posted 3 months ago

Apply

4 - 9 years

7 - 17 Lacs

Chennai, Hyderabad

Work from Office

Naukri logo

Role & responsibilities Immediate Joiners or 20 days of Notice Period A strong background in Google Cloud Platform (GCP) . Hands-on experience with Dataflow (Apache Beam) for stream and batch data processing. Expertise in BigQuery , including query optimization, partitioning, clustering, and cost management. Good experience in Teradata Proficiency in SQL and Python for data processing and automation. Familiarity with Cloud Storage, Cloud Pub/Sub, and Cloud Functions . Familiarity with Cloud Composer (Apache Airflow) for workflow orchestration . Strong understanding of data modeling, schema design, and performance tuning . Knowledge of IAM roles, security best practices, and data governance in GCP. Experience working with CI/CD pipelines for deploying data workflows. Excellent problem-solving skills and ability to work in a fast-paced environment.

Posted 3 months ago

Apply

4 - 8 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Role SSIS + POWER BI Experince 5+ Years Location Manikonda SEZ, Hyderabad Notice period Immediate Joiners Job Description This technically focused position with the team is responsible for designing, developing, testing and deploying our business intelligence solutions across the enterprise. BI Developer is responsible for application creation, administrative tasks, and supporting dashboard and report solutions development. The Senior BI Developer will collaborate with Business Analysts, Data Scientists, and Project Managers, provide leadership regarding reporting and BI, as well as support and develop subject matter knowledge for reporting within the ITG space. The Senior BI Developer will ensure the effective design, build, deployment, documentation and maintenance of reports against relational and multidimensional databases to support end-user processes. Description of Responsibility Technical This individual will lead all aspects of the BI development lifecycle from requirements gathering to data modeling to report development. The team member will assume subject matter expert responsibilities in developing appropriate strategies, plans, tools and approaches to ensure successful adoption of BI reports, dashboards and scorecards. Think analytically and critically to lead reporting standardization and automation efforts. Present data formally and informally and facilitate discussion regarding data outputs. Generate and prepare reports, scorecards, and dashboards to support business objectives. Develop BI Reporting architecture, reports, dashboards, guided analytics, and other BI solutions. Produce technical solution designs that will meet business requirements in a manner consistent with Company and industry standards. Produce technical solution designs that will meet business requirements in a manner consistent with Company and industry standards. Participate in the development and support of reporting solutions including program modifications, performance tuning, problem solving, debugging, and unit testing. Learn and abide by corporate data governance guidance. Participate in data modeling and loading as required by the BI Tool in use Education Grow professionally and lead developer in all activities listed above and Provide hands-on technical leadership for development projects. Establish learning curriculum or examples to serve as the basis for mentoring other teams. Relationship and Team Building Exchange detailed information across department, organization / enterprise;communicates independently, thorough understanding of organizational data, sharing technical and clinical expertise or providing direction/instruction. Provide assistance for BI reporting solutions to the rest of the ITS organization. Provision of the guidance takes a number of forms, including engaging in joint- development with various application development teams, providing second-level to Customer Service and primary support to the rest of the ITS organization, documenting and communicating standards and procedures, developing and delivering internal training certification programs, and assisting problem analysis/resolution. Develop and drive the implementation of management reporting best practices and standards for the team. Collaborate with other teams to integrate solutions where additional data and subject expertise is required. Contribute to multi-disciplinary teams using data to improve usage and operational workflow. Educational Requirements Technical Experience Requires a Bachelors degree. 5+ years data/process/ analysis experience. Fully competent professional with in-depth knowledge of clinical data, analytics and data science. Healthcare operations and clinical process workflow experience preferred. Desired Technical Skills Demonstrated extensive experience in large-scale reporting development projects (ad hoc and standard queries). Demonstrated broad SQL development experience with solid understanding of relational and multi-dimensional designs. Experience with Teradata and SQL Server. Prior certification in at least one (1) of BI Tools. Expert knowledge of Microsoft Office. Experience in an Agile BI Reporting development self-service environment is required. Experience with BI tool architectures, functions, and features desired. Detailed working knowledge with clinical and operational data within a healthcare system Desired Professional Skills Preferred detailed working knowledge of the health care industry, applications, clinical workflow issues, regulatory compliance and federal mandates preferred. Possesses interpersonal skills to work in a high profile and fast paced team. Technical aptitude to learn and adapt to new industry applications and tools. Works independently, receives minimal guidance and acts as a mentor and provides direction to colleagues with less experience. Ability to communicate and facilitate topics clearly and effectively across department lines; working inside peer groups and receiving guidance from supervisor. Demonstrated oral, written and technical communication skills required. Demonstrated presentation/public speaking skills and experience required. Possesses good organizational as well as visual and verbal presentation skills. Ability to establish and consistently meet delivery dates. Mandatory Skills SSIS, Power BI, SQL Development, BI Reporting Development, Data Modeling, Relational and Multidimensional Databases, Report Deployment, Agile Methodologies, Performance Tuning, Business Intelligence (BI) Tools (Experience with BI tools architecture and features), Data Governance Secondary Skills Teradata, Data Science, Clinical Workflow Knowledge, Operational Data within Healthcare Systems, Agile BI Environment, BI Tool Certification (e.g., Power BI Certification, SSIS, etc.), Team Leadership Mentorship, Process Improvement, Client Interaction, Interpersonal Skills, Microsoft Office Suite, Healthcare Regulatory Compliance Federal Mandates Knowledge Keywords SSIS, Power BI, BI Developer, SQL Server, Teradata, Data Modeling, Reporting Solutions, BI Dashboards, Agile BI, Data Governance, SQL Reporting, Data Science, Healthcare Operations, BI ToolsPerformance Tuning, Data Integration, Clinical Data, Reporting Automation, BI Architecture, Mentorship Leadership, Healthcare Compliance

Posted 3 months ago

Apply

7 - 10 years

15 - 16 Lacs

Chennai

Work from Office

Naukri logo

. Job Summary dx data engineering team is involved in data analytics, ELT and load semantics of inhouse self-service products and certifies the product according to Business needs and standards. The dx Data Engineer need to work on developing and testing applications on backend systems/platforms. Job Description Core Responsibilities Expert level in DWH, ETL, RDBMS, SQL, Hadoop/Big Data, Spark, Data modelling Expert level in Bigdata eco system/ Cloud (AWS) technologies, DevOps Nice to have experience in Dockers/ Kubernetes/ MinIO Experience in handling enterprise data lake/data warehouse solutions Responsible for taking accountability in delivering solutions based on data lineages Supervise team for best-in-class implementations Responsible for planning, design, and implementation Agile in critical problem-solving, solution orientation Develop end to end business applications (backend) Works hand-in-hand with core team Good appetite for continuous improvement and innovation Experience on agile methodology Success in stakeholder management Collaborate with product development teams Consistent exercise of independent judgment and discretion in matters of significance Proactive participation and effective communication in team discussions Other duties and responsibilities as assigned Employees at all levels are expected to: Understand our Operating Principles; make them the guidelines for how you do your job. Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services. Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences. Win as a team - make big things happen by working together and being open to new ideas. Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers. Drive results and growth. Respect and promote inclusion & diversity. Do whats right for each other, our customers, investors and our communities. Disclaimer: This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications. Skills Apache Spark, AWS Cloud Computing, Big Data, Databricks Platform, Data Warehousing (DW), ETL Development, Innovation, Structured Query Language (SQL), Teradata Database We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. Thats why we provide an array of options, expert guidance and always-on tools that are personalized to meet the needs of your reality to help support you physically, financially and emotionally through the big milestones and in your everyday life. Please visit the benefits summary on our careers site for more details. Education Bachelors Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Certifications (if applicable) Relative Work Experience 7-10 Years Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law.

Posted 3 months ago

Apply

7 - 10 years

15 - 16 Lacs

Chennai

Work from Office

Naukri logo

. Job Summary dx data engineering team is involved in data analytics, ELT and load semantics of inhouse self-service products and certifies the product according to Business needs and standards. Responsible for planning and designing new/existing applications. Assists with tracking performance metrics. Integrates knowledge of business and functional priorities. Acts as a key contributor in a complex and crucial environment. Lead teams or projects and shares expertise. Job Description Core Responsibilities Expert level in DWH, ETL, RDBMS, SQL, Hadoop/Big Data, Spark, Data modelling Expert level in Bigdata eco system/ Cloud (AWS) technologies, DevOps Nice to have experience in Dockers/ Kubernetes/ MinIO Experience in handling enterprise data lake/data warehouse solutions Responsible for taking accountability in delivering solutions based on data lineages Supervise team for best-in-class implementations Responsible for planning, design, and implementation Agile in critical problem-solving, solution orientation Develop end to end business applications (backend) Works hand-in-hand with core team Good appetite for continuous improvement and innovation Experience on agile methodology Success in stakeholder management Collaborate with product development teams Consistent exercise of independent judgment and discretion in matters of significance Proactive participation and effective communication in team discussions Other duties and responsibilities as assigned Employees at all levels are expected to: Understand our Operating Principles; make them the guidelines for how you do your job. Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services. Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences. Win as a team - make big things happen by working together and being open to new ideas. Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers. Drive results and growth. Respect and promote inclusion & diversity. Do whats right for each other, our customers, investors and our communities. Disclaimer: This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications. Skills Agile Methodology, Apache Spark, AWS Cloud Computing, Big Data, Databricks Platform, Data Warehousing (DW), ETL Development, Python (Programming Language), Structured Query Language (SQL), Teradata Database We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. Thats why we provide an array of options, expert guidance and always-on tools that are personalized to meet the needs of your reality to help support you physically, financially and emotionally through the big milestones and in your everyday life. Please visit the benefits summary on our careers site for more details. Education Bachelors Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Certifications (if applicable) Relative Work Experience 7-10 Years Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law.

Posted 3 months ago

Apply

3 - 9 years

14 - 16 Lacs

Hyderabad

Work from Office

Naukri logo

We are looking forward to hire SnowFlake Professionals in the following areas : : Primary Skill: Snowflake Secondary: AWS Data warehousing and ETL Principles Looking for Senior Data Engineers, should have strong communication, interpersonal, analytical, and problem-solving skills. At least 4+ years of total experience in IT (data warehousing and data analytics) out of which 3+ years working on Snowflake related implementations Design and develop data solutions within the Snowflake cloud data platform, including data warehousing, data lake, and data modelling solutions Experience working with Snowflake Functions, hands on exp with Snowflake utilities, stage and file upload features, time travel, fail safe Hands-on in complex SQL, parsing complex data sets Snowflake Tables/Views/stored procedures/streams/tasks creations/aggregation AWS Snowflake platform for migration (Good to have experience on AWS components like S3, Glue, Redshift, Stepfunction, EC2, No SQL database) Handling large & complex datasets like JSON, ORC, PARQUET, PDF, XML and CSV files Data lakes, multi-dimensional models, data dictionaries Data warehousing concepts, data modelling, metadata management Knowledge of SQL language and cloud-based technologies Working knowledge on complex Stored Procedures and Views Snowflake modelling roles, databases, schemas SQL performance measuring, query tuning, and database tuning ETL tools with cloud-driven skills Integration with third-party tools Root cause analysis of models with solutions Relational databases like Oracle SQL Server, Teradata, Postgres etc. Snowflake warehousing, architecture, processing, administration Enterprise-level technical exposure to Snowflake application Good to have Snowflake security RBAC understanding and Snowflake Administration Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 3 months ago

Apply

10 - 15 years

20 - 25 Lacs

Pune

Work from Office

Naukri logo

Minimum Experience: 10 Years We are looking for a Big Data / Cloud Architect to become part of Atgeirs Advanced Data Analytics team. The desired candidate is a Professional with proven track record of working on Big Data and Cloud Platforms. Required Skills: -------------- Work closely with customers, understand customer requirements and render those as architectural models that will operate at large scale and high performance, and advise customers on how to run these architectural models on traditional Data Platforms (Hadoop Based) as well as Modern Data Platforms (Cloud Based). Work alongside customers to build data management platforms using Open Source Technologies as well as Cloud Native services Extract best-practice knowledge, reference architectures, and patterns from these engagements for sharing with Advanced Analytics Centre of Excellence (CoE) team at Atgeir. Highly technical and analytical with 10 or more years of ETL and analytics systems development and deployment experience Strong verbal and written communications skills are a must, as well as the ability to work effectively across internal and external organizations and virtual teams. Ability to think understand complex business requirements and render them as prototype systems with quick turnaround time. Implementation and tuning experience in the Big Data Ecosystem, (such as Hadoop , Spark , Presto , Hive ), Database (such as Oracle , MySQL , PostgreSQL , MS SQL Server ) and Data Warehouses (such as Redshift, Teradata, etc.) Knowledge of foundation infrastructure requirements such as Networking, Storage, and Hardware Optimization with Hands-on experience with one of the clouds ( GCP / Azure / AWS ) and/or data cloud platforms ( Databricks / Snowflake ) Proven hands-on experience with at least one programming language among ( Python, Java. Go, Scala ) Willingness to work hands-on on the projects Ability to lead and guide large teams Architect level certification on one of the clouds will be an added advantage.

Posted 3 months ago

Apply

2 - 5 years

3 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

Detailed JD (Roles and Responsibilities) 6+ years of experience in Data Warehousing / BI Testing using any ETL Tools. Understand and Validation of data for all the layers from source to Target. Extensive experience in writing and troubleshooting SQL Queries Understanding the business requirements so as to formulate the problems to solve and restrict the slice of data to be explored. Exposure to Data Warehousing and Dimensional Modelling Concepts Experience in understanding of ETL Source to Target Mapping Document Experience in data validation using analytics tools Hands-on experience and strong understanding of Software Development Life Cycle (SDLC) and Software Testing Life Cycle (STLC). Experience in defect tracking tools like Jira, Client ALM Experience in writing test cases, Test summary and test plan. Good knowledge On testing concepts on regression ,Integration and production checkout activities Hands on experience on database SQL queries Vertica and Teradata. Automation experience is added advantage. MicroStrategy BI tool hands-on is added advantage. Mandatory skills SQL / MicroStrategy BI/ ETL Testing Desired/ Secondary skills Agile \

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies