Home
Jobs

823 Teradata Jobs - Page 27

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8 - 12 years

32 - 37 Lacs

Mumbai

Work from Office

Naukri logo

As a Data Owner Lead at JPMorgan Chase within the Banking Payments team, you will play a crucial role in accelerating product development, driving business growth, and improving the Chase customer experience through data. You will ensure data quality and protection, collaborating with product teams to deliver data that meets business objectives and analytics needs. You will also identify and mitigate data risks in compliance with Firmwide policies. Job Responsibilities Develop and deliver product data plans to support strategic objectives and analytics. Provide expertise on data content and usage within the business. Identify and document critical data scope with proper metadata classification. Support analytics projects by identifying necessary data for integration. Document and coordinate resources for data quality requirements. Resolve data issues promptly and influence resources for resolution. Develop processes to monitor and mitigate data risks, including protection and quality. Required Qualifications, Capabilities, and Skills 8+ years of industry experience in a data-related field. Experience managing delivery across multiple workstreams. Expertise in business or product data. Technical knowledge of data management, governance, and big data platforms. Ability to manage delivery timelines and ensure organizational goals are met. Proven ability to consult on data best practices and resolve data quality issues. Skilled in SQL and database technologies. Preferred Qualifications, Capabilities, and Skills Knowledge of payments products or retail banking. Technical knowledge of Datawarehouse like Snowflake, Teradata would be preferred Bachelor s degree required, and Master s degree preferred. Preferably skilled on Cloud technologies like AWS

Posted 1 month ago

Apply

3 - 8 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III at JPMorgan Chase within the Employee Platforms team, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm s business objectives. Job responsibilities Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture Contributes to software engineering communities of practice and events that explore new and emerging technologies Adds to team culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 3+ years applied experience Hands-on practical experience in system design, application development, testing, and operational stability Hnads on experience of software / data engineering experience with SQL, Spark/Pyspark, Databricks, and the AWS cloud ecosystem. Hands-on experience delivering systems, including design, development, testing, and operational stability Experience programming with at least one modern language or more such as Java, Python, Pyspark, Terraform. Experienced in REST APIs, microservices, and distributed systems Experience working with modern data lakehouse platforms like Databricks etc. Expertise in application development, automated testing, and ensuring operational stability. Experience with SQL/relational data models, e.g. Teradata, Oracle, sql server Knowledge of all aspects of the Software Development Life Cycle and Agile methodologies Understanding of CI/CD build and deploy pipeline, applicant resiliency and security Preferred qualifications, capabilities, and skills Familiarity with modern front-end technologies Exposure to cloud technologies Knowledge of emerging tech such as public cloud, artificial intelligence, machine learning, etc.

Posted 1 month ago

Apply

3 - 5 years

12 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

Data Solution Consultant (Informatica) Job Details | Goodyear By continuing to use and navigate this website, you are agreeing to the use of cookies. Accept Close Search by Keyword Search by Location Select how often (in days) to receive an alert: Data Solution Consultant (Informatica) Gachibowli Hyderabad, TG, IN Location: IN - Hyderabad Telangana Goodyear Talent Acquisition Representative: Maria Monica Canding Relocation Assistance Available: No Job Responsibilities: You are responsible for designing and building data products, legal data layers, data streams, algorithms, and reporting systems (e.g., dashboards, front ends). You ensure the correct design of solutions, performance, and scalability while considering appropriate cost control. You link data product design with DevOps and infrastructure. You act as a reference within and outside the Analytics team. You serve as a technical partner to Data Engineers regarding digital product implementation. Qualifications: You have a Bachelor s degree in Computer Science, Engineering, Management Information Systems, or a related discipline, or you have 10 or more years of experience in Information Technology in lieu of a degree. You have 5 or more years of experience in Information Technology. You have an in-depth understanding of database structure principles. You have experience gathering and analyzing system requirements. You have knowledge of data mining and segmentation techniques. You have expertise in SQL and Oracle. You are familiar with data visualization tools (e.g., Tableau, Cognos, SAP Analytics Cloud). You possess proven analytical skills and a problem-solving attitude. You have a proven ability to work with distributed systems. You are able to develop creative solutions to problems. You have knowledge and strong skills with SQL and NoSQL databases and applications, such as Teradata, Redshift, MongoDB, or equivalent. Goodyear is an Equal Employment Opportunity and Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to that individuals race, color, religion or creed, national origin or ancestry, sex (including pregnancy), sexual orientation, gender identity, age, physical or mental disability, ethnicity, citizenship, or any other characteristic protected by law. Computer Science, Recruiting, Data Mining, Information Systems, Database, Human Resources, Technology

Posted 1 month ago

Apply

8 - 10 years

0 Lacs

Mumbai Metropolitan Region

Hybrid

Linkedin logo

Sr Developer with special emphasis and experience of 8 to 10 years on Python and Pyspark along with hands on experience on AWS Data components like AWS Glue, Athena etc.,. Also have good knowledge on Data ware house tools to understand the existing system. Candidate should also have experience on Datalake, Teradata and Snowflake. Should be good at terraform. 8-10 years of experience in designing and developing Python and Pyspark applications Creating or maintaining data lake solutions using Snowflake,taradata and other dataware house tools. Should have good knowledge and hands on experience on AWS Glue , Athena etc., Sound Knowledge on all Data lake concepts and able to work on data migration projects. Providing ongoing support and maintenance for applications, including troubleshooting and resolving issues. Expertise in practices like Agile, Peer reviews and CICD Pipelines.

Posted 1 month ago

Apply

2 - 5 years

13 - 14 Lacs

Chennai

Work from Office

Naukri logo

. Job Summary Responsible for designing, building and overseeing the deployment and operation of technology architecture, solutions and software to capture, manage, store and utilize structured and unstructured data from internal and external sources. Establishes and builds processes and structures based on business and technical requirements to channel data from multiple inputs, route appropriately and store using any combination of distributed (cloud) structures, local databases, and other applicable storage forms as required. Develops technical tools and programming that leverage artificial intelligence, machine learning and big-data techniques to cleanse, organize and transform data and to maintain, defend and update data structures and integrity on an automated basis. Creates and establishes design standards and assurance processes for software, systems and applications development to ensure compatibility and operability of data connections, flows and storage requirements. Reviews internal and external business and product requirements for data operations and activity and suggests changes and upgrades to systems and storage to accommodate ongoing needs. Work with data modelers/analysts to understand the business problems they are trying to solve then create or augment data assets to feed their analysis. Works with moderate guidance in own area of knowledge. Job Description Core Responsibilities Develops data structures and pipelines aligned to established standards and guidelines to organize, collect, standardize and transform data that helps generate insights and address reporting needs. Focuses on ensuring data quality during ingest, processing as well as final load to the target tables. Creates standard ingestion frameworks for structured and unstructured data as well as checking and reporting on the quality of the data being processed. Creates standard methods for end users / downstream applications to consume data including but not limited to database views, extracts and Application Programming Interfaces. Develops and maintains information systems (e. g. , data warehouses, data lakes) including data access Application Programming Interfaces. Participates in the implementation of solutions via data architecture, data engineering, or data manipulation on both on-prem platforms like Kubernetes and Teradata as well as Cloud platforms like Databricks. Determines the appropriate storage platform across different on-prem (minIO and Teradata) and Cloud (AWS S3, Redshift) depending on the privacy, access and sensitivity requirements. Understands the data lineage from source to the final semantic layer along with the transformation rules applied to enable faster troubleshooting and impact analysis during changes. Collaborates with technology and platform management partners to optimize data sourcing and processing rules to ensure appropriate data quality as well as process optimization. Handles data migrations/conversions as data platforms evolve and new standards are defined. Preemptively recognizes and resolves technical issues utilizing knowledge of policies and processes. Understands the data sensitivity, customer data privacy rules and regulations and applies them consistently in all Information Lifecycle Management activities. Identifies and reacts to system notification and log to ensure quality standards for databases and applications. Solves abstract problems beyond single development language or situation by reusing data file and flags already set. Solves critical issues and shares knowledge such as trends, aggregate, quantity volume regarding specific data sources. Consistent exercise of independent judgment and discretion in matters of significance. Regular, consistent and punctual attendance. Must be able to work nights and weekends, variable schedule(s) as necessary. Other duties and responsibilities as assigned. Employees at all levels are expected to: Understand our Operating Principles; make them the guidelines for how you do your job. Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services. Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences. Win as a team - make big things happen by working together and being open to new ideas. Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers. Drive results and growth. Respect and promote inclusion & diversity. Do whats right for each other, our customers, investors and our communities. Disclaimer:This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. Thats why we provide an array of options, expert guidance and always-on tools that are personalized to meet the needs of your reality to help support you physically, financially and emotionally through the big milestones and in your everyday life. Please visit the benefits summary on our careers site for more details. Education Bachelors Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Certifications (if applicable) Relevant Work Experience 2-5 Years Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law.

Posted 1 month ago

Apply

4 years

0 Lacs

Bengaluru, Karnataka

Work from Office

Indeed logo

About this role: Wells Fargo is seeking a... In this role, you will: Participate in low risk initiatives within Risk Analytics Review process production, and model documentation in alignment with policy, analyzing trends in current population Receive direction from manager Exercise judgment within Risk Analytics while developing understanding of analytic models, policies, and procedures Provide monthly, quarterly, and annual reports to manager and experienced managers Required Qualifications: 6+ months of Risk Analytics experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Required Qualifications for Europe, Middle East & Africa only: Experience in Risk Analytics, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: 4+ years of experience SQL, Teradata, and or Hadoop experience. 4+ years of experience with BI tools such as Tableau, Power BI or Alteryx applications. 3+ years of experience in risk (includes compliance, financial crimes, operational, audit, legal, credit risk, market risk). Experience researching and resolving data problems and working with technology teams on remediation of data issues. Demonstrated strong analytical skills with high attention to detail and accuracy. Excellent verbal, written, and listening communication skills. Job Expectations: Participate in complex initiatives related to business analysis and modeling, including those that are cross functional, with broad impact, and act as key participant in data aggregation and monitoring for Risk Analytics. Fully understands Data Quality Checks, Methodology, Dimensions for data completeness, accuracy, and that policies and procedures are followed. Becomes a SME in the DQ Check elements, technology infrastructure utilized, and fully understands the metadata and lineage from DQ report to source data. Escalates potential risks, issues, or calendar/timeliness risks in a timely manner to management/Data Management Sharepoint. Ensures the organization and storage of DQ checks artifacts, files, and evidences are effective, efficient, and make sense. Perform deep dive analytics (both Adhoc and structured) and provide reporting or results to both internal and external stakeholders. Design and build rich data visualizations to communicate complex ideas and automate reporting and controls. Create and interpret Business Intelligence data (Reporting, Basic Analytics, Predictive Analytics and Prescriptive Analytics) combined with business knowledge to draw supportable conclusions about current and future risk levels. Becomes a SME in the Reporting, Data Quality check elements, technology infrastructure utilized, and fully understands the metadata and lineage from DQ report to source data. To demonstrate the ability to identify and implement areas of opportunities for quality assurance, data validation, analytics and data aggregation to improve overall reporting efficiencies. Creating and executing the UAT test cases, logging the defects and managing the defects till closure. Collaborate and consult with peers, less experienced to more experienced managers, to resolve production, project, and regulatory issues, and achieve risk analysts, and common modeling goals. Posting End Date: 28 May 2025 *Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants with Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo . Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment and Hiring Requirements: a. Third-Party recordings are prohibited unless authorized by Wells Fargo. b. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process.

Posted 1 month ago

Apply

0.0 years

0 Lacs

Gurugram, Haryana

On-site

Indeed logo

Location Gurugram, Haryana, India This job is associated with 2 categories Job Id GGN00001797 Information Technology Job Type Full-Time Posted Date 05/08/2025 Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description United's Digital Technology team designs, develops, and maintains massively scaling technology solutions brought to life with innovative architectures, data analytics, and digital solutions. Our Values : At United Airlines, we believe that inclusion propels innovation and is the foundation of all that we do. Our Shared Purpose: "Connecting people. Uniting the world." drives us to be the best airline for our employees, customers, and everyone we serve, and we can only do that with a truly diverse and inclusive workforce. Our team spans the globe and is made up of diverse individuals all working together with cutting-edge technology to build the best airline in the history of aviation. With multiple employee-run "Business Resource Group" communities and world-class benefits like health insurance, parental leave, and space available travel, United is truly a one-of-a-kind place to work that will make you feel welcome and accepted. Come join our team and help us make a positive impact on the world. Job overview and responsibilities The United Data Engineering team designs, develops and maintains massively scaling technology solutions that are brought to life with innovative architectures, data analytics and digital solutions. The Data Engineering team is building a modern data technology platform in the cloud with advanced DevOps and Machine Learning capabilities. The Data Engineering team at United Airlines is on a transformational journey to unlock the full potential of enterprise data, build a dynamic, diverse and inclusive culture and develop a modern cloud-based data lake architecture to scale our applications, and drive growth using data and machine learning. Our objective is to enable the enterprise to unleash the potential of data through innovation and agile thinking, and to execute on an effective data strategy to transform business processes, rapidly accelerate time to market and enable insightful decision making. United Airlines is seeking talented people to join the Data Engineering team. Data Engineering organization is responsible for driving data driven insights & innovation to support the data needs for commercial and operational projects with a digital focus. Partner with various teams to define and execute data acquisition, transformation, processing and make data actionable for operational and analytics initiatives that create sustainable revenue and share growth Design, develop, and implement streaming and near-real time data pipelines that feed systems that are the operational backbone of our business Utilize programming languages like Java, Scala, Python with RDBMS, NoSQL databases and Cloud based data warehousing like AWS Redshift Execute unit tests and validating expected results to ensure accuracy & integrity of data and applications through analysis, coding, writing clear documentation and problem resolution Drive the adoption of data processing and analysis using AWS services and help cross train other members of the team Leverage strategic and analytical skills to understand and solve customer and business centric questions Coordinate and guide cross-functional projects that involve team members across all areas of the enterprise, vendors, external agencies and partners Leverage data from a variety of sources to develop data marts and insights that provide a comprehensive understanding of the business Develop and implement innovative solutions leading to automation Mentor and train junior engineers Use of Agile methodologies to manage projects Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions in full-stack development tools and technologies Expand and share your passion by staying on top of tech trends, experimenting with and learning new technologies, and mentoring other members of the engineering community Work with a team of developers with deep experience in Digital technology, machine learning, distributed micro services, and full stack systems This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. This position is for United Airlines Business Services Pvt. Ltd - a wholly owned subsidiary of United Airlines Inc. Qualifications - External Required Bachelor’s Degree in computer science or related STEM field Experience with relational database systems like MS SQL Server, Oracle, Teradata Excellent O/S knowledge and experience in Linux or Windows with basic knowledge of the other MCSE/RHCE or equivalent level of knowledge preferred Experience of implementing and supporting AWS based instances and services (e.g. EC2, S3, EBS, ELB, RDS, IAM, Route53, Cloud front, Elastic cache, WAF etc.) Scripting ability in one or more of Python, Bash, Perl Git for version control useful Working with or supporting containerized environments (ECS/EKS/Kubernetes/Docker) Agile engineering practices Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position Qualifications Preferred Masters in computer science or related STEM field Experience with cloud based systems like AWS, AZURE Strong experience with continuous integration & delivery using Agile methodologies AWS Certified Developer – Associate or Professional AWS Certified Solutions Architect - Associate or Professional AWS Certified Specialty certification – (Big Data Analytics, Machine Learning) Experience with continuous integration & delivery using Agile methods Experience with Data Quality tolls including Deequ or Apache Griffin Experience building PySpark based services in a production environment

Posted 1 month ago

Apply

8 - 11 years

7 - 17 Lacs

Hyderabad

Hybrid

Naukri logo

Role & responsibilities 8+ years of Strong ETL Informatica experience. • Should have Oracle, Hadoop, MongoDB experience. • Strong SQL/Unix knowledge. • Experience in working with RDBMS. Preference to have Teradata. • Good to have Bigdata/Hadoop experience. • Good to have Python or any programming knowledge.

Posted 1 month ago

Apply

8 - 13 years

30 - 35 Lacs

Chennai

Work from Office

Naukri logo

8+ years of Data Engineering experience 5+ years of Cloud experience (GCP preferred) with solutions designed and implemented at production scale Strong understanding of key GCP services, especially those related to data processing [Batch/Real Time] Big Query, Cloud Scheduler, Airflow, Postgres, Data Flow, Pub/Sub, Cloud Logging and Cloud Monitoring Experience with infrastructure as code Terraform/ GitHub Experience in design, development and implementation of data pipelines using Data Warehousing applications Experience in integrating various data sources like Oracle, Teradata, DB2, Big Query & Flat files Hands on experience in performance tuning and debugging ETL jobs Involving in review meetings and coordinating with the team in job designing and fine-tuning the job performance. BE/BTech in CS/IT or equivalent degree in relevant domain Proven experience in data engineering [IBM DataStage or equivalent Cloud ETL preferred] Strong knowledge of ETL processes, data integration and data management Proficiency in programming languages such as SQL, Python or Java Familiarity with cloud computing platforms and data storage solutions Excellent problem-solving skills and ability to work independently as we'll as team player Professional Certification in GCP (eg, Professional Data Engineer) Experience in Mainframe to GCP migration considered plus Agile Methodology Practitioner Responsible for gathering business requirements and design ETL systems to meet application needs Work closely with stakeholders to gather requirements and provide technical guidance on the migration process Collaborate with the data engineering teams to understand the existing JCL/ETL jobs and their dependencies. Design and implement migration strategies for moving ETL to GCP native services. Develop, test, and deploy data pipelines and workflows, ensuring scalability, reliability, and performance Troubleshoot and resolve any production issues or challenges that arise during the migration Deliver product features as part of roadmap and business needs Lead in Effort estimation for the new developments and enhancements. Co-create test plan and lead execution and code deployment. Lead performance tuning to optimize the job execution. Responsible for validating, run, scheduling and monitoring jobs using data stage director Responsible for debugging Ticketing issues in the production phase

Posted 1 month ago

Apply

2 - 7 years

7 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

About this role: Wells Fargo is seeking a Risk Analytics Consultant In this role, you will: Participate in less complex analysis and modeling initiatives, and identify opportunity for process production, data reconciliation, and model documentation improvements within Risk Management Review and analyze programing models to extract data, and manipulate databases to provide statistical and financial modeling, and exercise independent judgment to guide new and existing projects with medium risk deliverables Coordinate and consolidate the production of monthly, quarterly, and annual performance reports for more experienced management Present recommendations for resolving data reconciliation, production, and database issues Exercise independent judgment while developing expertise in policy governance, risk projects, and regulatory requests Collaborate and consult with peers, managers, experienced managers, compliance, including various lines of business Required Qualifications: 2+ years of Risk Analytics experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education. Desired Qualifications: Financial services experience, in lending, underwriting, servicing. Excellent understanding of Capital markets focusing on counterparty credit risk, CCAR and stress testing. Experience in programming languages such as SAS, SQL, Teradata. Knowledge and understanding of databases and data mining techniques Ability to articulate complex concepts in a clear manner Ability to interact with all levels of an organization and collaborate to build successful relationships Ability to be flexible and adjust plans quickly to meet changing business needs Strong organizational, multi-tasking, and prioritizing skills Ability to take initiative and work independently with minimal supervision in a structured environment Exposure to Wells Fargo origination, servicing and reporting platforms such as MSP, SHAW, and AFS. Exposure to Wealth Management systems Cattalos, STOC, ClientLink, and Optimist. Support the WIM CRO Credit Risk team with risk analytics activities across consumer, mortgage, and custom credit/commercial products. Leverage innovative tools to collect, analyze, and interpret large amounts of data across multiple data environments Identify trends and patterns to inform sound business decisions. Effectively utilize analytical tools to create visualizations and share insights. Learn about many aspects of WIM business through partnerships with various groups including credit policy, underwriting, and servicing. Combine technical skills with business skills to help support the team with end-to-end quality assurance initiatives. Provide quantitative support of credit policy through analysis and impact assessment of prospective quality assurance findings. Develop recurring reporting and ad hoc analysis packages to monitor various product sets across WIM for management.

Posted 1 month ago

Apply

2 - 5 years

0 Lacs

Chennai, Tamil Nadu, India

Linkedin logo

When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What You’ll Be Doing… We’re seeking a skilled Data Engineering Analyst to join our high-performing team and propel our telecom business forward. You’ll contribute to building cutting-edge data products and assets for our wireless and wireline operations, spanning areas like consumer analytics, network performance, and service assurance. In this role, you will develop deep expertise in various telecom domains. As part of the Data Architecture & Strategy team, you’ll collaborate closely with IT and business stakeholders to design and implement user-friendly, robust data product solutions. This includes incorporating data classification and governance principles. Your responsibilities encompass Collaborate with stakeholders to understand data requirements and translate them into efficient data modelsDesign, develop, and implement data architecture solutions on GCP and Teradata to support our Telecom business.Design data ingestion for both real-time and batch processing, ensuring efficient and scalable data acquisition for creating an effective data warehouse.Maintain meticulous documentation, including data design specifications, functional test cases, data lineage, and other relevant artifacts for all data product solution assets.Implement data architecture standards, as set by the data architecture team.Proactively identify opportunities for automation and performance optimization within your scope of workCollaborate effectively within a product-oriented organization, providing data expertise and solutions across multiple business units.Cultivate strong cross-functional relationships and establish yourself as a subject matter expert in data and analytics within the organization. What We’re Looking For... You’re curious about new technologies and the game-changing possibilities it creates. You like to stay up-to-date with the latest trends and apply your technical expertise to solving business problems. You’ll need to have… Bachelor’s degree with four or more years of work experience.Four or more years of relevant work experience.Expertise in building complex SQLs to do data analysis to understand and design data solutionsExperience with ETL, Data Warehouse concepts and Data Management life cycleExperience in creating technical documentation such as Source to Target mapping, Source contract, SLA's etcExperience in any DBMS, preferably GCP/BigQueryExperience in creating Data models using Erwin toolExperience in shell scripting and pythonUnderstanding of git version control and basic git commandUnderstanding of Data Quality concepts Even better if you have one or more of the following… Certification in GCP-Data Engineer.Understanding of NO SQL databases like Cassandra, Mongo etcAccuracy and attention to detail.Good problem solving, analytical, and research capabilities.Good verbal and written communication.Experience presenting to leaders and influencing stakeholders. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics.

Posted 2 months ago

Apply

5 - 8 years

0 Lacs

Pune, Maharashtra, India

Hybrid

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities As an Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirementsWork in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Preferred Education Master's Degree Required Technical And Professional Expertise Design and implement efficient database schemas and data models using Teradata.Optimize SQL queries and stored procedures for performance.Perform database administration tasks including installation, configuration, and maintenance of Teradata systems Preferred Technical And Professional Experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutionsAbility to communicate results to technical and non-technical audiences

Posted 2 months ago

Apply

5 - 10 years

0 - 2 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Direct Responsibilities Analyze and interpret requirement & issues specifications received from Business analysts or production teams. Participate with analysts to ensure correct understanding and implementation of specifications. Work in collaboration with the analyst of the project to meet the clients expectations. Take charge of the development work according to the priorities defined by the product owner. Propose technical solutions adapted to the business needs, develop technical requirements. Design and develop IT solutions based on the specifications received. Packaging and deployment on non-production environments and monitor production releases & deployments Participate in the testing support (system, user acceptance, regression) . Bring a high level of quality in developments, in terms of maintainability, testability and performance. Participate in code reviews set up within the program. Contributing Responsibilities Participate in transversal, capability building efforts for the bank. Implementation of best practices, coding & development cultures Work closely as “one team” with all stakeholders jointly to provide high quality delivery Work on data-driven issues, innovative technologies (Java 17 or 21 with Quarkus Framework, Kubernetes, Kogito) and in the Finance & Risk functional areas. Technical & Behavioral Competencies Technical (Mandatory Skills): TRERADATA development (intermediate, expert) Knowledge of Linux shell SQL queries Unit Testing (Optional Skills): Management of temporality in Teradata Knowledge of Linux environment Behavior Skills Ability to work independently and collaborate as part of a team. Rigorous and disciplined, with deep attention to quality of work (software craftsmanship approach is welcome) Result oriented, ability to meet and respect deadlines. Curious, ability to learn and adapt to technological change. Good communication skills Excellent analytical and problem-solving skills

Posted 2 months ago

Apply

8 - 10 years

5 - 6 Lacs

Chennai

Work from Office

Naukri logo

Eviden, part of the Atos Group, with an annual revenue of circa 5 billion is a global leader in data-driven, trusted and sustainable digital transformation. As a next generation digital business with worldwide leading positions in digital, cloud, data, advanced computing and security, it brings deep expertise for all industries in more than 47 countries. By uniting unique high-end technologies across the full digital continuum with 47,000 world-class talents, Eviden expands the possibilities of data and technology, now and for generations to come. Roles Responsibilities: Understand and experience of Agile development approach and tools. Extensive use of conversant with Oracle and/or SQL server database technologies in particular ETL/ELT design, development and testing in DBs like Oracle, SQL Server and Data Warehouse appliances like Netezza, Teradata, Vertica Requirements: Minimum 8-10 years of total experience in IT building ETL Pipelines, Process and performance with Talend ETL Data Integration Experience working with Oracle, Netezza and/or other OLAP database system Good knowledge of Data Warehousing concepts and analytical skills Good knowledge of Talend Cloud is added advantage. Competence with Unix/Linux Shell scripting Job scheduling experience using TAC Strong skills in writing SQL queries Knowledge of MicroStrategy or any reporting tool is advantage Understand national and international legislation pertaining to the Government/Public Sector IT, e.g. Data Protection Should have experience of managing team of ~10 Talend Developers. Strong team management skills. Should have experience is estimating medium or large projects with Talend or related technologies. Good communication skills with Client / stakeholders management skills. Our Offering: Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences. Attractive Salary. Hybrid work culture. Let s grow together.

Posted 2 months ago

Apply

6 - 10 years

8 - 12 Lacs

Pune, Mumbai, Bengaluru

Work from Office

Naukri logo

The purpose of this role is to oversee the development of our database marketing solutions, using database technologies such as Microsoft SQL Server/Azure, Amazon Redshift, Google BigQuery. The role will be involved in design, development, troubleshooting, and issue resolution. The role involves upgrading, enhancing, and optimizing the technical solution. It involves continuous integration and continuous deployment of various requirements changes in the business logic implementation. Interactions with internal stakeholders and/or clients to explain technology solutions and a clear understanding of client s business requirements through which to guide optimal design/solution to meet their needs. The ability to communicate to both technical and non-technical audiences is key. Job Description: Must Have Skills: Database (SQL server / Snowflake / Teradata / Redshift / Vertica / Oracle / Big query / Azure DW etc. ETL (Extract, Transform, Load) tool (Talend, Informatica, SSIS, DataStage, Matillion) Python, UNIX shell scripting, Project & resource management Workflow Orchestration (Tivoli, Tidal, Stonebranch) Client-facing skills Good to have Skills: Experience in Cloud computing (one or more of AWS, Azure, GCP) . AWS Preferred. Key responsibilities: Understanding and practical knowledge of data warehouse, data mart, data modelling, data structures, databases, and data ingestion and transformation Strong understanding of ETL processes as well as database skills and common IT offerings i.e. storage, backups and operating system. Has a strong understanding of the SQL and data base programming language Has strong knowledge of development methodologies and tools Contribute to design and oversees code reviews for compliance with development standards Designs and implements technical vision for existing clients Able to convert documented requirements into technical solutions and implement the same in given timeline with quality issues. Able to quickly identify solutions for production failures and fix them. Document project architecture, explain detailed design to team and create low level to high level design. Perform mid to complex level tasks independently. Support Client, Data Scientists and Analytical Consultants working on marketing solution. Work with cross functional internal team and external clients . Strong project Management and organization skills . Ability to lead/work 1 - 2 projects of team size 2 - 3 team members. Code management systems which include Code review and deployments Location: DGS India - Pune - Baner M- Agile Brand: Merkle Time Type: Full time Contract Type: Permanent

Posted 2 months ago

Apply

6 - 10 years

8 - 12 Lacs

Mumbai

Work from Office

Naukri logo

The purpose of this role is to oversee the development of our database marketing solutions, using database technologies such as Microsoft SQL Server/Azure, Amazon Redshift, Google BigQuery. The role will be involved in design, development, troubleshooting, and issue resolution. The role involves upgrading, enhancing, and optimizing the technical solution. It involves continuous integration and continuous deployment of various requirements changes in the business logic implementation. Interactions with internal stakeholders and/or clients to explain technology solutions and a clear understanding of client s business requirements through which to guide optimal design/solution to meet their needs. The ability to communicate to both technical and non-technical audiences is key. Job Description: Must Have Skills: Database (SQL server / Snowflake / Teradata / Redshift / Vertica / Oracle / Big query / Azure DW etc. ETL (Extract, Transform, Load) tool (Talend, Informatica, SSIS, DataStage, Matillion) Python, UNIX shell scripting, Project & resource management Workflow Orchestration (Tivoli, Tidal, Stonebranch) Client-facing skills Good to have Skills: Experience in Cloud computing (one or more of AWS, Azure, GCP) . AWS Preferred. Key responsibilities: Understanding and practical knowledge of data warehouse, data mart, data modelling, data structures, databases, and data ingestion and transformation Strong understanding of ETL processes as well as database skills and common IT offerings i.e. storage, backups and operating system. Has a strong understanding of the SQL and data base programming language Has strong knowledge of development methodologies and tools Contribute to design and oversees code reviews for compliance with development standards Designs and implements technical vision for existing clients Able to convert documented requirements into technical solutions and implement the same in given timeline with quality issues. Able to quickly identify solutions for production failures and fix them. Document project architecture, explain detailed design to team and create low level to high level design. Perform mid to complex level tasks independently. Support Client, Data Scientists and Analytical Consultants working on marketing solution. Work with cross functional internal team and external clients . Strong project Management and organization skills . Ability to lead/work 1 - 2 projects of team size 2 - 3 team members. Code management systems which include Code review and deployments Location: DGS India - Pune - Baner M- Agile Brand: Merkle Time Type: Full time Contract Type: Permanent

Posted 2 months ago

Apply

3 - 5 years

12 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking for a skilled 3- 5 years experienced Senior Technical Consultant with expertise in SQL, SSIS, Python and PySpark to join our team The ideal candidate will have proficiency in building scalable Interfaces, performance tuning, data cleansing and validation strategies by leveraging the defined tech stack for data processing and data movement What you ll do: Advanced Execution & Data Management: Oversee and manage intricate project tasks, providing insights and directions related to advanced data ingestion, transformation, validation, and publishing Review and analyse the data provided by the customer along with its technical/functional intent and interdependencies Engage proactively with functional teams, ensuring a thorough understanding of end-toend data flows as related to the technical integration Build data Ingress or Egress pipelines, handling of huge volume of data and developing data transformation functions using languages such as SSIS, Python, Pyspark, SQL etc Integration of various data sources definitions like Teradata, SAP ERP, SQL Server, Oracle, Sybase, ODBC connectors & Flat Files through API or Batch Production Deployment and Hypercare: Assist with Production Deployment tasks; Assists with triage of issues, testing and identifying root cause; Carry out timely response and resolution of batch automation disruptions, in order to meet customer SLA s with accurate and on-time results Technical Leadership & Coding Oversight: Guide and review the code developed by junior consultants, ensuring alignment with best practices Incorporate o9 ways of working and embed the industry standards for smoother project executions What you should have: 3+ years experience in Data architecture, Data engineering, or a related field, with a strong focus on data modelling, ETL processes, and cloud-based data platforms Hands-on experience with SSIS Packages, Python, PySpark, SQL languages along with workflow management tools like Airflow, SSIS Experience working with Parquet, JSON, Restful APIs, HDFS, Delta Lake and query frameworks like Hive, Presto Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Working experience with version control platforms, eg GitHub, Azure DevOps Familiarity with Agile methodology Proactive mindset and the right attitude to embrace the agility of learning Excellent verbal and written communication skills Good to have: o Hands-on Experience with Delta Lake o Experience with Supply chain planning applications o Experience with Amazon Web Services (AWS), AZURE, Google Cloud Infrastructures What we ll do for you Competitive salary and benefits Stock options to eligible candidates High growth organization - very strong entrepreneurial culture and no corporate politics Support network: Work with a team you can learn from and every day Diversity: We pride ourselves on our international working environment Social: Fun after-work activities like Friday Socials If you re in the office, feel free to join these events in person Food and drink: Enjoy healthy snacks, fresh fruit, teas and coffees on us Work Life Balance: https: / / youtube / IHSZeUPATBA?feature=shared Feel part of A team: https: / / youtube / QbjtgaCyhes?feature=shared More about o9 Our platform, the o9 Digital Brain, is the premier AI-powered, cloud-native platform driving the digital transformations of major global enterprises including Google, Walmart, ABInBev, Starbucks and many others Our headquarters are located in Dallas, with offices in Amsterdam, Paris, London, Barcelona, Madrid, Sao Paolo, Bengaluru, Tokyo, Seoul, Milan, Stockholm, Sydney, Shanghai, Singapore and Munich o9 is an equal opportunity employer and seeks applicants of diverse backgrounds and hires without regard to race, colour, gender, religion, national origin, citizenship, age, sexual orientation or any other characteristic protected by law More about us Our platform, the o9 Digital Brain, is the premier AI-powered, cloud-native platform driving the digital transformations of major global enterprises including Google, Walmart, ABInBev, Starbucks and many others Our headquarters are located in Dallas, with offices in Amsterdam, Paris, London, Barcelona, Madrid, Sao Paolo, Bengaluru, Tokyo, Seoul, Milan, Stockholm, Sydney, Shanghai, Singapore Munich, Toronto

Posted 2 months ago

Apply

3 - 8 years

12 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Were on the lookout for the brightest, most committed individuals to join us on our mission Along the journey, we ll provide you with a nurturing environment where you can be part of something truly extraordinary and make a real difference for companies and the planet What you ll do: Advanced Execution & Data Management: Oversee and manage intricate project tasks, providing insights and directions related to advanced data ingestion, transformation, validation, and publishing Review and analyse the data provided by the customer along with its technical/functional intent and interdependencies Engage proactively with functional teams, ensuring a thorough understanding of end-toend data flows as related to the technical integration Build data Ingress or Egress pipelines , handling of huge volume of data and developing data transformation functions using languages such as SSIS, Python, Pyspark, SQL etc Integration of various data sources definitions like Teradata, SAP ERP, SQL Server, Oracle, Sybase, ODBC connectors & Flat Files through API or Batch Production Deployment and Hypercare : Assist with Production Deployment tasks; Assists with triage of issues, testing and identifying root cause; Carry out timely response and resolution of batch automation disruptions, in order to meet customer SLA s with accurate and on-time results Technical Leadership & Coding Oversight : Guide and review the code developed by junior consultants, ensuring alignment with best practices Incorporate o9 ways of working and embed the industry standards for smoother project executions What you should have: 3+ years experience in Data architecture, Data engineering, or a related field, with a strong focus on data modelling, ETL processes, and cloud-based data platforms Hands-on experience with Python, PySpark, SQL languages along with workflow management tools like Airflow, SSIS Experience working with Parquet, JSON, Restful APIs, HDFS, Delta Lake and query frameworks like Hive, Presto Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Working experience with version control platforms, eg GitHub, Azure DevOps Familiarity with Agile methodology Proactive mindset and the right attitude to embrace the agility of learning Excellent verbal and written communication skills Good to have Hands-on Experience with Delta Lake Experience with Supply chain planning applications Experience with Amazon Web Services (AWS), AZURE, Google Cloud Infrastructures What we ll do for you Competitive salary with stock options to eligible candidates Stock options to eligible candidates Flat organization: With a very strong entrepreneurial culture (and no corporate politics) Great people and unlimited fun at work Possibility to make a difference in a scale-up environment Opportunity to travel onsite in specific phases depending on project requirements Support network: Work with a team you can learn from every day Diversity: We pride ourselves on our international working environment

Posted 2 months ago

Apply

6 - 11 years

17 - 30 Lacs

Delhi NCR, Bengaluru, Hyderabad

Hybrid

Naukri logo

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant-Microsoft ETL Lead Engineer | Database design |Agile Development Process | Release Management)! Position Overview: We are currently seeking a highly experienced ETL engineer with hands-on experience in Microsoft ETL (Extract, Transform, Load) technologies. The ideal candidate will have a deep understanding of ETL processes, data warehousing, and data integration, with a proven track record of leading successful ETL implementations. As an principal ETL engineer, you will play a pivotal role in architecting, designing, and implementing ETL solutions to meet our organization's data needs. Key Responsibilities: • Lead and design and development of ETL processes using Microsoft ETL technologies, such as SSIS (SQL Server Integration Services) • Mandatory hands on experience (70% development, 30% leadership) • Collaborate with stakeholders to gather and analyze requirements for data integration and transformation. • Design and implement data quality checks and error handling mechanisms within ETL processes. • Lead a team of ETL developers, providing technical guidance, mentorship, and oversight. • Perform code reviews and ensure adherence to best practices and coding standards. • Troubleshoot and resolve issues related to data integration, ETL performance, and data quality. • Work closely with database administrators, data architects, and business analysts to ensure alignment of ETL solutions with business requirements. • Stay up-to-date with the latest trends and advancements in ETL technologies and best practices • Identify and resolve performance bottlenecks.Implement best practices for database performance tuning and optimization. • Ensure data integrity, security, and availability • Create and maintain documentation for database designs, configurations, and procedures • Ensure compliance with data privacy and security regulations Qualifications: Education: • Bachelor’s degree in Computer Science, Information Technology, or a related field. Experience: • experience in designing, developing, and implementing ETL solutions using Microsoft ETL technologies, particularly SSIS. • Strong understanding of data warehousing concepts, dimensional modeling, and ETL design patterns. • Proficiency in SQL and experience working with relational databases, preferably Microsoft SQL Server. • Experience leading ETL development teams and managing end-to-end ETL projects. • Proven track record of delivering high-quality ETL solutions on time and within budget. • Experience with other Microsoft data platform technologies (e.g., SSAS, SSRS) is a plus • Familiarity with version control systems (e.g., Git) • Knowledge of containerization and orchestration (e.g., Docker, Kubernetes) is a plus Soft Skills: o Strong analytical and problem-solving skills o Excellent communication and collaboration abilities o Ability to work independently and as part of a team Preferred Qualifications o Experience with cloud-based database services (e.g., AWS RDS, Google Cloud SQL) o Knowledge of other database systems (e.g., PGSQL, Oracle) o Familiarity with Agile development methodologies Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit www.genpact.com . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.

Posted 2 months ago

Apply

10 - 16 years

20 - 30 Lacs

Chennai, Bengaluru, Hyderabad

Hybrid

Naukri logo

Position : Power BI Lead Engineer Experience : 10+years Hybrid mode Shift Timings : 2PM 11PM Location : Bangalore, Chennai & Hyderabad and Pune. Responsibilities Lead the offshore Power BI development team. Oversee the development and testing of Power BI reports and dashboards. Ensure adherence to project timelines and quality standards. Collaborate with the onshore architect and project manager. Provide technical guidance and support to the offshore team. Work with the data engineers to validate the data within snowflake and teradata. Skills to Have Strong leadership and communication skills. Extensive experience with Power BI development. Proficiency in DAX, Power Query, and data modeling. Experience with Agile development methodologies. Ability to manage and mentor a team. Testing and validation of the user stories developed by engineers Remove any bottlenecks and technical impediments for the project Technologies (Must Have): Power BI (Desktop, Service) DAX, Power Query (M), Copilot. PowerBI Reports Builder SQL Python MicroStrategy, WebFocus : good to have Snowflake : basic knowledge is must Technologies (Good to Have): Azure cloud platform or Any cloud is fine good to have Exposure to PowerBI Rest APIs PowerBI External Tools Custom Visuals

Posted 2 months ago

Apply

10 - 20 years

25 - 32 Lacs

Bengaluru

Hybrid

Naukri logo

Title: Database Administrator Location- Bangalore (Hybrid) Experience- 10+ years Responsibilities: Strong expertise in writing and optimizing Teradata SQL queries, TPT script etc. Manage Production/Development databases performance. Review Teradata system reports and provide performance assessment report with recommendations to optimize system. Investigate and quantify opportunities from performance assessment” reports and Apply best practices in each of the areas. Monitor using Viewpoint tool for Teradata system performance using different portlets. Review poor performing queries generated from BI/ETL tools and provide best practice recommendations on how to simplify and restructure views, apply PPI or other index changes Closely monitor the performance of various work groups on the system and make sure data is available to business as per the SLA requirement. Optimal index analysis - Review Index usage on tables and recommend adding dropping indexes for optimal data access. Review uncompressed tables, analyse its usage, and implement compression to save space and reduce IO activity – using various algorithms like MVC/ BLC/ ALC. Optimize locking statements in views, macros & queries to eliminate blocking contention invest. Review the Spool Limits for the users and recommend optimal limit for the Ad-hoc users to avoid run away queries over consuming system resources. Check for Mismatch data types in the system and make them unique to avoid costly translations during query processing. Review Set tables and check for the options to convert to MultiSet to avoid costly duplicate row checking operation. Review Large Scan Tables on the system and analyze for using PPI, MLPPI, Compression, Secondary indexes & Join Indexes Analyze various applications and understand the space requirements and segregate the disk space under the categories of perm, spool, and temp space. Setting up the database hierarchy that includes database creation and management of objects such as users, Roles, Profiles, tables, views. Maintain profiles, roles, access rights and permissions for Teradata user groups and objects. Generate periodic performance reports using PDCR and identify bottlenecks with the system performance. Establish PDCR canary performance baselines. Utilize standard canary queries to identify variance from baseline. Effective usage of TASM & Priority distribution to penalize the resource intensive queries, Give high priority to business-critical workloads, Throttling of different workloads for optimal Through put and provide performance reports to check workload management health. Qualifications we seek in you! Minimum qualifications: 6-12 years of Teradata Performance DBA experience. Experience in review of poor performing queries and provide best practice recommendations on how to simplify and restructure views, apply PPI or other index changes. Statistics Management and Optimization Exposure to DWH Env (Knowledge of ETL/DI/BI Reporting). Exposure to troubleshoot the TPT/ FastLoad / Multiload/ FastExport/ BTEQ/ TPump errors, should be good at error handling. Experience in fine tuning various application parameters/number of sessions to ensure optimal functioning of the application. Well conversant with various ticketing system/production change request/ Teradata Incident management. Should be good at automating various processes. Ability to write efficient SQL & exposure to query tuning. Preferably understand Normalization and De-normalization concepts. Preferable exposure to visualization tools like Tableau, PowerBI. Preferably have good working knowledge on UNIX shell, Python scripting. Good to have exposure to FSLDM Good to have exposure to GCFR framework. If interested, Kindly send your resume on nitish.sethi@portraypeople.com or call on 9717411774

Posted 2 months ago

Apply

5 - 7 years

10 - 14 Lacs

Gurgaon

Work from Office

Naukri logo

Skills : Program Management, project management tools such as Jira, Asana, and Smartsheet, data management and storage systems such as Databricks, Snowflake, and Teradata. Required Candidate profile project management methodologies, including Scrum, Kanban, and Waterfall, test API endpoints, and build dashboards.Experience: Total 5 to 7 years(5+ years Program Management

Posted 2 months ago

Apply

4 - 6 years

7 - 9 Lacs

Chennai, Bengaluru, Hyderabad

Work from Office

Naukri logo

Skills: Snowflake DB, Snowflake, Teradata / Oracle / SQL, Scripts: UNIX, Python, cloud computing architecture, ETL Tools, GIT, Snowflake architecture and warehouse implementation and management Required Candidate profile Notice Period: Immediate -45 days LocationHyderabad, Bangalore, Chennai, Bhubaneshwar

Posted 2 months ago

Apply

4 - 8 years

7 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

About this role: Wells Fargo is seeking a Senior Analytics Consultant In this role, you will: Consult, review and research moderately complex business, operational, and technical challenges that require an in-depth evaluation of variable data factors Perform moderately complex data analysis to support and drive strategic initiatives and business needs Develop a deep understanding of technical systems and business processes to extract data driven insights while identifying opportunities for engineering enhancements Lead or participate on large cross group projects Mentor less experienced staff Collaborate and consult with peers, colleagues, external contractors, and mid-level managers to resolve issues and achieve goals Leverage a solid understanding of compliance and risk management requirements for supported area Required Qualifications: 4+ years of Analytics experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: 4+ years of experience in Analytics and Reporting, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education. SQL, Teradata experience Testing or quality assurance experience SDLC (System Development Life Cycle) experience System Integration Testing (SIT) ETL Testing Report testing Strong analytical skills with high attention to detail and accuracy Experience in onshore/offshore support model Strong presentation, communication, writing and interpersonal skills. Experience in Agile methodology and leveraging Jira tools for workflow and productivity management. Job Expectations: Knowledge of Conduct Management data, such as the Enterprise Allegations Platform (EAP) and/or the Allegation Lifecycle/Methodology Experience with Tableau/ PowerBI Reporting tools. Strong analytical skills with high attention to detail and accuracy. 4+ years of data management experience including data modeling, data integrity and data quality and best practice design concepts. ISTQB Certification.

Posted 2 months ago

Apply

2 - 7 years

7 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

About this role: Wells Fargo is seeking an Analytics Consultant. In this role, you will: Consult with business line and enterprise functions on less complex research Use functional knowledge to assist in non-model quantitative tools that support strategic decision making Perform analysis of findings and trends using statistical analysis and document process Present recommendations to increase revenue, reduce expense, maximize operational efficiency, quality, and compliance Identify and define business requirements and translate data and business needs into research and recommendations to improve efficiency Participate in all group technology efforts including design and implementation of database structures, analytics software, storage, and processing Develop customized reports and ad hoc analyses to make recommendations and provide guidance to less experienced staff Understand compliance and risk management requirements for supported area Ensure adherence to data management or data governance regulations and policies Participate in company initiatives or processes to assist in meeting risk and capital objectives and other strategic goals Collaborate and consult with more experienced consultants and with partners in technology and other business groups Required Qualifications: 2+ years of Analytics experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Experience in Analytics, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education. Excellent verbal, written, and interpersonal communication skills. Strong knowledge of Enterprise Risk programs and applicability of risk management framework (3 Line of defense) Experience identifying internal and external data sources from multiple sources across the business Experience with SQL, Teradata, or SAS and Database Management systems like Teradata and MS SQL Server. Experience in risk (includes compliance, financial crimes, operational, audit, legal, credit risk, market risk). Experience in data visualization and business intelligence tools. Advanced Microsoft Office (Word, Excel, Outlook and PowerPoint) skills Demonstrated strong analytical skills with high attention to detail and accuracy. Strong presentation skills and ability to translate and present data in a manner that educates, enhances understanding, and influence decisions, bias for simplicity Strong writing skills - proven ability to translate data sets and conclusions drawn from analysis into business/executive format and language Ability to support multiple projects with tight timelines Meta Data management, Data Lineage, Data Element Mapping, Data Documentation experience. Experience researching and resolving data problems and working with technology teams on remediation of data issues Hands-on proficiency with Python, Power BI (Power Query, DAX, Power apps), Tableau, or SAS Knowledge of Defect management tools like HP ALM. Knowledge of Data Governance. Job Expectations: Ensure adherence to data management or data governance regulations and policies Extract and analyze data from multiple technology systems/platforms and related data sources to identify factors that pose a risk to the firm. Consult with business line and enterprise functions on less complex research Understand compliance and risk management requirements for sanctions compliance and data management Perform analysis of findings and trends using statistical analysis and document process Require a solid background in reporting, understanding and utilizing Relational Databases and Data Warehouses, and be effective in querying and reporting large and complex data sets. Excel at telling stories with data, presenting information in visually compelling ways that appeal to executive audiences, and will be well versed in the development and delivery of reporting solutions. Responsible for building easy to use visualization and perform data analysis to generate meaningful business insights using complex datasets for global stakeholders. Responsible for testing key reports and produce process documentation. Present recommendations to maximize operational efficiency, quality, and compliance Identify and define business requirements and translate data and business needs into research and recommendations to improve efficiency Develop customized reports and ad hoc analyses to make recommendations and provide guidance to less experienced staff

Posted 2 months ago

Apply

Exploring Teradata Jobs in India

Teradata is a popular data warehousing platform that is widely used by businesses in India. As a result, there is a growing demand for skilled professionals who can work with Teradata effectively. Job seekers in India who have expertise in Teradata have a wide range of opportunities available to them across different industries.

Top Hiring Locations in India

  1. Bengaluru
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

These cities are known for their thriving tech industries and have a high demand for Teradata professionals.

Average Salary Range

The average salary range for Teradata professionals in India varies based on experience levels. Entry-level roles can expect to earn around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15 lakhs per annum.

Career Path

In the field of Teradata, a typical career path may involve progressing from roles such as Junior Developer to Senior Developer, and eventually to a Tech Lead position. With experience and skill development, professionals can take on more challenging and higher-paying roles in the industry.

Related Skills

In addition to Teradata expertise, professionals in this field are often expected to have knowledge of SQL, data modeling, ETL tools, and data warehousing concepts. Strong analytical and problem-solving skills are also essential for success in Teradata roles.

Interview Questions

  • What is Teradata and how is it different from other database management systems? (basic)
  • Can you explain the difference between a join and a merge in Teradata? (medium)
  • How would you optimize a Teradata query for performance? (medium)
  • What are fallback tables in Teradata and why are they important? (advanced)
  • How do you handle duplicate records in Teradata? (basic)
  • What is the purpose of a collect statistics statement in Teradata? (medium)
  • Explain the concept of indexing in Teradata. (medium)
  • How does Teradata handle concurrency control? (advanced)
  • Can you describe the process of data distribution in Teradata? (medium)
  • What are the different types of locks in Teradata and how are they used? (advanced)
  • How would you troubleshoot performance issues in a Teradata system? (medium)
  • What is a Teradata View and how is it different from a Table? (basic)
  • How do you handle NULL values in Teradata? (basic)
  • Can you explain the difference between FastLoad and MultiLoad in Teradata? (medium)
  • What is the Teradata Parallel Transporter? (advanced)
  • How do you perform data migration in Teradata? (medium)
  • Explain the concept of fallback protection in Teradata. (advanced)
  • What are the different types of Teradata macros and how are they used? (advanced)
  • How do you monitor and manage Teradata performance? (medium)
  • What is the purpose of the Teradata QueryGrid? (advanced)
  • How do you optimize the storage of data in Teradata? (medium)
  • Can you explain the concept of Teradata indexing strategies? (advanced)
  • How do you handle data security in Teradata? (medium)
  • What are the best practices for Teradata database design? (medium)
  • How do you ensure data integrity in a Teradata system? (medium)

Closing Remark

As you prepare for interviews and explore job opportunities in Teradata, remember to showcase your skills and experience confidently. With the right preparation and determination, you can land a rewarding role in the dynamic field of Teradata in India. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies