Home
Jobs

596 Teradata Jobs - Page 15

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

25 - 35 Lacs

Bengaluru

Hybrid

Naukri logo

We're Hiring: Teradata Developer | Bangalore | 5.0 8.0 Years Experience Are you a Teradata expert looking for your next opportunity in a fast-paced and collaborative environment? We're looking for skilled professionals who can join within 1 month immediate joiners preferred! Location: Bangalore (Work from Office) Experience: 5.0 – 8.0 years Position: Teradata Developer Key Responsibilities: Hands-on experience with Teradata utilities (BTEQ, FastLoad, MultiLoad, TPT) Strong command over SQL and ETL development Deep understanding of Teradata architecture , star/snowflake schema, and data modeling Proven experience in SQL optimization and performance tuning Good knowledge of data warehousing concepts and best practices Experience in partitioning strategies , workload management, and system security Able to troubleshoot and resolve performance bottlenecks Collaborate effectively with business analysts to understand data requirements Create and maintain detailed technical documentation Who Should Apply: Only candidates who can join within 1 month Professionals with strong Teradata knowledge and solid data warehousing fundamentals Looking for immediate joiners based in or willing to relocate to Bangalore Interested? Apply now or email to vijay.s@xebia.com Let’s build data-driven solutions together! #Teradata #BangaloreJobs #ImmediateJoiners #HiringNow #ETL #SQL #DataEngineering #DataWarehouse

Posted 2 weeks ago

Apply

0 years

0 Lacs

Khairatabad, Telangana, India

On-site

Linkedin logo

Location: IN - Hyderabad Telangana Goodyear Talent Acquisition Representative: Maria Monica Canding Sponsorship Available: No Relocation Assistance Available: No Job Responsibilities You are responsible for designing and building data products, legal data layers, data streams, algorithms, and reporting systems (e.g., dashboards, front ends). You ensure the correct design of solutions, performance, and scalability while considering appropriate cost control. You link data product design with DevOps and infrastructure. You act as a reference within and outside the Analytics team. You serve as a technical partner to Data Engineers regarding digital product implementation. Qualifications You have a Bachelor’s degree in Computer Science, Engineering, Management Information Systems, or a related discipline, or you have 10 or more years of experience in Information Technology in lieu of a degree. You have 5 or more years of experience in Information Technology. You have an in-depth understanding of database structure principles. You have experience gathering and analyzing system requirements. You have knowledge of data mining and segmentation techniques. You have expertise in SQL and Oracle. You are familiar with data visualization tools (e.g., Tableau, Cognos, SAP Analytics Cloud). You possess proven analytical skills and a problem-solving attitude. You have a proven ability to work with distributed systems. You are able to develop creative solutions to problems. You have knowledge and strong skills with SQL and NoSQL databases and applications, such as Teradata, Redshift, MongoDB, or equivalent. Goodyear is an Equal Employment Opportunity and Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to that individual's race, color, religion or creed, national origin or ancestry, sex (including pregnancy), sexual orientation, gender identity, age, physical or mental disability, ethnicity, citizenship, or any other characteristic protected by law. Goodyear is one of the world’s largest tire companies. It employs about 74,000 people and manufactures its products in 57 facilities in 23 countries around the world. Its two Innovation Centers in Akron, Ohio and Colmar-Berg, Luxembourg strive to develop state-of-the-art products and services that set the technology and performance standard for the industry. For more information about Goodyear and its products, go to www.goodyear.com/corporate Show more Show less

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Our company: At Teradata, we believe that people thrive when empowered with better information. That’s why we built the most complete cloud analytics and data platform for AI. By delivering harmonized data, trusted AI, and faster innovation, we uplift and empower our customers—and our customers’ customers—to make better, more confident decisions. The world’s top companies across every major industry trust Teradata to improve business performance, enrich customer experiences, and fully integrate data across the enterprise. What you will do: Highly motivated Full Stack Engineer with a solid background in software development. The ideal candidate should be adept at multi-tasking across various development activities, including coding, system configuration, testing, and research. Key Responsibilities: Collaborate with an integrated development team to deliver high-quality applications. Develop end-user applications, leveraging research capabilities and SQL knowledge. Utilize open-source tools and technologies effectively, adapting and extending them as needed to create innovative solutions. Communicate effectively across teams to ensure alignment and clarity throughout the development process. Provide post-production support Who you will work with: On our team we collaborate with several cross-functional agile teams that include product owners, other engineering groups, and quality engineers to conceptualize, build, test and ship software solutions for the next generation of enterprise applications. You will report directly to the Manager of the Applications team. What makes you a qualified candidate: 4+ years of relevant experience, preferably in R&D based teams Strong programming experience with JavaScript frameworks such as Angular, React, or Node. js, or experience writing Python-based microservices. Experience driving cloud-native service development with a focus on DevOps principles (CI/CD, TDD, Automation). Hands on experience on Java, JSP and related areas. Proficiency in Docker, Unix or Linux platforms. Experience with Spring Framework or Spring Boot a plus Expertise in designing and deploying scalable solutions in public cloud environments. A passion for innovation and continuous learning, with the ability to quickly adapt to new technologies. Familiarity with software configuration management tools, defect tracking tools, & peer review tools Excellent debugging skills to troubleshoot and resolve issues effectively. Familiarity with relational database management systems (RDBMS) such as PostgreSQL, MySQL, etc. Strong oral and written communication skills, with the ability to produce runbooks and both technical and non-technical documentation. What you will bring: Master’s or bachelor’s degree in computer science or a related discipline. Practical experience in development and support structures. Knowledge of cloud environments, particularly AWS. Proficiency in SQL. Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Greetings from TCS! TCS is hiring for Informatica Power Centre-Teradata Desired Experience Range: 4 to 8 Years Job Location: Pune/Mumbai/Chennai Must-Have ** 1. Informatica PowerCenter 2. DB Experience 3. SQL 4. Unix Commands 5. PL/SQL 6.Teradata 7.Oracle Good-to-Have 1. Insurance Domain Knowledge 2. Good Communication Skills 3. Ability to work Indvidual 4. Good Team Player/Multi-tasking 5. Adopt to new techn Thank You Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Pune/Pimpri-Chinchwad Area

On-site

Linkedin logo

Primary Technical Data analysis for the business requirement Data platform Solution design of data flow from source Mapping of source data elements to the target table SQL (Teradata , Oracle , DB2 , MSSQL, DAS) ETL/EDW/Informatica Data lake/Azure Data Warehouse architecture Data modelling/Data Architecture Secondary Banking domain experience Show more Show less

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Gurugram, Haryana

On-site

Indeed logo

Location Gurugram, Haryana, India Category Corporate Job Id GGN00002011 Marketing / Loyalty / Mileage Plus / Alliances Job Type Full-Time Posted Date 06/02/2025 Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description Description - External United's Kinective Media Data Engineering team designs, develops, and maintains massively scaling ad- technology solutions brought to life with innovative architectures, data analytics, and digital solutions. Our Values : At United Airlines, we believe that inclusion propels innovation and is the foundation of all that we do. Our Shared Purpose: "Connecting people. Uniting the world." drives us to be the best airline for our employees, customers, and everyone we serve, and we can only do that with a truly diverse and inclusive workforce. Our team spans the globe and is made up of diverse individuals all working together with cutting-edge technology to build the best airline in the history of aviation. With multiple employee-run "Business Resource Group" communities and world-class benefits like health insurance, parental leave, and space available travel, United is truly a one-of-a-kind place to work that will make you feel welcome and accepted. Come join our team and help us make a positive impact on the world. Job overview and responsibilities Data Engineering organization is responsible for driving data driven insights & innovation to support the data needs for commercial projects with a digital focus. Data Engineer will be responsible to partner with various teams to define and execute data acquisition, transformation, processing and make data actionable for operational and analytics initiatives that create sustainable revenue and share growth. Execute unit tests and validating expected results to ensure accuracy & integrity of data and applications through analysis, coding, writing clear documentation and problem resolution. This role will also drive the adoption of data processing and analysis within the AWS environment and help cross train other members of the team. Leverage strategic and analytical skills to understand and solve customer and business centric questions. Coordinate and guide cross-functional projects that involve team members across all areas of the enterprise, vendors, external agencies and partners Leverage data from a variety of sources to develop data marts and insights that provide a comprehensive understanding of the business. Develop and implement innovative solutions leading to automation Use of Agile methodologies to manage projects Mentor and train junior engineers. This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. This position is for United Airlines Business Services Pvt. Ltd - a wholly owned subsidiary of United Airlines Inc. Qualifications Qualifications - External Required BS/BA, in computer science or related STEM field 2+ years of IT experience in software development 2+ years of development experience using Java, Python, Scala 2+ years of experience with Big Data technologies like PySpark, Hadoop, Hive, HBASE, Kafka, Nifi 2+ years of experience with database systems like redshift,MS SQL Server, Oracle, Teradata. Creative, driven, detail-oriented individuals who enjoy tackling tough problems with data and insights Individuals who have a natural curiosity and desire to solve problems are encouraged to apply 2+ years of IT experience in software development 2+ years of development experience using Java, Python, Scala Must be legally authorized to work in India for any employer without sponsorship Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position Preferred Masters in computer science or related STEM field Experience with cloud based systems like AWS, AZURE or Google Cloud Certified Developer / Architect on AWS Strong experience with continuous integration & delivery using Agile methodologies Data engineering experience with transportation/airline industry Strong problem-solving skills Strong knowledge in Big Data

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Karnataka, India

On-site

Linkedin logo

Our company: At Teradata, we believe that people thrive when empowered with better information. That’s why we built the most complete cloud analytics and data platform for AI. By delivering harmonized data, trusted AI, and faster innovation, we uplift and empower our customers—and our customers’ customers—to make better, more confident decisions. The world’s top companies across every major industry trust Teradata to improve business performance, enrich customer experiences, and fully integrate data across the enterprise. What you will do: Highly motivated Full Stack Engineer with a solid background in software development. The ideal candidate should be adept at multi-tasking across various development activities, including coding, system configuration, testing, and research. Key Responsibilities: Collaborate with an integrated development team to deliver high-quality applications. Develop end-user applications, leveraging research capabilities and SQL knowledge. Utilize open-source tools and technologies effectively, adapting and extending them as needed to create innovative solutions. Communicate effectively across teams to ensure alignment and clarity throughout the development process. Provide post-production support Who you will work with: On our team we collaborate with several cross-functional agile teams that include product owners, other engineering groups, and quality engineers to conceptualize, build, test and ship software solutions for the next generation of enterprise applications. You will report directly to the Manager of the Applications team. What makes you a qualified candidate: 2+ years of relevant experience, preferably in R&D based teams Strong programming experience with JavaScript frameworks such as Angular, React, or Node. js, or experience writing Python-based microservices. Experience driving cloud-native service development with a focus on DevOps principles (CI/CD, TDD, Automation). Hands on experience on Java, JSP and related areas. Proficiency in Docker, Unix or Linux platforms. Experience with Spring Framework or Spring Boot a plus Expertise in designing and deploying scalable solutions in public cloud environments. A passion for innovation and continuous learning, with the ability to quickly adapt to new technologies. Familiarity with software configuration management tools, defect tracking tools, & peer review tools Excellent debugging skills to troubleshoot and resolve issues effectively. Familiarity with relational database management systems (RDBMS) such as PostgreSQL, MySQL, etc. Strong oral and written communication skills, with the ability to produce runbooks and both technical and non-technical documentation. What you will bring: Master’s or bachelor’s degree in computer science or a related discipline. Practical experience in development and support structures. Knowledge of cloud environments, particularly AWS. Proficiency in SQL. Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

TCS is hiring for Teradata Developer Location: Chennai, Bangalore, Pune, Gurgaon, Hyderabad Years of Experience: 7-10yrs (Precise) Notice Period: 0-30 days(Precise) Responsibilities: Hand on Experience on writing Teradata SQL. Should be expert in Teradata Utilities Should be good in Unix Shell scripting Sound knowledge of financial service logical data model (Teradata FSLDM) Work closely with Business Users to convert the business requirement into Technical Requirement. Should be able to create Mapping documents for FSLDM and doswnstream data mart Must have work with ETL, preferably in Datastage Prepare Test cases for Unit Testing/SIT/Regression Design and document development standards Should have knowledge on Data warehouse Concepts Hands on experiecne in Control M Scheduling tool Should be able to take ownership and deliver independently Kindly share your updated CVs if it matches above requirements Thanks & Regards Shilpa Silonee BFSI A&I TAG Team Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Evernorth Evernorth℠ exists to elevate health for all, because we believe health is the starting point for human potential and progress. As champions for affordable, predictable and simple health care, we solve the problems others don’t, won’t or can’t. Our innovation hub in India will allow us to work with the right talent, expand our global footprint, improve our competitive stance, and better deliver on our promises to stakeholders. We are passionate about making healthcare better by delivering world-class solutions that make a real difference. We are always looking upward. And that starts with finding the right talent to help us get there. Position Overview Work with data governance products to assist Product Owner(s) with guiding functionality and content development with our business and engineering partners. Works daily with our delivery team partners to gain functionality to build trust in data by ensuring quality, accuracy across an organization. Work daily with partners to understand use case and necessary quality checks to govern the data. Works daily with our data stewards and data owners to onboard and train on data governance platforms and concepts. Qualifications Required Skills: Ability to lead functional development and tool on boarding meetings documenting decisions and issues Ability to translate business data governance requirements into technical requirements to meet multiple business needs Strong analytical, creative problem solving, and process management skills Ability to work with various data assets and provide insight into the data Ability to plan and prioritize own work effectively to achieve set end results Excellent attention to detail Excellent oral, written & presentation communication skills Ability to accept ambiguity, open to new ideas; willing to test and fail; and attempt new processes Ability to influence peers and subordinates to modify behaviors and provide support for the implementation and adoption of data governance and metadata strategy and practices. Ability to work independently and as a part of a team with a positive attitude for team goals. Ability to deal with multiple time zones Proactive flexible attitude with a desire to learn new skills in the growing field of data governance Agile Experience a plus. Required Experience & Education 8 -11 Years of Experience Experience (3-4+ years working experience) with data governance, constructs including technical and business metadata. Understanding of structured and semi-structured data, data quality knowledge Strong ability to drive connections with a wide variety of data platforms including AWS (Databricks, Teradata, Snowflake, Kafka), Azure, DB2 and SQL Server, etc. Knowledgeable on firewalls is a plus. Experience in using BI reporting tool is a plus. Experience with Collibra data governance products, Healthcare Industry data and/or data governance regulations and controls is a plus. Tooling: Excel, Agile knowlage like Jira or Rally, SQL knowledge, Tableau, other reporting tools, Experience with Data Governance Tools (Collibra DQ & Collibra Data Inteligence ) Alation or similar desired Location & Hours of Work Full-time position, working 40 hours per week. Expected overlap with US hours as appropriate Primarily based in the Innovation Hub in Hyderabad, India in a hybrid working model (3 days WFO and 2 days WAH) Equal Opportunity Statement Evernorth is an Equal Opportunity Employer actively encouraging and supporting organization-wide involvement of staff in diversity, equity, and inclusion efforts to educate, inform and advance both internal practices and external work with diverse client populations. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

Company Description Seosaph-infotech is a rapidly growing company in customized software development, providing advanced technology solutions and trusted services across multiple business verticals. In Just Two Years, Seosaph-infotech Has Delivered Exceptional Solutions To Industries Such As Finance, Healthcare, And E-commerce, Establishing Itself As a Reliable IT Partner For Businesses Seeking To Enhance Their Technological Independently complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, Spark, Data Bricks Delta Lakehouse or other Cloud data warehousing technologies. Governs data design/modelling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Develop a deep understanding of the business domains like Customer, Sales, Finance, Supplier, and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Drive collaborative reviews of data model design, code, data, security features to drive data product development. Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; SAP Data Model. Develop reusable data models based on cloud-centric, code-first approaches to data management and data mapping. Partner with the data stewards team for data discovery and action by business customers and stakeholders. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Assist with data planning, sourcing, collection, profiling, and transformation. Support data lineage and mapping of source system data to canonical data stores. Create Source to Target Mappings (STTM) for ETL and BI needed : Expertise in data modelling tools (ER/Studio, Erwin, IDM/ARDM models, CPG / domains ). Experience with at least one MPP database technology such as Databricks Lakehouse, Redshift, Synapse, Teradata, or Snowflake. Experience with version control systems like GitHub and deployment & CI tools. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Working knowledge of SAP data models, particularly in the context of HANA and S/4HANA, Retails Data like IRI, Nielsen Location : Remote. (ref:hirist.tech) Show more Show less

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Linkedin logo

Our company: At Teradata, we believe that people thrive when empowered with better information. That’s why we built the most complete cloud analytics and data platform for AI. By delivering harmonized data, trusted AI, and faster innovation, we uplift and empower our customers—and our customers’ customers—to make better, more confident decisions. The world’s top companies across every major industry trust Teradata to improve business performance, enrich customer experiences, and fully integrate data across the enterprise. What you will do: Highly motivated Full Stack Engineer with a solid background in software development. The ideal candidate should be adept at multi-tasking across various development activities, including coding, system configuration, testing, and research. Key Responsibilities: Collaborate with an integrated development team to deliver high-quality applications. Develop end-user applications, leveraging research capabilities and SQL knowledge. Utilize open-source tools and technologies effectively, adapting and extending them as needed to create innovative solutions. Communicate effectively across teams to ensure alignment and clarity throughout the development process. Provide post-production support Who you will work with: On our team we collaborate with several cross-functional agile teams that include product owners, other engineering groups, and quality engineers to conceptualize, build, test and ship software solutions for the next generation of enterprise applications. You will report directly to the Manager of the Applications team. What makes you a qualified candidate: 2+ years of relevant experience, preferably in R&D based teams Strong programming experience with JavaScript frameworks such as Angular, React, or Node. js, or experience writing Python-based microservices. Experience driving cloud-native service development with a focus on DevOps principles (CI/CD, TDD, Automation). Hands on experience on Java, JSP and related areas. Proficiency in Docker, Unix or Linux platforms. Experience with Spring Framework or Spring Boot a plus Expertise in designing and deploying scalable solutions in public cloud environments. A passion for innovation and continuous learning, with the ability to quickly adapt to new technologies. Familiarity with software configuration management tools, defect tracking tools, & peer review tools Excellent debugging skills to troubleshoot and resolve issues effectively. Familiarity with relational database management systems (RDBMS) such as PostgreSQL, MySQL, etc. Strong oral and written communication skills, with the ability to produce runbooks and both technical and non-technical documentation. What you will bring: Master’s or bachelor’s degree in computer science or a related discipline. Practical experience in development and support structures. Knowledge of cloud environments, particularly AWS. Proficiency in SQL. Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less

Posted 2 weeks ago

Apply

40.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Amgen Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. About The Role Role Description: We are seeking a Reference Data Management Senior Analyst who as the Reference Data Product team member of the Enterprise Data Management organization, will be responsible for managing and promoting the use of reference data, partnering with business Subject Mater Experts on creation of vocabularies / taxonomies and ontologies, and developing analytic solutions using semantic technologies . Roles & Responsibilities: Work with Reference Data Product Owner, external resources and other engineers as part of the product team Develop and maintain semantically appropriate concepts Identify and address conceptual gaps in both content and taxonomy Maintain ontology source vocabularies for new or edited codes Support product teams to help them leverage taxonomic solutions Analyze the data from public/internal datasets. Develop a Data Model/schema for taxonomy. Create a taxonomy in Semaphore Ontology Editor. Perform Bulk-import data templates into Semaphore to add/update terms in taxonomies. Prepare SPARQL queries to generate adhoc reports. Perform Gap Analysis on current and updated data Maintain taxonomies in Semaphore through Change Management process. Develop and optimize automated data ingestion / pipelines through Python/PySpark when APIs are available Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Identify and resolve complex data-related challenges Participate in sprint planning meetings and provide estimations on technical implementation. Basic Qualifications and Experience: Master’s degree with 6 years of experience in Business, Engineering, IT or related field OR Bachelor’s degree with 8 years of experience in Business, Engineering, IT or related field OR Diploma with 9+ years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Knowledge of controlled vocabularies, classification, ontology and taxonomy Experience in ontology development using Semaphore, or a similar tool Hands on experience writing SPARQL queries on graph data Excellent problem-solving skills and the ability to work with large, complex datasets Understanding of data modeling, data warehousing, and data integration concepts Good-to-Have Skills: Hands on experience writing SQL using any RDBMS (Redshift, Postgres, MySQL, Teradata, Oracle, etc.). Experience using cloud services such as AWS or Azure or GCP Experience working in Product Teams environment Knowledge of Python/R, Databricks, cloud data platforms Knowledge of NLP (Natural Language Processing) and AI (Artificial Intelligence) for extracting and standardizing controlled vocabularies. Strong understanding of data governance frameworks, tools, and best practices Professional Certifications: Databricks Certificate preferred SAFe® Practitioner Certificate preferred Any Data Analysis certification (SQL, Python) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Show more Show less

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Company Overview At ReKnew, our mission is to empower enterprises to revitalize their core business and organization by positioning themselves for the new world of AI. We're a startup founded by seasoned practitioners, supported by expert advisors, and built on decades of experience in enterprise technology, data, analytics, AI, digital, and automation across diverse industries. We're actively seeking top talent to join us in this mission. Job Description We're seeking a highly skilled Senior Data Engineer with deep expertise in AWS-based data solutions. In this role, you'll be responsible for designing, building, and optimizing large-scale data pipelines and frameworks that power analytics and machine learning workloads. You'll lead the modernization of legacy systems by migrating workloads from platforms like Teradata to AWS-native big data environments such as EMR, Glue, and Redshift. A strong emphasis is placed on reusability, automation, observability, performance optimization, and managing schema evolution in dynamic data lake environments . Key Responsibilities Migration & Modernization: Build reusable accelerators and frameworks to migrate data from legacy platforms (e.g., Teradata) to AWS-native architectures such as EMR, Glue, and Redshift. Data Pipeline Development: Design and implement robust ETL/ELT pipelines using Python, PySpark, and SQL on AWS big data platforms. Code Quality & Testing: Drive development standards with test-driven development (TDD), unit testing, and automated validation of data pipelines. Monitoring & Observability: Build operational tooling and dashboards for pipeline observability, including tracking key metrics like latency, throughput, data quality, and cost. Cloud-Native Engineering: Architect scalable, secure data workflows using AWS services such as Glue, Lambda, Step Functions, S3, and Athena. Collaboration: Partner with internal product teams, data scientists, and external stakeholders to clarify requirements and drive solutions aligned with business goals. Architecture & Integration: Work with enterprise architects to evolve data architecture while securely integrating AWS systems with on-premise or hybrid environments. This includes strategic adoption of data lake table formats like Delta Lake, Apache Iceberg, or Apache Hudi for schema management and ACID capabilities. ML Support & Experimentation: Enable data scientists to operationalize machine learning models by providing clean, well-governed datasets at scale. Documentation & Enablement: Document solutions thoroughly and provide technical guidance and knowledge sharing to internal engineering teams. Team Training & Mentoring: Act as a subject matter expert, providing guidance, training, and mentorship to junior and mid-level data engineers, fostering a culture of continuous learning and best practices within the team. Qualifications Experience: 7+ years in technology roles, with at least 5+ years specifically in data engineering, software development, and distributed systems. Programming: Expert in Python and PySpark (Scala is a plus). Deep understanding of software engineering best practices. AWS Expertise: 3+ years of hands-on experience in the AWS data ecosystem. Proficient in AWS Glue, S3, Redshift, EMR, Athena, Step Functions, and Lambda. Experience with AWS Lake Formation and data cataloging tools is a plus. AWS Data Analytics or Solutions Architect certification is a strong plus. Big Data & MPP Systems: Strong grasp of distributed data processing. Experience with MPP data warehouses like Redshift, Snowflake, or Databricks on AWS. Hands-on experience with Delta Lake, Apache Iceberg, or Apache Hudi for building reliable data lakes with schema evolution, ACID transactions, and time travel capabilities. DevOps & Tooling: Experience with version control (e.g., GitHub/CodeCommit) and CI/CD tools (e.g., CodePipeline, Jenkins). Familiarity with containerization and deployment in Kubernetes or ECS. Data Quality & Governance: Experience with data profiling, data lineage, and relevant tools. Understanding of metadata management and data security best practices. Bonus: Experience supporting machine learning or data science workflows. Familiarity with BI tools such as QuickSight, PowerBI, or Tableau. Show more Show less

Posted 2 weeks ago

Apply

8.0 - 13.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

What youll do DocuSign is seeking a talented and results oriented Data Engineer to focus on delivering trusted data to the business. As a member of the Global Data Analytics (GDA) T eam, the Data Engineer leverages a variety of technologies to design, develop and deliver new features in addition to loading, transforming and preparing data sets of all shapes and sizes for teams around the world. During a typical day, the Engineer will spend time developing new features to analyze data, develop solutions and load tested data sets into the Snowflake Enterprise Data Warehouse. The ideal candidate will demonstrate a positive can do attitude, a passion for learning and growing, and the drive to work hard and get the job done in a timely fashion. This individual contributor position provides plenty of room to grow -- a mix of challenging assignments, a chance to work with a world-class team, and the opportunity to use innovative technologies such as AWS, Snowflake, dbt, Airflow and Matillion. This position is an individual contributor role reporting to the Manager, Data Engineering. Responsibility Design, develop and maintain scalable and efficient data pipelines Analyze and Develop data quality and validation procedures Work with stakeholders to understand the data requirements and provide solutions Troubleshoot and resolve data issues in a timely manner Learn and leverage available AI tools for increased developer productivity Collaborate with cross-functional teams to ingest data from various sources Evaluate and improve data architecture and processes continuously Own, monitor, and improve solutions to ensure SLAs are met Develop and maintain documentation for Data infrastructure and processes Executes projects using Agile Scrum methodologies and be a team player Job Designation Hybrid: Employee divides their time between in-office and remote work. Access to an office location is required. (Frequency: Minimum 2 days per week; may vary by team but will be weekly in-office expectation) Positions at Docusign are assigned a job designation of either In Office, Hybrid or Remote and are specific to the role/job. Preferred job designations are not guaranteed when changing positions within Docusign. Docusign reserves the right to change a positions job designation depending on business needs and as permitted by local law. What you bring Basic Bachelor s Degree in Computer Science, Data Analytics, Information Systems, etc Experience developing data pipelines in one of the following languages: Python or Java 8+ years dimensional and relational data modeling experience Excellent SQL and database management skills Preferred 8+ years in data warehouse engineering (OLAP) Snowflake, BigQuery, Teradata 8+ years with transactional databases (OLTP) Oracle, SQL Server, MySQL 8+ years with big data, Hadoop, Data Lake, Spark in a cloud environment(AWS) 8+ years with commercial ETL tools DBT, Matillion etc 8+ years delivering ETL solutions from source systems, databases, APIs, flat-files, JSON Experience developing Entity Relationship Diagrams with Erwin, SQLDBM, or equivalent Experience working with job scheduling and monitoring systems (Airflow, Datadog, AWS SNS) Familiarity with Gen AI tools like Git Copilot and dbt copilot. Good understanding of Gen AI Application frameworks. Knowledge on any agentic platforms Experience building BI Dashboards with tools like Tableau Experience in the financial domain, master data management(MDM), sales and marketing, accounts payable, accounts receivable, invoicing Experience managing work assignments using tools like Jira and Confluence Experience with Scrum/Agile methodologies Ability to work independently and as part of a team Excellent analytical and problem solving and communication skills Life at Docusign Working here Docusign is committed to building trust and making the world more agreeable for our employees, customers and the communities in which we live and work. You can count on us to listen, be honest, and try our best to do what s right, every day. At Docusign, everything is equal. We each have a responsibility to ensure every team member has an equal opportunity to succeed, to be heard, to exchange ideas openly, to build lasting relationships, and to do the work of their life. Best of all, you will be able to feel deep pride in the work you do, because your contribution helps us make the world better than we found it. And for that, you ll be loved by us, our customers, and the world in which we live. Accommodation Docusign is committed to providing reasonable accommodations for qualified individuals with disabilities in our job application procedures. for assistance. Applicant and Candidate Privacy Notice #LI-Hybrid #LI-SA4 ","qualifications":" Basic Bachelor s Degree in Computer Science, Data Analytics, Information Systems, etc Experience developing data pipelines in one of the following languages: Python or Java 8+ years dimensional and relational data modeling experience Excellent SQL and database management skills Preferred 8+ years in data warehouse engineering (OLAP) Snowflake, BigQuery, Teradata 8+ years with transactional databases (OLTP) Oracle, SQL Server, MySQL 8+ years with big data, Hadoop, Data Lake, Spark in a cloud environment(AWS) 8+ years with commercial ETL tools DBT, Matillion etc 8+ years delivering ETL solutions from source systems, databases, APIs, flat-files, JSON Experience developing Entity Relationship Diagrams with Erwin, SQLDBM, or equivalent Experience working with job scheduling and monitoring systems (Airflow, Datadog, AWS SNS) Familiarity with Gen AI tools like Git Copilot and dbt copilot. Good understanding of Gen AI Application frameworks. Knowledge on any agentic platforms Experience building BI Dashboards with tools like Tableau Experience in the financial domain, master data management(MDM), sales and marketing, accounts payable, accounts receivable, invoicing Experience managing work assignments using tools like Jira and Confluence Experience with Scrum/Agile methodologies Ability to work independently and as part of a team Excellent analytical and problem solving and communication skills ","responsibilities":" DocuSign is seeking a talented and results oriented Data Engineer to focus on delivering trusted data to the business. As a member of the Global Data Analytics (GDA) T eam, the Data Engineer leverages a variety of technologies to design, develop and deliver new features in addition to loading, transforming and preparing data sets of all shapes and sizes for teams around the world. During a typical day, the Engineer will spend time developing new features to analyze data, develop solutions and load tested data sets into the Snowflake Enterprise Data Warehouse. The ideal candidate will demonstrate a positive can do attitude, a passion for learning and growing, and the drive to work hard and get the job done in a timely fashion. This individual contributor position provides plenty of room to grow -- a mix of challenging assignments, a chance to work with a world-class team, and the opportunity to use innovative technologies such as AWS, Snowflake, dbt, Airflow and Matillion. This position is an individual contributor role reporting to the Manager, Data Engineering. Responsibility Design, develop and maintain scalable and efficient data pipelines Analyze and Develop data quality and validation procedures Work with stakeholders to understand the data requirements and provide solutions Troubleshoot and resolve data issues in a timely manner Learn and leverage available AI tools for increased developer productivity Collaborate with cross-functional teams to ingest data from various sources Evaluate and improve data architecture and processes continuously Own, monitor, and improve solutions to ensure SLAs are met Develop and maintain documentation for Data infrastructure and processes Executes projects using Agile Scrum methodologies and be a team player

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

What youll do Docusign is seeking a talented and results oriented Data Engineer to focus on delivering trusted data to the business. As a member of the Global Data Analytics (GDA) Team, the Data Engineer leverages a variety of technologies to design, develop and deliver new features in addition to loading, transforming and preparing data sets of all shapes and sizes for teams around the world. During a typical day, the Engineer will spend time developing new features to analyze data, develop solutions and load tested data sets into the Snowflake Enterprise Data Warehouse. The ideal candidate will demonstrate a positive can do attitude, a passion for learning and growing, and the drive to work hard and get the job done in a timely fashion. This individual contributor position provides plenty of room to grow -- a mix of challenging assignments, a chance to work with a world-class team, and the opportunity to use innovative technologies such as AWS, Snowflake, dbt, Airflow and Matillion. This position is an individual contributor role reporting to the Manager, Data Engineering. Responsibility Design, develop and maintain scalable and efficient data pipelines Analyze and Develop data quality and validation procedures. Work with stakeholders to understand the data requirements and provide solutions Troubleshoot and resolve data issues in a timely manner Learn and leverage available AI tools for increased developer productivity Collaborate with cross-functional teams to ingest data from various sources Evaluate and improve data architecture and processes continuously Own, monitor, and improve solutions to ensure SLAs are met Develop and maintain documentation for Data infrastructure and processes Executes projects using Agile Scrum methodologies and be a team player Job Designation Hybrid: Employee divides their time between in-office and remote work. Access to an office location is required. (Frequency: Minimum 2 days per week; may vary by team but will be weekly in-office expectation) Positions at Docusign are assigned a job designation of either In Office, Hybrid or Remote and are specific to the role/job. Preferred job designations are not guaranteed when changing positions within Docusign. Docusign reserves the right to change a positions job designation depending on business needs and as permitted by local law. What you bring Basic Bachelor s Degree in Computer Science, Data Analytics, Information Systems, etc Experience developing data pipelines in one of the following languages: Python or Java 5+ years dimensional and relational data modeling experience Excellent SQL and database management skills Preferred 5+ years in data warehouse engineering (OLAP) Snowflake, BigQuery, Teradata, Redshift 5+ years with transactional databases (OLTP) Oracle, SQL Server, MySQL 5+ years with big data, Hadoop, Data Lake, Spark in a cloud environment(AWS) 5+ years with commercial ETL tools DBT, Matillion etc 5+ years delivering ETL solutions from source systems, databases, APIs, flat-files, JSON Experience developing Entity Relationship Diagrams with Erwin, SQLDBM, or equivalent Experience working with job scheduling and monitoring systems (Airflow, Datadog, AWS SNS) Familiarity with Gen AI tools like Git Copilot and dbt copilot. Good understanding of Gen AI Application frameworks. Knowledge on any agentic platforms Experience building BI Dashboards with tools like Tableau Experience in the financial domain, sales and marketing, accounts payable, accounts receivable, invoicing Experience managing work assignments using tools like Jira and Confluence Experience with Scrum/Agile methodologies Ability to work independently and as part of a team Excellent analytical and problem solving and communication skills Life at Docusign Working here Docusign is committed to building trust and making the world more agreeable for our employees, customers and the communities in which we live and work. You can count on us to listen, be honest, and try our best to do what s right, every day. At Docusign, everything is equal. We each have a responsibility to ensure every team member has an equal opportunity to succeed, to be heard, to exchange ideas openly, to build lasting relationships, and to do the work of their life. Best of all, you will be able to feel deep pride in the work you do, because your contribution helps us make the world better than we found it. And for that, you ll be loved by us, our customers, and the world in which we live. Accommodation Docusign is committed to providing reasonable accommodations for qualified individuals with disabilities in our job application procedures. for assistance. Applicant and Candidate Privacy Notice #LI-Hybrid #LI-SA4 ","qualifications":" Basic Bachelor s Degree in Computer Science, Data Analytics, Information Systems, etc Experience developing data pipelines in one of the following languages: Python or Java 5+ years dimensional and relational data modeling experience Excellent SQL and database management skills Preferred 5+ years in data warehouse engineering (OLAP) Snowflake, BigQuery, Teradata, Redshift 5+ years with transactional databases (OLTP) Oracle, SQL Server, MySQL 5+ years with big data, Hadoop, Data Lake, Spark in a cloud environment(AWS) 5+ years with commercial ETL tools DBT, Matillion etc 5+ years delivering ETL solutions from source systems, databases, APIs, flat-files, JSON Experience developing Entity Relationship Diagrams with Erwin, SQLDBM, or equivalent Experience working with job scheduling and monitoring systems (Airflow, Datadog, AWS SNS) Familiarity with Gen AI tools like Git Copilot and dbt copilot. Good understanding of Gen AI Application frameworks. Knowledge on any agentic platforms Experience building BI Dashboards with tools like Tableau Experience in the financial domain, sales and marketing, accounts payable, accounts receivable, invoicing Experience managing work assignments using tools like Jira and Confluence Experience with Scrum/Agile methodologies Ability to work independently and as part of a team Excellent analytical and problem solving and communication skills ","responsibilities":" Docusign is seeking a talented and results oriented Data Engineer to focus on delivering trusted data to the business. As a member of the Global Data Analytics (GDA) Team, the Data Engineer leverages a variety of technologies to design, develop and deliver new features in addition to loading, transforming and preparing data sets of all shapes and sizes for teams around the world. During a typical day, the Engineer will spend time developing new features to analyze data, develop solutions and load tested data sets into the Snowflake Enterprise Data Warehouse. The ideal candidate will demonstrate a positive can do attitude, a passion for learning and growing, and the drive to work hard and get the job done in a timely fashion. This individual contributor position provides plenty of room to grow -- a mix of challenging assignments, a chance to work with a world-class team, and the opportunity to use innovative technologies such as AWS, Snowflake, dbt, Airflow and Matillion. This position is an individual contributor role reporting to the Manager, Data Engineering. Responsibility Design, develop and maintain scalable and efficient data pipelines Analyze and Develop data quality and validation procedures. Work with stakeholders to understand the data requirements and provide solutions Troubleshoot and resolve data issues in a timely manner Learn and leverage available AI tools for increased developer productivity Collaborate with cross-functional teams to ingest data from various sources Evaluate and improve data architecture and processes continuously Own, monitor, and improve solutions to ensure SLAs are met Develop and maintain documentation for Data infrastructure and processes Executes projects using Agile Scrum methodologies and be a team player

Posted 2 weeks ago

Apply

4.0 - 9.0 years

4 - 7 Lacs

Gurugram

Work from Office

Naukri logo

Job Overview: We are seeking a dynamic Consultant to join our data and analytics team, delivering innovative solutions with a focus on the life sciences industry. The ideal candidate will bring current, hands-on expertise in data warehousing (Snowflake, Redshift, Databricks or similar), master data management (MDM), and report development (Power BI, Tableau, Sigma or similar), leveraging cloud platforms (AWS, Azure, GCP). This role involves leading a small team of 2-3 developers, actively contributing to technical delivery, and engaging with clients in an onshore/offshore model. We are particularly excited to find someone passionate about applying Generative AI (Gen AI) to transform the life sciences space, with a preferred understanding of healthcare concepts and data. Key Responsibilities: Hands-On Technical Delivery: Actively design, develop, and optimize data warehouse solutions using Snowflake, Redshift, and Databricks, ensuring high performance and scalability. Reporting Visualization: Build and refine advanced dashboards and reports using Power BI, Tableau, Sigma, and web-based platforms to deliver actionable insights. Cloud Expertise: Implement and manage data solutions on AWS, Azure, and GCP, maintaining cutting-edge technical proficiency. Master Data Management: Execute MDM processes to ensure data quality, governance, and integration, with a focus on life sciences applications. Team Leadership: Lead and mentor a small team of 2-3 developers, providing technical guidance, code reviews, and workload coordination. Gen AI Exploration: Drive the application of Generative AI techniques to solve challenges in the life sciences domain, such as drug discovery, patient analytics, or personalized medicine. Client Collaboration: Work closely with clients to gather requirements, propose solutions, and deliver results that align with business and scientific objectives. Project Contribution: Support project execution within Agile frameworks, collaborating with onshore and offshore teams to meet deadlines. Innovation: Contribute to internal initiatives, particularly those exploring Gen AI and healthcare-focused analytics Key Qualifications: Technical Skills: 4+ years in data engineering, analytics, or a related technical field, with at least 2 years in a leadership or managerial role. Strong proficiency in technologies like Redshift, Teradata, Data bricks, Snowflake, or similar solutions Experience handling huge volumes of data and setting up large scale solutions using tools like Airflow, Airbyte, dbt etc. Proficiency with cloud platforms like AWS, Azure, or Google Cloud, including services like S3, EC2, and Lambda o Proficiency with database technologies such as MySQL, PostgreSQL, SnowSQL etc. Familiarity with back-end technologies like Node.js, Python (Django/Flask), Ruby on Rails, or Java (Spring Boot). Familiarity with front-end technologies such as HTML5, CSS3, JavaScript, and frameworks like React.js, Angular, or Vue.js. Experience with API design and development (RESTful and/or GraphQL). Knowledge of version control systems like Git and collaboration platforms such as GitHub, GitLab, or Bitbucket. Experience of working with US based pharma clients and datasets would be preferred. Leadership and Management: Strong leadership skills with experience in building and managing technical teams. Excellent problem-solving abilities and a strategic mindset. Strong knowledge of master data management principles and practices Excellent project management skills, with experience in Agile/Scrum frameworks What We Offer: Opportunity to work on transformative healthcare projects. A collaborative and inclusive work environment. Competitive salary, performance-based bonuses, and professional development opportunities. Work on cutting edge cloud and Gen AI solutions

Posted 2 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Issue Remediation Senior Analyst – C11 About us: Analytics & Information Management AIM is a global community that is driving data driven transformation across Citi in multiple functions with the objective to create actionable intelligence for our business leaders. We are a fast-growing organization working with Citi businesses and functions across the world. Remediation & Remuneration COE Remediation team is responsible for cross functional coordination of customer facing remediation efforts. Provide oversight, prioritization, and scheduling of remediation activities with remediation partner teams including Technology, FSO, Analytics groups, Shared Services (mail vendor) and Controllers. R&R AIM Team works as the data Analytic partner for the Issue Remediation Business Team. Job responsibilities: R&R team manages the analysis of the customer remediation issues across globe, currently in retail consumer bank. The critical areas are work is divided into: Remediation analysis: Execution of the comprehensive data remediation approach on Customer issues due to gaps observed in policies and governance, Self-identified, or through IA. Impact assessment: Identification of size of the customers and the dollar amount impacted due to these issues. Issue Management & Root cause analysis: Identifying the issues and reasons for the issues by leveraging analytical methods. Audit Support: Tracking implementation plans and providing data evidence, artifacts for audit completion Expertise Required: Tools and Platforms Proficient in SAS, SQL, RDBMS, Teradata, Unix Proficient in MS Excel, PowerPoint, and VBA Jira, Bitbucket Mainframes Exposure to Big data, Python Domain Skills Good understanding of banking domain and consumer products (Retail Banking, Deposit, Loans, Wealth management, Mortgage, Insurance, etc.) (Preferred) Knowledge of Finance Regulations, Understanding on Retail Business/ Banking Domain Analytical Skills Ability to identify, clearly articulate and solve complex business problems and present them to the management in a structured and simpler form Data analysis, Data profiling, Data Management skills MIS reporting and generate actionable Business Insights Coming up with automated Techniques to reduce redundancy, remove false positives and enhance optimization Identification of control gaps and providing recommendations as per data strategy (Preferred) - Risk & control Metrics & Audit Framework Exposure Interpersonal Skills Ability to identify, clearly articulate and solve complex business problems and present them to the management in a structured and simpler form Should have excellent communication and inter-personal skills Good process/project management skills Ability to work well across multiple functional areas Ability to thrive in a dynamic and fast-paced environment Identifying and implementation of new collaboration ideas Contribute to organizational initiatives in wide ranging areas including competency development, training, organizational building activities etc. Proactive approach in solving problems and eye for details, identifying process gaps in solution implementation and suggest the alternatives Other Info: Education Level: Master’s / Advanced Degree in Information Technology/ Computer Applications/ Engineering/ MBA from a premier institute Overall experience of 5-8 years with At least 2 years of experience in Banking Industry delivering data solutions Job Category: Decision Management Schedule: Full-time Shift: Regular Local Working Hours (aligned with NAM working hours) ------------------------------------------------------ Job Family Group: Decision Management ------------------------------------------------------ Job Family: Data/Information Management ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity and affirmative action employer. Qualified applicants will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Citigroup Inc. and its subsidiaries ("Citi”) invite all qualified interested applicants to apply for career opportunities. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi . View the " EEO is the Law " poster. View the EEO is the Law Supplement . View the EEO Policy Statement . View the Pay Transparency Posting Show more Show less

Posted 2 weeks ago

Apply

3.0 - 8.0 years

13 - 14 Lacs

Hyderabad, Chennai

Work from Office

Naukri logo

Are you ready to make an impact at DTCC Do you want to work on innovative projects, collaborate with a dynamic and encouraging team, and receive investment in your professional developmentAt DTCC, we are at the forefront of innovation in the financial markets. Were committed to helping our employees grow and succeed. We believe that you have the skills and drive to make a real impact. We develop a growing internal community and are committed to creating a workplace that looks like the world that we serve. Pay and Benefits: Competitive compensation, including base pay and annual incentive. Comprehensive health and life insurance and well-being benefits, based on location. Pension / Retirement benefits Paid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and well-being. DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (Tuesdays, Wednesdays, and a day unique to each team or employee). The impact you will have in this role: Data Quality and Integration role is a highly technical position - considered a technical expert in system implementation - with an emphasis on providing design, ETL, data quality and warehouse modeling expertise. This role will be accountable for, knowledge of capital development efforts. Performs in an experienced level the technical design of application components, builds applications, interfaces between applications, and understands data security, retention, and recovery. Can research technologies independently and recommend appropriate solutions. Contributes to technology-specific best practices standards; gives to success criteria from design through deployment, including, reliability, cost-effectiveness, performance, data integrity, maintainability, reuse, extensibility, usability and scalability; gives expertise on significant application components, vendor products, program languages, databases, operating systems, etc, completes the plan by building components, testing, configuring, tuning, and deploying solutions. Software Engineer (SE) for Data Quality and Integration applies specific technical knowledge of data quality and data integration in order to assist in the design and construction of critical systems. The SE works as part of an AD project squad and may interact with the business, Functional Architects, and domain experts on related integrating systems. The SE will contribute to the design of components or individual programs and participates fully in the construction and testing. This involves working with the Senior Application Architect, and other technical contributors at all levels. This position contributes expertise to project teams through all phases, including post-deployment support. This means researching specific technologies, and applications, and contributing to the solution design, supporting development teams, testing, troubleshooting, and production support. The ASD must possess experience in integrating large volumes of data, efficiently and in a timely manner. This position requires working closely with the functional and governance functions, and more senior technical resources, reviewing technical designs and specifications, and contributing to cost estimates and schedules. What Youll Do: Technology Expertise - is a domain expert on one or more of programming languages, vendor products specifically, Informatica Data Quality and Informatica Data Integration Hub, DTCC applications, data structures, business lines. Platforms - works with Infrastructure partners to stand up development, testing, and production environments Requirements Elaboration - works with the Functional Architect to ensure designs satisfy functional requirements Data Modeling - reviews and extends data models Data Quality Concepts - Experience in Data Profiling, Scorecards, Monitoring, Matching, Cleansing Is aware of frameworks - that promote concepts of isolation, extensibility, and extendibility System Performance - contributes to solutions that satisfy performance requirements; constructs test cases and strategies that account for performance requirements; tunes application performance issues Security - implements solutions and complete test plans working mentoring other team members in standard process Standards - is aware of technology standards and understands technical solutions need to be consistent with them Documentation - develops and maintains system documentation Is familiar with different software development methodologies (Waterfall, Agile, Scrum, Kanban) Aligns risk and control processes into day to day responsibilities to monitor and mitigate risk; escalates appropriately Educational background and work experience that includes mathematics and conversion of expressions into run time executable code. Ensures own and team s practices support success across all geographic locations Mitigates risk by following established procedures and monitoring controls, spotting key errors and demonstrating strong ethical behavior. Helps roll out standards and policies to other team members. Financial Industry Experience including Trades, Clearing and Settlement Education: Bachelors degree or equivalent experience. Minimum of 3+ years in Data Quality and Integration. Basic understanding of Logical Data Modeling and Database design is a plus Technical experience with multiple database platforms: Sybase, Oracle, DB2 and distributed databases like Teradata / Greenplum / Redshift / Snowflake containing high volumes of data. Knowledge of data management processes and standard methodologies preferred Proficiency with Microsoft Office tools required Supports team in managing client expectations and resolving issues on time. Technical skills highly preferred along with strong analytical skills. Actual salary is determined based on the role, location, individual experience, skills, and other considerations. Please contact us to request accommodation.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

5 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

SUMMARY Job Role: Snowflake Data Engineering Professional Location: Bangalore Experience: The ideal candidate should possess at least 8 years of experience in Snowflake with a focus on Data Engineering. Primary Skills: Proficiency in Snowflake, DBT, and AWS. Good to have Skills: Familiarity with Fivetran (HVR) and Python. Responsibilities: Design, develop, and maintain data pipelines using Snowflake, DBT, and AWS. Collaborate with cross-functional teams to understand data requirements and deliver solutions. Optimize and troubleshoot existing data workflows to ensure efficiency and reliability. Implement best practices for data management and governance. Stay updated with the latest industry trends and technologies to continuously improve the data infrastructure. Required Skills: Strong experience in data modeling, ETL processes, and data warehousing. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Preferred Skills: Knowledge of Fivetran (HVR) and Python. Familiarity with data integration tools and techniques. Ability to work in a fast-paced and agile environment. Education: Bachelor's degree in Computer Science, Information Technology, or a related field. Requirements Requirements: Bachelor's degree in Computer Science, Information Technology, or a related field. 8 years of relevant experience in Snowflake with Data Engineering. Proficiency in Snowflake, DBT, and AWS. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Thiruvananthapuram

On-site

GlassDoor logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. About EY GDS Global Delivery Services (GDS) is a dynamic and truly global delivery network. Across our six locations, we work with teams from all EY service lines, geographies and sectors, and play a vital role in the delivery of the EY growth strategy. We operate from six countries and sixteen cities: Argentina (Buenos Aires) China (Dalian) India (Bangalore, Chennai, Gurgaon, Hyderabad, Kochi, Kolkata, Mumbai, Noida, Trivandrum) Philippines (Manila) Poland (Warsaw and Wroclaw) UK (Manchester, Liverpool) Careers in EY Global Delivery Services Join a team of over 50,000 people, working across borders, to provide innovative and strategic business solutions to EY member firms around the world. Join one of our dynamic teams From accountants to coders, we offer a wide variety of fulfilling career opportunities that span all business disciplines Our Consulting practice provides differentiated focus on the key business themes to help our clients solve better questions around technology. Our vision is to be recognized as a leading provider of differentiated technology consulting services, harnessing new disruptive technology, alliances and attracting talented people to solve our clients' issues. It's an exciting time to join us and grow your career as a technology professional. A technology career is about far more than leading-edge innovations. It’s about the application of these technologies in the real world to make a real, meaningful impact. We are looking for highly motivated, articulate individuals who have the skills to the technology lifecycle and are passionate about designing innovative solutions to solve complex business problems. Your career in Consulting can span across these technology areas/ services lines: Digital Technologies: We are a globally integrated digital architecture and engineering team. Our mission is to deliver tailored, custom-built end to end solutions to our customers that are Digital, Cloud Native and Open Source. Our skills include Experience design, UI development, Design Thinking, Architecture & Design, Full stack development (.Net/ Java/ SharePoint/ Power Platform), Emerging Technologies like Block Chain, IoT, AR\VR, Drones, Cloud and DevSecOps. We use industrialized techniques, built on top of agile methods utilizing our global teams to deliver end to end solutions at best unit cost proposition. Testing Services: We are the yardstick of quality software product. We break something to make the product stronger and successful. We provide entire gamut of testing services including Busines / User acceptance testing. Hence this is a team with all round skills such as functional, technical and process. Data & Analytics: Data and Analytics is amongst the largest and most versatile practices within EY. Our sector and domain expertise combined with technical skills in data, cloud, advanced analytics and artificial intelligence differentiates us in the industry. Our talented team possesses cross-sector and cross-domain expertise and a wide array of skills in Information Management (IM), Business Intelligence (BI), Advance Analytics (AA) and Artificial Intelligence (AI) Oracle: We provide one-stop solution for end-to-end project implementation enabled by Oracle and IBM Products. We use proven methodologies, tools and accelerators to jumpstart and support large Risk and Finance Transformation. We develop solutions using various languages such as SQL or PL/ SQL, Java, Java Script, Python, IBM Maximo and other Oracle Utilities. We also provide consulting services for streamlining the current reporting process using various Enterprise Performance Management tools. SAP: By building on SAP’s S/4HANA digital core and cloud services, EY and SAP are working to help organizations leverage industry-leading technologies to improve operational performance. This collaboration helps drive digital transformation for our clients across areas including finance, human resources, supply chain and procurement. Our goal is to support clients as they initiate or undergo major transformation. Our capabilities span end-to-end solution implementation services from strategy and architecture to production deployment. EY supports clients in three main areas, Technology implementation support, Enterprise and Industry application implementation, Governance Risk Compliance (GRC) Technology. Banking and Capital Market Services: Banking and Capital Market Services companies are transforming their complex tax and finance functions with technologies such as AI and ML. With the right blend of core competencies, tax and finance personnel will shift to data, process and technology skills to service global clients on their Core Banking Platforms and support their business / digital transformation like Deposit system replacements, lending / leasing modernization, Cloud–native architecture (Containerization) etc. Wealth and Asset Management: We help our clients thrive in a transformative age by providing innovative services to global and domestic asset management clients to increase efficiency, effectiveness and manage the overall impact on bottom line profitability by leveraging the technology, data and digital teams. We do many operational efficiency programs and Technology Enabled Transformation to re-platform their front and Back offices with emerging technologies like AI, ML, Blockchain etc. Insurance Transformation: The current changing Macroeconomic trends continue to challenge Insurers globally. However, with disruptive technologies – including IoT, autonomous vehicles, Blockchain etc, we help companies through these challenges and create innovative strategies to transform their business through technology enabled transformation programs. We provide end to end services to Global P&C (General), Life and Health Insurers, Reinsurers and Insurance brokers. Cyber Security: The ever-increasing risk and complexity surrounding cybersecurity and privacy has put cybersecurity at the top of the agenda for senior management, the Board of Directors, and regulators. We help our clients to understand and quantify their cyber risk, prioritize investments, and embed security, privacy and resilience into every digitally-enabled initiative – from day one. Technology Risk: A practice that is a unique, industry-focused business unit that provides a broad range of integrated services where you’ll contribute technically to IT Risk and Assurance client engagements and internal projects. An important part of your role will be to actively establish, maintain and strengthen internal and external relationships. You’ll also identify potential business opportunities for EY within existing engagements and escalate these as appropriate. Similarly, you’ll anticipate and identify risks within engagements and share any issues with senior members of the team. Behavioral Competencies: Adaptive to team and fosters collaborative approach Innovative approach to the project, when required Shows passion and curiosity, desire to learn and can think digital Agile mindset and ability to multi-task Must have an eye for detail Skills needed: Should have understanding and/or experience of software development best practices and software development life cycle Understanding of one/more programming languages such as Java/ .Net/ Python, data analytics or databases such as SQL/ Oracle/ Teradata etc. Internship in a relevant technology domain will be an added advantage Qualification : BE - B. Tech / (IT/ Computer Science/ Circuit branches) Should have secured 60% and above No active Backlogs EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 2 weeks ago

Apply

0 years

6 - 9 Lacs

Chennai

Remote

GlassDoor logo

Chennai, India Hyderabad, India Bangalore, India Job ID: R-1067512 Apply prior to the end date: August 8th, 2025 When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What you’ll be doing… We’re seeking a skilled Data Engineering Analyst to join our high-performing team and propel our telecom business forward. You’ll contribute to building cutting-edge data products and assets for our wireless and wireline operations, spanning areas like consumer analytics, network performance, and service assurance. In this role, you will develop deep expertise in various telecom domains. As part of the Data Architecture & Strategy team, you’ll collaborate closely with IT and business stakeholders to design and implement user-friendly, robust data product solutions. This includes incorporating data classification and governance principles. Your responsibilities encompass Collaborate with stakeholders to understand data requirements and translate them into efficient data models Design, develop, and implement data architecture solutions on GCP and Teradata to support our Telecom business. Design data ingestion for both real-time and batch processing, ensuring efficient and scalable data acquisition for creating an effective data warehouse. Maintain meticulous documentation, including data design specifications, functional test cases, data lineage, and other relevant artifacts for all data product solution assets. Implement data architecture standards, as set by the data architecture team. Proactively identify opportunities for automation and performance optimization within your scope of work Collaborate effectively within a product-oriented organization, providing data expertise and solutions across multiple business units. Cultivate strong cross-functional relationships and establish yourself as a subject matter expert in data and analytics within the organization. What we’re looking for... You’re curious about new technologies and the game-changing possibilities it creates. You like to stay up-to-date with the latest trends and apply your technical expertise to solving business problems. You’ll need to have… Bachelor’s degree with four or more years of work experience. Four or more years of relevant work experience. Expertise in building complex SQLs to do data analysis to understand and design data solutions Experience with ETL, Data Warehouse concepts and Data Management life cycle Experience in creating technical documentation such as Source to Target mapping, Source contract, SLA's etc Experience in any DBMS, preferably GCP/BigQuery Experience in creating Data models using Erwin tool Experience in shell scripting and python Understanding of git version control and basic git command Understanding of Data Quality concepts Even better if you have one or more of the following… Certification in GCP-Data Engineer. Understanding of NO SQL databases like Cassandra, Mongo etc Accuracy and attention to detail. Good problem solving, analytical, and research capabilities. Good verbal and written communication. Experience presenting to leaders and influencing stakeholders. #AI&D Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics. Apply Now Save Saved Open sharing options Share Related Jobs Engr III Spec-Data Engineering Save Chennai, India, +2 other locations Technology Senior Engineer Consultant-Data Engineering Save Hyderabad, India, +2 other locations Technology Engineer III Specialist-Data Engineering Save Chennai, India, +2 other locations Technology Shaping the future. Connect with the best and brightest to help innovate and operate some of the world’s largest platforms and networks.

Posted 2 weeks ago

Apply

0 years

6 - 9 Lacs

Chennai

Remote

GlassDoor logo

Chennai, India Hyderabad, India Bangalore, India Job ID: R-1058889 Apply prior to the end date: August 8th, 2025 When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What you’ll be doing... We’re seeking a skilled Data Engineering Analyst to join our high-performing team and propel our telecom business forward. You’ll contribute to building cutting-edge data products and assets for our wireless and wireline operations, spanning areas like consumer analytics, network performance, and service assurance. In this role, you will develop deep expertise in various telecom domains. As part of the Data Architecture & Strategy team, you’ll collaborate closely with IT and business stakeholders to design and implement user-friendly, robust data product solutions. This includes incorporating data classification and governance principles. Your responsibilities encompass Collaborating with stakeholders to understand data requirements and translate them into efficient data models Designing, developing, and implementing data architecture solutions on GCP and Teradata to support our Telecom business. Designing data ingestion for both real-time and batch processing, ensuring efficient and scalable data acquisition for creating an effective data warehouse. Maintaining meticulous documentation, including data design specifications, functional test cases, data lineage, and other relevant artifacts for all data product solution assets. Implementing data architecture standards, as set by the data architecture team. Proactively identifying opportunities for automation and performance optimization within your scope of work Collaborating effectively within a product-oriented organization, providing data expertise and solutions across multiple business units. Cultivating strong cross-functional relationships and establish yourself as a subject matter expert in data and analytics within the organization. What we’re looking for... You’re curious about new technologies and the game-changing possibilities it creates. You like to stay up-to-date with the latest trends and apply your technical expertise to solving business problems. You’ll need to have… Bachelor’s degree with four or more years of work experience. Four or more years of relevant work experience. Expertise in building complex SQLs to do data analysis to understand and design data solutions Experience with ETL, Data Warehouse concepts and Data Management life cycle. Experience in creating technical documentation such as Source to Target mapping, Source contract, SLA's etc. Experience in any DBMS, preferably GCP/BigQuery. Experience in creating Data models using Erwin tool. Experience in shell scripting and python. Knowledge of git version control and basic git command. Knowledge of Data Quality concepts. Even better if you have… Certification in GCP-Data Engineer. Understanding of NO SQL databases like Cassandra, Mongo etc Accuracy and attention to detail. Good problem solving, analytical, and research capabilities. Good verbal and written communication. Experience presenting to leaders and influencing stakeholders. If Verizon and this role sound like a fit for you, we encourage you to apply even if you don’t meet every “even better” qualification listed above. #AI&D Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics. Apply Now Save Saved Open sharing options Share Related Jobs Senior Engineer Consultant-Data Engineering Save Hyderabad, India, +2 other locations Technology Engineer III Specialist-Data Engineering Save Chennai, India, +2 other locations Technology Engr II-Data Engineering Save Chennai, India, +1 other location Technology Shaping the future. Connect with the best and brightest to help innovate and operate some of the world’s largest platforms and networks.

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Spaulding Ridge is an advisory and IT implementation firm. We help global organizations get financial clarity into the complex, daily sales, and operational decisions that impact profitable revenue generations, efficient operational performance, and reliable financial management. At Spaulding Ridge, we believe all business is personal. Core to our values is our relationships with our clients, our business partners, our team, and the global community. Our employees dedicate their time to helping our clients transform their business, from strategy through implementation and business transformation. What You Will Do And Learn As a Data Architect/ Manager in Data Solutions, you’ll be responsible for designing, implementing, and testing proposed modern analytic solutions. Working closely with our client partners and architects, you’ll develop relationships with key technical resources while delivering tangible business outcomes. Manage the Data engineering lifecycle including research, proof of concepts, architecture, design, development, test, deployment, and maintenance Collaborate with team members to design and implement technology that aligns with client business objectives Build proof of concepts for a modern analytics stack supporting a variety of Cloud-based Business Systems for potential clients Team management experience and ability to manage, mentor and develop talent of assigned junior resources Create actionable recommendations based on identified platform, structural and/or logic problems Communicate and demonstrate a clear understanding of client business needs, goals, and objectives Collaborate with other architects on solution designs and recommendations. Qualifications: 8+ years’ experience developing industry leading business intelligence and analytic solutions Must have thorough knowledge of data warehouse concepts and dimensional modelling Must have experience in writing advanced SQL Must have at least 5+ years of hands-on experience on DBT (Data Build Tool). Mandatory to have most recent hands-on experience on DBT. Must have experience working with DBT on one or more of the modern databases like Snowflake / Amazon Redshift / BigQuery / Databricks / etc. Hands-on experience with Snowflake would carry higher weightage Snowflake SnowPro Core certification would carry higher weightage Experience working in AWS, Azure, GCP or similar cloud data platform would be an added advantage Hands-on experience on Azure would carry higher weightage Must have experience in setting up DBT projects Must have experience in understanding / creating / modifying & optimizing YML files within DBT Must have experience in implementing and managing data models using DBT, ensuring efficient and scalable data transformations Must have experience with various materialization techniques within DBT Must have experience in writing & executing DBT Test cases Must have experience in setting up DBT environments Must have experience in setting up DBT Jobs Must have experience with writing DBT Jinja and Macros Must have experience in creating DBT Snapshots Must have experience in creating & managing incremental models using DBT Must have experience with DBT Docs Should have a good understanding of DBT Seeds Must have experience with DBT Deployment Must Experience with architecting data pipelines using DBT, utilizing advanced DBT features Proficiency in version control systems and CI/CD Must have hands-on experience configuring DBT with one or more version control systems like Azure DevOps / Github / Gitlab / etc. Must have experience in PR approval workflow Participate in code reviews and best practices for SQL and DBT development Experience working with visualization tools such as Tableau, PowerBI, Looker and other similar analytic tools would be an added advantage 2+ years of Business Data Analyst experience 2+ years of experience writing Business requirements, Use cases and/or user stories, for data warehouse or data mart initiatives. Understanding and experience on ETL/ELT is an added advantage 2+ years of consulting experience working on project-based delivery using Software Development Life Cycle (SDLC) 2+ years of years of experience with relational databases (Postgres, MySQL, SQL Server, Oracle, Teradata etc.) 2+ years of experience creating functional test cases and supporting user acceptance testing 2+ years of experience in Agile/Kanban/DevOps Delivery Outstanding analytical, communication, and interpersonal skillsAbility to manage projects and teams against planned work Responsible for managing the day-to-day client relationship on projects Spaulding Ridge’s Commitment to an Inclusive Workplace When we engage the expertise, insights, and creativity of people from all walks of life, we become a better organization, we deliver superior services to clients, and we transform our communities and world for the better. At Spaulding Ridge, we believe our team should reflect the rich diversity of society and we take seriously the responsibility to cultivate a workplace where every bandmate feels accepted, respected, and valued for who they are. We do this by creating a culture of trust and belonging, through practices and policies that support inclusion, and through our employee led Employee Resource Groups (ERGs): CRE (Cultural Race and Ethnicity), Women Elevate, PROUD and Mental Wellness Alliance. The company is committed to offering Equal Employment Opportunity and to providing reasonable accommodation to applicants with physical and/or mental disabilities. If you are interested in applying for employment with Spaulding Ridge and are in need of accommodation or special assistance to navigate our website or to complete your application, please send an e-mail with your request to our VP of Human Resources, Cara Halladay (challaday@spauldingridge.com). Requests for reasonable accommodation will be considered on a case-by-case basis. Qualified applicants will receive consideration for employment without regard to their age, race, religion, national origin, gender, sexual orientation, gender identity, protected veteran status or disability. Show more Show less

Posted 2 weeks ago

Apply

0 years

7 - 9 Lacs

Noida

On-site

GlassDoor logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. About EY GDS Global Delivery Services (GDS) is a dynamic and truly global delivery network. Across our six locations, we work with teams from all EY service lines, geographies and sectors, and play a vital role in the delivery of the EY growth strategy. We operate from six countries and sixteen cities: Argentina (Buenos Aires) China (Dalian) India (Bangalore, Chennai, Gurgaon, Hyderabad, Kochi, Kolkata, Mumbai, Noida, Trivandrum) Philippines (Manila) Poland (Warsaw and Wroclaw) UK (Manchester, Liverpool) Careers in EY Global Delivery Services Join a team of over 50,000 people, working across borders, to provide innovative and strategic business solutions to EY member firms around the world. Join one of our dynamic teams From accountants to coders, we offer a wide variety of fulfilling career opportunities that span all business disciplines Our Consulting practice provides differentiated focus on the key business themes to help our clients solve better questions around technology. Our vision is to be recognized as a leading provider of differentiated technology consulting services, harnessing new disruptive technology, alliances and attracting talented people to solve our clients' issues. It's an exciting time to join us and grow your career as a technology professional. A technology career is about far more than leading-edge innovations. It’s about the application of these technologies in the real world to make a real, meaningful impact. We are looking for highly motivated, articulate individuals who have the skills to the technology lifecycle and are passionate about designing innovative solutions to solve complex business problems. Your career in Consulting can span across these technology areas/ services lines: Digital Technologies: We are a globally integrated digital architecture and engineering team. Our mission is to deliver tailored, custom-built end to end solutions to our customers that are Digital, Cloud Native and Open Source. Our skills include Experience design, UI development, Design Thinking, Architecture & Design, Full stack development (.Net/ Java/ SharePoint/ Power Platform), Emerging Technologies like Block Chain, IoT, AR\VR, Drones, Cloud and DevSecOps. We use industrialized techniques, built on top of agile methods utilizing our global teams to deliver end to end solutions at best unit cost proposition. Testing Services: We are the yardstick of quality software product. We break something to make the product stronger and successful. We provide entire gamut of testing services including Busines / User acceptance testing. Hence this is a team with all round skills such as functional, technical and process. Data & Analytics : Data and Analytics is amongst the largest and most versatile practices within EY. Our sector and domain expertise combined with technical skills in data, cloud, advanced analytics and artificial intelligence differentiates us in the industry. Our talented team possesses cross-sector and cross-domain expertise and a wide array of skills in Information Management (IM), Business Intelligence (BI), Advance Analytics (AA) and Artificial Intelligence (AI) Oracle: We provide one-stop solution for end-to-end project implementation enabled by Oracle and IBM Products. We use proven methodologies, tools and accelerators to jumpstart and support large Risk and Finance Transformation. We develop solutions using various languages such as SQL or PL/ SQL, Java, Java Script, Python, IBM Maximo and other Oracle Utilities . We also provide consulting services for streamlining the current reporting process using various Enterprise Performance Management tools. SAP : By building on SAP’s S/4HANA digital core and cloud services, EY and SAP are working to help organizations leverage industry-leading technologies to improve operational performance. This collaboration helps drive digital transformation for our clients across areas including finance, human resources, supply chain and procurement. Our goal is to support clients as they initiate or undergo major transformation. Our capabilities span end-to-end solution implementation services from strategy and architecture to production deployment. EY supports clients in three main areas, Technology implementation support, Enterprise and Industry application implementation, Governance Risk Compliance (GRC) Technology. Banking and Capital Market Services : Banking and Capital Market Services companies are transforming their complex tax and finance functions with technologies such as AI and ML. With the right blend of core competencies, tax and finance personnel will shift to data, process and technology skills to service global clients on their Core Banking Platforms and support their business / digital transformation like Deposit system replacements, lending / leasing modernization, Cloud–native architecture (Containerization) etc. Wealth and Asset Management: We help our clients thrive in a transformative age by providing innovative services to global and domestic asset management clients to increase efficiency, effectiveness and manage the overall impact on bottom line profitability by leveraging the technology, data and digital teams. We do many operational efficiency programs and Technology Enabled Transformation to re-platform their front and Back offices with emerging technologies like AI, ML, Blockchain etc. Insurance Transformation: The current changing Macroeconomic trends continue to challenge Insurers globally. However, with disruptive technologies – including IoT, autonomous vehicles, Blockchain etc, we help companies through these challenges and create innovative strategies to transform their business through technology enabled transformation programs. We provide end to end services to Global P&C (General), Life and Health Insurers, Reinsurers and Insurance brokers. Cyber Security: The ever-increasing risk and complexity surrounding cybersecurity and privacy has put cybersecurity at the top of the agenda for senior management, the Board of Directors, and regulators. We help our clients to understand and quantify their cyber risk, prioritize investments, and embed security, privacy and resilience into every digitally-enabled initiative – from day one. Technology Risk : A practice that is a unique, industry-focused business unit that provides a broad range of integrated services where you’ll contribute technically to IT Risk and Assurance client engagements and internal projects. An important part of your role will be to actively establish, maintain and strengthen internal and external relationships. You’ll also identify potential business opportunities for EY within existing engagements and escalate these as appropriate. Similarly, you’ll anticipate and identify risks within engagements and share any issues with senior members of the team. Behavioral Competencies: Adaptive to team and fosters collaborative approach Innovative approach to the project, when required Shows passion and curiosity, desire to learn and can think digital Agile mindset and ability to multi-task Must have an eye for detail Skills needed: Should have understanding and/or experience of software development best practices and software development life cycle Understanding of one/more programming languages such as Java/ .Net/ Python, data analytics or databases such as SQL/ Oracle/ Teradata etc. Internship in a relevant technology domain will be an added advantage Qualification : BE - B. Tech / (IT/ Computer Science/ Circuit branches) Should have secured 60% and above No active Backlogs EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 2 weeks ago

Apply

0 years

6 - 9 Lacs

Calcutta

On-site

GlassDoor logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. About EY GDS Global Delivery Services (GDS) is a dynamic and truly global delivery network. Across our six locations, we work with teams from all EY service lines, geographies and sectors, and play a vital role in the delivery of the EY growth strategy. We operate from six countries and sixteen cities: Argentina (Buenos Aires) China (Dalian) India (Bangalore, Chennai, Gurgaon, Hyderabad, Kochi, Kolkata, Mumbai, Noida, Trivandrum) Philippines (Manila) Poland (Warsaw and Wroclaw) UK (Manchester, Liverpool) Careers in EY Global Delivery Services Join a team of over 50,000 people, working across borders, to provide innovative and strategic business solutions to EY member firms around the world. Join one of our dynamic teams From accountants to coders, we offer a wide variety of fulfilling career opportunities that span all business disciplines Our Consulting practice provides differentiated focus on the key business themes to help our clients solve better questions around technology. Our vision is to be recognized as a leading provider of differentiated technology consulting services, harnessing new disruptive technology, alliances and attracting talented people to solve our clients' issues. It's an exciting time to join us and grow your career as a technology professional. A technology career is about far more than leading-edge innovations. It’s about the application of these technologies in the real world to make a real, meaningful impact. We are looking for highly motivated, articulate individuals who have the skills to the technology lifecycle and are passionate about designing innovative solutions to solve complex business problems. Your career in Consulting can span across these technology areas/ services lines: Digital Technologies: We are a globally integrated digital architecture and engineering team. Our mission is to deliver tailored, custom-built end to end solutions to our customers that are Digital, Cloud Native and Open Source. Our skills include Experience design, UI development, Design Thinking, Architecture & Design, Full stack development (.Net/ Java/ SharePoint/ Power Platform), Emerging Technologies like Block Chain, IoT, AR\VR, Drones, Cloud and DevSecOps. We use industrialized techniques, built on top of agile methods utilizing our global teams to deliver end to end solutions at best unit cost proposition. Testing Services: We are the yardstick of quality software product. We break something to make the product stronger and successful. We provide entire gamut of testing services including Busines / User acceptance testing. Hence this is a team with all round skills such as functional, technical and process. Data & Analytics: Data and Analytics is amongst the largest and most versatile practices within EY. Our sector and domain expertise combined with technical skills in data, cloud, advanced analytics and artificial intelligence differentiates us in the industry. Our talented team possesses cross-sector and cross-domain expertise and a wide array of skills in Information Management (IM), Business Intelligence (BI), Advance Analytics (AA) and Artificial Intelligence (AI) Oracle: We provide one-stop solution for end-to-end project implementation enabled by Oracle and IBM Products. We use proven methodologies, tools and accelerators to jumpstart and support large Risk and Finance Transformation. We develop solutions using various languages such as SQL or PL/ SQL, Java, Java Script, Python, IBM Maximo and other Oracle Utilities. We also provide consulting services for streamlining the current reporting process using various Enterprise Performance Management tools. SAP : By building on SAP’s S/4HANA digital core and cloud services, EY and SAP are working to help organizations leverage industry-leading technologies to improve operational performance. This collaboration helps drive digital transformation for our clients across areas including finance, human resources, supply chain and procurement. Our goal is to support clients as they initiate or undergo major transformation. Our capabilities span end-to-end solution implementation services from strategy and architecture to production deployment. EY supports clients in three main areas, Technology implementation support, Enterprise and Industry application implementation, Governance Risk Compliance (GRC) Technology. Banking and Capital Market Services : Banking and Capital Market Services companies are transforming their complex tax and finance functions with technologies such as AI and ML. With the right blend of core competencies, tax and finance personnel will shift to data, process and technology skills to service global clients on their Core Banking Platforms and support their business / digital transformation like Deposit system replacements, lending / leasing modernization, Cloud–native architecture (Containerization) etc. Wealth and Asset Management: We help our clients thrive in a transformative age by providing innovative services to global and domestic asset management clients to increase efficiency, effectiveness and manage the overall impact on bottom line profitability by leveraging the technology, data and digital teams. We do many operational efficiency programs and Technology Enabled Transformation to re-platform their front and Back offices with emerging technologies like AI, ML, Blockchain etc. Insurance Transformation: The current changing Macroeconomic trends continue to challenge Insurers globally. However, with disruptive technologies – including IoT, autonomous vehicles, Blockchain etc, we help companies through these challenges and create innovative strategies to transform their business through technology enabled transformation programs. We provide end to end services to Global P&C (General), Life and Health Insurers, Reinsurers and Insurance brokers. Cyber Security : The ever-increasing risk and complexity surrounding cybersecurity and privacy has put cybersecurity at the top of the agenda for senior management, the Board of Directors, and regulators. We help our clients to understand and quantify their cyber risk, prioritize investments, and embed security, privacy and resilience into every digitally-enabled initiative – from day one. Technology Risk: A practice that is a unique, industry-focused business unit that provides a broad range of integrated services where you’ll contribute technically to IT Risk and Assurance client engagements and internal projects. An important part of your role will be to actively establish, maintain and strengthen internal and external relationships. You’ll also identify potential business opportunities for EY within existing engagements and escalate these as appropriate. Similarly, you’ll anticipate and identify risks within engagements and share any issues with senior members of the team. Behavioral Competencies: Adaptive to team and fosters collaborative approach Innovative approach to the project, when required Shows passion and curiosity, desire to learn and can think digital Agile mindset and ability to multi-task Must have an eye for detail Skills needed: Should have understanding and/or experience of software development best practices and software development life cycle Understanding of one/more programming languages such as Java/ .Net/ Python, data analytics or databases such as SQL/ Oracle/ Teradata etc. Internship in a relevant technology domain will be an added advantage Qualification: BE - B. Tech / (IT/ Computer Science/ Circuit branches) Should have secured 60% and above No active Backlogs EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 2 weeks ago

Apply

Exploring Teradata Jobs in India

Teradata is a popular data warehousing platform that is widely used by businesses in India. As a result, there is a growing demand for skilled professionals who can work with Teradata effectively. Job seekers in India who have expertise in Teradata have a wide range of opportunities available to them across different industries.

Top Hiring Locations in India

  1. Bengaluru
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

These cities are known for their thriving tech industries and have a high demand for Teradata professionals.

Average Salary Range

The average salary range for Teradata professionals in India varies based on experience levels. Entry-level roles can expect to earn around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15 lakhs per annum.

Career Path

In the field of Teradata, a typical career path may involve progressing from roles such as Junior Developer to Senior Developer, and eventually to a Tech Lead position. With experience and skill development, professionals can take on more challenging and higher-paying roles in the industry.

Related Skills

In addition to Teradata expertise, professionals in this field are often expected to have knowledge of SQL, data modeling, ETL tools, and data warehousing concepts. Strong analytical and problem-solving skills are also essential for success in Teradata roles.

Interview Questions

  • What is Teradata and how is it different from other database management systems? (basic)
  • Can you explain the difference between a join and a merge in Teradata? (medium)
  • How would you optimize a Teradata query for performance? (medium)
  • What are fallback tables in Teradata and why are they important? (advanced)
  • How do you handle duplicate records in Teradata? (basic)
  • What is the purpose of a collect statistics statement in Teradata? (medium)
  • Explain the concept of indexing in Teradata. (medium)
  • How does Teradata handle concurrency control? (advanced)
  • Can you describe the process of data distribution in Teradata? (medium)
  • What are the different types of locks in Teradata and how are they used? (advanced)
  • How would you troubleshoot performance issues in a Teradata system? (medium)
  • What is a Teradata View and how is it different from a Table? (basic)
  • How do you handle NULL values in Teradata? (basic)
  • Can you explain the difference between FastLoad and MultiLoad in Teradata? (medium)
  • What is the Teradata Parallel Transporter? (advanced)
  • How do you perform data migration in Teradata? (medium)
  • Explain the concept of fallback protection in Teradata. (advanced)
  • What are the different types of Teradata macros and how are they used? (advanced)
  • How do you monitor and manage Teradata performance? (medium)
  • What is the purpose of the Teradata QueryGrid? (advanced)
  • How do you optimize the storage of data in Teradata? (medium)
  • Can you explain the concept of Teradata indexing strategies? (advanced)
  • How do you handle data security in Teradata? (medium)
  • What are the best practices for Teradata database design? (medium)
  • How do you ensure data integrity in a Teradata system? (medium)

Closing Remark

As you prepare for interviews and explore job opportunities in Teradata, remember to showcase your skills and experience confidently. With the right preparation and determination, you can land a rewarding role in the dynamic field of Teradata in India. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies