Jobs
Interviews

1357 Teradata Jobs - Page 32

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

2 - 3 Lacs

Hyderābād

On-site

Job Description: About Us At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Process Overview DAIT is Data Analytics and Insights Technology provides end-to-end technology solutions for multiple line of business. Job Description The individual will be a part of Production support L2 team – Batch Operations with technical expertise in – Hadoop/Teradata/Datastage/Autosys/Linux. Responsible for platform stability, proactive application and job monitoring, issue management & resolution, triage, reporting and timely escalation. Responsible for break-fix activities requiring to review Root Cause Analysis, Small Changes to Code, Unit Test results and help to Deploy in production following the release management & code deployment process. The ideal candidate must be highly self-motivated, proactive, attention to detail, good documentation & communication skills to interact with partners like TI, Application, Other Prod Support teams like CCO, L1, L2, L3, Application and Business stakeholders as required. Ability to think of process improvements to improve platform stability and resiliency. Responsibilities Monitor and support applications for 100% SLA meets On call support Production Ticket/Issue Triage Preparing RCA – Root Cause Analysis (RCA) document Partner with Application team, CCO, L1, Level 2 support teams to resolve the issue Prepare and/or review Impact Analysis based on issue analysis Hands on experience with Batch Ops(L1/L2) and L3 support work load. Write scripts to automate mundane daily BAU tasks Willing to provide support after office hours, weekends and stay on call when business needs Identify root cause in the code, perform break-fix activities in the code and/or DB Work on addition projects for improving production efficiency as well as reducing risk Requirements Education: B.E. / B. Tech/M.E. /M. Tech/B.Sc./M.Sc./BCA/MCA (prefer IT/CS specialization) Certifications, If Any: BFSI Domain certifications (Not Mandatory) Experience Range: 6-10 years Foundational skills: Experience in Bigdata (Hadoop) Experience in UNIX and shell scripting. Experience in ETL (Datastage/Informatica). Experience in Database (Oracle/Exadata), Teradata , DB2 Experience in Job scheduling tools like Autosys Aware of ITIL concepts like Incident and Problem Management Experience in application development or production support.( preferably in Batch Processing , scheduling , monitoring , triaging. Desired skills: Experience with Hadoop architecture ,HIVE , Impala, coding in Python , Experience in Datastage 11.7 and above Working Experience with SQL , Teradata, Oracle ,DB2 . Work Timings: 06:30 a.m. to 03:30 p.m. and 11:30 a.m. to 08:30 p.m, Job Location: Chennai

Posted 1 month ago

Apply

0 years

3 - 9 Lacs

Indore

On-site

AV-281592 INDORE,Madhya Pradesh,India Vollzeit Unbefristet Global Business Services DHL INFORMATION SERVICES (INDIA) LLP Your IT Future, Delivered Solutions Architect With a global team of 5600+ IT professionals, DHL IT Services connects people and keeps the global economy running by continuously innovating and creating sustainable digital solutions. We work beyond global borders and push boundaries across all dimensions of logistics. You can leave your mark shaping the technology backbone of the biggest logistics company of the world. All Our locations have earned #GreatPlaceToWork certification, reflecting our commitment to exceptional employee experiences. Digitalization. Simply delivered. At IT Services, we are passionate about Solution Architect in datawarehouse and business intelligence space. Our Customer Service Complex Data Solution team is continuously expanding. No matter your level of Solution Architect proficiency, you can always grow within our diverse environment. #DHL #DHLITServices #GreatPlace #ppmt #Kart #cscombine Grow together. We strive to deliver efficient and optimized business solutions in the Area of Customer Service Complex Data Solutions for our business. You will work as Solutions Architect for existing and new applications to provide end to end Architecture expertise on wide range of technologies like Azure Cloud, Python, Snowflake, Teradata, Power BI, Matillion and many more. You will be our main Architect providing guidance and direction on the implementation of Application Solutioning & Design, Analytics, Data Warehousing & Reporting products. You will ensure that the Analytics & Reporting solutions meets the required performance benchmark and adheres to standards & guidelines. You will guide the development team with technical expertise for ensuring business requirements are implemented as expected. This would mean you sometime have to get down to coding and provide a solution or high-level approach to achieve the requirement to give direction to the Dev Team. You will work with project teams to ensure Business Requirements are delivered keeping in mind the end-to-end Solution & Application/Data Architecture. You will get to work with some of the complex data structures that will need your expertise to Data Modelling & Design. You will be involved in optimizing the performance and resource utilization of the existing solutions. As a senior member in the team, you will collaborate with business users on Requirements and ensure that the requirements are well defined before assigning for development. Lead discussion with Business during UAT Defects review. You will be working on latest technologies like Snowflake, Matilllion, Teradata, ERWIN, Microservices, Data pipelines, Jenkins, Jira/Confluence, Splunk etc. You will get ample opportunities to grow within the organization and with focus on continuous learning will get opportunity to work & learn many different technologies. Ready to embark on the journey? Here’s what we are looking for: As a Solution Architect, you are well versed in architecture design, software development experiences especially in Python, familiarity of development framework and also analytics and problem solving skills. Having excellent skills in understanding the latest technology relation to the business knowledge of customer service experience is a huge plus. Very good knowledge of data modeling will also be an integral part of this role and experience in implementation of customer facing application. Been part of the Agile / Scrum team experience is useful. You are a business intelligence technology aficionado, therefore you have a good understanding of latest analytics skill sets and experience in implementation of MVP and POC rapid prototyping experience is good to have also in the AI space of new technology adoptions. You are able to work independently prioritize and organize your tasks under time and workload pressure. Working in a multinational environment, you can expect cross-region collaboration with teams around the globe, thus being advanced in spoken and written English will be certainly useful. Basic certification / knowledge of AWS / Azure/ Snowflake/ Teradata/ Power BI related too is a plus. An array of benefits for you: Hybrid work arrangements to balance in-office collaboration and home flexibility. Annual Leave: 42 days off apart from Public / National Holidays. Medical Insurance: Self + Spouse + 2 children. An option to opt for Voluntary Parental Insurance (Parents / Parent -in-laws) at a nominal premium covering pre existing disease. In House training programs: professional and technical training certifications.

Posted 1 month ago

Apply

0 years

0 Lacs

Bangalore Urban, Karnataka, India

On-site

Data Modeller JD We are seeking a skilled Data Modeller to join our Corporate Banking team. The ideal candidate will have a strong background in creating data models for various banking services, including Current Account Savings Account (CASA), Loans, and Credit Services. This role involves collaborating with the Data Architect to define data model structures within a data mesh environment and coordinating with multiple departments to ensure cohesive data management practices. Data Modelling: oDesign and develop data models for CASA, Loan, and Credit Services, ensuring they meet business requirements and compliance standards. Create conceptual, logical, and physical data models that support the bank's strategic objectives. Ensure data models are optimized for performance, security, and scalability to support business operations and analytics. Collaboration With Data Architect Work closely with the Data Architect to establish the overall data architecture strategy and framework. Contribute to the definition of data model structures within a data mesh environment. Data Quality And Governance Ensure data quality and integrity in the data models by implementing best practices in data governance. Assist in the establishment of data management policies and standards. Conduct regular data audits and reviews to ensure data accuracy and consistency across systems. Data Modelling Tools: ERwin, IBM InfoSphere Data Architect, Oracle Data Modeler, Microsoft Visio, or similar tools. Databases: SQL, Oracle, MySQL, MS SQL Server, PostgreSQL, Neo4j Graph Data Warehousing Technologies: Snowflake, Teradata, or similar. ETL Tools: Informatica, Talend, Apache NiFi, Microsoft SSIS, or similar. Big Data Technologies: Hadoop, Spark (optional but preferred). Technologies: Experience with data modelling on cloud platforms Microsoft Azure (Synapse, Data Factory) Show more Show less

Posted 1 month ago

Apply

8.0 years

0 Lacs

Kochi, Kerala, India

On-site

Introduction Software Developers at IBM are the backbone of our strategic initiatives to design, code, test, and provide industry-leading solutions that make the world run today - planes and trains take off on time, bank transactions complete in the blink of an eye and the world remains safe because of the work our software developers do. Whether you are working on projects internally or for a client, software development is critical to the success of IBM and our clients worldwide. At IBM, you will use the latest software development tools, techniques and approaches and work with leading minds in the industry to build solutions you can be proud of Your Role And Responsibilities This Candidate is responsible for DB2 installation and configuration on the below environments. On Prem Multi Cloud Redhat Open shift Cluster HADR Non-DPF and DPF. Migration of other databases to Db2(eg TERADATA / SNOWFLAKE / SAP/ Cloudera to db2 migration) Create high-level designs, detail level designs, maintaining product roadmaps which includes both modernization and leveraging cloud solutions Design scalable, performant, and cost-effective data architectures within the Lakehouse to support diverse workloads, including reporting, analytics, data science, and AI/ML. Perform health check of the databases, make recommendations and deliver tuning. At the Database and system level. Deploy DB2 databases as containers within Red Hat OpenShift clusters Configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Lead the architectural design and implementation of solutions on IBM watsonx.data, ensuring alignment with overall enterprise data strategy and business objectives. Define and optimize the watsonx.data ecosystem, including integration with other IBM watsonx components (watsonx.ai, watsonx.governance) and existing data infrastructure (DB2, Netezza, cloud data sources) Establish best practices for data modeling, schema evolution, and data organization within the watsonx.data lakehouse Act as a subject matter expert on Lakehouse architecture, providing technical leadership and guidance to data engineering, analytics, and development teams. Mentor junior architects and engineers, fostering their growth and knowledge in modern data platforms. Participate in the development of architecture governance processes and promote best practices across the organization. Communicate complex technical concepts to both technical and non-technical stakeholders. Required Technical And Professional Expertise 8+years of experience in data architecture, data engineering, or a similar role, with significant hands-on experience in cloud data platforms Strong proficiency in DB2, SQL and Python. Strong understanding of: Database design and modelling(dimensional, normalized, NoSQL schemas) Normalization and indexing Data warehousing and ETL processes Cloud platforms (AWS, Azure, GCP) Big data technologies (e.g., Hadoop, Spark) Database Migration project experience from one database to another database (target database Db2). Experience in deployment of DB2 databases as containers within Red Hat OpenShift clusters and configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Excellent communication, collaboration, problem-solving, and leadership skills Preferred Technical And Professional Experience Experience with machine learning environments and LLMs Certification in IBM watsonx.data or related IBM data and AI technologies Hands-on experience with Lakehouse platform (e.g., Databricks, Snowflake) Having exposure to implement or understanding of DB replication process Experience with integrating watsonx.data with GenAI or LLM initiatives (e.g., RAG architectures). Experience with NoSQL databases (e.g., MongoDB, Cassandra). Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Req ID: 328481 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Business Intelligence Senior Analyst to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Work as ETL developer using Oracle SQL , PL/SQL, Informatica, Linux Scripting , Tidal ETL code development, unit testing, source code control, technical specification writing and production implementation. Develop and deliver ETL applications using Informatica , Teradata, Oracle , Tidal and Linux. Interact with BAs and other teams for requirement clarification, query resolution, testing and Sign Offs Developing software that conforms to a design model within its constraints Preparing documentation for design, source code, unit test plans. Ability to work as part of a Global development team. Should have good knowledge of healthcare domain and Data Warehousing concepts About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here . Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Req ID: 328478 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Business Intelligence Senior Analyst to join our team in noida, Uttar Pradesh (IN-UP), India (IN). Work as ETL developer using Oracle SQL , PL/SQL, Informatica, Linux Scripting , Tidal ETL code development, unit testing, source code control, technical specification writing and production implementation. Develop and deliver ETL applications using Informatica , Teradata, Oracle , Tidal and Linux. Interact with BAs and other teams for requirement clarification, query resolution, testing and Sign Offs Developing software that conforms to a design model within its constraints Preparing documentation for design, source code, unit test plans. Ability to work as part of a Global development team. Should have good knowledge of healthcare domain and Data Warehousing concepts About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here . Show more Show less

Posted 1 month ago

Apply

4.0 - 12.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Role: Data Scientist with Gen AI Shift Time – 2.00 PM – 10.00 PM Location – Bangalore/Hyderabad/Chennai Experience – 4 to 12 Years Work Mode – Hybrid (3 Work from Office) Notice Period – Immediate – 10 days Mandatory Skills – Data Scientist, Gen AI, RAG, LLM, (Python OR Java), (Oracle OR SQL). Required Skills · B.S/BTech in a Science, Technology, Engineering, Mathematics (STEM) or Economics related field of study. · 3 or more years with relational or NoSQL databases (Oracle, Teradata, SQL Server, Hadoop, ELK, etc.) · 3 or more years working with languages such as R, Python or Java · 3 or more years working Gen AI ( LLMs, RAG patterns, Foundational models) · 3 or more years working with advanced statistical methods such as regressions, classifiers, recommenders, anomaly detection, optimization algorithms, tree methods, neural nets, etc. · Experience presenting and delivering results to business partners TIAA reviews/approvals Show more Show less

Posted 1 month ago

Apply

8.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Bea able to align data models with business goals and enterprise architecture Collaborate with Data Architects, Engineers, Business Analysts, and Leadership teams Lead data modelling, governance discussions and decision-making across cross-functional teams Proactively identify data inconsistencies, integrity issues, and optimization opportunities Design scalable and future-proof data models Define and enforce enterprise data modelling standards and best practices Experience working in Agile environments (Scrum, Kanban) Identify impacted applications, size capabilities, and create new capabilities Lead complex initiatives with multiple cross-application impacts, ensuring seamless integration Drive innovation, optimize processes, and deliver high-quality architecture solutions Understand business objectives, review business scenarios, and plan acceptance criteria for proposed solution architecture Discuss capabilities with individual applications, resolve dependencies and conflicts, and reach agreements on proposed high-level approaches and solutions Participate in Architecture Review, present solutions, and review other solutions Work with Enterprise architects to learn and adopt standards and best practices Design solutions adhering to applicable rules and compliances Stay updated with the latest technology trends to solve business problems with minimal change or impact Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Undergraduate degree or equivalent experience 8+ years of proven experience in a similar role, leading and mentoring a team of architects and technical leads Extensive experience with Relational, Dimensional, and NoSQL Data Modelling Experience in driving innovation, optimizing processes, and delivering high-quality solutions Experience in large scale OLAP, OLTP, and hybrid data processing systems Experience in complex initiatives with multiple cross-application impacts Expert in Erwin for Conceptual, Logical, and Physical Data Modelling Expertise in Relational Databases, SQL, indexing and partitioning for databases like Teradata, Snowflake, Azure Synapse or traditional RDBMS Expertise in ETL/ELT architecture, data pipelines, and integration strategies Expertise in Data Normalization, Denormalization and Performance Optimization Exposure to cloud platforms, tools, and AI-based solutions Solid knowledge of 3NF, Star Schema, Snowflake schema, and Data Vault Knowledge of Java, Python, Spring, Spring boot framework, SQL, Mongo DBS, KAFKA, React JS, Dynatrace, Power BI kind of exposure Knowledge of Azure Platform as a Service (PaaS) offerings (Azure Functions, App Service, Event grid) Good knowledge of the latest happenings in the technology world Advanced SQL skills for complex queries, stored procedures, indexing, partitioning, macros, recursive queries, query tuning and OLAP functions Understanding of Data Privacy Regulations, Master Data Management, and Data Quality Proven excellent communication and leadership skills Proven ability to think from a long-term perspective and arrive at intentional and strategic architecture Proven ability to provide consistent solutions across Lines of Business (LOB) At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

There is a job opening for Data Analyst in Tata Consultancy Services Experience - 5+ Location - Mumbai JD -- Skills -- Teradata/SQL/ Python with Data Transformation, Implementation, Data Management framework • Understanding and clarifying the business need/opportunity/problem • Data Collection and Preparation as needed for analysis • Perform data mining and exploratory data analysis to Identify patterns, trends, and outliers within datasets • Create visualizations (charts, graphs, dashboards) to communicate findings. • Develop reports and presentations to present findings to stakeholders. • Develop recommendations based on data findings to improve business performance. • Communicate findings and recommendations to stakeholders Show more Show less

Posted 1 month ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III at JPMorgan Chase within the Consumer and community banking- Data technology, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives. Job Responsibilities Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Develops secure high-quality production code, and reviews and debugs code written by others Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems Leads evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies Required Qualifications, Capabilities, And Skills Formal training or certification on software engineering concepts and 3+ years applied experience Experience in software engineering, including hands-on expertise in ETL/Data pipeline and data lake platforms like Teradata and Snowflake Hands-on practical experience delivering system design, application development, testing, and operational stability Proficiency in AWS services especially in Aurora Postgres RDS Proficiency in automation and continuous delivery methods Proficient in all aspects of the Software Development Life Cycle Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated proficiency in software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) In-depth knowledge of the financial services industry and their IT systems Preferred Qualifications, Capabilities, And Skills Experience in re-engineering and migrating on-premises data solutions to and for the cloud Experience in Infrastructure as Code (Terraform) for Cloud based data infrastructure Experience in building on emerging cloud serverless managed services, to minimize/eliminate physical/virtual server footprint Advanced in Java plus Python (nice to have) ABOUT US Show more Show less

Posted 1 month ago

Apply

6.0 - 9.0 years

7 - 11 Lacs

Bengaluru

Work from Office

We are seeking a GCP Data & Cloud Engineer with strong expertise in Google Cloud Platform services, including BigQuery, Cloud Run, Cloud Storage , and Pub/Sub . The ideal candidate will have deep experience in SQL coding , data pipeline development, and deploying cloud-native solutions. Key Responsibilities: Design, implement, and optimize scalable data pipelines and services using GCP Build and manage cloud-native applications deployed via Cloud Run Develop complex and performance-optimized SQL queries for analytics and data transformation Manage and automate data storage, retrieval, and archival using Cloud Storage Implement event-driven architectures using Google Pub/Sub Work with large datasets in BigQuery , including ETL/ELT design and query optimization Ensure security, monitoring, and compliance of cloud-based systems Collaborate with data analysts, engineers, and product teams to deliver end-to-end cloud solutions Required Skills & Experience: 3+ years of experience working with Google Cloud Platform (GCP) Strong proficiency in SQL coding , query tuning, and handling complex data transformations Hands-on experience with: BigQuery Cloud Run Cloud Storage Pub/Sub Understanding of data pipeline and ETL/ELT workflows in cloud environments Familiarity with containerized services and CI/CD pipelines Experience in scripting languages (e.g., Python, Shell) is a plus Strong analytical and problem-solving skills

Posted 1 month ago

Apply

4.0 - 6.0 years

5 - 9 Lacs

Chennai

Work from Office

Job Title: GCP Teradata Engineer Location: Chennai, Bangalore, Hyderabad Experience: 4-6 Years Job Summary: We are seeking a GCP Data & Cloud Engineer with strong expertise in Google Cloud Platform services, including BigQuery, Cloud Run, Cloud Storage , and Pub/Sub . The ideal candidate will have deep experience in SQL coding , data pipeline development, and deploying cloud-native solutions. Key Responsibilities: Design, implement, and optimize scalable data pipelines and services using GCP Build and manage cloud-native applications deployed via Cloud Run Develop complex and performance-optimized SQL queries for analytics and data transformation Manage and automate data storage, retrieval, and archival using Cloud Storage Implement event-driven architectures using Google Pub/Sub Work with large datasets in BigQuery , including ETL/ELT design and query optimization Ensure security, monitoring, and compliance of cloud-based systems Collaborate with data analysts, engineers, and product teams to deliver end-to-end cloud solutions Required Skills & Experience: 4 years of experience working with Google Cloud Platform (GCP) Strong proficiency in SQL coding , query tuning, and handling complex data transformations Hands-on experience with: BigQuery Cloud Run Cloud Storage Pub/Sub Understanding of data pipeline and ETL/ELT workflows in cloud environments Familiarity with containerized services and CI/CD pipelines Experience in scripting languages (e.g., Python, Shell) is a plus Strong analytical and problem-solving skills

Posted 1 month ago

Apply

2.0 - 4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

The Applications Development Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: Utilize knowledge of applications development procedures and concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements Identify and analyze issues, make recommendations, and implement solutions Utilize knowledge of business processes, system processes, and industry standards to solve complex issues Analyze information and make evaluative judgements to recommend solutions and improvements Conduct testing and debugging, utilize script tools, and write basic code for design specifications Assess applicability of similar experiences and evaluate options under circumstances not covered by procedures Develop working knowledge of Citi’s information systems, procedures, standards, client server application development, network operations, database administration, systems administration, data center operations, and PC-based applications Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Job Description: The role requires very advanced level skills and very good working knowledge in SQL with Tableau (BI Visualization). Resource should capable to do the performance tuning on SQL and ready to support as L3 Support Role for all data related concerns and reporting server issues. Require very good knowledge in developing tableau dashboards and reports Past experience on Databases like Netezza(NPS), Teradata, Oracle, Bigdata Hadoop, Unix Script Utilize knowledge of applications development procedures and concepts and other technical aspects to identify and define requirements to enhance the system Identify and analyze issues, make recommendations, and implement solutions Utilize knowledge of business processes, system processes, and industry standards to solve complex issues Resource should have experience on SDLC Deployment Cycle and Agile Methodology Good conceptual and working knowledge of DEV Ops Tools such as : Jira, Bitbucket, Jenkins etc... Candidate will extend support in release timing during weekend. Also candidate willing to upskills in new techstack/skills for project need Qualifications: 2-4 years of relevant experience as Tableau Developer with strong SQL query writing skills. Profile who have the strong knowledge on Basic/Advanced SQL in any Databases. Require Tableau BI Visualization tool experience Database knowledge on Netezza(NPS), Teradata, Oracle, Bigdata Hadoop, Unix Script Experience in programming/debugging used in business applications Working knowledge of industry practice and standards Comprehensive knowledge of specific business area for application development Working knowledge of program languages Consistently demonstrates clear and concise written and verbal communication Professional SQL/Tableau Certifications are good to have. Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 1 month ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Design,develop,maintain applications using Cobol,JCL,SQL. Collaborate with cross-functional teams to define,design,ship new features. Troubleshoot/resolve application issues/bugs Design, implement, and optimize SQL queries and database structures. Required Candidate profile Ensure the performance, quality, and responsiveness of applications using Cobol, JCL Develop, maintain, and support mainframe applications using COBOL, JCL Location Hyderabad,Pune,Bengaluru,Chennai

Posted 1 month ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Chennai

Work from Office

An ETL Tester (4+ Years Must) is responsible for testing and validating the accuracy and completeness of data being extracted, transformed, and loaded (ETL) from various sources into the target systems. These target systems can be On Cloud or on Premise They work closely with ETL developers, Data engineers, data analysts, and other stakeholders to ensure the quality of data and the reliability of the ETL processes Understand Cloud architecture and design test strategies for data moving in and out of Cloud systems Roles And Responsibilities Strong in Data warehouse testing - ETL and BI Strong Database Knowledge Oracle/ SQL Server/ Teradata / Snowflake Strong SQL skills with experience in writing complex data validation SQLs Experience working in Agile environment Experience creating test strategy, release level test plan and test cases Develop and Maintain test data for ETL testing Design and Execute test cases for ETL processes and data integration Good Knowledge of Rally, Jira and HP ALM Experience in Automation testing and data validation using Python Document test results and communicate with stakeholders on the status of ETL testing Skills: rally,agile environment,automation testing,data validation,jira,etl and bi,hp alm,etl testing,test strategy,data warehouse,data integration testing,test case creation,python,oracle/ sql server/ teradata / snowflake,sql,data warehouse testing,database knowledge,test data maintenance

Posted 1 month ago

Apply

8.0 - 12.0 years

22 - 27 Lacs

Indore, Chennai

Work from Office

We are hiring a Senior Python DevOps Engineer to develop scalable apps using Flask/FastAPI, automate CI/CD, manage cloud and ML workflows, and support containerized deployments in OpenShift environments. Required Candidate profile 8+ years in Python DevOps with expertise in Flask, FastAPI, CI/CD, cloud, ML workflows, and OpenShift. Skilled in automation, backend optimization, and global team collaboration.

Posted 1 month ago

Apply

3.0 - 10.0 years

15 - 16 Lacs

Hyderabad

Work from Office

As a Software Engineer III at JPMorgan Chase within the Consumer and community banking- Data technology, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm s business objectives. Job responsibilities Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Develops secure high-quality production code, and reviews and debugs code written by others Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems Leads evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 3+ years applied experience Experience in software engineering, including hands-on expertise in ETL/Data pipeline and data lake platforms like Teradata and Snowflake Hands-on practical experience delivering system design, application development, testing, and operational stability Proficiency in AWS services especially in Aurora Postgres RDS Proficiency in automation and continuous delivery methods Proficient in all aspects of the Software Development Life Cycle Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated proficiency in software applications and technical processes within a technical discipline (e. g. , cloud, artificial intelligence, machine learning, mobile, etc. ) In-depth knowledge of the financial services industry and their IT systems Preferred qualifications, capabilities, and skills Experience in re-engineering and migrating on-premises data solutions to and for the cloud Experience in Infrastructure as Code (Terraform) for Cloud based data infrastructure Experience in building on emerging cloud serverless managed services, to minimize/eliminate physical/virtual server footprint Advanced in Java plus Python (nice to have)

Posted 1 month ago

Apply

3.0 - 10.0 years

15 - 16 Lacs

Hyderabad

Work from Office

We have an opportunity to impact your career and provide an adventure where you can push the limits of whats possible. As a Lead Software Engineer at JPMorgan Chase within the Consumer and community banking - Data technology, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm s business objectives. Job responsibilities Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Develops secure high-quality production code, and reviews and debugs code written by others Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems Leads evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies Becomes a technical mentor in the team Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience Experience in software engineering, including hands-on expertise in ETL/Data pipeline and data lake platforms like Teradata and Snowflake Hands-on practical experience delivering system design, application development, testing, and operational stability Proficiency in AWS services especially in Aurora Postgres RDS Proficiency in automation and continuous delivery methods Proficient in all aspects of the Software Development Life Cycle Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated proficiency in software applications and technical processes within a technical discipline (e. g. , cloud, artificial intelligence, machine learning, mobile, etc. ) In-depth knowledge of the financial services industry and their IT systems Preferred qualifications, capabilities, and skills Experience in re-engineering and migrating on-premises data solutions to and for the cloud Experience in Infrastructure as Code (Terraform) for Cloud based data infrastructure Experience in building on emerging cloud serverless managed services, to minimize/eliminate physical/virtual server footprint Advanced in Java plus Python (nice to have)

Posted 1 month ago

Apply

6.0 - 8.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Our Company Teradata is the connected multi-cloud data platform for enterprise analytics company. Our enterprise analytics solve business challenges from start to scale. Only Teradata gives you the flexibility to handle the massive and mixed data workloads of the future, today. The Teradata Vantage architecture is cloud native, delivered as-a-service, and built on an open ecosystem. These design features make Vantage the ideal platform to optimize price performance in a multi-cloud environment. What You’ll Do This Role will mainly be working as part of Change Ops for the Cloud Ops L2 team, which is eventually responsible for all Changes, across AWS, Azure & Google Cloud Platforms. Few core responsibilities though not limited to would be as below. The Cloud Ops Administrator is responsible for managing Teradata’s as-a-Service offering on public cloud (AWS/Azure/GC) Delivery responsibilities in the areas of cloud network administration, security administration, instantiation, provisioning, optimizing the environment, third party software support. Supporting the onsite teams with migration from On premise to Cloud for customers Implementing security best practices, and analyzing the partner compatibility Manages and coordinates all activities necessary to implement the Changes in the environment. Ensures Change status, progress and issues are communicated to the appropriate groups. Views and implements the process lifecycle and reports to upper management. Evaluates performance metrics against the critical success factors and assures actions for streamline the process. Perform Change related activities documented in the Change Request to ensure the Change is implemented according to plan Document closure activities in the Change record and completing the Change record Escalate any deviations from plans to appropriate TLs/Managers Provide input for the ongoing improvement of the Change Management process Manage and support 24x7 VaaS environments for multiple customers. Devise and implement security, operations best practices. Implementing development, production environment for data warehousing cloud environment Backup, Archive and Recovery planning and execution of the cloud-based data warehouses across all the platforms AWS/Azure/GC resources. Ensuring SLA are met during implementing the change Ensure all scheduled changes are implemented within the prescribed window First level of escalation for team members First level of help/support for team members Who You’ll Work This Role will mainly be working as part of Change Ops for the Cloud Ops L2 team, which is eventually responsible for all Cases, Incidents, Changes, across Azure & Google Cloud Platforms. This will be reporting into Delivery Manager for Change Ops What Makes You a Qualified Candidate Minimum 6-8 years of IT experience in a Systems Administrator / Engineer role. Minimum 4 years of Cloud hands-on experience (Azure/AWS/GCP). Cloud Certification ITIL or other relevant certifications are desirable Service Now / ITSM tool day to day Operations Must be willing to provide 24x7 on-call support on a rotational basis with the team. Must be willing to travel – both short-term and long-term What You’ll Bring 4 Year Engineering Degree or 3 Year Masters of Computer Application. Excellent oral and written communication skills in the English language Must be willing to provide 24x7 on-call support on a rotational basis with the team. Must be willing to travel – both short-term and long-term Teradata/DBMS Experience Hands on experience with Teradata administration and strong understanding of Cloud capabilities and limitations Thorough understanding of Cloud Computing: virtualization technologies, Infrastructure as a Service, Platform as a Service and Software as a Service Cloud delivery models and the current competitive landscape Implement and support new and existing customers on VaaS infrastructure. Thorough understanding of infrastructure (firewalls, load balancers, hypervisor, storage, monitoring, security etc. ) and have experience with orchestration to develop a cloud solution. Should have good knowledge of cloud services for Compute, Storage, Network and OS for at least one of the following cloud platforms: Azure Managed responsibilities as a Shift lead Should have experience in Enterprise VPN and Azure virtual LAN with data center Knowledge of monitoring, logging and cost management tools Hands-on experience with database architecture/modeling, RDBMS and No-SQL. Should have good understanding of data archive/restore policies. Teradata Basic If certified with VMware skills will be added advantage. Working experience in Linux administration, Shell Scripting. Working experience on any of the RDBMS like Oracle//DB2/Netezza/Teradata/SQL Server,MySQL. Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less

Posted 1 month ago

Apply

6.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Our Company Teradata is the connected multi-cloud data platform for enterprise analytics company. Our enterprise analytics solve business challenges from start to scale. Only Teradata gives you the flexibility to handle the massive and mixed data workloads of the future, today. The Teradata Vantage architecture is cloud native, delivered as-a-service, and built on an open ecosystem. These design features make Vantage the ideal platform to optimize price performance in a multi-cloud environment. What You’ll Do This Role will mainly be working as part of Change Ops for the Cloud Ops L2 team, which is eventually responsible for all Changes, across AWS, Azure & Google Cloud Platforms. Few core responsibilities though not limited to would be as below. The Cloud Ops Administrator is responsible for managing Teradata’s as-a-Service offering on public cloud (AWS/Azure/GC) Delivery responsibilities in the areas of cloud network administration, security administration, instantiation, provisioning, optimizing the environment, third party software support. Supporting the onsite teams with migration from On premise to Cloud for customers Implementing security best practices, and analyzing the partner compatibility Manages and coordinates all activities necessary to implement the Changes in the environment. Ensures Change status, progress and issues are communicated to the appropriate groups. Views and implements the process lifecycle and reports to upper management. Evaluates performance metrics against the critical success factors and assures actions for streamline the process. Perform Change related activities documented in the Change Request to ensure the Change is implemented according to plan Document closure activities in the Change record and completing the Change record Escalate any deviations from plans to appropriate TLs/Managers Provide input for the ongoing improvement of the Change Management process Manage and support 24x7 VaaS environments for multiple customers. Devise and implement security, operations best practices. Implementing development, production environment for data warehousing cloud environment Backup, Archive and Recovery planning and execution of the cloud-based data warehouses across all the platforms AWS/Azure/GC resources. Ensuring SLA are met during implementing the change Ensure all scheduled changes are implemented within the prescribed window First level of escalation for team members First level of help/support for team members Who You’ll Work This Role will mainly be working as part of Change Ops for the Cloud Ops L2 team, which is eventually responsible for all Cases, Incidents, Changes, across Azure & Google Cloud Platforms. This will be reporting into Delivery Manager for Change Ops What Makes You a Qualified Candidate Minimum 6-8 years of IT experience in a Systems Administrator / Engineer role. Minimum 4 years of Cloud hands-on experience (Azure/AWS/GCP). Cloud Certification ITIL or other relevant certifications are desirable Service Now / ITSM tool day to day Operations Must be willing to provide 24x7 on-call support on a rotational basis with the team. Must be willing to travel – both short-term and long-term What You’ll Bring 4 Year Engineering Degree or 3 Year Masters of Computer Application. Excellent oral and written communication skills in the English language Must be willing to provide 24x7 on-call support on a rotational basis with the team. Must be willing to travel – both short-term and long-term Teradata/DBMS Experience Hands on experience with Teradata administration and strong understanding of Cloud capabilities and limitations Thorough understanding of Cloud Computing: virtualization technologies, Infrastructure as a Service, Platform as a Service and Software as a Service Cloud delivery models and the current competitive landscape Implement and support new and existing customers on VaaS infrastructure. Thorough understanding of infrastructure (firewalls, load balancers, hypervisor, storage, monitoring, security etc. ) and have experience with orchestration to develop a cloud solution. Should have good knowledge of cloud services for Compute, Storage, Network and OS for at least one of the following cloud platforms: Azure Managed responsibilities as a Shift lead Should have experience in Enterprise VPN and Azure virtual LAN with data center Knowledge of monitoring, logging and cost management tools Hands-on experience with database architecture/modeling, RDBMS and No-SQL. Should have good understanding of data archive/restore policies. Teradata Basic If certified with VMware skills will be added advantage. Working experience in Linux administration, Shell Scripting. Working experience on any of the RDBMS like Oracle//DB2/Netezza/Teradata/SQL Server,MySQL. Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less

Posted 1 month ago

Apply

8.0 years

0 Lacs

Hyderābād

On-site

Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Senior Principal Consultant - Teradata SME We are seeking a highly experienced and knowledgeable Teradata Subject Matter Expert (SME) to provide deep technical expertise and strategic guidance on our existing Teradata data warehouse environment, with a focus on its integration, migration, and potential modernization within the Google Cloud Platform (GCP). You will be the go-to person for complex Teradata-related challenges, optimization initiatives, and architectural decisions, particularly as they relate to our cloud strategy on GCP. You will collaborate with data engineers, cloud architects, analysts, and business stakeholders to ensure our data landscape effectively leverages both Teradata and GCP capabilities. Responsibilities Serve as the primary point of contact and expert resource for all Teradata-related technical inquiries and issues, including those related to GCP integration. Provide deep technical expertise in Teradata architecture, utilities, performance tuning, and query optimization, with an understanding of how these aspects translate to or interact with GCP services. Lead efforts to integrate Teradata with GCP services for data ingestion, processing, and analysis. Provide guidance and expertise on potential migration strategies from Teradata to GCP data warehousing solutions like BigQuery . Optimize Teradata performance in the context of data pipelines that may involve GCP components. Troubleshoot and resolve complex Teradata system and application issues, considering potential interactions with GCP. Develop and maintain best practices, standards, and documentation for Teradata development and administration, with a focus on cloud integration scenarios. Collaborate with cloud architects and data engineers to design hybrid data solutions leveraging both Teradata and GCP. Provide guidance and mentorship to team members on Teradata best practices and techniques within a cloud-focused context. Participate in capacity planning and forecasting for the Teradata environment, considering its future within our GCP strategy. Evaluate and recommend Teradata upgrades, patches, and new features, assessing their compatibility and value within a GCP ecosystem. Ensure adherence to data governance policies and security standards across both Teradata and GCP environments. Stay current with the latest Teradata features, trends, and best practices, as well as relevant GCP data warehousing and integration services. Qualifications we seek in you! Minimum Qualifications / Skills Bachelor's or Master's degree in Computer Science , Engineering, or a related field. Extensive and deep experience (typically 8+ years) working with Teradata data warehouse systems. Expert-level knowledge of Teradata architecture, including MPP concepts, BYNET, and storage management. Proven ability to write and optimize complex SQL queries in Teradata. Strong experience with Teradata utilities (e.g., BTEQ, FastLoad , MultiLoad , TPump). Deep understanding of Teradata performance tuning techniques, including workload management and query optimization. Experience with Teradata data modeling principles and best practices. Excellent analytical, problem-solving, and troubleshooting skills specific to Teradata environments, with an aptitude for understanding cloud integration. Strong communication , collaboration, and interpersonal skills, with the ability to explain complex technical concepts clearly, including those bridging Teradata and GCP. Familiarity with Google Cloud Platform (GCP) and its core data services (e.g., BigQuery , Cloud Storage, Dataflow). Preferred Qualifications/ Skills Teradata certifications. Google Cloud certifications (e.g., Cloud Architect, Data Engineer). Experience with Teradata Viewpoint and other monitoring tools. Knowledge of data integration tools (e.g., Informatica, Talend) and their interaction with both Teradata and GCP. Experience with workload management and prioritization in Teradata, and how it might be approached in GCP. Familiarity with data security concepts and implementation within both Teradata and GCP. Experience with migrating data to or from Teradata, especially to GCP. Exposure to cloud-based data warehousing solutions (specifically BigQuery ) and their architectural differences from Teradata. Scripting skills (e.g., Shell, Python) for automation of tasks across both Teradata and GCP. Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Senior Principal Consultant Primary Location India-Hyderabad Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 16, 2025, 11:49:57 PM Unposting Date Ongoing Master Skills List Digital Job Category Full Time

Posted 1 month ago

Apply

5.0 years

1 - 9 Lacs

Hyderābād

On-site

We have an opportunity to impact your career and provide an adventure where you can push the limits of what's possible. As a Lead Software Engineer at JPMorgan Chase within the Consumer and community banking - Data technology, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives. Job responsibilities Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Develops secure high-quality production code, and reviews and debugs code written by others Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems Leads evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies Becomes a technical mentor in the team Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience Experience in software engineering, including hands-on expertise in ETL/Data pipeline and data lake platforms like Teradata and Snowflake Hands-on practical experience delivering system design, application development, testing, and operational stability Proficiency in AWS services especially in Aurora Postgres RDS Proficiency in automation and continuous delivery methods Proficient in all aspects of the Software Development Life Cycle Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated proficiency in software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) In-depth knowledge of the financial services industry and their IT systems Preferred qualifications, capabilities, and skills Experience in re-engineering and migrating on-premises data solutions to and for the cloud Experience in Infrastructure as Code (Terraform) for Cloud based data infrastructure Experience in building on emerging cloud serverless managed services, to minimize/eliminate physical/virtual server footprint Advanced in Java plus Python (nice to have)

Posted 1 month ago

Apply

5.0 years

0 Lacs

Hyderābād

On-site

JOB DESCRIPTION We have an opportunity to impact your career and provide an adventure where you can push the limits of what's possible. As a Lead Software Engineer at JPMorgan Chase within the Consumer and community banking - Data technology, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives. Job responsibilities Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Develops secure high-quality production code, and reviews and debugs code written by others Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems Leads evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies Becomes a technical mentor in the team Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience Experience in software engineering, including hands-on expertise in ETL/Data pipeline and data lake platforms like Teradata and Snowflake Hands-on practical experience delivering system design, application development, testing, and operational stability Proficiency in AWS services especially in Aurora Postgres RDS Proficiency in automation and continuous delivery methods Proficient in all aspects of the Software Development Life Cycle Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated proficiency in software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) In-depth knowledge of the financial services industry and their IT systems Preferred qualifications, capabilities, and skills Experience in re-engineering and migrating on-premises data solutions to and for the cloud Experience in Infrastructure as Code (Terraform) for Cloud based data infrastructure Experience in building on emerging cloud serverless managed services, to minimize/eliminate physical/virtual server footprint Advanced in Java plus Python (nice to have) ABOUT US

Posted 1 month ago

Apply

12.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Overview Deputy Director - Data Engineering PepsiCo operates in an environment undergoing immense and rapid change. Big-data and digital technologies are driving business transformation that is unlocking new capabilities and business innovations in areas like eCommerce, mobile experiences and IoT. The key to winning in these areas is being able to leverage enterprise data foundations built on PepsiCo’s global business scale to enable business insights, advanced analytics, and new product development. PepsiCo’s Data Management and Operations team is tasked with the responsibility of developing quality data collection processes, maintaining the integrity of our data foundations, and enabling business leaders and data scientists across the company to have rapid access to the data they need for decision-making and innovation. What PepsiCo Data Management and Operations does: Maintain a predictable, transparent, global operating rhythm that ensures always-on access to high-quality data for stakeholders across the company. Responsible for day-to-day data collection, transportation, maintenance/curation, and access to the PepsiCo corporate data asset Work cross-functionally across the enterprise to centralize data and standardize it for use by business, data science or other stakeholders. Increase awareness about available data and democratize access to it across the company. As a data engineering lead, you will be the key technical expert overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be empowered to create & lead a strong team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. Responsibilities Data engineering lead role for D&Ai data modernization (MDIP) Ideally Candidate must be flexible to work an alternative schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon coverage requirements of the job. The can didate can work with immediate supervisor to change the work schedule on rotational basis depending on the product and project requirements. Responsibilities Manage a team of data engineers and data analysts by delegating project responsibilities and managing their flow of work as well as empowering them to realize their full potential. Design, structure and store data into unified data models and link them together to make the data reusable for downstream products. Manage and scale data pipelines from internal and external data sources to support new product launches and drive data quality across data products. Create reusable accelerators and solutions to migrate data from legacy data warehouse platforms such as Teradata to Azure Databricks and Azure SQL. Enable and accelerate standards-based development prioritizing reuse of code, adopt test-driven development, unit testing and test automation with end-to-end observability of data Build and own the automation and monitoring frameworks that captures metrics and operational KPIs for data pipeline quality, performance and cost. Collaborate with internal clients (product teams, sector leads, data science teams) and external partners (SI partners/data providers) to drive solutioning and clarify solution requirements. Evolve the architectural capabilities and maturity of the data platform by engaging with enterprise architects to build and support the right domain architecture for each application following well-architected design standards. Define and manage SLA’s for data products and processes running in production. Create documentation for learnings and knowledge transfer to internal associates. Qualifications 12+ years of engineering and data management experience Qualifications 12+ years of overall technology experience that includes at least 5+ years of hands-on software development, data engineering, and systems architecture. 8+ years of experience with Data Lakehouse, Data Warehousing, and Data Analytics tools. 6+ years of experience in SQL optimization and performance tuning on MS SQL Server, Azure SQL or any other popular RDBMS 6+ years of experience in Python/Pyspark/Scala programming on big data platforms like Databricks 4+ years in cloud data engineering experience in Azure or AWS. Fluent with Azure cloud services. Azure Data Engineering certification is a plus. Experience with integration of multi cloud services with on-premises technologies. Experience with data modelling, data warehousing, and building high-volume ETL/ELT pipelines. Experience with data profiling and data quality tools like Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one business intelligence tool such as Power BI or Tableau Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes. Experience with version control systems like ADO, Github and CI/CD tools for DevOps automation and deployments. Experience with Azure Data Factory, Azure Databricks and Azure Machine learning tools. Experience with Statistical/ML techniques is a plus. Experience with building solutions in the retail or in the supply chain space is a plus. Understanding of metadata management, data lineage, and data glossaries is a plus. BA/BS in Computer Science, Math, Physics, or other technical fields. Candidate must be flexible to work an alternative work schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon product and project coverage requirements of the job. Candidates are expected to be in the office at the assigned location at least 3 days a week and the days at work needs to be coordinated with immediate supervisor Skills, Abilities, Knowledge: Excellent communication skills, both verbal and written, along with the ability to influence and demonstrate confidence in communications with senior level management. Proven track record of leading, mentoring data teams. Strong change manager. Comfortable with change, especially that which arises through company growth. Ability to understand and translate business requirements into data and technical requirements. High degree of organization and ability to manage multiple, competing projects and priorities simultaneously. Positive and flexible attitude to enable adjusting to different needs in an ever-changing environment. Strong leadership, organizational and interpersonal skills; comfortable managing trade-offs. Foster a team culture of accountability, communication, and self-management. Proactively drives impact and engagement while bringing others along. Consistently attain/exceed individual and team goals. Ability to lead others without direct authority in a matrixed environment. Comfortable working in a hybrid environment with teams consisting of contractors as well as FTEs spread across multiple PepsiCo locations. Domain Knowledge in CPG industry with Supply chain/GTM background is preferred. Show more Show less

Posted 1 month ago

Apply

15.0 years

0 Lacs

Kochi, Kerala, India

On-site

Introduction Joining the IBM Technology Expert Labs teams means you'll have a career delivering world-class services for our clients. As the ultimate expert in IBM products, you'll bring together all the necessary technology and services to help customers solve their most challenging problems. Working in IBM Technology Expert Labs means accelerating the time to value confidently and ensuring speed and insight while our clients focus on what they do best—running and growing their business. Excellent onboarding and industry-leading learning culture will set you up for a positive impact, while advancing your career. Our culture is collaborative and experiential. As part of a team, you will be surrounded by bright minds and keen co-creators—always willing to help and be helped—as you apply passion to work that will positively impact the world around us. Your Role And Responsibilities This Candidate is responsible for: DB2 installation and configuration on the below environments. On Prem Multi Cloud Redhat Open shift Cluster HADR Non-DPF and DPF. Migration of other databases to Db2(eg TERADATA / SNOWFLAKE / SAP/ Cloudera to db2 migration) Create high-level designs, detail level designs, maintaining product roadmaps which includes both modernization and leveraging cloud solutions Design scalable, performant, and cost-effective data architectures within the Lakehouse to support diverse workloads, including reporting, analytics, data science, and AI/ML. Perform health check of the databases, make recommendations and deliver tuning. At the Database and system level. Deploy DB2 databases as containers within Red Hat OpenShift clusters Configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Lead the architectural design and implementation of solutions on IBM watsonx.data, ensuring alignment with overall enterprise data strategy and business objectives. Define and optimize the watsonx.data ecosystem, including integration with other IBM watsonx components (watsonx.ai, watsonx.governance) and existing data infrastructure (DB2, Netezza, cloud data sources) Establish best practices for data modeling, schema evolution, and data organization within the watsonx.data lakehouse Act as a subject matter expert on Lakehouse architecture, providing technical leadership and guidance to data engineering, analytics, and development teams. Mentor junior architects and engineers, fostering their growth and knowledge in modern data platforms. Participate in the development of architecture governance processes and promote best practices across the organization. Communicate complex technical concepts to both technical and non-technical stakeholders. Required Technical And Professional Expertise 15+ years of experience in data architecture, data engineering, or a similar role, with significant hands-on experience in cloud data platforms Strong proficiency in DB2, SQL and Python. Strong understanding of: Database design and modelling(dimensional, normalized, NoSQL schemas) Normalization and indexing Data warehousing and ETL processes Cloud platforms (AWS, Azure, GCP) Big data technologies (e.g., Hadoop, Spark) Database Migration project experience from one database to another database (target database Db2). Experience in deployment of DB2 databases as containers within Red Hat OpenShift clusters and configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Excellent communication, collaboration, problem-solving, and leadership skills. Preferred Technical And Professional Experience Experience with machine learning environments and LLMs Certification in IBM watsonx.data or related IBM data and AI technologies Hands-on experience with Lakehouse platform (e.g., Databricks, Snowflake) Having exposure to implement or understanding of DB replication process Experience with integrating watsonx.data with GenAI or LLM initiatives (e.g., RAG architectures). Experience with NoSQL databases (e.g., MongoDB, Cassandra). Experience in data modeling tools (e.g., ER/Studio, ERwin). Knowledge of data governance and compliance standards (e.g., GDPR, HIPAA). soft skills. Show more Show less

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies