Jobs
Interviews

1363 Teradata Jobs - Page 17

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 4.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Our Company Teradata is the connected multi-cloud data platform for enterprise analytics company. Our enterprise analytics solve business challenges from start to scale. Only Teradata gives you the flexibility to handle the massive and mixed data workloads of the future, today. The Teradata Vantage architecture is cloud native, delivered as-a-service, and built on an open ecosystem. These design features make Vantage the ideal platform to optimize price performance in a multi-cloud environment. What You’ll Do This Role will mainly be working as part of Change Ops for the Cloud Ops L2 team, which is eventually responsible for all Changes, across AWS, Azure & Google Cloud Platforms. Few core responsibilities though not limited to would be as below. The Cloud Ops Administrator is responsible for managing Teradata’s as-a-Service offering on public cloud (AWS/Azure/GC) Delivery responsibilities in the areas of cloud network administration, security administration, instantiation, provisioning, optimizing the environment, third party software support. Supporting the onsite teams with migration from On premise to Cloud for customers Implementing security best practices, and analyzing the partner compatibility Manages and coordinates all activities necessary to implement the Changes in the environment. Ensures Change status, progress and issues are communicated to the appropriate groups. Views and implements the process lifecycle and reports to upper management. Evaluates performance metrics against the critical success factors and assures actions for streamline the process. Perform Change related activities documented in the Change Request to ensure the Change is implemented according to plan Document closure activities in the Change record and completing the Change record Escalate any deviations from plans to appropriate TLs/Managers Provide input for the ongoing improvement of the Change Management process Manage and support 24x7 VaaS environments for multiple customers. Devise and implement security, operations best practices. Implementing development, production environment for data warehousing cloud environment Backup, Archive and Recovery planning and execution of the cloud-based data warehouses across all the platforms AWS/Azure/GC resources. Ensuring SLA are met during implementing the change Ensure all scheduled changes are implemented within the prescribed window First level of escalation for team members First level of help/support for team members Who You’ll Work This Role will mainly be working as part of Change Ops for the Cloud Ops L2 team, which is eventually responsible for all Cases, Incidents, Changes, across Azure & Google Cloud Platforms. This will be reporting into Delivery Manager for Change Ops What Makes You a Qualified Candidate Minimum 3-4 years of IT experience in a Systems Administrator / Engineer role. Minimum 4 years of Cloud hands-on experience (Azure/AWS/GCP). Cloud Certification ITIL or other relevant certifications are desirable Service Now / ITSM tool day to day Operations Must be willing to provide 24x7 on-call support on a rotational basis with the team. Must be willing to travel – both short-term and long-term What You’ll Bring 4 Year Engineering Degree or 3 Year Masters of Computer Application. Excellent oral and written communication skills in the English language Must be willing to provide 24x7 on-call support on a rotational basis with the team. Must be willing to travel – both short-term and long-term Teradata/DBMS Experience Hands on experience with Teradata administration and strong understanding of Cloud capabilities and limitations Thorough understanding of Cloud Computing: virtualization technologies, Infrastructure as a Service, Platform as a Service and Software as a Service Cloud delivery models and the current competitive landscape Implement and support new and existing customers on VaaS infrastructure. Thorough understanding of infrastructure (firewalls, load balancers, hypervisor, storage, monitoring, security etc. ) and have experience with orchestration to develop a cloud solution. Should have good knowledge of cloud services for Compute, Storage, Network and OS for at least one of the following cloud platforms: Azure Managed responsibilities as a Shift lead Should have experience in Enterprise VPN and Azure virtual LAN with data center Knowledge of monitoring, logging and cost management tools Hands-on experience with database architecture/modeling, RDBMS and No-SQL. Should have good understanding of data archive/restore policies. Teradata Basic If certified with VMware skills will be added advantage. Working experience in Linux administration, Shell Scripting. Working experience on any of the RDBMS like Oracle//DB2/Netezza/Teradata/SQL Server,MySQL Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status.

Posted 3 weeks ago

Apply

175.0 years

8 - 9 Lacs

Gurgaon

Remote

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? What is Amex’s objective for a digital workplace? The Digital Workplace AI/ML Platform with GENAI capabilities aims to bring together the data from all Unified Workspace, Collaboration and Colleague Servicing platforms; combining this with HR, Information Security, and network data to provide real-time, meaningful insights in areas such as user experience, health scoring, productivity, and overall IT visibility. As the Engineering Senior Engineer 2 of the Digital Workplace AI/ML Platform, you will have responsibility for leading the engineering teams to develop the GENAI based Data and Cloud platform and enhance it to provide personalization capabilities, analytics, and engineering automations, and best practices. Our winning aspiration is to deliver the best Colleague digital experience. We simplify work and raise productivity by empowering Colleagues with the best digital tools and services. Opportunity for Impact Digital Workplace at American Express is entering into a new phase of technology transformation driven by opportunities to improve Colleague experience, raise productivity and collaboration, and drive operational efficiency of all service and infrastructure operations. If you have the talent and desire to deliver innovative products and services at a rapid pace, with hands on experience and strategic thinking, in areas of productivity and collaboration software suites, endpoint computing and security, mobile platforms, data management and analytics, and software engineering, join our leadership team to help with our transformation journey. Role and Responsibilities: The Data platform with GENAI capabilities is central to the future of how we work and improve colleague experience while identifying opportunities for improvement. As the leader of this group, you will: Create and manage a complex distributed data engineering pipeline at scale and manage its availability, throughputs, and security. Lead engineering teams and partner with Product Management to enhance the quality and outcomes of the data products including data analytics & visualization Lead the solutioning of Software Infrastructure, Authentication/Authorization (OKTA, JWT, SSL, A2A), CI/CD pipelines, Devops. The engineering skillset would need to run the GENAI/AI/ML models on the Cloud platform to derive meaningful insights & predictive analysis from search and chat bots. Work & collaborate with Product Management and Digital Workplace teams to influence key decisions on architecture and implementation of scalable, reliable, and cost-effective AI/ML platform Bring thought leadership to advance the overall state of technology and customer focus for the platform Manage delivery milestones, deployment cycles & delivery of the overall Software, Data Engineering and Platform Platform’s roadmap Enthusiasm for staying up-to-date with the latest advancements in AI, NLP, and large language models. A portfolio showcasing previous language model projects, contributions to open-source projects, or relevant publications is a plus. Build, inspire, and grow talented engineering teams that are responsible for designing and scaling solutions using Devops and Analytics skills. Build culture of innovation, ownership, accountability, and customer focus Contribute to the American Express Data Strategy. Working with other Technology teams to drive enterprise solutions, define best practice at a company level and further develop skills and experience outside Digital Workplace. Partner with the Digital Workplace technology teams to develop the AI/ML. platform strategy across all products and channels. Participate actively and constructively in agile team meetings and application reviews Work directly with and learn from the business, product, and engineering leaders across the organization Strengthen the collaboration with Industry partners/suppliers for more robust data solutions and market research for innovative solutions in this space. Professional Qualifications: Demonstrated experience leading engineering teams in remote and distributed product engineering setup. Must have writing software experience with exhaustive lines of code in Java, Spring, SpringBoot, React, Databases (Postgress, SQL-DB), RESTful API, and container-based application and Python. Demonstrated experience in data migration, integration, etc. Extensive knowledge of Devops best practices (CI/CD, Github Actions), observability, databases, caches software design skills. Mandatory strong experience in scalability, large scale distributed system designs to handle million requests, including reliability engineering, and platform monitoring. Expertise in pre-processing and cleaning large datasets as a part of ingesting data, from multiple sources within the enterprise. Experience with data structures, algorithms, and software design. Exposure to Data Science including Predictive Modelling. Willingness to learn GENAI, AI, LLMs, Rags, NLP, that are used in multilingual conversational systems. Take an extra mile to solve real-world scenarios for user commands and requests by identifying the right LLM models, tooling and frameworks. Willingness in deploying language models in production environments and integrating them into applications, platforms, or services. Review architecture and provide technical guidance for engineers Good to have experience on various data architectures, latest tools, current and future trends in data engineering space especially Big Data, Streaming and Cloud technologies like GCP, AWS, Azure. Good to have experience with Big Data technologies (Spark, Kafka, Hive, etc.) and have at least 1 Big data implementation on platforms like Cornerstone, Teradata, etc. Experience with Visualization Tools like ElasticSearch, Tableau, Power BI, etc. Experience with complex, high volume, multi-dimensional data ingestion including unstructured, structured, and streaming datasets. Ability to learn new tools and paradigms in data engineering and science Well versed in AGILE, SAFe and Program Management methods Bachelor’s degree with a preference for Computer Science, Master’s / Phd is a plus. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Hyderābād

On-site

Our Company: At Teradata, we believe that people thrive when empowered with better information. That’s why we built the most complete cloud analytics and data platform for AI. By delivering harmonized data, trusted AI, and faster innovation, we uplift and empower our customers—and our customers’ customers—to make better, more confident decisions. The world’s top companies across every major industry trust Teradata to improve business performance, enrich customer experiences, and fully integrate data across the enterprise. What you will do: In this role you will lead a critical and highly visible function within Teradata Vantage platform. You will be given the opportunity to autonomously deliver the technical direction of the service, and the feature roadmap. You will work with extraordinary talent and have the opportunity to shape the team to best execute on the product. Job Responsibilities: Design, develop, and scale intelligent software systems that power autonomous AI agents capable of reasoning, planning, acting, and learning in real-world environments. Lead the implementation of core Agentic AI components — including agent memory, context-aware planning, multi-step tool use, and self-reflective behavior loops. Architect robust, cloud-native backends that support high-throughput agent pipelines across AWS, Azure, or GCP, with best-in-class observability and fault tolerance. Build seamless integrations with large language models (LLMs) such as GPT-4, Claude, Gemini, or open-source models — using advanced techniques like function calling, dynamic prompting, and multi-agent orchestration. Develop scalable APIs and services to connect agents with internal tools, vector databases, RAG pipelines, and external APIs. Own technical delivery of major agent-related features, leading design reviews, code quality standards, and engineering best practices. Collaborate cross-functionally with researchers, ML engineers, product managers, and UX teams to translate ideas into intelligent, performant, and production-ready systems. Define and implement testing strategies to validate agentic behavior in both deterministic and probabilistic conditions. Guide junior engineers and peers by mentoring, unblocking challenges, and championing a culture of technical excellence. Continuously evaluate emerging frameworks, libraries, and research to drive innovation in our Agentic AI stack. What makes you a qualified candidate: 5+ years of hands-on experience in backend development, distributed systems, or AI infrastructure, with a proven track record of delivering in high-scale environments. Expertise in building and deploying AI-integrated software , particularly with LLMs and frameworks like LangChain, AutoGen, CrewAI, Semantic Kernel, or custom orchestrators. Strong development skills in Python (preferred), Go, Java, or similar languages used in intelligent system design. Practical knowledge of agentic AI principles — including task decomposition, autonomous decision-making, memory/context management, and multi-agent collaboration. Familiarity with vector databases (Pinecone, Weaviate, FAISS) and embedding models for semantic search and retrieval-augmented generation (RAG). Deep experience working with cloud platforms (AWS, Azure, or GCP), containerized environments (Docker/Kubernetes), and infrastructure-as-code (Terraform, Ansible). Demonstrated ability to design clean APIs, modular microservices, and resilient, maintainable backend systems. Clear communicator with the ability to simplify complex AI system behaviors into actionable architecture. Passion for AI and a hunger to build systems that push the boundaries of autonomous software. Strong understanding of Agile software development, CI/CD practices, and collaborative team workflows. What you will bring: BS or MS degree in Computer Science, Artificial Intelligence, Software Engineering, or a related technical field. A solid foundation in software engineering principles, including system design, data structures, algorithms, and distributed computing. Proven ability to work in a fast-paced, innovation-driven environment where engineers take full ownership from concept to deployment. Experience deploying and operating intelligent systems in production environments with live users and evolving requirements. Curiosity, creativity, and the mindset of a builder — someone who thrives at the intersection of AI research and real-world impact . Desire to help shape the future of software agents by building scalable, reliable, and intelligent backends that unlock new capabilities in autonomy and adaptability. Why we think you will love Teradata: We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. #LI-NT1

Posted 3 weeks ago

Apply

10.0 years

1 - 2 Lacs

Hyderābād

On-site

As a Senior Lead Architect at JPMorgan Chase within the Consumer and Community Banking Data Technology, you will be a key member of the Data Product Solutions Architecture Team. Your role involves designing, developing, and implementing analytical data solutions that align with the organization's strategic goals. You will leverage your expertise in data architecture, data modeling, data migrations and data integration, collaborating with cross-functional teams to achieve target state architecture goals. Job Responsibilities: Represent the Data Product Solutions Architecture team in various forums, advising on Data Product Solutions. Lead the design and maintenance of scalable data solutions, including data lakes and warehouses. Collaborate with cross-functional teams to ensure data product solutions supports business needs and enables data-driven decision-making. Evaluate and select data technologies, driving the adoption of emerging technologies. Develop architectural models using Archimate, C4 Model, etc. and other artifacts to support data initiatives. Serve as a subject matter expert in specific areas. Contribute to the data engineering community and advocate for firmwide data practices. Engage in hands-on coding and design to implement production solutions. Optimize system performance by resolving inefficiencies. Influence product design and technical operations. Develop multi-year roadmaps aligned with business and data technology strategies. Design reusable data frameworks using new technologies. Required Qualifications: Bachelor's or Master's degree in Computer Science or related field with 10+ years of experience. 5+ years as a Data Product Solution Architect or similar role leading technologists to manage, anticipate and solve complex technical items within your domain of expertise. Hands-on experience in system design, application development, and operational stability. Expertise in architecture disciplines and programming languages. Deep knowledge of data architecture, modeling, integration, cloud data services, data domain driven design, best practices, and industry trends in data engineering Practical experience with AWS, big data technologies, and data engineering disciplines. Advanced experience in one or more data engineering disciplines, e.g. streaming, ELT/ELT, event processing. Proficiency in SQL and data warehousing solutions using Teradata or similar cloud native relational databases, e.g. Snowflake, Athena, Postgres Strong problem-solving, communication, and interpersonal skills. Ability to evaluate and recommend technologies for future state architecture. Preferred qualifications, capabilities, and skills Financial services experience, especially in card and banking. Experience with modern data processing technologies such as Kafka streaming, DBT, Spark, Python, Java, Airflow, etc. using data mesh & data lake. Business architecture knowledge and experience with architecture assessment frameworks.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

7 - 7 Lacs

Hyderābād

Remote

Our Company Teradata is the connected multi-cloud data platform for enterprise analytics company. Our enterprise analytics solve business challenges from start to scale. Only Teradata gives you the flexibility to handle the massive and mixed data workloads of the future, today. The Teradata Vantage architecture is cloud native, delivered as-a-service, and built on an open ecosystem. These design features make Vantage the ideal platform to optimize price performance in a multi-cloud environment. What You’ll Do Participate in monthly close activities, including preparing journal entries and conducting analysis of GL activity Review, record and assess Investment and Equity eliminations to the consolidated financial statements Prepare balance sheet reconciliations, investigate discrepancies, and ensure financial data integrity Recommend reclassification and adjustments as needed Participate in quarterly balance sheet review and flux analysis Support corporate departments such as HR, Marketing, Tax, Treasury, etc. Support internal and external audit requests Interpret financial reports to communicate key findings to stakeholders Document systems and processes and ensure compliance with SOX requirements Handle ad hoc reporting and projects as needed Who You’ll Work With The GL Staff Accountant will be part of the Corporate Accounting team within the Teradata Corporate Controller’s group based out of Atlanta, Georgia. The Corporate Accounting team is responsible for the administration and performance of a wide array of general ledger accounting functions. Team members are expected to support a variety of corporate departments and interact with various levels of management. This role will work closely with the corporate and regional accounting teams and report directly to the Corporate Accounting Manager. We’re seeking a detail-oriented and motivated accountant to join our team. The role will play a key part in supporting month-end processes, maintaining accurate financial records, and contributing to the overall success of the accounting operations. Candidates must be comfortable executing and troubleshooting on their own, while hitting tight deadlines. What Makes You a Qualified Candidate Advanced problem-solving skills with the ability to interpret data to reach conclusive results Apply critical thinking to analysis to ensure accuracy and GAAP compliance Must possess a basic understanding of GAAP accounting principles Must be a strong team player who collaborates well with others with different personalities and backgrounds Will need to be a quick learner with the desire to improve skill sets Needs to be flexible and willing to adapt to the ever-changing business environment Will need strong time management skills with the ability to effectively prioritize tasks Needs to be able to work well under pressure and to meet tight deadlines Needs to be comfortable working in a fully remote environment Bachelor’s degree in accounting, finance, or related field 5 to 7 years of relevant accounting experience Exceptional problem-solving skills Well-developed Microsoft Office skills, including intermediate Excel proficiency Strong oral and written communication skills What You’ll Bring Experience with Oracle Cloud ERP, specifically general ledger Smart View experience Experience working in accounting for publicly traded companies with international operations Experience preparing and booking journal entries Power BI experience or similar Business Intelligence tools Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. #LI-NT1

Posted 3 weeks ago

Apply

3.0 - 6.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Software Engineering Lead Analyst - HIH - Evernorth ABOUT EVERNORTH: Evernorth exists to elevate health for all, because we believe health is the starting point for human potential and progress. As champions for affordable, predictable and simple health care, we solve the problems others don t, won t or can t. Our innovation hub in India will allow us to work with the right talent, expand our global footprint, improve our competitive stance, and better deliver on our promises to stakeholders. We are passionate about making healthcare better by delivering world-class solutions that make a real difference. We are always looking upward. And that starts with finding the right talent to help us get there. Position Overview Responsible for overall design and development of data integration code for the engineering team/asset. Responsible for providing technical knowledge on integration and ETL processes to build and establish coding standards. Create a working agreement withing the team and other stakeholders that are involved. Partner and align with Enterprise Architect, Product Owner, Production Support, Analyst, PVS Testing team, and data engineers to build solutions that conform to defined standards. Responsibilities Well versed in design, development, and unit testing of ETL jobs that read and writes data from database tables, flat files, datasets, IBM MQ s, Kafka topics, S3 files, etc. Carry out Data Profiling of Source data and generate logical data models (as required or applicable). Define, document, and complete System Requirements Specifications including Functional Requirements, Context Diagrams, Non-Functional Requirements, and Business Rules (as applicable for Sprints to be complete). Create Source-to-Target mapping documents as required and applicable. Support definition of business requirements including assisting the Product Owner in writing user stories and acceptance criteria for user stories. Support other scrum team members during the following activities (as required or applicable). Design of test scenarios and test cases. Develop and identify data requirements for Unit, Systems, and Integration tests. Qualifications Required Skills: Adept working experience in design and development of performance efficient ETL flows dealing with millions of rows in volume. Must have experience working in SAFE Agile Scrum project delivery model. Good at writing complex SQL queries to pull data out of RDBMS databases like Oracle, SQL Server, DB2, Teradata, etc. Good working knowledge of Unix scripts. Batch job scheduling software such as CA ESP. Experienced in using CI/CD methodologies. Education Bachelor s degree in Computer Science, Software Engineering, or related field. A Masters degree is a plus. Certifications: Relevant certifications in AWS a plus Location & Hours of Work Full-time position, working 40 hours per week. Expected overlap with US hours as appropriate Primarily based in the Innovation Hub in Hyderabad, India in a hybrid working model (3 days WFO and 2 days WAH) Equal Opportunity Statement Evernorth is an Equal Opportunity Employer actively encouraging and supporting organization-wide involvement of staff in diversity, equity, and inclusion efforts to educate, inform and advance both internal practices and external work with diverse client populations. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Hyderābād

On-site

Our Company: At Teradata, we believe that people thrive when empowered with better information. That’s why we built the most complete cloud analytics and data platform for AI. By delivering harmonized data, trusted AI, and faster innovation, we uplift and empower our customers—and our customers’ customers—to make better, more confident decisions. The world’s top companies across every major industry trust Teradata to improve business performance, enrich customer experiences, and fully integrate data across the enterprise. What you will do: In this role you will lead a critical and highly visible function within Teradata Vantage platform. You will be given the opportunity to autonomously deliver the technical direction of the service, and the feature roadmap. You will work with extraordinary talent and have the opportunity to shape the team to best execute on the product. Job Responsibilities: Contribute to the design and development of software components that enable autonomous AI agents to reason, plan, act, and learn. Implement features that support agentic capabilities such as tool use, memory, long-term planning, task execution , and self-reflection. Integrate with state-of-the-art LLMs (e.g., OpenAI, Anthropic) and agent orchestration frameworks (e.g., LangChain, CrewAI, AutoGen). Write, test, and debug Python or other backend code that powers agent workflows in cloud-native environments. Assist in building interfaces between agents and tools such as APIs, databases, search engines, and user-facing applications. Participate in Agile development workflows — sprint planning, code reviews, retrospectives — and collaborate across AI, product, and engineering teams. Work with structured and unstructured data to support context-aware reasoning and decision-making in agents. Contribute to the development of internal testing tools and evaluation methods for autonomous agent behaviors. Support the deployment of agent systems in scalable environments using Kubernetes, Docker, and cloud services (AWS, Azure, or GCP). Maintain documentation of agent capabilities, system design, and integration strategies. What makes you a qualified candidate: 1–3 years of experience (or strong academic/internship background) in software development with a focus on AI/ML, intelligent systems, or cloud-native applications. Strong interest in Agentic AI — systems where AI agents can autonomously interpret goals, take actions, and learn from outcomes. Familiarity with one or more programming languages: Python (preferred), Go, Java, or C++. Exposure to LLMs and AI APIs (e.g., OpenAI, Hugging Face, Claude) and understanding of prompting, embeddings, or vector databases . Basic understanding of how AI agents use tools and memory to solve complex tasks. Experience with RESTful APIs, backend service design, and microservice architectures. Familiarity with modern cloud infrastructure (AWS, GCP, or Azure), especially in deploying containerized applications. Ability to write clean, testable code and follow best practices in code reviews and version control. Clear communicator, with a willingness to ask questions, propose ideas, and collaborate across teams. Passion for learning emerging AI techniques and contributing to real-world agentic applications. What you will bring: BS degree in Computer Science, Artificial Intelligence, Software Engineering, or related technical field. Strong problem-solving skills and a solid foundation in data structures, algorithms, and object-oriented programming. Experience with cloud tools and platforms (AWS, GCP, Azure), especially Kubernetes and Docker. Exposure to agentic frameworks such as LangChain, AutoGen, CrewAI, or similar tools. Curiosity and excitement about building intelligent systems that act autonomously and adaptively . A mindset geared toward experimentation, continuous learning, and pushing the boundaries of AI capabilities. Why we think you will love Teradata: We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. #LI-NT1

Posted 3 weeks ago

Apply

0 years

0 Lacs

Bengaluru East, Karnataka, India

Remote

When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. Responsibilities What you’ll be doing... Publishing various insights & inferences for technical and senior leadership to make informed decisions. Collecting, processing, and performing statistical analysis on large datasets to discover useful information, suggest conclusions, and support decision-making Identifying, defining, and scoping moderately complex data analytics problems in the Enterprise Cyber Security domain. Developing cross-domain strategies for increased network security and resiliency of critical infrastructure, working with researchers in other disciplines Designing, developing and maintaining applications and databases by evaluating business needs, analyzing requirements and developing software systems. Researching, developing, designing and implementing machine learning algorithms for cyber threat detection in Enterprise Security and IAM functions and transform data points into objective Executing full software development life cycle (SDLC) – concept, design, build, deploy, test, release and support. Managing daily activities include but are not limited to attending project calls to groom new user stories, acting as a liaison between business and technical teams, collecting, organizing, and interpreting data using statistical tools,developing user interface components using programming languages, and visualization techniques. All aspects of a project from analysis, testing, implementation and support after launch. What We’re Looking For... Experience with SQL Server/Teradata/DB2 databases. Experience with advanced analytics using R or Python in performing data analysis. Fundamental knowledge in and/or experience applying algorithms in one or more of the following Machine Learning areas: anomaly detection, one/few-shot learning, deep learning, unsupervised feature learning, ensemble methods, probabilistic graphical models, and/or reinforcement learning. Experience with visualization software like Tableau, Qlik, Looker or Thoughtspot to tell data-driven stories to business users at all levels Broad knowledge of IT Security such as end point, network and cloud Security Developing user interface components and implementing them following well-known React.js workflows (such as Flux or Redux). You will ensure that these components and the overall application are robust and easy to maintain. You will coordinate with the rest of the team working on different layers of the infrastructure. Your duties will include designing software solutions to meet project requirements, maintaining and refactoring existing code, writing tests, and fixing bugs. Ability to communicate comprehensive knowledge effectively across multi-disciplinary teams and to non-cyber experts, as well as demonstrate the proficient interpersonal skills necessary to effectively collaborate in a team environment. Following appropriate systems life cycle methodologies, Agile and Waterfall, for quality and maintainability and communicates status to IT management. Staying abreast of changes and advances in data warehousing technology. Perform the role of detective as you dig deep into the data warehouse to ensure new data requirements are not already available for the business to access, if not there, how the new data will fit in, be ingested and exposed in a usable manner You’ll need to have.. Bachelor degree with two or more years of work experience. Two or more years of professional experience in data analytics, business analysis or comparable analytics position. Ability to write SQL against a relational database in order to analyze and test data. Two or more Years of professional experience in working on IT Security domain Familiarity with RESTful APIs Experience with popular React.js workflows (such as Flux or Redux) Exposure to Threat, Risk and Vulnerability Management is added advantage Familiarity with Application dev Even better if you have one or more of the following: Bachelor degree in Computer Science/Information Systems or an equivalent combination of education and work experience Strong verbal and written communication skills Ability to work in a team environment. Familiarity with modern front-end build pipelines and tools Knowledge of modern authorization mechanisms, such as JSON Web Token When you join Verizon You’ll be doing work that matters alongside other talented people, transforming the way people, businesses and things connect with each other. Beyond powering America’s fastest and most reliable network, we’re leading the way in broadband, cloud and security solutions, Internet of Things and innovating in areas such as video entertainment. Of course, we will offer you great pay and benefits, but we’re about more than that. Verizon is a place where you can craft your own path to greatness. Whether you think in code, words, pictures or numbers, find your future at Verizon. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

1 - 4 Lacs

Noida

On-site

Req ID: 328437 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Modeller to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Job Description Mandatory Skills : ER Studio , Oracle , Teradata Creating and updating data models, defining information requirements for small to medium size projects Creating ETL specifications / source to target mapping based on project requirements Generate data definition language DDL used to create the database schemas and tables Create optimal database views aligned with business and technical needs Work with assigned technical teams to ensure correct deployment of DDL Synchronizing models to ensure that database structures match models Conduct business and data analysis Work independently on projects with guidance from Project Leader Domain Proficiency with Healthcare Plan/Payer Qualifications: Previous experience working with business and technical teams, compiling business definitions for enterprise data model attributes. 3-5 years' experience in a high-tech environment in Technical or business Application Analysis or equivalent combination of experience and education Physical data modeling, Business and Data analysis, technical specification development and enterprise data mapping Experience in relational database, Teradata preferred Understanding of Data Warehouse, ETL technology and Business Intelligence Reporting Understanding of enterprise logical models (EDM) containing all entities and their relationships and a complete set of documentation using industry standard tools and technique Bachelor's degree in related technical field of study that includes basic data modeling. Required Skills Excellent written and verbal communication skills. Candidate must be able to interact effectively with both technical and business users. Physical data modeling, Business and Data analysis, technical specification development and data mapping Experience using Power Designer or other Data Modeling tool such as Erwin Advanced SQL expertise Linux operating system Nice to have Experience with Healthcare Payer domain About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.

Posted 3 weeks ago

Apply

3.0 years

1 - 9 Lacs

Noida

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Design, develop, and implement data models and ETL processes for Power BI solutions Be able to understand and create Test Scripts for data validation as it moves through various lifecycles in cloud-based technologies Be able to work closely with business partners and data SMEs to understand Healthcare Quality Measures and its related business requirements Conduct data validation after major/minor enhancements in project and determine the best data validation techniques to implement Communicate effectively with leadership and analysts across teams Troubleshoot and resolve issues with Jobs/pipelines/overhead Ensure data accuracy and integrity between sources and consumers Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Graduate degree or equivalent (B.Tech./MCA preferred) with overall 3+ years of work experience 3+ years of advanced understanding to at least one programming language - Python, Spark, Scala Experience of working with Cloud technologies preferably Snowflake, ADF and Databricks Experience of working with Agile Methodology (preferably in Rally) Knowledge of Unix Shell Scripting for automation & scheduling Batch Jobs Knowledge of Configuration Management - Github Knowledge of Relational Databases - SQL Server, Oracle, Teradata, IBM DB2, MySQL Knowledge of Messaging Queues - Kafka/ActiveMQ/RabbitMQ Knowledge of CI/CD Tools - Jenkins Understanding Relational Database Model & Entity Relation diagrams Proven solid communication and interpersonal skills Proven excellent written and verbal communication skills with ability to provide clear explanations and overviews to others (internal and external) of their work efforts Proven solid facilitation, critical thinking, problem solving, decision making and analytical skills Demonstrated ability to prioritize and manage multiple tasks At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 3 weeks ago

Apply

0 years

0 Lacs

Noida

On-site

Req ID: 328482 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a ETL Informatica ,IICS Developer to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Work as ETL developer using Oracle SQL , PL/SQL, Informatica, Linux Scripting , Tidal ETL code development, unit testing, source code control, technical specification writing and production implementation. Develop and deliver ETL applications using Informatica , Teradata, Oracle , Tidal and Linux. Interact with BAs and other teams for requirement clarification, query resolution, testing and Sign Offs Developing software that conforms to a design model within its constraints Preparing documentation for design, source code, unit test plans. Ability to work as part of a Global development team. Should have good knowledge of healthcare domain and Data Warehousing concepts About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.

Posted 3 weeks ago

Apply

6.0 years

0 Lacs

Andhra Pradesh

On-site

QA & Testing Lead Analyst Position Overview The QA & Testing Lead Analyst will act as a member of our PBM Technology QA Team and will provide manual and automated testing for our propriety applications. You will collaborate with engineers to understand risks and to communicate found issues, and work within cross-functional feature teams and collaborate closely with engineers, designers, product managers. as well as the other QA Engineers. Responsibilities Write test strategy and test case documents that are derived from user stories for one or more features. Test cases should include positive and negative scenarios as well as test data setup / configuration and expected results. Contribute to other testing activities such as stress, load, and performance testing where required. Design, build, and maintain efficient, reusable, and reliable test framework components as part of framework improvement and enhancement. Provide estimates for testing effort based user stories as part of sprint planning. Contribute and participate in other Agile scrum activities such as daily standups, backlog grooming, demos, and retrospectives. Ensure the best possible performance, quality, and responsiveness of the applications Help maintain code quality, organization, and automation Implement test automation programs using Java, Ruby, SQL and scripting languages with no guidance Able to work on projects individually and directly with clients. Qualifications Required Skills: Experience maintaining large scale QA platforms Capable of writing test automation programs using frameworks Testing and triaging of defects and issues. Knowledge of defect tracking / task tools such as Jira and Confluence. QA Analyst / UI and DWBI Testing, Tableau, SQL, Oracle, DB2, MongoDB, Teradata, Hadoop, JMeter, Java, Selenium, Cucumber, Python Knowledge of build automation and deployment tools such as Jenkins as well as source code repository tools such as Git. Experience with other forms for testing including stress/load, performance, security (nice to have), and browser compatibility testing. Strong written and verbal communication skills with the ability to interact with all levels of the organization. Strong influencing/negotiation skills. Strong interpersonal/relationship skills. Strong time and project management skills. Required Experience & Education: 6+ years of experience Experience with an onshore/offshore model. Proven experience with Java, Selenium, Test NG, Cucumber, Strong SQL, AWS (is a plus), test automation College degree (Bachelor) in related technical/business areas or equivalent work experience. Desired Experience: Healthcare experience including Disease Management Location & Hours of Work Full-time position, working 40 hours per week. Expected overlap with US hours as appropriate Primarily based in the Innovation Hub in Hyderabad, India in a hybrid working model (3 days WFO and 2 days WAH) Equal Opportunity Statement Evernorth is an Equal Opportunity Employer actively encouraging and supporting organization-wide involvement of staff in diversity, equity, and inclusion efforts to educate, inform and advance both internal practices and external work with diverse client populations. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.

Posted 3 weeks ago

Apply

12.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Overview Deputy Director - Data Engineering PepsiCo operates in an environment undergoing immense and rapid change. Big-data and digital technologies are driving business transformation that is unlocking new capabilities and business innovations in areas like eCommerce, mobile experiences and IoT. The key to winning in these areas is being able to leverage enterprise data foundations built on PepsiCo’s global business scale to enable business insights, advanced analytics, and new product development. PepsiCo’s Data Management and Operations team is tasked with the responsibility of developing quality data collection processes, maintaining the integrity of our data foundations, and enabling business leaders and data scientists across the company to have rapid access to the data they need for decision-making and innovation. What PepsiCo Data Management and Operations does: Maintain a predictable, transparent, global operating rhythm that ensures always-on access to high-quality data for stakeholders across the company. Responsible for day-to-day data collection, transportation, maintenance/curation, and access to the PepsiCo corporate data asset Work cross-functionally across the enterprise to centralize data and standardize it for use by business, data science or other stakeholders. Increase awareness about available data and democratize access to it across the company. As a data engineering lead, you will be the key technical expert overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be empowered to create & lead a strong team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. Responsibilities Data engineering lead role for D&Ai data modernization (MDIP) Ideally Candidate must be flexible to work an alternative schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon coverage requirements of the job. The can didate can work with immediate supervisor to change the work schedule on rotational basis depending on the product and project requirements. Responsibilities Manage a team of data engineers and data analysts by delegating project responsibilities and managing their flow of work as well as empowering them to realize their full potential. Design, structure and store data into unified data models and link them together to make the data reusable for downstream products. Manage and scale data pipelines from internal and external data sources to support new product launches and drive data quality across data products. Create reusable accelerators and solutions to migrate data from legacy data warehouse platforms such as Teradata to Azure Databricks and Azure SQL. Enable and accelerate standards-based development prioritizing reuse of code, adopt test-driven development, unit testing and test automation with end-to-end observability of data Build and own the automation and monitoring frameworks that captures metrics and operational KPIs for data pipeline quality, performance and cost. Collaborate with internal clients (product teams, sector leads, data science teams) and external partners (SI partners/data providers) to drive solutioning and clarify solution requirements. Evolve the architectural capabilities and maturity of the data platform by engaging with enterprise architects to build and support the right domain architecture for each application following well-architected design standards. Define and manage SLA’s for data products and processes running in production. Create documentation for learnings and knowledge transfer to internal associates. Qualifications 12+ years of engineering and data management experience Qualifications 12+ years of overall technology experience that includes at least 5+ years of hands-on software development, data engineering, and systems architecture. 8+ years of experience with Data Lakehouse, Data Warehousing, and Data Analytics tools. 6+ years of experience in SQL optimization and performance tuning on MS SQL Server, Azure SQL or any other popular RDBMS 6+ years of experience in Python/Pyspark/Scala programming on big data platforms like Databricks 4+ years in cloud data engineering experience in Azure or AWS. Fluent with Azure cloud services. Azure Data Engineering certification is a plus. Experience with integration of multi cloud services with on-premises technologies. Experience with data modelling, data warehousing, and building high-volume ETL/ELT pipelines. Experience with data profiling and data quality tools like Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one business intelligence tool such as Power BI or Tableau Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes. Experience with version control systems like ADO, Github and CI/CD tools for DevOps automation and deployments. Experience with Azure Data Factory, Azure Databricks and Azure Machine learning tools. Experience with Statistical/ML techniques is a plus. Experience with building solutions in the retail or in the supply chain space is a plus. Understanding of metadata management, data lineage, and data glossaries is a plus. BA/BS in Computer Science, Math, Physics, or other technical fields. Candidate must be flexible to work an alternative work schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon product and project coverage requirements of the job. Candidates are expected to be in the office at the assigned location at least 3 days a week and the days at work needs to be coordinated with immediate supervisor Skills, Abilities, Knowledge: Excellent communication skills, both verbal and written, along with the ability to influence and demonstrate confidence in communications with senior level management. Proven track record of leading, mentoring data teams. Strong change manager. Comfortable with change, especially that which arises through company growth. Ability to understand and translate business requirements into data and technical requirements. High degree of organization and ability to manage multiple, competing projects and priorities simultaneously. Positive and flexible attitude to enable adjusting to different needs in an ever-changing environment. Strong leadership, organizational and interpersonal skills; comfortable managing trade-offs. Foster a team culture of accountability, communication, and self-management. Proactively drives impact and engagement while bringing others along. Consistently attain/exceed individual and team goals. Ability to lead others without direct authority in a matrixed environment. Comfortable working in a hybrid environment with teams consisting of contractors as well as FTEs spread across multiple PepsiCo locations. Domain Knowledge in CPG industry with Supply chain/GTM background is preferred.

Posted 3 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

About VOIS VO IS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Group’s partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, VO IS has evolved into a global, multi-functional organisation, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone. About VOIS India In 2009, VO IS started operating in India and now has established global delivery centres in Pune, Bangalore and Ahmedabad. With more than 14,500 employees, VO IS India supports global markets and group functions of Vodafone, and delivers best-in-class customer experience through multi-functional services in the areas of Information Technology, Networks, Business Intelligence and Analytics, Digital Business Solutions (Robotics & AI), Commercial Operations (Consumer & Business), Intelligent Operations, Finance Operations, Supply Chain Operations and HR Operations and more. Who You Are Role purpose: Working for Business Intelligence requires a good understanding of the business context and the business requirements. Focus of the role is testing of BI Application which is the data provisioning layer and direct interface to our BI customers. Major focus is to make sure the quality of backend development for this layer, which means, responsible for the development of required data structures, data marts, their provisioning with data and their transfer into regular. Business Intelligence testing initiatives help companies gain deeper and better insights so they can manage or make decisions based on hard facts or data. Simple, a BI testing project is a testing project too. That means the typical stages of testing are applicable here too, whether it is the performance you are testing or functional end to end testing: Test planning Test strategy Test design Test execution (Once again, you are going to need some querying interface such as Teradata SQL Assistant, to run your queries) Defect reporting, closure etc. BI Testing Strategy The goal of testing BI applications is to achieve credible data. And data credibility can be attained by making the testing cycle effective. A comprehensive test strategy is the steppingstone of an effective test cycle. The strategy should cover test planning for each stage, every time the data moves and state the responsibilities of each stakeholder e.g. business analysts, infrastructure team, QA team, DBA’s, Developers and Business Users. To ensure testing readiness from all aspects the key areas the strategy should focus on are: Scope of testing: Describe testing techniques and types to be used. Test environment set up. Test Data Availability: It is recommended to have production like data covering all/critical business scenarios. Data quality and performance acceptance criteria. What's In It For You Core competencies, knowledge and experience : Manual testing, DWH testing,SQL Good understanding of range of data manipulation and analysis techniques Detail conscious, Problem solver & Innovative thinker Ability to work under pressure to tight deadlines. Should be able to interact with client team and share ideas. Relevant work experience (4-5) years. Experience Verify transformations of data are applied correctly at data mart level or not. Check if the target and source data base are connected well and there are no access issues. While loading the data, check for the performance of the session. Verify you can fail the calling parent task if the child task fails. Verify that the logs are updated. Verify mapping and workflow parameters are configured accurately. Verify data completeness. Make sure data transformation are correct as per applied business logic. Make sure no Data loss during data integration process and handshaking between sources. VOIS Equal Opportunity Employer Commitment VO IS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees’ growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, colour, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 10 Best Workplaces for Millennials, Equity, and Inclusion , Top 50 Best Workplaces for Women , Top 25 Best Workplaces in IT & IT-BPM and 10th Overall Best Workplaces in India by the Great Place to Work Institute in 2024. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills! Apply now, and we’ll be in touch!

Posted 3 weeks ago

Apply

2.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description United's Digital Technology team designs, develops, and maintains massively scaling technology solutions brought to life with innovative architectures, data analytics, and digital solutions. Find your future at United! We’re reinventing what our industry looks like, and what an airline can be – from the planes we fly to the people who fly them. When you join us, you’re joining a global team of 100,000+ connected by a shared passion with a wide spectrum of experience and skills to lead the way forward. Achieving our ambitions starts with supporting yours. Evolve your career and find your next opportunity. Get the care you need with industry-leading health plans and best-in-class programs to support your emotional, physical, and financial wellness. Expand your horizons with travel across the world’s biggest route network. Connect outside your team through employee-led Business Resource Groups. Create what’s next with us. Let’s define tomorrow together. Job Overview And Responsibilities The team is currently looking for an entry level data scientist who has a passion for data and analytics with the willingness to dig deep into details as well as the ability to assess the big picture. In-depth understanding of modeling techniques and the ability to perform in a team as well as an individual is paramount to success in this role. The person must have experience to able to extract insights from large volumes of data and clearly communicate relevant triggers and recommendations. High-level responsibilities of the role include: Execute solutions to business problems using data analysis, data mining, optimization tools, statistical modeling and machine learning techniques Continuously develop and demonstrate improved analysis methodologies You will have strong modelling skills and are comfortable owning data and working from concept through to execution Understand a business problem and the available data and identify what analytics and modeling techniques can be applied to answer a business question This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. This position is for United Airlines Business Services Pvt. Ltd - a wholly owned subsidiary of United Airlines Inc. Qualifications Required Bachelor's degree required Master's Degree in a quantitative field like Math, Statistics, Operations Research and/or MBA preferred At least 2+ years of experience in modeling/ machine learning required Proven comfort and an intellectual curiosity for working with very large sets of data, and hands-on knowledge in predictive modeling Strong knowledge of either R or Python Be experienced in manipulating and analyzing complex, high-volume, high dimensionality data from various sources to highlight patterns and relationships Understanding of how and when to apply predictive and machine learning techniques like logistic regression, random forest, GBM, Neural Nets, SVM etc. is required Be proficient in using database querying tools and able to write complex queries and procedures using Teradata SQL and/or Microsoft TSQL Hands on experience in using Big Data ecosystems (Hadoop/Spark), API Gateways and non-structured data will be a plus Being able to communicate complex quantitative analysis and algorithms in a clear, precise and actionable manner Exhibit written and spoken English fluency About 4-6 weeks of travel to US in a year required Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position GGN00001901

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Design, develop, and implement data models and ETL processes for Power BI solutions Be able to understand and create Test Scripts for data validation as it moves through various lifecycles in cloud-based technologies Be able to work closely with business partners and data SMEs to understand Healthcare Quality Measures and its related business requirements Conduct data validation after major/minor enhancements in project and determine the best data validation techniques to implement Communicate effectively with leadership and analysts across teams Troubleshoot and resolve issues with Jobs/pipelines/overhead Ensure data accuracy and integrity between sources and consumers Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Graduate degree or equivalent (B.Tech./MCA preferred) with overall 3+ years of work experience 3+ years of advanced understanding to at least one programming language - Python, Spark, Scala Experience of working with Cloud technologies preferably Snowflake, ADF and Databricks Experience of working with Agile Methodology (preferably in Rally) Knowledge of Unix Shell Scripting for automation & scheduling Batch Jobs Knowledge of Configuration Management - Github Knowledge of Relational Databases - SQL Server, Oracle, Teradata, IBM DB2, MySQL Knowledge of Messaging Queues - Kafka/ActiveMQ/RabbitMQ Knowledge of CI/CD Tools - Jenkins Understanding Relational Database Model & Entity Relation diagrams Proven solid communication and interpersonal skills Proven excellent written and verbal communication skills with ability to provide clear explanations and overviews to others (internal and external) of their work efforts Proven solid facilitation, critical thinking, problem solving, decision making and analytical skills Demonstrated ability to prioritize and manage multiple tasks At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

About Citi: Citi, the leading global bank, has approximately 200 million customer accounts and does business in more than 160 countries and jurisdictions. Citi provides consumers, corporations, governments and institutions with a broad range of financial products and services, including consumer banking and credit, corporate and investment banking, securities brokerage, transaction services, and wealth management. Our core activities are safeguarding assets, lending money, making payments and accessing the capital markets on behalf of our clients Citi’s Mission and Value Proposition explains what we do and Citi Leadership Standards explain how we do it. Our mission is to serve as a trusted partner to our clients by responsibly providing financial services that enable growth and economic progress. We strive to earn and maintain our clients’ and the public’s trust by constantly adhering to the highest ethical standards and making a positive impact on the communities we serve. Our Leadership Standards is a common set of skills and expected behaviors that illustrate how our employees should work every day to be successful and strengthens our ability to execute against our strategic priorities Diversity is a key business imperative and a source of strength at Citi. We serve clients from every walk of life, every background and every origin. Our goal is to have our workforce reflect this same diversity at all levels. Citi has made it a priority to foster a culture where the best people want to work, where individuals are promoted based on merit, where we value and demand respect for others and where opportunities to develop to are widely available to all Job Description : Citi is looking for an energetic, motivated professional to join its Fraud Analytics team as a Spec Analytics Analyst 1 . In this role, you will be tasked to prevent financial crime by working with strategy development team to test, validate and execute various strategy changes and model deployments. Primary responsibilities include: Strong knowledge of system & user functionalities of various fraud orchestration engines such as Actimize (RCM/Actone) , SAS Fraud Manager, Lexis Nexis/Threat Metrix etc. Good to have Business knowledge on Financial crime risk management and Digital fraud. Ensure seamless implementation of fraud rule (Actimize Policy manager) & various Traditional and Machine Learning models adds, deletes, and edits ensuring all procedures and audit requirements are followed. Associate will be required to understand the logic (often in SQL) being adjusted and why the change is being made. Ensuring 100% execution accuracy. Developing Policy rules/Fraud Strategies on Actimize Policy Manager UI and Promoting the same on Prod and Prod test environment. Collaborate with fraud strategy development teammates to provide feedback and performing validation on rule performance, trends and volumes. Focus on development of tactical and strategic solutions with high visibility for key stakeholders. Perform gap analysis to identify system weaknesses and work with various system partners to actively close such gaps. Assist with developing and supporting fraud rule performance management tracking. Objectives will include establishing consistent standards for key fraud metrics such as hit rate, false positive rate, ROI and run regular reports to gather the same. Ensure rules are developed in a fashion that maximizes the capabilities of our decisioning platforms. Will thus require developing a deep understanding of the platforms, their features and limitations. Periodic development of MIS reports, dashboards to communicate results on pre and post fraud strategy changes. Perform gap analysis to identify system weaknesses and work with various system partners to actively close such gaps. Focus on development of tactical and strategic solutions with high visibility for key stakeholders. Partner with various stakeholders during strategy development and execution to identify testing and validation requirements. BAU-Analysis on Bad merchant (Fraud merchant) on the basis of the fraud trends via SAS. BAU-Analysis on the manual check process via python model and SAS. Contribute to organizational initiatives in wide ranging areas including competency development, training, organizational building activities etc. Job Skills/Qualifications: Bachelor’s degree, preferably in a quantitative discipline. Minimum 5 years of hands-on experience in system execution, system testing and/or analytics. Minimum 5 years’ experience in working and good programming knowledge in SAS/SQL(Oracle/Teradata) is required. Good in enhancing the SQL queries according to the requirement for Data Validation and Analysis. Model execution and governance experience in any domain will be preferable. Candidate should have a demonstrable analytic, problem solving, and has the ability to deliver projects in a fast-paced environment. Strong verbal and written communication skills; experience with stakeholder management. Should also have a basic knowledge of reporting and visualization tools such as Tableau. Actimize knowledge (Policy manager, Platform list, profile reader and writer) would be a great add on. Strategy Execution or Control background preferred. Ability to work independently. Risk and control mindset: ability to ask incisive questions, assess materiality and escalate issues Ability to handle large volumes of transactional data. Strong project management ,organizational skills and team handling capability; ability to multi-task and meet deadlines. ------------------------------------------------------ Job Family Group: Decision Management ------------------------------------------------------ Job Family: Business Analysis ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 3 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Join us as a Developer at Barclays where you will spearhead the evolution of our infrastructure and deployment pipelines, driving innovation and operational excellence. You will harness cutting-edge technology to build and manage robust, scalable and secure infrastructure, ensuring seamless delivery of our digital solutions. To be successful as a Developer you should have experience with: Ab>Initio Experience SQL / RDBMS Knowledge Unix / Python wrapper Script Experience in Oracle, Teradata Some Other Highly Valued Skills/key Accountabilities Include AWS – Architecture, Glue, S3 Iceberg DBT Snowflake / Databricks You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. The role is based out of Pune. Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data. Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. Development of processing and analysis algorithms fit for the intended data complexity and volumes. Collaboration with data scientist to build and deploy machine learning models. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team’s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.

Posted 3 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Join us as a Developer at Barclays where you will spearhead the evolution of our infrastructure and deployment pipelines, driving innovation and operational excellence. You will harness cutting-edge technology to build and manage robust, scalable and secure infrastructure, ensuring seamless delivery of our digital solutions. To be successful as a Developer you should have experience with: Ab>Initio Experience SQL / RDBMS Knowledge Unix / Python wrapper Script Experience in Oracle, Teradata Some Other Highly Valued Skills/key Accountabilities Include AWS – Architecture, Glue, S3 Iceberg DBT Snowflake / Databricks You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. The role is based out of Pune. Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data. Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. Development of processing and analysis algorithms fit for the intended data complexity and volumes. Collaboration with data scientist to build and deploy machine learning models. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team’s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave. Back to nav Share job X(Opens in new tab or window) Facebook(Opens in new tab or window) LinkedIn(Opens in new tab or window)

Posted 3 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Overview Of The Role Citi, the leading global bank, has approximately 200 million customer accounts and does business in more than 160 countries and jurisdictions. Citi provides consumers, corporations, governments, and institutions with a broad range of financial products and services, including consumer banking and credit, corporate and investment banking, securities brokerage, transaction services, and wealth management. As a bank with a brain and a soul, Citi creates economic value that is systemically responsible and, in our clients’, best interests. As a financial institution that touches every region of the world and every sector that shapes your daily life, our Enterprise Operations & Technology teams are charged with a mission that rivals any large tech company. Our technology solutions are the foundations of everything we do from keeping the bank safe, managing global resources, and providing the technical tools our workers need to be successful to designing our digital architecture and ensuring our platforms provide a first-class customer experience. We reimagine client and partner experiences to deliver excellence through secure, reliable, and efficient services. Our commitment to diversity includes a workforce that represents the clients we serve from all walks of life, backgrounds, and origins. We foster an environment where the best people want to work. We value and demand respect for others, promote individuals based on merit, and ensure opportunities for personal development are widely available to all. Ideal candidates are innovators with well-rounded backgrounds who bring their authentic selves to work and complement our culture of delivering results with pride. If you are a problem solver who seeks passion in your work, come join us. We’ll enable growth and progress together. Description: This is a Non-Production Management Technical Lead position L2 SRE position in North America DevOps, supporting Global Consumer Group applications. GCG Production Management is during transformation, expanding the support model to incorporate Service Reliability Engineering principles. In support of this transformation, this role is a blend of traditional ITIL based Production Management, with Service Reliability Engineering. The ideal candidate for this position will have experience and broad knowledge of North America Consumer applications along with an interest in learning new technologies, including the use of automation and artificial intelligence technologies to avoid system problems, automate manual activities, and drive improved system & application service levels. The work is supported by contractors offshore and onshore, who provide 7x24 service for North America. This is a technical leadership position, requiring strong organizational and communication skills in addition to analytical and troubleshooting talent. Partnership with Development Teams, Technology teams in CTI, and other Production Management teams is a critical component of this position and required daily. This position will lead provide the technical leadership for GCG applications . They will work with other peers in the DevOps team to drive the stability. Collaborate with app Dev community, CTI partners, TPM and other stakeholders to identify and create value chain , identify and conduct POC to plug the gap areas. Primary Responsibilities: Provides expertise related to various Distributed Consumer Applications across multiple Lines of Business in North America. Primary point of contact LOB assigned domain. Enable Production management processes in non-production environment to provide environment stability Execute robust service readiness. Facilitate standard toolset adoption for all services in the domain. Works as a L2 expert to support the Problem management, Risk management and Change management , CI/CD enablement pipeline for SRE function identified. Has Overall accountability of non-production stability for his area/domain Partners with Level 1 and Level 3 support teams to improve resolution rates, efficiency targets, and organizational Service Level Agreements. Partners with SRE enablement and works as SRE eventually to identify the key areas and provides the SRE recommendation from UAT to PERF and PROD for key business transactions supported. Knowledge of technologies like OSE, Kubernetes, APIGEE, Platform services, DataPower, Google cloud, AWS, CI/CD pipeline, ITIL and Service Management Identifies and leads the implementation of Service Automation to reduce cost, reduce risk, improve efficiency, and enable Service Management to keep up with the ever-increasing volume of with fast pace of newer technologies. Continually evolve the working practices within and services provided by Production Management to improve efficiency and productivity. Ability to conduct blameless problem management/post-mortem phase of major incidents, develop executive briefings, assess major incident impacts, and drive service improvements to prevent repeat of an incident . Provide the expertise support to L1 technical recovery center on the P1/P2 incidents assigned for Apigee, DataPower and AWS, legacy tech stacks other applications assigned as need may be and join the triaging call to provide the direction and take it to resolution. Create PMR for P1/P2 incidents and close on the actions. Identify the risks, classify them in the non-production estate and work with the peers , team members , create Service Improvement plans and drive them to closure. Create Operational readiness documents for major initiatives and provide handover to production team in a seamless manner. Work with SRE team to create a proactive analysis of UAT and PERF view before handing over to production management. Accountable for end-to-end service health of NAM Core space Overall accountable for patching , changes, infra changes, certificates and other KTLO activities in his domain assigned. Overall accountability of the monitoring and its usage by its stakeholders. Work with the monitoring team for setup and overall accountability. Represent DevOps team in various digital forums and facilitate generate of reports and presentations. Be proficient in various technologies of OSE, Apigee, AWS and other new age technologies. Adopt automation laid down by Production management automation and AIOps. Support and achieve successful internal audits. Qualifications: 6-10 years development or production support experience with North America Consumer applications. Experience or familiarity Cloud Technology is a plus. Solid ITIL Foundation understanding. Engineering Background in system admin, development, DevOps, or equivalent field, preferably with experience in Distributed Consumer applications. Experience/ familiarity with automation technologies, advanced analytics, and predictive modelling. Ability to develop and manage relationships at all levels. Experience with databases i.e., Oracle, MSSQL, MongoDB, Teradata, DB2 Experience in programming in one of the following languages UNIX shell scripting, Java, etc. Competent with cloud concepts i.e., API, web services and microservices Strong analytical, algorithmic, and critical thinking skills Core Competencies/ Skills: Fluent English Strong analytical skills, strong critical thinking skills and ability to logically break down tasks into smaller manageable parts. Solid understanding of systems and application design Systematic problem-solving approach Effective communication skills and sense of ownership and drive Adaptable and can work with large complex and multi team owned services. Extremely organized, detailed oriented and thorough in every aspect. Able to balance multiple tasks and projects effectively while adapting to new variables. Utilizing creative and innovative thinking but also adhering to a powerful sense of ownership, customer service and integrity demonstrated through clear communication. Drive, self-motivated and eager to learn. Education: Bachelor’s/University degree or equivalent experience Certification in Site Reliability Engineer, Sales Force or Cloud Based Certification like AWS. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Systems & Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 3 weeks ago

Apply

1.0 - 2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description WHO WE ARE Come join our Technology Team and start reimagining the future of the automotive aftermarket. We are a highly motivated tech-focused organization, excited to be amid dynamic innovation and transformational change. Driven by Advance’s top-down commitment to empowering our team members, we are focused on delighting our Customers with Care and Speed, through delivery of world class technology solutions and products. We value and cultivate our culture by seeking to always be collaborative, intellectually curious, fun, open, and diverse. You will be a key member of a growing and passionate group focused on collaborating across business and technology resources to drive forward key programs and projects building enterprise capabilities across Advance Auto Parts. The Opportunity Join the AAP team and start reimagining the future of automotive retail. Disrupt the way consumers buy auto parts and take on the industry’s biggest challengers to execute on AAP's top-down commitment to digital expansion. As a member of the Advance Auto Parts team, you will have an opportunity to disrupt a $150B auto parts industry to bring better and faster solutions to customers. You will be part of a team helping the company live its mission of “Advancing a World in Motion”. The role is part of a merit-based organization with a culture of professional growth and development, and emphasis on the latest tools, platforms and technologies. Responsibilities Help in the migration and modernization of data platforms, moving applications and pipelines to Google Cloud-based solutions. Ensure data security and governance, enforcing compliance with industry standards and regulations. Help in managing and scale data pipelines from internal and external data sources to support new product launches and ensure high data quality. Help in developing automation and monitoring frameworks to capture key metrics and operational KPIs for pipeline performance. Collaborate with internal teams, including data science and product teams, to drive solutioning and proof-of-concept (PoC) discussions. Help in developing and optimizing procedures to transition data into production. Create and maintain technical documentation for sharing knowledge. Help in developing reusable packages and libraries to enhance development efficiency. Help in developing real-time and batch data processing solutions, integrating structured and unstructured data sources. Required Qualification We are looking for a candidate with 1-2 years of experience in Data Engineering and Application development. They must have a graduate degree in Computer Science or a related field of study. They must have experience with programming languages such as Python, Java & DS&Algo, Spark, and Scala. Expertise in Python and Spark is a must. Exposure of AWS and Cloud technologies. Experience in data platform engineering, with a focus on cloud transformation and modernization. Hands-on experience building large, scaled data pipelines in cloud environments and handling of data in PBs. Experience with CI/CD pipeline management in GCP DevOps. Understanding of data governance, security, and compliance best practices. Experience working in an Agile development environment. Prior experience in migrating applications from legacy platforms to the cloud. Knowledge of Terraform or Infrastructure-as-Code (IaC) for cloud resource management. Familiarity with Kafka, Event Hubs, or other real-time data streaming solutions. Experience with legacy RDBMS (Oracle, DB2, Teradata) & DataStage/Talend Having background supporting data science models in production. California Residents Click Below For Privacy Notice https://jobs.advanceautoparts.com/us/en/disclosures We are an Equal Opportunity Employer and do not discriminate against any employee or applicant for employment because of race, color, sex, age national origin, religion, sexual orientation, gender identity, status as a veteran and basis of disability or any other federal, state or local protected class.

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Telangana, India

On-site

Ignite the Future of Language with AI at Teradata! What You'll Do: Shape the Way the World Understands Data At Teradata, we're not just managing data; we're unleashing its full potential. Our ClearScape Analytics™ platform and pioneering Enterprise Vector Store are empowering the world's largest enterprises to derive unprecedented value from their most complex data. We're rapidly pushing the boundaries of what's possible with Artificial Intelligence, especially in the exciting realm of autonomous and agentic systems We’re building intelligent systems that go far beyond automation — they observe, reason, adapt, and drive complex decision-making across large-scale enterprise environments. As a member of our AI engineering team, you’ll play a critical role in designing and deploying advanced AI agents that integrate deeply with business operations, turning data into insight, action, and measurable outcomes. You’ll work alongside a high-caliber team of AI researchers, engineers, and data scientists tackling some of the hardest problems in AI and enterprise software — from scalable multi-agent coordination and fine-tuned LLM applications, to real-time monitoring, drift detection, and closed-loop retraining systems. If you're passionate about building intelligent systems that are not only powerful but observable, resilient, and production-ready, this role offers the opportunity to shape the future of enterprise AI from the ground up. Who You'll Work With: Join Forces with the Best Imagine collaborating daily with some of the brightest minds in the company – individuals who champion diversity, equity, and inclusion as fundamental to our success. You'll be part of a cohesive force, laser-focused on delivering high-quality, critical, and highly visible AI/ML functionality within the Teradata Vantage platform. Your insights will directly shape the future of our intelligent data solutions. You'll report directly to the inspiring Sr. Manager, Software Engineering, who will champion your growth and empower your contributions. What Makes You a Qualified Candidate: Skills in Action Experience working with modern data platforms like Teradata, Snowflake, and Databricks Passion for staying current with AI research, especially in the areas of reasoning, planning, and autonomous systems. You are an excellent backend engineer who codes daily and owns systems end-to-end. Strong engineering background (Python/Java/Golang, API integration, backend frameworks) Strong system design skills and understanding of distributed systems. You’re obsessive about reliability, debuggability, and ensuring AI systems behave deterministically when needed. Hands-on experience with Machine learning & deep learning frameworks: TensorFlow, PyTorch, Scikit-learn Hands-on experience with LLMs, agent frameworks (LangChain, AutoGPT, ReAct, etc. ), and orchestration tools. Experience with AI observability tools and practices (e. g. , logging, monitoring, tracing, metrics for AI agents or ML models). Solid understanding of model performance monitoring, drift detection, and responsible AI principles. What You Bring: Passion and Potential A Bachelor's or Master's degree in Computer Science, Engineering, Data Science, or a related field – your academic foundation is key. A genuine excitement for AI and large language models (LLMs) is a significant advantage – you'll be working at the cutting edge! Design, develop, and deploy agentic systems integrated into the data platform. 3+ years of experience in software architecture, backend systems, or AI infrastructure. Experience in software development (Python, Go, or Java preferred). Familiarity with backend service development, APIs, and distributed systems. Interest or experience in LLMs, autonomous agents, or AI tooling. Familiarity with containerized environments (Docker, Kubernetes) and CI/CD pipelines. Experience with AI observability tools and practices (e. g. , logging, monitoring, tracing, metrics for AI agents or ML models). Build dashboards and metrics pipelines to track key AI system indicators: latency, accuracy, tool invocation success, hallucination rate, and failure modes. Integrate observability tooling (e. g. , OpenTelemetry, Prometheus, Grafana) with LLM-based workflows and agent pipelines. Strong knowledge of LLMs, RL, or cognitive architectures is highly desirable. Passion for building safe, human-aligned autonomous systems. Bonus: Research experience or contributions to open-source agentic frameworks. You're knowledgeable about open-source tools and technologies and know how to leverage and extend them to build innovative solutions. Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status.

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Our Company: At Teradata, we believe that people thrive when empowered with better information. That’s why we built the most complete cloud analytics and data platform for AI. By delivering harmonized data, trusted AI, and faster innovation, we uplift and empower our customers—and our customers’ customers—to make better, more confident decisions. The world’s top companies across every major industry trust Teradata to improve business performance, enrich customer experiences, and fully integrate data across the enterprise. What you will do: In this role you will lead a critical and highly visible function within Teradata Vantage platform. You will be given the opportunity to autonomously deliver the technical direction of the service, and the feature roadmap. You will work with extraordinary talent and have the opportunity to shape the team to best execute on the product. Job Responsibilities: Contribute to the design and development of software components that enable autonomous AI agents to reason, plan, act, and learn. Implement features that support agentic capabilities such as tool use, memory, long-term planning, task execution, and self-reflection. Integrate with state-of-the-art LLMs (e. g. , OpenAI, Anthropic) and agent orchestration frameworks (e. g. , LangChain, CrewAI, AutoGen). Write, test, and debug Python or other backend code that powers agent workflows in cloud-native environments. Assist in building interfaces between agents and tools such as APIs, databases, search engines, and user-facing applications. Participate in Agile development workflows — sprint planning, code reviews, retrospectives — and collaborate across AI, product, and engineering teams. Work with structured and unstructured data to support context-aware reasoning and decision-making in agents. Contribute to the development of internal testing tools and evaluation methods for autonomous agent behaviors. Support the deployment of agent systems in scalable environments using Kubernetes, Docker, and cloud services (AWS, Azure, or GCP). Maintain documentation of agent capabilities, system design, and integration strategies. What makes you a qualified candidate: 1–3 years of experience (or strong academic/internship background) in software development with a focus on AI/ML, intelligent systems, or cloud-native applications. Strong interest in Agentic AI — systems where AI agents can autonomously interpret goals, take actions, and learn from outcomes. Familiarity with one or more programming languages: Python (preferred), Go, Java, or C++. Exposure to LLMs and AI APIs (e. g. , OpenAI, Hugging Face, Claude) and understanding of prompting, embeddings, or vector databases. Basic understanding of how AI agents use tools and memory to solve complex tasks. Experience with RESTful APIs, backend service design, and microservice architectures. Familiarity with modern cloud infrastructure (AWS, GCP, or Azure), especially in deploying containerized applications. Ability to write clean, testable code and follow best practices in code reviews and version control. Clear communicator, with a willingness to ask questions, propose ideas, and collaborate across teams. Passion for learning emerging AI techniques and contributing to real-world agentic applications. What you will bring: BS degree in Computer Science, Artificial Intelligence, Software Engineering, or related technical field. Strong problem-solving skills and a solid foundation in data structures, algorithms, and object-oriented programming. Experience with cloud tools and platforms (AWS, GCP, Azure), especially Kubernetes and Docker. Exposure to agentic frameworks such as LangChain, AutoGen, CrewAI, or similar tools. Curiosity and excitement about building intelligent systems that act autonomously and adaptively. A mindset geared toward experimentation, continuous learning, and pushing the boundaries of AI capabilities. Why we think you will love Teradata: We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Our Company Teradata is the connected multi-cloud data platform for enterprise analytics company. Our enterprise analytics solve business challenges from start to scale. Only Teradata gives you the flexibility to handle the massive and mixed data workloads of the future, today. The Teradata Vantage architecture is cloud native, delivered as-a-service, and built on an open ecosystem. These design features make Vantage the ideal platform to optimize price performance in a multi-cloud environment. What You’ll Do Participate in monthly close activities, including preparing journal entries and conducting analysis of GL activity Review, record and assess Investment and Equity eliminations to the consolidated financial statements Prepare balance sheet reconciliations, investigate discrepancies, and ensure financial data integrity Recommend reclassification and adjustments as needed Participate in quarterly balance sheet review and flux analysis Support corporate departments such as HR, Marketing, Tax, Treasury, etc. Support internal and external audit requests Interpret financial reports to communicate key findings to stakeholders Document systems and processes and ensure compliance with SOX requirements Handle ad hoc reporting and projects as needed Who You’ll Work With The GL Staff Accountant will be part of the Corporate Accounting team within the Teradata Corporate Controller’s group based out of Atlanta, Georgia. The Corporate Accounting team is responsible for the administration and performance of a wide array of general ledger accounting functions. Team members are expected to support a variety of corporate departments and interact with various levels of management. This role will work closely with the corporate and regional accounting teams and report directly to the Corporate Accounting Manager. We’re seeking a detail-oriented and motivated accountant to join our team. The role will play a key part in supporting month-end processes, maintaining accurate financial records, and contributing to the overall success of the accounting operations. Candidates must be comfortable executing and troubleshooting on their own, while hitting tight deadlines. What Makes You a Qualified Candidate Advanced problem-solving skills with the ability to interpret data to reach conclusive results Apply critical thinking to analysis to ensure accuracy and GAAP compliance Must possess a basic understanding of GAAP accounting principles Must be a strong team player who collaborates well with others with different personalities and backgrounds Will need to be a quick learner with the desire to improve skill sets Needs to be flexible and willing to adapt to the ever-changing business environment Will need strong time management skills with the ability to effectively prioritize tasks Needs to be able to work well under pressure and to meet tight deadlines Needs to be comfortable working in a fully remote environment Bachelor’s degree in accounting, finance, or related field 5 to 7 years of relevant accounting experience Exceptional problem-solving skills Well-developed Microsoft Office skills, including intermediate Excel proficiency Strong oral and written communication skills What You’ll Bring Experience with Oracle Cloud ERP, specifically general ledger Smart View experience Experience working in accounting for publicly traded companies with international operations Experience preparing and booking journal entries Power BI experience or similar Business Intelligence tools Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status.

Posted 3 weeks ago

Apply

0.0 years

13 Lacs

Bengaluru, Karnataka, India

On-site

Job Description: Has good knowledge on Snowflake Architecture Understanding Virtual Warehouses multi cluster warehouse autoscaling Metadata and system objects query history grants to users grants to roles users Micro partitions Table Clustering Auto Reclustering Materialized Views and benefits Data Protection with Time Travel in Snowflake extremely imp Analyzing Queries Using Query Profile extremely important Explain plan Cache architecture Virtual Warehouse VW Named Stages Direct Loading SnowPipe Data Sharing Streams JavaScript Procedures Tasks Strong ability to design and develop workflows in Snowflake in at least one cloud technology preferably AWS Apply Snowflake programming and ETL experience to write Snowflake SQL and maintain complex internally developed Reporting system Preferable knowledge in ETL Activities like data processing from multiple source systems Extensive Knowledge on Query Performance tuning Apply knowledge of BI tools Manage time effectively Accurately estimate effort for tasks and meet agreed upon deadlines Effectively juggle ad hoc requests and longer term projects Snowflake performance specialist Familiar withzero copy cloningand usingtime travelfeatures to clone table Familiar in understandingSnowflake query profileand what each step does andidentifying performance bottlenecks from query profile Understanding of when a table needs to be clustered Choosing the right cluster keyas a part of table design to help query optimization Working with materialized views andbenefits vs cost scenario How Snowflake micro partitions are maintained and what are the performance implications wrt micro partitions pruningetc Horizontal vs vertical scaling When to do what Concept of multi cluster warehouse and autoscaling Advanced SQL knowledge including window functions recursive queriesand ability to understand and rewritecomplex SQLs as a part of performance optimization Key Responsibilities: Has good knowledge on Snowflake Architecture Understanding Virtual Warehouses multi cluster warehouse autoscaling Metadata and system objects query history grants to users grants to roles users Micro partitions Table Clustering Auto Reclustering Materialized Views and benefits Data Protection with Time Travel in Snowflake extremely imp Analyzing Queries Using Query Profile extremely important Explain plan Cache architecture Virtual Warehouse VW Named Stages Direct Loading SnowPipe Data Sharing Streams JavaScript Procedures Tasks Strong ability to design and develop workflows in Snowflake in at least one cloud technology preferably AWS Apply Snowflake programming and ETL experience to write Snowflake SQL and maintain complex internally developed Reporting system Preferable knowledge in ETL Activities like data processing from multiple source systems Extensive Knowledge on Query Performance tuning Apply knowledge of BI tools Manage time effectively Accurately estimate effort for tasks and meet agreed upon deadlines Effectively juggle ad hoc requests and longer term projects Snowflake performance specialist Familiar withzero copy cloningand usingtime travelfeatures to clone table Familiar in understandingSnowflake query profileand what each step does andidentifying performance bottlenecks from query profile Understanding of when a table needs to be clustered Choosing the right cluster keyas a part of table design to help query optimization Working with materialized views andbenefits vs cost scenario How Snowflake micro partitions are maintained and what are the performance implications wrt micro partitions pruningetc Horizontal vs vertical scaling When to do what Concept of multi cluster warehouse and autoscaling Advanced SQL knowledge including window functions recursive queriesand ability to understand and rewritecomplex SQLs as a part of performance optimization Technical Requirements: Mandatory skills Snowflake Desired skills Teradata Python Not Mandatory Additional Responsibilities: Domain Data Warehousing Business Intelligence Precise Work Location Bhubaneswar Bangalore Hyderabad Pune Preferred Skills: Cloud Platform->Snowflake,Technology->OpenSystem->Python - OpenSystem

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies