Jobs
Interviews

1346 Teradata Jobs - Page 8

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 5.0 years

0 Lacs

hyderabad, telangana

On-site

As an integral part of American Airlines Tech Hub in Hyderabad, India, you will have the opportunity to contribute to the innovative and tech-driven environment that shapes the future of travel. Your role will involve collaborating with source data application teams and product owners to develop and support analytics solutions that provide valuable insights for informed decision-making. By leveraging Azure products and services such as Azure Data Lake Storage, Azure Data Factory, and Azure Databricks, you will be responsible for implementing data migration and engineering solutions to enhance the airline's digital capabilities. Your responsibilities will encompass various aspects of the development lifecycle, including design, cloud engineering, data modeling, testing, performance tuning, and deployment. Working within a DevOps team, you will have the chance to take ownership of your product and contribute to the development of batch and streaming data pipelines using cloud technologies. Adherence to coding standards, best practices, and security guidelines will be crucial as you collaborate with a multidisciplinary team to deliver technical solutions effectively. To excel in this role, you should have a Bachelor's degree in a relevant technical discipline or equivalent experience, along with a minimum of 1 year of software solution development experience using agile methodologies. Proficiency in SQL for data analytics and prior experience with cloud development, particularly in Microsoft Azure, will be advantageous. Preferred qualifications include additional years of software development and data analytics experience, as well as familiarity with tools such as Azure EventHub, Azure Power BI, and Teradata Vantage. Your success in this position will be further enhanced by expertise in the Azure Technology stack, practical knowledge of Azure cloud services, and relevant certifications such as Azure Development Track and Spark Certification. A combination of development, administration, and support experience in various tools and platforms, including scripting languages, data platforms, and BI analytics tools, will be beneficial for your role in driving data management and governance initiatives within the organization. Effective communication skills, both verbal and written, will be essential for engaging with stakeholders across different levels of the organization. Additionally, your physical abilities should enable you to perform the essential functions of the role safely and successfully, with or without reasonable accommodations as required by law. At American Airlines, diversity and inclusion are integral to our workforce, fostering an inclusive environment where employees can thrive and contribute to the airline's success. Join us at American Airlines and embark on a journey where your technical expertise and innovative spirit will play a pivotal role in shaping the future of travel. Feel free to be yourself as you contribute to the seamless operation of the world's largest airline, caring for people on life's journey.,

Posted 1 week ago

Apply

7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Company Description The Smart Cube, a WNS company, is a trusted partner for high performing intelligence that answers critical business questions. And we work with our clients to figure out how to implement the answers, faster. Job Description Roles and ResponsibilitiesAssistant Managers must understand client objectives and collaborate with the Project Lead to design effective analytical frameworks. They should translate requirements into clear deliverables with defined priorities and constraints. Responsibilities include managing data preparation, performing quality checks, and ensuring analysis readiness. They should implement analytical techniques and machine learning methods such as regression, decision trees, segmentation, forecasting, and algorithms like Random Forest, SVM, and ANN.They are expected to perform sanity checks and quality control of their own work as well as that of junior analysts to ensure accuracy. The ability to interpret results in a business context and identify actionable insights is critical. Assistant Managers should handle client communications independently and interact with onsite leads, discussing deliverables and addressing queries over calls or video conferences.They are responsible for managing the entire project lifecycle from initiation to delivery, ensuring timelines and budgets are met. This includes translating business requirements into technical specifications, managing data teams, ensuring data integrity, and facilitating clear communication between business and technical stakeholders. They should lead process improvements in analytics and act as project leads for cross-functional coordination.Client ManagementThey serve as client leads, maintaining strong relationships and making key decisions. They participate in deliverable discussions and guide project teams on next steps and execution strategy.Technical RequirementsAssistant Managers must know how to connect databases with Knime (e.g., Snowflake, SQL) and understand SQL concepts such as joins and unions. They should be able to read/write data to and from databases and use macros and schedulers to automate workflows. They must design and manage Knime ETL workflows to support BI tools and ensure end-to-end data validation and documentation.Proficiency in PowerBI is required for building dashboards and supporting data-driven decision-making. They must be capable of leading analytics projects using PowerBI, Python, and SQL to generate insights. Visualizing key findings using PowerPoint or BI tools like Tableau or Qlikview is essential.Ideal CandidateCandidates should have 4–7 years of experience in advanced analytics across Marketing, CRM, or Pricing in Retail or CPG. Experience in other B2C domains is acceptable. They must be skilled in handling large datasets using Python, R, or SAS and have worked with multiple analytics or machine learning techniques. Comfort with client interactions and working independently is expected, along with a good understanding of consumer sectors such as Retail, CPG, or Telecom.They should have experience with various data formats and platforms including flat files, RDBMS, Knime workflows and server, SQL Server, Teradata, Hadoop, and Spark—on-prem or in the cloud. Basic knowledge of statistical and machine learning techniques like regression, clustering, decision trees, forecasting (e.g., ARIMA), and other ML models is required.Other SkillsStrong written and verbal communication is essential. They should be capable of creating client-ready deliverables using Excel and PowerPoint. Knowledge of optimization methods, supply chain concepts, VBA, Excel Macros, Tableau, and Qlikview will be an added advantage. Qualifications Engineers from top tier institutes (IITs, DCE/NSIT, NITs) or Post Graduates in Maths/Statistics/OR from top Tier Colleges/UniversitiesMBA from top tier B-schools

Posted 1 week ago

Apply

4.0 - 11.0 years

6 - 13 Lacs

Pune

Work from Office

Join us as a Senior ETL Developer in Barclays where you will spearhead the evolution of our infrastructure and deployment pipelines, driving innovation and operational excellence. You will harness cutting-edge technology to build and manage robust, scalable and secure infrastructure, ensuring seamless delivery of our digital solutions. To be successful as Senior ETL Developer you should have experience with: Ab> Initio Experience SQL / RDBMS Knowledge Unix / Python wrapper Script Experience in Oracle, Teradata Some other highly valued skills/key accountabilities include: AWS Architecture, Glue, S3 Iceberg DBT Snowflake / Databricks You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. The role is based out of Pune. Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data. Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. Development of processing and analysis algorithms fit for the intended data complexity and volumes. Collaboration with data scientist to build and deploy machine learning models. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L Listen and be authentic, E Energise and inspire, A Align across the enterprise, D Develop others. OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation.

Posted 1 week ago

Apply

4.0 - 9.0 years

7 - 12 Lacs

Mumbai

Work from Office

Job Title: Analyst D&A Solutions Job Code: 10679 Country: IN City: Mumbai Skill Category: Finance Middle Office Description: We are committed to providing equal opportunities throughout employment including in the recruitment, training and development of employees. We prohibit discrimination in the workplace whether on grounds of gender, marital or domestic partnership status, pregnancy, carer s responsibilities, sexual orientation, gender identity, gender expression, race, color, national or ethnic origins, religious belief, disability or age. *Applying for this role does not amount to a job offer or create an obligation on Nomura to provide a job offer. The expression "Nomura" refers to Nomura Services India Private Limited together with its affiliates. Job Responsibilities: Data engineering & technical skills. Stateoftheart expertise designing & developing solutions for Data, Cloud & AI Governance. Contribute to the required Solutions for enabling Data, Cloud & AI Governance. Use your data engineering expertise plus FS domain & functional knowledge, problem solving skills and independent thinking to create Nomura s Data, Cloud & AI Governance Solutions. Be a team player & an individual contributor. Work with Chief Data Office (CDO) and other business function people as part of larger deliveries, as well as being able to work independently or in small teams to continuously deliver business value for Chief Data Office (CDO). Risk and control mindset : Conversant with Data, Cloud and AI risks and ability to interpret policies and frameworks for governance. Good understanding of responsible AI development and emerging risks of Generative AI. Qualification & Core Skills requirement: Preferred bachelors or master s in computer science, Statistics or similar or at least a 4year bachelor s degree Proficiency with SQL (MySQL, Postgres, Teradata, etc.). Experience with data processing tools and technologies (ETL tools, Python etc.) Good understanding of fundamental data concepts (data modelling, data pipelines, data quality, lineage etc.) and data warehousing methodologies Exposure to cloud technologies (AWS , snowflake etc.) is an added advantage Understanding FS industry fundamentals and business problems to find new ways to leverage data. Intellectual curiosity to solve data driven problems. Able to collaborate / virtually manage multicultural, multidisciplined, globally dispersed teams. Strong communication skills to be able to explain highly technical problems in simple layman form. Ability to work independently with guidance from senior members Knowledge of SDLC standards, including data integration, application security, and scalable design. We are committed to providing equal opportunities throughout employment including in the recruitment, training and development of employees. We prohibit discrimination in the workplace whether on grounds of gender, marital or domestic partnership status, pregnancy, carer s responsibilities, sexual orientation, gender identity, gender expression, race, color, national or ethnic origins, religious belief, disability or age. *Applying for this role does not amount to a job offer or create an obligation on Nomura to provide a job offer. The expression "Nomura" refers to Nomura Services India Private Limited together with its affiliates.

Posted 1 week ago

Apply

5.0 - 10.0 years

8 - 13 Lacs

Hyderabad

Work from Office

Job Summary The Technical Support Engineer (TSE) acts as a Starburst SME for a book of enterprise accounts. The TSE is responsible for answering all technical questions within both standard and custom deployment environments and assisting with supported LTS upgrades. The TSE is also responsible for peer training and development, personal continued education, and contributing to our reference documentation. They will coordinate closely with Support leadership, Engineering, Product and Accounts teams to ensure our customers receive a value driven enterprise experience. A TSE is able to work independently, with minimal guidance, and demonstrates an expert degree of proficiency in both SEP and Galaxy. Responsibilities \u25CF Technical Support: \u25CB Provide support for standard and custom deployments \u25CB Answer break/fix and non-break/fix technical questions through SFDC ticketing system \u25CB Efficiently reproduce reported issues by leveraging tools (minikube, minitrino, docker-compose, etc.), identify root causes, and provide solutions \u25CB Open SEP and Galaxy bug reports in Jira and feature requests in Aha! \u25CF LTS Upgrades: \u25CB Provide upgrade support upon customer request \u25A0 Customer must be on a supported LTS version at the time of request \u25A0 TSE must communicate unsupported LTS requests to the Account team as these require PS services \u25CF Monthly Technical check-ins \u25CB Conduct regularly scheduled technical check-ins with each BU \u25A0 Discuss open support tickets, provide updates on product bugs and provide best practice recommendations based on your observations and ticket trends \u25A0 Responsible for ensuring customer environments are on supported LTS versions \u25CF Knowledge Sharing/Technical Enablement: Knowledge exchange and continued technical enablement are crucial for the development of our team and the customer experience. Its essential that we keep our product expertise and documentation current and that all team members have access to information. \u25CB Contribute to our reference documentation \u25CB Lead peer training \u25CB Consultant to our content teams \u25CB Own your personal technical education journey \u25CF Project Involvement \u25CB Contribute to or drive components of departmental and cross functional initiatives \u25CF Partner with Leadership \u25CB Identify areas of opportunity with potential solutions for inefficiencies or obstacles within the team and cross-functionally \u25CB Provide feedback to your manager on continued ed. opportunities, project ideas, etc. Requirements \u25CF 5+ years of support experience \u25CF 3+ years of Big Data, Docker, Kubernetes and cloud technologies experience Skills \u25CF Big Data (Teradata, Hadoop, Data Lakes, Spark) \u25CF Docker and Kubernetes \u25CF Cloud technologies (AWS, Azure, GCP) \u25CF Security - Authentication (LDAP, OAuth2.0) and Authorization technologies \u25CF SSL/TLS \u25CF Linux Skills \u25CF DBMS Concepts/SQL Exposure Languages: SQL, Java, Python, Bash Benefits Why Join Us? Work with a globally recognized team on cutting-edge AI/ML projects. Be a part of a culture that values curiosity, continuous learning, and impact. Opportunity to collaborate with Fortune 500 clients and industry leaders. Grow your career by becoming an instructor and sharing your knowledge worldwide.

Posted 1 week ago

Apply

3.0 - 8.0 years

25 - 30 Lacs

Hyderabad

Work from Office

As a Software Engineer III at JPMorgan Chase within the Consumer and community banking- Data technology, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm s business objectives. Job responsibilities Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Develops secure high-quality production code, and reviews and debugs code written by others Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems Leads evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 3+ years applied experience Experience in software engineering, including hands-on expertise in ETL/Data pipeline and data lake platforms like Teradata and Snowflake Hands-on practical experience delivering system design, application development, testing, and operational stability Proficiency in AWS services especially in Aurora Postgres RDS Proficiency in automation and continuous delivery methods Proficient in all aspects of the Software Development Life Cycle Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated proficiency in software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) In-depth knowledge of the financial services industry and their IT systems Preferred qualifications, capabilities, and skills Experience in re-engineering and migrating on-premises data solutions to and for the cloud Experience in Infrastructure as Code (Terraform) for Cloud based data infrastructure Experience in building on emerging cloud serverless managed services, to minimize/eliminate physical/virtual server footprint Advanced in Java plus Python (nice to have) As a Software Engineer III at JPMorgan Chase within the Consumer and community banking- Data technology, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm s business objectives. Job responsibilities Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Develops secure high-quality production code, and reviews and debugs code written by others Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems Leads evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 3+ years applied experience Experience in software engineering, including hands-on expertise in ETL/Data pipeline and data lake platforms like Teradata and Snowflake Hands-on practical experience delivering system design, application development, testing, and operational stability Proficiency in AWS services especially in Aurora Postgres RDS Proficiency in automation and continuous delivery methods Proficient in all aspects of the Software Development Life Cycle Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated proficiency in software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) In-depth knowledge of the financial services industry and their IT systems Preferred qualifications, capabilities, and skills Experience in re-engineering and migrating on-premises data solutions to and for the cloud Experience in Infrastructure as Code (Terraform) for Cloud based data infrastructure Experience in building on emerging cloud serverless managed services, to minimize/eliminate physical/virtual server footprint Advanced in Java plus Python (nice to have)

Posted 1 week ago

Apply

0 years

0 Lacs

Bangalore Urban, Karnataka, India

On-site

Job Details: Experience - 4 to 12 yrs Mandatory Skills - Data Science, Classical Machine Learning, Python. Location - Mumbai, Pune, Bangalore, Chennai, Hyderabad and Kolkata location. Notice - Immediate to 30 days. Data Scientist Python and Notebooks Responsibilities We are seeking a talented Data Scientist to join our team and drive datadriven decisionmaking across our organization The ideal candidate will have a strong background in statistical analysis machine learning and data visualization with experience working with large datasets in a Teradata environment Design and implement endtoend data science projects from problem definition to model deployment Develop and apply advanced machine learning algorithms and statistical models to solve complex business problems Collaborate with crossfunctional teams to identify opportunities for datadriven improvements Conduct exploratory data analysis and feature engineering to prepare data for modeling Create and maintain dashboards and reports to communicate insights to stakeholders Optimize data collection procedures and ensure data quality Stay current with the latest advancements in data science and machine learning techniques Implement and maintain onpremise AIML solutions Apply Explainable AI techniques to enhance model interpretability and transparency Requirements Strong proficiency in Python R and SQL Experience with Teradata and data lakelakehouse architectures Expertise in machine learning algorithms statistical modeling and data visualization Familiarity with big data technologies eg Hadoop Spark Excellent problemsolving and communication skills Experience with version control systems eg Git Experience with onpremise AIML solutions Knowledge of Explainable AI methods and their practical applications.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Experience: 5+ Years Location: Pune Employment Type: Full-Time Job Summary: We are looking for a highly skilled and detail-oriented Data Engineer / Data Operations Specialist with 5+ years of experience in shell scripting, data pipeline orchestration, and database scripting. The ideal candidate will have strong hands-on expertise in Linux environments, Teradata tools, and a good understanding of DAG-based workflows. Key Responsibilities: Develop and maintain shell/Korn shell (KSH) scripts to automate data processing and system tasks. Write and optimize SQL queries (preferably BigQuery, but other DBs acceptable). Design, monitor, and troubleshoot data pipelines using Airflow or equivalent scheduling/orchestration tools. Handle large data loads using Teradata tools such as BTEQ, MLOAD, FLOAD, and FastExport. Ensure smooth data flow across different layers by writing robust ETL/ELT processes. Collaborate with data analysts, engineers, and DevOps teams to resolve pipeline issues. Continuously improve system performance and data accuracy. Required Skills: Linux Shell Scripting / KSH – Advanced SQL Expertise – Preferably BigQuery (but strong SQL logic is mandatory) Data Pipeline Knowledge – Understanding of DAGs; hands-on with Apache Airflow or similar tools Teradata Scripting – BTEQ, MLOAD, FLOAD, FastExport experience is essential Logical & Analytical Skills – Strong problem-solving and debugging capability Command Line Proficiency – Comfortable working in CLI-based environments Good to Have: Exposure to cloud platforms like GCP or AWS Familiarity with Git and CI/CD practices Basic knowledge of Python or any scripting language Education: Bachelor's degree in Computer Science, Information Systems, or related field

Posted 1 week ago

Apply

2.0 - 4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Nice to meet you! We’re a leader in data and AI. Through our software and services, we inspire customers around the world to transform data into intelligence - and questions into answers. We’re also a debt-free multi-billion-dollar organization on our path to IPO-readiness. If you're looking for a dynamic, fulfilling career coupled with flexibility and world-class employee experience, you'll find it here. About The Job The role is based in the Pune R&D Center, at SAS R&D Pune facility, SAS9 team. We are looking for an SDET to benchmark, test and validate SAS9 ACCESS engines for various databases like Oracle, Teradata, MSSQL, Postgres, MySQL, etc.The platforms supported would be Windows and various Unix platforms. Execution and designing automation of the functional test sets for the SAS9 ACCESS platform would also be a part of this role. You will be joining a friendly team with a broad range of experience, to develop and maintain SAS 9 Solutions Primary Responsibilities Validation and Testing of SAS Access engines of SAS9 Software on Windows and Unix platforms. Configuration of testing environment and support SAS9 testing for various database software. Demonstrate aptitude for problem solving and debugging of complex software systems. Automate the deployment process as applicable using internal tools, Jenkins, Python, Java/Shell scripts etc. Install and administer Postgres, Oracle, MySQL, SAS or related DBMS would be good to have. Design, implement, and maintain automated test frameworks, tools, and scripts that address specific needs. Design and implement test plans (cases, scenarios, usage). Create test strategies, test scenarios, and test ideas with clear intent. Understand complex usage concepts and assess applications’ ability to fulfill them. Identify risks, issues, potential defects, or defects in any phase of the project life cycle, managing them through closure. Requirement A minimum of 2-4 years of experience in all facets of software testing in a product-based software development environment At least 1.5 years of working experience with Windows and Unix operating systems. Knowledge of programming/Scripting language such as Java/Shell Script, Python etc. for automating deployment process as applicable. Install and administer any of Postgres, Hadoop, Oracle, MySQL, SAS or related DBMS. Independently execute test ware and debug the errors reported. The position requires strong ability to think and act like an end-user who would be using SAS software to store data in databases. Basic understanding of Cloud computing concepts and hypervisors. Must be familiar with all the phases of the Software Testing Life Cycle (STLC). Ability to make recommendations based on solid understanding of the problem resolution, troubleshooting development and environment and functional interaction. Mandatory Technical Skills A minimum of 2-4 years of experience in all facets of software testing in a product-based software development environment At least 1.5 years of working experience with Windows and Unix operating systems. Knowledge of programming/Scripting language such as Java/Shell Script, Python etc. for automating deployment process as applicable. Install and administer any of Postgres, Hadoop, Oracle, MySQL, SAS or related DBMS. Good logical problem-solving capability, quick and self-learner. Experience with Postgres, Oracle databases or other RDBMS and persistent stores. Must be familiar with all phases of the Software Testing Life Cycle (STLC). Experience working with Jira, agile methodology . Diverse and Inclusive At SAS, it’s not about fitting into our culture – it’s about adding to it. We believe our people make the difference. Our diverse workforce brings together unique talents and inspires teams to create amazing software that reflects the diversity of our users and customers. Our commitment to diversity is a priority to our leadership, all the way up to the top; and it’s essential to who we are. To put it plainly: you are welcome here. #SAS

Posted 1 week ago

Apply

2.0 - 4.0 years

2 - 4 Lacs

Chennai, Tamil Nadu, India

On-site

Responsibilities: ETL Development & Implementation: Design, develop, implement, and maintain complex ETL (Extract, Transform, Load) processes using IBM InfoSphere DataStage to facilitate data integration, migration, and warehousing. Data Warehousing with Teradata: Work extensively with Teradata as a data warehousing platform, including database design, query optimization, and data loading strategies. Analysis & Design: Conduct detailed analysis of business requirements for data integration, translating them into technical designs and specifications for DataStage jobs and Teradata schemas. Performance Tuning & Optimization: Optimize DataStage job performance and Teradata SQL queries to ensure efficient data processing, loading, and retrieval for large datasets. Troubleshooting & Support: Provide expert-level troubleshooting and ongoing support for DataStage jobs, Teradata database issues, and overall data pipeline operations, ensuring smooth functioning and high availability. Operational Contribution: Actively contribute to the operational efficiency and effectiveness of data systems built on or integrated with DataStage and Teradata. Data Quality & Governance: Implement data quality checks and contribute to data governance initiatives within the ETL processes and Teradata environment. Collaboration: Collaborate closely with data architects, business analysts, source system owners, and other stakeholders to ensure seamless data integration and successful project delivery. Required Skills: Proficiency in IBM InfoSphere DataStage . Proficiency in Teradata database and data warehousing concepts. Ability to perform analysis, development, implementation, and troubleshooting within the DataStage and Teradata domains. Strong understanding of ETL methodologies and data integration patterns. Problem-solving skills for complex data-related challenges. Ability to contribute to business objectives through effective data solutions.

Posted 1 week ago

Apply

7.0 - 9.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Domain: Insurance & Finance Experience: 7-9 years of experience as a Data Analyst, with at least 5 years supporting Finance within the insurance industry. Hands-on experience with Vertica/Teradata for querying, performance optimization, and large-scale data analysis. Advanced SQL skills: proficiency in Python is a strong plus. Proven ability to write detailed source-to-target mapping documents and collaborate with technical teams on data integration. Experience working in hybrid onshore-offshore team environments. Deep understanding of data modeling concepts and experience working with relational and dimensional models. Strong communication skills with the ability to clearly explain technical concepts to non-technical audiences. A strong understanding of statistical concepts, probability and accounting standards, financial statements (balance sheet, income statement, cash flow statement), and financial ratios. Strong understanding of life insurance products and business processes across the policy lifecycle. Investment Principles: Knowledge of different asset classes, investment strategies, and financial markets. Quantitative Finance: Understanding of financial modeling, risk management, and derivatives. Regulatory Framework: Awareness of relevant financial regulations and compliance requirements.

Posted 1 week ago

Apply

5.0 - 8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job description: Job Description Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ͏ Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ͏ Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ͏ Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ͏ Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Teradata . Experience: 5-8 Years . Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 week ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Role: GCP Data Engineer Location: Hyderabad/Chennai Experience: 3+ Years Skills: GCP, Pyspark, DAG, Airflow, Python, Teradata (Good to Have) Certifications: GCP preferred Regards, Manvendra Singh manvendra.singh1@incedoinc.com

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Skill required: Marketing Operations - Marketing Automation & Technology Execution Designation: Marketing Platform Auto Analyst Qualifications: BTech/MCA/BCA Years of Experience: 3 to 5 years Language - Ability: English(International) - Advanced About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world’s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do? This is a key role where campaign specialist is the end-to-end campaign executor of build and segmentation during the campaign execution lifecycle and will be responsible for delivering agreed activities for campaign deployment and He/she will support delivery via designated marketing automation tool such as UNICA. The role will require high level of expertise in consumer segmentation and loyalty tools, eye for detail and quality output. Experience in working on UNICA platform is desirable. Ability to master the current UNICA CRM environment but also learn new CRM technologies as they roll out. Responsible for list extraction for different type of campaigns on UNICA platform. SQL knowledge and experience is also required. Experience with campaign creation and ability to build campaign on different automation platforms as per client’s BRD document. Understanding of offers, collaterals, segment, and collateral mapping concepts. Maintain campaign calendar with real time status of campaigns and go live status. Ensure timely completion of tasks and requests Ensure accurate reporting at required frequency Risk & Issue management Escalate risks as per the escalation matrix defined Identify gaps and areas for improvement in execution processes and propose improvement solutions Apply learning and industry standard best practices from experience What are we looking for? Experience working in campaign eco-system specially email, SMS, or direct mail channels Execution experience in UNICA platform or similar marketing automation platform Hands on experience on SQL to perform data extraction using relevant tools Execution experience in database marketing, experience working in high pressure environments. 2 - 5 years of experience in marketing technology and operations focusing on execution of marketing campaigns on behalf of the Client Bachelor s degree in computer science, Computer Engineering, Computer Information Systems will be preferred Understanding of integrated marketing and customer data as it relates to targeting, segmentation, test/control design, and campaign analytics Understanding of marketing operations, processes, business requirements Comfortable operating in a fast-paced, deadline-driven environment with rapidly changing priorities and a high volume of projects Strong written and verbal communication skills with strong analytical and problem-solving skills. Exposure to Responsys, Salesforce, Marketo, Eloqua, Teradata, Adobe Campaign Classic Knowledge of HTML and CSS, JavaScript Understanding of email, social, mobile and display best practice Roles and Responsibilities: In this role you are required to do analysis and solving of lower-complexity problems Your day to day interaction is with peers within Accenture before updating supervisors In this role you may have limited exposure with clients and/or Accenture management You will be given moderate level instruction on daily work tasks and detailed instructions on new assignments The decisions you make impact your own work and may impact the work of others You will be an individual contributor as a part of a team, with a focused scope of work

Posted 1 week ago

Apply

6.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

As a Data Platform Solution Engineer (SE), you will play a pivotal role in helping enterprises unlock the full potential of Microsoft’s cloud database and analytics stack across every stage of deployment. You’ll collaborate closely with engineering leaders and platform teams to accelerate the Fabric Data Platform, including Azure Databases and Analytics, through hands-on engagements like Proof of Concepts, hackathons, and architecture workshops. This opportunity will allow you to accelerate your career growth, develop deep business acumen, hone your technical skills, and become adept at solution design and deployment. You’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform, all while enjoying flexible work opportunities. As a trusted technical advisor, you’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Responsibilities Drive technical conversations with decision makers using demos and PoCs to influence solution design and enable production deployments. Lead hands-on engagements—hackathons and architecture workshops—to accelerate adoption of Microsoft’s cloud platforms. Build trusted relationships with platform leads, co-designing secure, scalable architectures and solutions Resolve technical blockers and objections, collaborating with engineering to share insights and improve products. Maintain deep expertise in Analytics Portfolio: Microsoft Fabric (OneLake, DW, real-time intelligence, BI, Copilot), Azure Databricks, Purview Data Governance and Azure Databases: SQL DB, Cosmos DB, PostgreSQL. Maintain and grow expertise in on-prem EDW (Teradata, Netezza, Exadata), Hadoop & BI solutions. Represent Microsoft through thought leadership in cloud Database & Analytics communities and customer forums Qualifications Preffered 6+ years technical pre-sales, technical consulting, or technology delivery, or related experience OR equivalent experience 4+ years experience with cloud and hybrid, or on premises infrastructure, architecture designs, migrations, industry standards, and/or technology management Proficient on data warehouse & big data migration including on-prem appliance (Teradata, Netezza, Oracle), Hadoop (Cloudera, Hortonworks) and Azure Synapse Gen2. Or 5+ years technical pre-sales or technical consulting experience OR Bachelor's Degree in Computer Science, Information Technology, or related field AND 4+ years technical pre-sales or technical consulting experience OR Master's Degree in Computer Science, Information Technology, or related field AND 3+ year(s) technical pre-sales or technical consulting experience OR equivalent experience Expert on Azure Databases (SQL DB, Cosmos DB, PostgreSQL) from migration & modernize and creating new AI apps. Expert on Azure Analytics (Fabric, Azure Databricks, Purview) and other cloud products (BigQuery, Redshift, Snowflake) in data warehouse, data lake, big data, analytics, real-time intelligent, and reporting using integrated Data Security & Governance. Proven ability to lead technical engagements (e.g., hackathons, PoCs, MVPs) that drive production-scale outcomes. Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 1 week ago

Apply

8.0 years

4 - 9 Lacs

Bengaluru

On-site

Data Science is all about breaking new ground to enable businesses to answer their most urgent questions. Pioneering massively parallel data-intensive analytic processing, our mission is to develop a whole new approach to generating meaning and value from petabyte-scale data sets and shape brand new methodologies, tools, statistical methods, and models. What’s more, we are in collaboration with leading academics, industry experts and highly skilled engineers to equip our customers to generate sophisticated new insights from the biggest of big data. Join us as a Senior Advisor on our Data Science team in Bangalore to do the best work of your career and make a profound social impact. What you’ll achieve Data Science is all about breaking new ground to enable businesses to answer their most urgent questions. Pioneering massively parallel data-intensive analytic processing, our mission is to develop a whole new approach to generating meaning and value from petabyte-scale data sets and shape brand new methodologies, tools, statistical methods, and models. What’s more, we are in collaboration with leading academics, industry experts and highly skilled engineers to equip our customers to generate sophisticated new insights from the biggest of big data. You will: Develop Gen AI-based solutions to tackle real-world challenges using extensive datasets of text, images, and more Design and manage experiments; research new algorithms and optimization methods Build and maintain data pipelines and platforms to operationalize Machine Learning models at scale Demonstrate a passion for blending software development with Gen AI and ML Take the first step towards your dream career Every Dell Technologies team member brings something unique to the table. Here’s what we are looking for with this role: Essential Requirements Design, implement, test and maintain ML solutions within Dell's services organization Engage in design discussions, code reviews, and interact with various stakeholders Collaborate across functions to influence business solutions with technical expertise Thrive in a startup-like environment, focusing on high-priority tasks Desired Requirements Proficiency in Data Science Platforms (Domino Data Lab, Microsoft Azure, AWS, Google Cloud) | Deep knowledge in ML, data mining, statistics, NLP, or related fields | Experience in object-oriented programming (C#, Java) and familiarity with Python, Spark, TensorFlow, XGBoost | Experience in productionizing ML models and scaling them for low-latency environments | Proficient in Data Mining, ETL, SQL OLAP, Teradata, Hadoop 8+ years of related experience with a bachelor’s degree; or 6+ years with a Master’s; or 3+ years with a PhD; or equivalent experience Who we are We believe that each of us has the power to make an impact. That’s why we put our team members at the center of everything we do. If you’re looking for an opportunity to grow your career with some of the best minds and most advanced tech in the industry, we’re looking for you. Dell Technologies is a unique family of businesses that helps individuals and organizations transform how they work, live and play. Join us to build a future that works for everyone because Progress Takes All of Us. Application closing date: 20th Aug 2025 Dell Technologies is committed to the principle of equal employment opportunity for all employees and to providing employees with a work environment free of discrimination and harassment. Job ID: R274037

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Overview As an Analyst, Data Modeling, your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling - documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper management: business and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications 5+ years of overall technology experience that includes at least 2+ years of data modeling and systems architecture. Around 2+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 2+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI).

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Our Company Teradata is the connected multi-cloud data platform for enterprise analytics company. Our enterprise analytics solve business challenges from start to scale. Only Teradata gives you the flexibility to handle the massive and mixed data workloads of the future, today. The Teradata Vantage architecture is cloud native, delivered as-a-service, and built on an open ecosystem. These design features make Vantage the ideal platform to optimize price performance in a multi-cloud environment. Ignite the Future of Language with AI at Teradata! What You'll Do: Shape the Way the World Understands Data Are you ready to be at the forefront of a revolution? At Teradata, we're not just managing data; we're unlocking its hidden potential through the power of Artificial Intelligence and Machine Learning. As a key member of our innovative AI/ML team, you'll be architecting, building, and deploying cutting-edge software solutions that will literally transform languages within the Teradata Vantage platform – a cornerstone of our strategic vision and a powerhouse in the analytics world. Dive deep into the performance DNA of AI/ML applications. You'll be the detective, identifying and crushing bottlenecks to ensure our solutions not only scale massively but also deliver lightning-fast results. Your mission? To champion quality at every stage, tackling the unique and exhilarating challenges presented by AI/ML in the cloud. Become an integral part of a brilliant, tightly-knit team where collaboration isn't just a buzzword – it's how we create world-class, enterprise-grade software that pushes boundaries. You'll be a knowledge champion, diving into the intricacies of our domain, crafting compelling documentation, and sharing your expertise to inspire other teams. Your proficiency in Python, Java, Go, C++, along with Angular for frontend development, will be instrumental in delivering high-impact, full stack software that performs seamlessly, ensures long-term durability, optimizes costs, and upholds the highest standards of security. Unleash your inner API artisan! We're looking for someone with a genuine passion for crafting incredibly simple yet powerfully functional APIs that will be the backbone of our intelligent systems. Ready to paint the web with pixel-perfect magic? We're looking for a full stack developer who lives for clean code but dreams in UI – if crafting seamless, stunning frontends is your jam, this is your playground! Step into an agile, dynamic environment that feels like a startup but with the backing of an industry leader. You'll thrive on rapidly evolving business needs, directly impacting our trajectory and delivering quality solutions with speed and precision. Get ready to explore uncharted territories, creatively solve complex puzzles, and directly contribute to groundbreaking advancements. Who You'll Work With: Join Forces with the Best Imagine collaborating daily with some of the brightest minds in the company – individuals who champion diversity, equity, and inclusion as fundamental to our success. You'll be part of a cohesive force, laser-focused on delivering high-quality, critical, and highly visible AI/ML functionality within the Teradata Vantage platform. Your insights will directly shape the future of our intelligent data solutions. You'll report directly to the inspiring Sr. Manager, Software Engineering, who will champion your growth and empower your contributions. What Makes You a Qualified Candidate: Skills That Deliver Impact You bring 5+ years of industry experience in the exciting world of software development and operating software systems that can handle massive scale. Your mastery of Java with the Spring Framework and Angular—amplified by expertise in AI/ML, Kubernetes, microservices architecture, and DevOps methodologies—makes you a full-stack powerhouse poised to shape the future of technology. Bonus points for proficiency in Go, Python, FastAPI, or other object-oriented languages—the more versatile your tech stack, the stronger your impact. You possess a strong command of AI/ML algorithms, methodologies, tools, and the best practices for building robust AI/ML systems. Your foundational knowledge of data structures and algorithms is rock-solid. Skilled in full-stack development with a strong focus on test-first TDD practices and comprehensive unit testing across the entire application stack. A strong advantage: hands-on experience with AI/ML orchestration tools such as LangChain and MLflow, streamlining the training, evaluation, and deployment of AI/ML models. Demonstrates a strong interest in AI observability, particularly in monitoring and mitigating model drift to ensure sustained accuracy and reliability. Knowledge of containerization and orchestration tools like Docker and Kubernetes? That's a significant plus in our cloud-native world. Your analytical and problem-solving skills are sharp enough to cut through any challenge. Good grasp of designing complex systems, balancing scalability with simplicity in architecture and implementation You're a team player with experience in group software development and a fluent user of version control tools, especially Git. Your debugging skills are legendary – you can track down and squash bugs with finesse. You possess excellent oral and written communication skills, capable of producing clear and concise runbooks and technical documentation for both technical and non-technical audiences. Familiarity with relational database management systems (RDBMS) like PostgreSQL and MySQL is a plus. What You Bring: Passion and Product Thinking A Bachelor's or Master's degree in Computer Science, Engineering, Data Science, or a related field – your academic foundation is key. A genuine excitement for AI and large language models (LLMs) is a significant advantage – you'll be working at the cutting edge! Experience with Analytics? That's a huge plus in our data-driven environment. Familiarity with RDBMS – PostgreSQL, MySQL etc. – understanding data is crucial. You thrive in ambiguity, tackling undefined problems with an abstract and innovative mindset. Experience driving product vision to deliver long-term value for our customers is highly valued. You're ready to own the entire development lifecycle – from initial requirements to deployment and ongoing support. You're knowledgeable about open-source tools and technologies and know how to leverage and extend them to build innovative solutions. Passion for AI/ML, especially in building smart, agent-driven interfaces that feel human. Ownership mindset — you build, deploy, iterate, and scale with a long-term view. Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status.

Posted 1 week ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Senior Advisor, Data Science Data Science is all about breaking new ground to enable businesses to answer their most urgent questions. Pioneering massively parallel data-intensive analytic processing, our mission is to develop a whole new approach to generating meaning and value from petabyte-scale data sets and shape brand new methodologies, tools, statistical methods, and models. What’s more, we are in collaboration with leading academics, industry experts and highly skilled engineers to equip our customers to generate sophisticated new insights from the biggest of big data. Join us as a Senior Advisor on our Data Science team in Bangalore to do the best work of your career and make a profound social impact. What You’ll Achieve Data Science is all about breaking new ground to enable businesses to answer their most urgent questions. Pioneering massively parallel data-intensive analytic processing, our mission is to develop a whole new approach to generating meaning and value from petabyte-scale data sets and shape brand new methodologies, tools, statistical methods, and models. What’s more, we are in collaboration with leading academics, industry experts and highly skilled engineers to equip our customers to generate sophisticated new insights from the biggest of big data. You will: Develop Gen AI-based solutions to tackle real-world challenges using extensive datasets of text, images, and more Design and manage experiments; research new algorithms and optimization methods Build and maintain data pipelines and platforms to operationalize Machine Learning models at scale Demonstrate a passion for blending software development with Gen AI and ML Take the first step towards your dream career Every Dell Technologies team member brings something unique to the table. Here’s what we are looking for with this role: Essential Requirements Design, implement, test and maintain ML solutions within Dell's services organization Engage in design discussions, code reviews, and interact with various stakeholders Collaborate across functions to influence business solutions with technical expertise Thrive in a startup-like environment, focusing on high-priority tasks Desired Requirements Proficiency in Data Science Platforms (Domino Data Lab, Microsoft Azure, AWS, Google Cloud) | Deep knowledge in ML, data mining, statistics, NLP, or related fields | Experience in object-oriented programming (C#, Java) and familiarity with Python, Spark, TensorFlow, XGBoost | Experience in productionizing ML models and scaling them for low-latency environments | Proficient in Data Mining, ETL, SQL OLAP, Teradata, Hadoop 8+ years of related experience with a bachelor’s degree; or 6+ years with a Master’s; or 3+ years with a PhD; or equivalent experience Who We Are We believe that each of us has the power to make an impact. That’s why we put our team members at the center of everything we do. If you’re looking for an opportunity to grow your career with some of the best minds and most advanced tech in the industry, we’re looking for you. Dell Technologies is a unique family of businesses that helps individuals and organizations transform how they work, live and play. Join us to build a future that works for everyone because Progress Takes All of Us. Application closing date: 20th Aug 2025 Dell Technologies is committed to the principle of equal employment opportunity for all employees and to providing employees with a work environment free of discrimination and harassment. Read the full Equal Employment Opportunity Policy here. Job ID: R274037

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be responsible for developing and maintaining data pipelines and ETL processes in Teradata and Snowflake databases. Your main focus will be on optimizing database performance, ensuring data quality, designing data models and schemas, and collaborating with Power BI developers to provide data solutions. Additionally, you will be validating data within the databases. To excel in this role, you must possess expertise in Teradata and Snowflake database development, strong SQL and data warehousing skills, exposure to PowerBI, hands-on experience with ETL tools and processes, and knowledge of data modeling and schema design. The must-have technologies for this position include Teradata, Snowflake, SQL, and ETL tools. It would be beneficial to have skills in Python, Cloud Data platforms, or any other cloud technologies. This position requires a minimum of 5-6 years of relevant experience. Please note that after the initial L1 interview, if you are selected, you may be required to come to any of the mentioned locations (Chennai, Bengaluru, Hyderabad) for the L2 interview. The shift timing for this role is from 2 pm to 11 pm.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

Data Science is all about breaking new ground to enable businesses to answer their most urgent questions. Pioneering massively parallel data-intensive analytic processing, our mission is to develop a whole new approach to generating meaning and value from petabyte-scale data sets and shape brand new methodologies, tools, statistical methods and models. What's more, we are in collaboration with leading academics, industry experts, and highly skilled engineers to equip our customers to generate sophisticated new insights from the biggest of big data. Join us to do the best work of your career and make a profound social impact as a Senior Data Engineering Advisor on our Data Engineering Team in Bangalore. As a Senior Data Engineering Advisor, you will be responsible for developing technical tools and programs to automate the data management process, integrating medium to large structured data sets. You will have the opportunity to partner with Data Scientists, Architects, or Businesses to design strategic projects and improve complex processes. You will: - Design and build analytics solutions that deliver transformative insights from extremely large data sets. - Design, develop, and implement web applications for self-service delivery of analytics. - Design and develop APIs, database objects, machine learning algorithms, and necessary server-side code to support applications. - Work closely with team members to quickly integrate new components and features into the current application ecosystem. - Continuously evaluate industry trends for opportunities to utilize new technologies and methodologies, and implement these into the solution stack as appropriate. Every Dell Technologies team member brings something unique to the table. Here's what we are looking for with this role: Essential Requirements: - Bachelor's or advanced degree in Computer Science, Applied Mathematics, Engineering, or related field with 8 to 12 years of experience in using data technologies to deliver cutting-edge business intelligence solutions. - Strong communications and presentation skills, with the ability to articulate big data platforms and solution designs to both technical and non-technical stakeholders. - Knowledge in key data engineering technologies, including Big Data Tools, Cloud Services, Object-Oriented or Object Function Scripting Languages. - Working knowledge of statistics and machine learning concepts. - Demonstrated ability to write, tune, and debug performant SQL. - Experience with related technologies such as C#, .Net, HTML5, JavaScript, SQL Server, TeraData, Hadoop, Spark, and R. - Demonstrated ability creating rich web interfaces using a modern client-side framework. - Strong knowledge of computer science fundamentals, and demonstrated ability in applying these effectively in the real-world. Desirable Requirements: - Bachelor's degree. Dell Technologies is a unique family of businesses that helps individuals and organizations transform how they work, live, and play. If you're looking for an opportunity to grow your career with some of the best minds and most advanced tech in the industry, we're looking for you. Join us to build a future that works for everyone because Progress Takes All of Us.,

Posted 1 week ago

Apply

6.0 - 11.0 years

0 Lacs

Bengaluru

Hybrid

Dear Candidate, Greetings of the day!! We are hiring for one of our leading client. Designation - Sr. Software Engineer / Sr. Developer /Sr. Consultant Location - Bangalore Mode :-Hybrid Mode Experience : 6+ years Notice Period : Immediate Joiner /Serving notice period / 15 Days Mandatory Skills : Teradata Demand :- i) Overall 6+ Years of Experience ii) Good SQL skill with Teradata Hands on Exp of at least 2 years Generic any type of SQL at least 5 years of the total 6 terra data specific usage yes 1 to 2 years is fine iii) Good Unix skill to work with SQL with wrapper script of Unix iv) Good Problem-solving skill and excellent communication skill Interested candidates can share your updated resume on sheeza.ahmed@thehrsolutions.in or call on 9760984797 Thanks & Regards Sheeza Ahmed Sr. HR Consultant HR SOLUTIONS

Posted 1 week ago

Apply

5.0 years

0 Lacs

India

On-site

Job Summary: We are seeking a skilled and experienced Teradata Developer with a strong background in data warehousing to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining Teradata-based solutions that support business intelligence and analytics initiatives. Key Responsibilities: Design, develop, and optimize Teradata SQL scripts, procedures, and views. Implement and maintain ETL processes using Teradata utilities (e.g., BTEQ, FastLoad, MultiLoad, TPT). Collaborate with data architects and business analysts to understand data requirements and translate them into technical solutions. Perform data modeling, schema design, and performance tuning. Ensure data quality, integrity, and consistency across data warehouse systems. Troubleshoot and resolve issues related to data loads, queries, and performance. Participate in code reviews, testing, and deployment activities. Document technical specifications and maintain system documentation. Required Skills & Qualifications: Minimum 5 years of experience in Teradata development. Strong proficiency in Teradata SQL and utilities. Solid understanding of data warehousing concepts, architecture, and best practices. Experience with ETL tools and data integration techniques. Knowledge of performance tuning and query optimization in Teradata. Familiarity with data modeling tools (e.g., Erwin, PowerDesigner). Experience with Unix/Linux scripting is a plus. Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Preferred Qualifications: Experience with cloud data platforms (e.g., AWS, Azure, GCP). Knowledge of BI tools like Tableau, Power BI, or MicroStrategy. Familiarity with Agile methodologies and DevOps practices. Job Type: Full-time Schedule: Day shift Work Location: In person Application Deadline: 22/07/2025

Posted 1 week ago

Apply

0.0 - 2.0 years

8 - 10 Lacs

Gurgaon

On-site

Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description At United, we care about our customers. To be the best airline in aviation history, we need to deliver the best service to our customers. And it takes a whole team of dedicated customer-focused advocates to make it happen! From our Contact Center to customer analytics, insights, innovation, and everything in between, the Customer Experience team delivers outstanding service and helps us to run a more customer-centric and dependable airline. Job overview and responsibilities United Airlines is set to expand its fleet size and route network in the near future. The goal of this team is to help empower stakeholders across operating teams with timely and relevant data driven insights to help meet the challenges of scaling the airline, while continuing to maintain dependability and customer focus. Extract data from a variety of sources to assist in building operational dashboards and reports Leverage data from a variety of sources to analyze business performance and provide actionable insights Support creation of presentations for United leadership and external stakeholders Execute solutions to business problems using data mining and data analysis This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. This position is for United Airlines Business Services Pvt. Ltd - a wholly owned subsidiary of United Airlines Inc. Qualifications What’s needed to succeed (Minimum Qualifications): Bachelor's degree Bachelor's degree in a quantitative field like Math, Statistics, Computer Science, Engineering, or related field required 0 to 2 years of experience in analytics Analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information Attention to accuracy and detail Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position What will help you propel from the pack (Preferred Qualifications): Master's degree Airline experience or knowledge of airline operations preferred Familiarity with database querying tools and ability to write complex queries and procedures using Teradata SQL and/or Microsoft TSQL Familiar with reporting tools – Spotfire/ Tableau/ Power BI

Posted 1 week ago

Apply

5.0 years

2 - 8 Lacs

Mohali

On-site

Job Information Date Opened 07/18/2025 Job Type Full time Industry Education Work Experience 5+ years City S.A.S.Nagar (Mohali) State/Province Punjab Country India Zip/Postal Code 160062 Job Description Job Summary The Technical Support Engineer (TSE) acts as a Starburst SME for a book of enterprise accounts. The TSE is responsible for answering all technical questions within both standard and custom deployment environments and assisting with supported LTS upgrades. The TSE is also responsible for peer training and development, personal continued education, and contributing to our reference documentation. They will coordinate closely with Support leadership, Engineering, Product and Accounts teams to ensure our customers receive a value driven enterprise experience. A TSE is able to work independently, with minimal guidance, and demonstrates an expert degree of proficiency in both SEP and Galaxy. Responsibilities Technical Support: Provide support for standard and custom deployments Answer break/fix and non-break/fix technical questions through SFDC ticketing system Efficiently reproduce reported issues by leveraging tools (minikube, minitrino, docker-compose, etc.), identify root causes, and provide solutions Open SEP and Galaxy bug reports in Jira and feature requests in Aha! LTS Upgrades: Provide upgrade support upon customer request Customer must be on a supported LTS version at the time of request TSE must communicate unsupported LTS requests to the Account team as these require PS services Monthly Technical check-ins Conduct regularly scheduled technical check-ins with each BU Discuss open support tickets, provide updates on product bugs and provide best practice recommendations based on your observations and ticket trends Responsible for ensuring customer environments are on supported LTS versions Knowledge Sharing/Technical Enablement: Knowledge exchange and continued technical enablement are crucial for the development of our team and the customer experience. It's essential that we keep our product expertise and documentation current and that all team members have access to information. Contribute to our reference documentation Lead peer training Consultant to our content teams Own your personal technical education journey Project Involvement Contribute to or drive components of departmental and cross functional initiatives Partner with Leadership Identify areas of opportunity with potential solutions for inefficiencies or obstacles within the team and cross-functionally Provide feedback to your manager on continued ed. opportunities, project ideas, etc. Requirements 5+ years of support experience 3+ years of Big Data, Docker, Kubernetes and cloud technologies experience Skills Big Data (Teradata, Hadoop, Data Lakes, Spark) Docker and Kubernetes Cloud technologies (AWS, Azure, GCP) Security - Authentication (LDAP, OAuth2.0) and Authorization technologies SSL/TLS Linux Skills DBMS Concepts/SQL Exposure Languages: SQL, Java, Python, Bash Benefits Why Join Us? Work with a globally recognized team on cutting-edge AI/ML projects. Be a part of a culture that values curiosity, continuous learning, and impact. Opportunity to collaborate with Fortune 500 clients and industry leaders. Grow your career by becoming an instructor and sharing your knowledge worldwide.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies