Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
0.0 years
0 Lacs
Bengaluru, Karnataka
On-site
- 5+ years of data engineering experience - Experience with data modeling, warehousing and building ETL pipelines - Experience with SQL - Experience managing a data or BI team - Experience leading and influencing the data or BI strategy of your team or organization - Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS - Experience with big data technologies such as: Hadoop, Hive, Spark, EMR - Experience hiring, developing and promoting engineering talent - Experience communicating to senior management and customers verbally and in writing We are seeking an ambitious Data Engineering Manager to join our Metrics and Data Platform team. The Metrics and Data Platform team plays a critical role in enabling Amazon Music’s business decisions and data-driven software development by collecting and providing behavioral and operational metrics to our internal teams. We maintain a scalable and robust data platform to support Amazon Music’s rapid growth, and collaborate closely with data producers and data consumers to accelerate innovation using data. As a Data Engineering Manager, you will manage a team of talented Data Engineers. Your team collects billions of events a day, manages petabyte-scale datasets on Redshift and S3, and develops data pipelines with Spark, SQL, EMR, and Airflow. You will collaborate with product and technical stakeholders to solve challenging data modeling, data availability, data quality, and data governance problems. At Amazon Music, engineering managers are the primary drivers of their team’s roadmap, priorities, and goals. You will be deeply involved in your team’s execution, helping to remove obstacles and accelerate progress. A successful candidate will be customer obsessed, highly analytical and detail oriented, able to work effectively in a data-heavy organization, and adept at leading across multiple different complex workstreams at once. Key job responsibilities - Hiring, motivating, mentoring, and growing a high-performing engineering team - Owning and managing big data pipelines, Amazon Music’s foundational datasets, and the quality and operational performance of the datasets - Collaborating with cross-functional teams and customers, including business analysts, marketing, product managers, technical program managers, and software engineers/managers - Defining and managing your team’s roadmap, priorities, and goals in partnership with Product, stakeholders, and leaders - Ensuring timely execution of team priorities and goals by proactively identifying risks and removing blockers - Recognizing and recommending process and engineering improvements that reduce failures and improve efficiency - Clearly communicating business updates, verbally and in writing, to both technical and non-technical stakeholders, peers, and leadership - Effectively influencing other team’s priorities and managing escalations - Owning and improving business and operational metrics of your team's software - Ensuring team compliance with policies (e.g., information security, data handling, service level agreements) - Identifying ways to leverage GenAI to reduce operational overhead and improve execution velocity - Introducing ideas to evolve and modernize our data model to address customer pain points and improve query performance About the team Amazon Music is an immersive audio entertainment service that deepens connections between fans, artists, and creators. From personalized music playlists to exclusive podcasts, concert livestreams to artist merch, Amazon Music is innovating at some of the most exciting intersections of music and culture. We offer experiences that serve all listeners with our different tiers of service: Prime members get access to all the music in shuffle mode, and top ad-free podcasts, included with their membership; customers can upgrade to Amazon Music Unlimited for unlimited, on-demand access to 100 million songs, including millions in HD, Ultra HD, and spatial audio; and anyone can listen for free by downloading the Amazon Music app or via Alexa-enabled devices. Join us for the opportunity to influence how Amazon Music engages fans, artists, and creators on a global scale. Learn more at https://www.amazon.com/music. Experience with AWS Tools and Technologies (Redshift, S3, EC2) Experience in processing data with a massively parallel technology (such as Redshift, Teradata, Netezza, Spark or Hadoop based big data solution) Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 1 month ago
0 years
0 - 0 Lacs
Chennai, Tamil Nadu
Work from Office
Create COBOL, DB2, Informatica, Linux, Teradata and Oracle code artifacts Meet with various IT groups (other departments and computer operations' staff) to address issues/concerns Interact closely with Business Analysis team, ETL team and BI Reporting teams to ensure understanding of proper use of data architecture Analyze requirements to create technical designs, data models and migration strategies Design, build, and maintain physical databases, dimensional data models, OLAP cubes, ETL layer design and data integration strategies Evaluate and influence selection of data warehouse and business intelligence software Collaborate with technology stakeholders to define and implement actionable metrics, KPIs and data visualizations Lead technical design and implementation of dashboards and reporting capabilities Implement data quality, data integrity, and data standardization efforts across products and databases enabling key business processes and applications Recommend improvements to enhance existing ETL and data integration processes to enable performance and overall scalability Job Types: Full-time, Permanent, Fresher Pay: ₹18,455.00 - ₹28,755.00 per month Benefits: Provident Fund Schedule: Day shift Morning shift Rotational shift Supplemental Pay: Yearly bonus Work Location: In person
Posted 1 month ago
5 - 7 years
6 - 11 Lacs
Chennai
Work from Office
Job Description Role: Data Engineer Experience level: 5 to 7 years Location: Chennai Can you say Yes, I have! to the following? Good understanding of distributed system architecture, data lake design and best practices Working knowledge of cloud-based deployments in AWS, Azure or GCP Coding proficiency in at least one programming language (Scala, Python, Java) Experience in Airflow is preferred Experience in data warehousing, relational database architectures (Oracle, SQL, DB2, Teradata) Expertise in Big Data storage and processing platform (Hadoop, Spark, Hive, HBASE) Skills: Problem solver, fast learner, energetic and enthusiastic Self-motivated and highly professional, with the ability to lead, and take ownership and responsibility Adaptable and flexible to business demands Can you say Yes, I will! to the following? Lead analytical projects and deliver value to customers Coordinate individual teams to fulfil client requirements and manage deliverables Communicate and present complex concepts to business audiences Manage and strategize business from an analytics point of view Travel to client locations when necessary Design algorithms for product development and build analytics-based products
Posted 1 month ago
2 - 6 years
7 - 17 Lacs
Hyderabad
Work from Office
About this role: Wells Fargo is seeking an Analytics Consultant. We believe in the power of working together because great ideas can come from anyone. Through collaboration, any employee can have an impact and make a difference for the entire company. Explore opportunities with us for a career in a supportive environment where you can learn and grow. In this role, you will: Consult with business line and enterprise functions on less complex research Use functional knowledge to assist in non-model quantitative tools that support strategic decision making Perform analysis of findings and trends using statistical analysis and document process Present recommendations to increase revenue, reduce expense, maximize operational efficiency, quality, and compliance Identify and define business requirements and translate data and business needs into research and recommendations to improve efficiency Participate in all group technology efforts including design and implementation of database structures, analytics software, storage, and processing Develop customized reports and ad hoc analyses to make recommendations and provide guidance to less experienced staff Understand compliance and risk management requirements for supported area Ensure adherence to data management or data governance regulations and policies Participate in company initiatives or processes to assist in meeting risk and capital objectives and other strategic goals Collaborate and consult with more experienced consultants and with partners in technology and other business groups Required Qualifications: 2+ years of Analytics experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualification: Responsible for maintaining partner relationships, ensuring high quality team deliverables and SLAs. Working closely with the US partners on daily basis, interacting closely with multiple business partners and program managers. Work independently, foster a culture of healthy and efficient working for the team. Designing and solving complex business problems by analytical techniques and tools. Will be involved directly in the technical build-out and/or support of databases, query tools, reporting tools, BI tools, dashboards, etc. that enable analysis, modeling, and/or advanced data visualization including development of Business Objects reports using multiple databases. Recommends potential data sources, compiles and mine data from multiple, cross business sources. Works with typically very large data sets, both structured and unstructured, and from multiple sources. Develops specific, customized reports, ad hoc analyses and/or data visualizations, formatted with business user-friendly techniques to drive adoption, such as Excel macros/pivoting/filtering, PowerPoint slides and presentations, and clear verbal and e-mail communications. Works with senior consultants or directly with partners, responsible for identifying and defining business requirements and translating business needs into moderately complex analyses and recommendations. Works with local and international colleagues and with internal customers, responsible for identifying and defining business requirements and catering to business needs for the team. Ensure adherence to data management/data governance regulations and policies. Applies knowledge of business, customers, and/or products/services/portfolios to synthesize data to 'form a story' and align information to contrast/ compare to industry perspective. Ability to work overlap hours with US team. 2+ years of experience in one or more of the following: Modeling, Forecasting, Decision Trees as well as other statistical and performance analytics. 2+ years of experience in one or more of the following: Tableau, or Power BI and paginated reports. 2+ years of Python and SQL 2+ years of experience in developing and creating BI dashboards, working on end-to-end reports, deriving insights from data. Excellent verbal, written, and interpersonal communication skill. Extensive knowledge and understanding of research and analysis. Strong analytical skills with high attention to detail and accuracy. Collaborative, team-focused attitude. Experience with Teradata/Oracle databases. Experience with Power Automate and GitHub. Domain knowledge within banking.
Posted 1 month ago
4 - 9 years
7 - 17 Lacs
Bengaluru
Work from Office
About this role: Wells Fargo is seeking a... In this role, you will: Participate in low risk initiatives within Risk Analytics Review process production, and model documentation in alignment with policy, analyzing trends in current population Receive direction from manager Exercise judgment within Risk Analytics while developing understanding of analytic models, policies, and procedures Provide monthly, quarterly, and annual reports to manager and experienced managers Required Qualifications: 6+ months of Risk Analytics experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Required Qualifications for Europe, Middle East Africa only: Experience in Risk Analytics, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: 4+ years of experience SQL, Teradata, and or Hadoop experience. 4+ years of experience with BI tools such as Tableau, Power BI or Alteryx applications. 3+ years of experience in risk (includes compliance, financial crimes, operational, audit, legal, credit risk, market risk). Experience researching and resolving data problems and working with technology teams on remediation of data issues. Demonstrated strong analytical skills with high attention to detail and accuracy. Excellent verbal, written, and listening communication skills. Job Expectations: Participate in complex initiatives related to business analysis and modeling, including those that are cross functional, with broad impact, and act as key participant in data aggregation and monitoring for Risk Analytics. Fully understands Data Quality Checks, Methodology, Dimensions for data completeness, accuracy, and that policies and procedures are followed. Becomes a SME in the DQ Check elements, technology infrastructure utilized, and fully understands the metadata and lineage from DQ report to source data. Escalates potential risks, issues, or calendar/timeliness risks in a timely manner to management/Data Management Sharepoint. Ensures the organization and storage of DQ checks artifacts, files, and evidences are effective, efficient, and make sense. Perform deep dive analytics (both Adhoc and structured) and provide reporting or results to both internal and external stakeholders. Design and build rich data visualizations to communicate complex ideas and automate reporting and controls. Create and interpret Business Intelligence data (Reporting, Basic Analytics, Predictive Analytics and Prescriptive Analytics) combined with business knowledge to draw supportable conclusions about current and future risk levels. Becomes a SME in the Reporting, Data Quality check elements, technology infrastructure utilized, and fully understands the metadata and lineage from DQ report to source data. To demonstrate the ability to identify and implement areas of opportunities for quality assurance, data validation, analytics and data aggregation to improve overall reporting efficiencies. Creating and executing the UAT test cases, logging the defects and managing the defects till closure. Collaborate and consult with peers, less experienced to more experienced managers, to resolve production, project, and regulatory issues, and achieve risk analysts, and common modeling goals.
Posted 1 month ago
0 years
0 Lacs
Chennai, Tamil Nadu
Work from Office
Job Description Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. Job Description - Grade Specific The involves leading and managing a team of data engineers, overseeing data engineering projects, ensuring technical excellence, and fostering collaboration with stakeholders. They play a critical role in driving the success of data engineering initiatives and ensuring the delivery of reliable and high-quality data solutions to support the organization's data-driven objectives. Skills (competencies) Ab Initio Agile (Software Development Framework) Apache Hadoop AWS Airflow AWS Athena AWS Code Pipeline AWS EFS AWS EMR AWS Redshift AWS S3 Azure ADLS Gen2 Azure Data Factory Azure Data Lake Storage Azure Databricks Azure Event Hub Azure Stream Analytics Azure Sunapse Bitbucket Change Management Client Centricity Collaboration Continuous Integration and Continuous Delivery (CI/CD) Data Architecture Patterns Data Format Analysis Data Governance Data Modeling Data Validation Data Vault Modeling Database Schema Design Decision-Making DevOps Dimensional Modeling GCP Big Table GCP BigQuery GCP Cloud Storage GCP DataFlow GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2 Industry Standard Data Modeling (FSLDM) Industry Standard Data Modeling (IBM FSDM)) Influencing Informatica IICS Inmon methodology JavaScript Jenkins Kimball Linux - Redhat Negotiation Netezza NewSQL Oracle Exadata Performance Tuning Perl Platform Update Management Project Management PySpark Python R RDD Optimization SantOs SaS Scala Spark Shell Script Snowflake SPARK SPARK Code Optimization SQL Stakeholder Management Sun Solaris Synapse Talend Teradata Time Management Ubuntu Vendor Management
Posted 1 month ago
2 - 7 years
4 - 9 Lacs
Pune
Work from Office
Role Overview: We are seeking an experienced ETL Developer with strong expertise in Informatica PowerCenter and Teradata to design and implement robust data integration solutions. This role involves end-to-end ownership of ETL workflows, performance optimization, and close collaboration with business and technical stakeholders to support enterprise data warehouse initiatives. Key Responsibilities: ETL Development (Informatica PowerCenter): Design, develop, and implement scalable ETL processes using Informatica PowerCenter . Extract, transform, and load data from multiple source systems into the Teradata data warehouse. Create, manage, and optimize ETL workflows and mappings. Teradata Database Management: Create and manage tables, indexes, stored procedures , and other database objects in Teradata . Ensure optimal database performance and maintain scalable data structures. Data Mapping and Transformation: Develop data mapping specifications and define transformation logic. Implement data cleansing, validation, and transformation rules within ETL processes. Performance Tuning: Optimize ETL performance by tuning SQL queries, mappings, and workflows. Identify and resolve performance bottlenecks in the ETL and data integration pipeline. Documentation: Maintain detailed documentation for ETL jobs, data mappings, SQL scripts , and Teradata configurations . Ensure adherence to coding standards and best practices. Collaboration & Quality Assurance: Work closely with data architects, business analysts, and cross-functional teams to understand data requirements and ensure accurate data delivery. Conduct unit, system, and integration testing to validate ETL workflows. Troubleshoot and resolve data-related issues in a timely and efficient manner. Required Skills : Hands-on experience in ETL development using Informatica PowerCenter . Strong knowledge of Teradata and its ecosystem. Proficient in SQL , with experience in query optimization and performance tuning. Solid understanding of data modeling , data warehousing concepts , and ETL architecture . Ability to create detailed and clear technical documentation . Familiarity with data quality , validation , and ","
Posted 1 month ago
7 years
0 Lacs
Chennai, Tamil Nadu, India
Job Summary Job Summary: We are looking for an experienced Senior Software Engineer with deep expertise in Spark SQL / SQL development to lead the design, development, and optimization of complex database systems. As a Senior Spark SQL/SQL Developer, you will play a key role in creating and maintaining high performance, scalable database solutions that meet business requirements and support critical applications. You will collaborate with engineering teams, mentor junior developers, and drive improvements in database architecture and performance. Key Responsibilities: Design, develop, and optimize complex Spark SQL / SQL queries, stored procedures, views, and triggers for high performance systems. Lead the design and implementation of scalable database architectures to meet business needs. Perform advanced query optimization and troubleshooting to ensure database performance, efficiency, and reliability. Mentor junior developers and provide guidance on best practices for SQL development, performance tuning, and database design. Collaborate with cross functional teams, including software engineers, product managers, and system architects, to understand requirements and deliver robust database solutions. Conduct code reviews to ensure code quality, performance standards, and compliance with database design principles. Develop and implement strategies for data security, backup, disaster recovery, and high availability. Monitor and maintain database performance, ensuring minimal downtime and optimal resource utilization. Contribute to long term technical strategies for database management and integration with other systems. Write and maintain comprehensive documentation on database systems, queries, and architecture. Required Skills & Qualifications :-- Experience: 7+ years of hands on experience in SQL Developer / data engineering or a related field. Expert level proficiency in Spark SQL and extensive experience with Bigdata (Hive), MPP (Teradata), relational databases such as SQL Server, MySQL, or Oracle. ¿ Strong experience in database design, optimization, and troubleshooting. Deep knowledge of query optimization, indexing, and performance tuning techniques. Strong understanding of database architecture, scalability, and high availability strategies. Experience with large scale, high transaction databases and data warehousing. Strong problem solving skills with the ability to analyze complex data issues and provide effective solutions. Data testing and data reconciliation Ability to mentor and guide junior developers and promote best practices in SQL development. Proficiency in database migration, version control, and integration with applications. Excellent communication and collaboration skills, with the ability to interact with both technical and non technical stakeholders. Preferred Qualifications: Experience with NoSQL databases (e.g., MongoDB, Cassandra) and cloud based databases (e.g., AWS RDS, Azure SQL Database). Familiarity with data analytics, ETL processes, and data pipelines. Experience in automation tools, CI/CD pipelines, and agile methodologies. Familiarity with programming languages such as Python, Java, or C#. Education: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field (or equivalent experience).
Posted 1 month ago
1 years
0 Lacs
Gurugram, Haryana, India
Hybrid
You Lead the Way. We’ve Got Your Back With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities, and each other. Here, you will learn and grow as we help you create a career journey that is unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you will be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we will do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. Join Team Amex and let's lead the way together. The Global Servicing (GS) organization delivers extraordinary customer care to Card Members, merchants and commercial clients around the world, while providing world-class credit, collections and fraud services. The GS Servicing Insights & MIS team (part of Global Servicing Enablement, GSE) is the primary point of contact for all GS information needs and is responsible for Executive Decision Support through advanced analytics and MIS. The team has a global footprint and this position will be based out of the American Express Service Center in Gurgaon, India Purpose of the Role: MIS and Analytics to support GS Responsibilities: · Providing Analytical & Decision Support across GS through advanced analytics (from sourcing to staging data, generating insights to exposing them for consumption via reporting platforms/strategy implementation) · Enabling business user self-service through creation of MIS capabilities · Systematically identify out of pattern activities in a timely manner and address information gaps by providing insightful analytics · Working independently assuming responsibility for the development, validation and implementation of projects · Participate on global teams evaluating processes and making suggestions for process and system improvements · Interacting with all levels of the organization across multiple time zones. Critical Factors to Success: · Ensure timely and accurate MIS based on customer requirements · Centrally manage MIS and key operational metrics and address functional data needs across operations and support teams · Provide analytical and decision support framework and address information gaps through insightful analytics and developing lead indicators · Build collaborative relationships across GS groups and participate on global teams evaluating processes and making suggestions for process and system improvements · Put enterprise thinking first, connect the role’s agenda to enterprise priorities and balance the needs of customers, partners, colleagues & shareholders Past Experience: · Preferably a minimum 2 years’ experience with at least 1 year in Quantitative Business Analysis/Data Science with experience in handling large data sets Academic Background: · Bachelor's Degree or equivalent, preferably in a quantitative field · Post-graduate degree in a quantitative field will be an added advantage Functional Skills/Capabilities: · Must possess strong quantitative and analytical skills and be a conceptual and innovative thinker · Project management skills and ability to identify and translate business information needs into insights and information cubes for ease of consumption in reporting and analytics · Proven thought leadership, strong communication, relationship management skills · Ability to work on multiple projects simultaneously, flexibility and adaptability to work within tight deadlines and changing priorities · Data presentation & visualization skills Technical Skills/Capabilities: · Excellent programming skills on Hive/SAS/SQL/Teradata is essential with good understanding of Big Data ecosystems · Exposure to visualization using Business Intelligence software like Tableau or Qlikview will be an added advantage Knowledge of Platforms: · Advanced knowledge of Microsoft Excel and PowerPoint, Word, Access and Project Behavioral Skills/Capabilities: Set The Agenda: Define What Winning Looks Like, Put Enterprise Thinking First, Lead with an External Perspective Bring Others With You: Build the Best Team, Seek & Provide Coaching Feedback, Make Collaboration Essential Do It The Right Way: Communicate Frequently, Candidly & Clearly, Make Decisions Quickly & Effectively, Live the Blue Box Values, Great Leadership Demands Courage We back our colleagues and their loved ones with benefits and programs that support their holistic well-being. That means we prioritize their physical, financial, and mental health through each stage of life. Benefits include: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.
Posted 1 month ago
5 - 8 years
0 Lacs
Pune, Maharashtra, India
Entity: Technology Job Family Group: IT&S Group Job Description: Responsible for delivering business analysis and consulting activities for the defined specialism using sophisticated technical capabilities, building and maintaining effective working relationships with a range of customers, ensuring relevant standards are defined and maintained, and implementing process and system improvements to deliver business value. Specialisms: Business Analysis; Data Management and Data Science; Digital Innovation!!! Senior Data Engineer will work as part of an Agile software delivery team; typically delivering within an Agile Scrum framework. Duties will include attending daily scrums, sprint reviews, retrospectives, backlog prioritisation and improvements! Will coach, mentor and support the data engineering squad on the full range of data engineering and solutions development activities covering requirements gathering and analysis, solutions design, coding and development, testing, implementation and operational support. Will work closely with the Product Owner to understand requirements / user stories and have the ability to plan and estimate the time taken to deliver the user stories. Proactively collaborate with the Product Owner, Data Architects, Data Scientists, Business Analysts, and Visualisation developers to meet the acceptance criteria Will be very highly skilled and experienced in use of tools and techniques such as AWS Data Lake technologies, Redshift, Glue, Spark SQL, Athena Years of Experience: 13- 15 Essential domain expertise: Experience in Big Data Technologies – AWS, Redshift, Glue, Py-spark Experience of MPP (Massive Parallel Processing) databases helpful – e.g. Teradata, Netezza Challenges involved in Big Data – large table sizes (e.g. depth/width), even distribution of data Experience of programming- SQL, Python Data Modelling experience/awareness – Third Normal Form, Dimensional Modelling Data Pipelining skills – Data blending, etc Visualisation experience – Tableau, PBI, etc Data Management experience – e.g. Data Quality, Security, etc Experience of working in a cloud environment - AWS Development/Delivery methodologies – Agile, SDLC. Experience working in a geographically disparate team Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Commercial Acumen, Communication, Data Analysis, Data cleansing and transformation, Data domain knowledge, Data Integration, Data Management, Data Manipulation, Data Sourcing, Data strategy and governance, Data Structures and Algorithms (Inactive), Data visualization and interpretation, Digital Security, Extract, transform and load, Group Problem Solving Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks.
Posted 1 month ago
4 - 8 years
18 - 30 Lacs
Bengaluru
Work from Office
Only immediate joiners based in Bangalore apply.Candidates must work from client office 3 days a week in Whitefiled. No relocation candidates Skills and Knowledge: Hands-on experience with Teradata: Proficiency in Teradata utilities (BTEQ, Fast Load, Multiload, and TPT for data loading), SQL, and data warehousing concepts. Deep knowledge of Teradata architecture: Understanding Teradata's architecture, including star schema, snowflake schema, and data modeling techniques. SQL optimization skills: Ability to write and optimize complex SQL queries for performance. ETL process development: Experience in designing and implementing ETL processes using Teradata utilities. Teradata performance tuning: Knowledge of performance tuning techniques, workload management, and partitioning strategies. Data warehousing concepts: Strong understanding of data warehouse concepts and best practices. Ensuring database security and availability: Implementing measures to protect data integrity and maintain system uptime. Database modeling and design: Creating and managing database structures, including star schema and snowflake schema. Troubleshooting and resolving performance issues: Identifying and addressing performance bottlenecks in the Teradata system. Collaborating with business analysts: Understanding data requirements and translating them into technical solutions. Creating and maintaining technical documentation: Documenting database designs, ETL processes, and other technical aspects.
Posted 1 month ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Comcast brings together the best in media and technology. We drive innovation to create the world's best entertainment and online experiences. As a Fortune 50 leader, we set the pace in a variety of innovative and fascinating businesses and create career opportunities across a wide range of locations and disciplines. We are at the forefront of change and move at an amazing pace, thanks to our remarkable people, who bring cutting-edge products and services to life for millions of customers every day. If you share in our passion for teamwork, our vision to revolutionize industries and our goal to lead the future in media and technology, we want you to fast-forward your career at Comcast. Job Summary Responsible for designing, building and overseeing the deployment and operation of technology architecture, solutions and software to capture, manage, store and utilize structured and unstructured data from internal and external sources. Establishes and builds processes and structures based on business and technical requirements to channel data from multiple inputs, route appropriately and store using any combination of distributed (cloud) structures, local databases, and other applicable storage forms as required. Develops technical tools and programming that leverage artificial intelligence, machine learning and big-data techniques to cleanse, organize and transform data and to maintain, defend and update data structures and integrity on an automated basis. Creates and establishes design standards and assurance processes for software, systems and applications development to ensure compatibility and operability of data connections, flows and storage requirements. Reviews internal and external business and product requirements for data operations and activity and suggests changes and upgrades to systems and storage to accommodate ongoing needs. Work with data modelers/analysts to understand the business problems they are trying to solve then create or augment data assets to feed their analysis. Works with moderate guidance in own area of knowledge. Job Description Core Responsibilities Develops data structures and pipelines aligned to established standards and guidelines to organize, collect, standardize and transform data that helps generate insights and address reporting needs. Focuses on ensuring data quality during ingest, processing as well as final load to the target tables. Creates standard ingestion frameworks for structured and unstructured data as well as checking and reporting on the quality of the data being processed. Creates standard methods for end users / downstream applications to consume data including but not limited to database views, extracts and Application Programming Interfaces. Develops and maintains information systems (e.g., data warehouses, data lakes) including data access Application Programming Interfaces. Participates in the implementation of solutions via data architecture, data engineering, or data manipulation on both on-prem platforms like Kubernetes and Teradata as well as Cloud platforms like Databricks. Determines the appropriate storage platform across different on-prem (minIO and Teradata) and Cloud (AWS S3, Redshift) depending on the privacy, access and sensitivity requirements. Understands the data lineage from source to the final semantic layer along with the transformation rules applied to enable faster troubleshooting and impact analysis during changes. Collaborates with technology and platform management partners to optimize data sourcing and processing rules to ensure appropriate data quality as well as process optimization. Handles data migrations/conversions as data platforms evolve and new standards are defined. Preemptively recognizes and resolves technical issues utilizing knowledge of policies and processes. Understands the data sensitivity, customer data privacy rules and regulations and applies them consistently in all Information Lifecycle Management activities. Identifies and reacts to system notification and log to ensure quality standards for databases and applications. Solves abstract problems beyond single development language or situation by reusing data file and flags already set. Solves critical issues and shares knowledge such as trends, aggregate, quantity volume regarding specific data sources. Consistent exercise of independent judgment and discretion in matters of significance. Regular, consistent and punctual attendance. Must be able to work nights and weekends, variable schedule(s) as necessary. Other Duties And Responsibilities As Assigned. Employees at all levels are expected to: Understand our Operating Principles; make them the guidelines for how you do your job. Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services. Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences. Win as a team - make big things happen by working together and being open to new ideas. Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers. Drive results and growth. Respect and promote inclusion & diversity. Do what's right for each other, our customers, investors and our communities. Disclaimer:This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications. Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law. Base pay is one part of the Total Rewards that Comcast provides to compensate and recognize employees for their work. Most sales positions are eligible for a Commission under the terms of an applicable plan, while most non-sales positions are eligible for a Bonus. Additionally, Comcast provides best-in-class Benefits to eligible employees. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That’s why we provide an array of options, expert guidance and always-on tools, that are personalized to meet the needs of your reality – to help support you physically, financially and emotionally through the big milestones and in your everyday life. Please visit the compensation and benefits summary on our careers site for more details. Education Bachelor's Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Relevant Work Experience 2-5 Years Show more Show less
Posted 1 month ago
0 years
0 Lacs
Chennai, Tamil Nadu
Work from Office
1. Design, build data cleansing and imputation, map to a standard data model, transform to satisfy business rules and statistical computations, and validate data content. Develop, modify, and maintain Python and Unix Scripts, and complex SQL Performance tuning of the existing code and avoid bottlenecks and improve performance Build an end-to-end data flow from sources to entirely curated and enhanced data sets. Develop automated Python jobs for ingesting data from various source systems Provide technical expertise in areas of architecture, design, and implementation. Work with team members to create useful reports and dashboards that provide insight, improve/automate processes, or otherwise add value to the team. Write SQL queries for data validation. Design, develop and maintain ETL processess to extract, transform and load Data from various sources into the data warehours Colloborate with data architects, analysts and other stake holders to understand data requirement and ensure quality Optimize and tune ETL processes for performance and scalaiblity Develop and maintain documentation for ETL processes, data flows, and data mappings Monitor and trouble shoot ETL processes to ensure data accuracy and availability Implement data validation and error handling mechanisms Work with large data sets and ensure data integrity and consistency skills Python, ETL Tools like Informatica, Talend, SSIS or similar SQL, Mysql, Expertise in Oracle, SQL Server and Teradata DeV Ops, GIT Lab Exp in AWS glue or Azure data factory About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 1 month ago
0 years
0 Lacs
Bengaluru, Karnataka
Work from Office
P1,C3,STS 1. Design, build data cleansing and imputation, map to a standard data model, transform to satisfy business rules and statistical computations, and validate data content. Develop, modify, and maintain Python and Unix Scripts, and complex SQL Performance tuning of the existing code and avoid bottlenecks and improve performance Build an end-to-end data flow from sources to entirely curated and enhanced data sets. Develop automated Python jobs for ingesting data from various source systems Provide technical expertise in areas of architecture, design, and implementation. Work with team members to create useful reports and dashboards that provide insight, improve/automate processes, or otherwise add value to the team. Write SQL queries for data validation. Design, develop and maintain ETL processess to extract, transform and load Data from various sources into the data warehours Colloborate with data architects, analysts and other stake holders to understand data requirement and ensure quality Optimize and tune ETL processes for performance and scalaiblity Develop and maintain documentation for ETL processes, data flows, and data mappings Monitor and trouble shoot ETL processes to ensure data accuracy and availability Implement data validation and error handling mechanisms Work with large data sets and ensure data integrity and consistency skills Python, ETL Tools like Informatica, Talend, SSIS or similar SQL, Mysql, Expertise in Oracle, SQL Server and Teradata DeV Ops, GIT Lab Exp in AWS glue or Azure data factory About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 1 month ago
5 - 8 years
0 Lacs
Hyderabad, Telangana, India
Hybrid
Our company: At Teradata, we believe that people thrive when empowered with better information. That’s why we built the most complete cloud analytics and data platform for AI. By delivering harmonized data, trusted AI, and faster innovation, we uplift and empower our customers—and our customers’ customers—to make better, more confident decisions. The world’s top companies across every major industry trust Teradata to improve business performance, enrich customer experiences, and fully integrate data across the enterprise. What you will do: Highly motivated Full Stack Engineer with a solid background in software development. The ideal candidate should be adept at multi-tasking across various development activities, including coding, system configuration, testing, and research. Key Responsibilities: Collaborate with an integrated development team to deliver high-quality applications. Develop end-user applications, leveraging research capabilities and SQL knowledge. Utilize open-source tools and technologies effectively, adapting and extending them as needed to create innovative solutions. Communicate effectively across teams to ensure alignment and clarity throughout the development process. Provide post-production support Who you will work with: On our team we collaborate with several cross-functional agile teams that include product owners, other engineering groups, and quality engineers to conceptualize, build, test and ship software solutions for the next generation of enterprise applications. You will report directly to the Manager of the Applications team. What makes you a qualified candidate: 4+ years of relevant experience, preferably in R&D based teamsStrong programming experience with JavaScript frameworks such as Angular, React, or Node. js, or experience writing Python-based microservices. Experience driving cloud-native service development with a focus on DevOps principles (CI/CD, TDD, Automation). Hands on experience on Java, JSP and related areas. Proficiency in Docker, Unix or Linux platforms. Experience with Spring Framework or Spring Boot a plusExpertise in designing and deploying scalable solutions in public cloud environments. A passion for innovation and continuous learning, with the ability to quickly adapt to new technologies. Familiarity with software configuration management tools, defect tracking tools, & peer review toolsExcellent debugging skills to troubleshoot and resolve issues effectively. Familiarity with relational database management systems (RDBMS) such as PostgreSQL, MySQL, etc. Strong oral and written communication skills, with the ability to produce runbooks and both technical and non-technical documentation. What you will bring: Master’s or bachelor’s degree in computer science or a related discipline. Practical experience in development and support structures. Knowledge of cloud environments, particularly AWS. Proficiency in SQL. Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status.
Posted 1 month ago
5 - 8 years
0 Lacs
Chennai, Tamil Nadu, India
Hybrid
When you join Verizon You want more out of a career. A place to share your ideas freely even if theyre daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What Youll Be Doing As part of DMBM-BI, you will be helping to create and deliver a comprehensive measurement and reporting approach for all of Verizon Consumer Groups. In this role, you will interact with cross-functional teams working throughout Verizon bringing new experiences to life for our Customers. You will help in measurement, and reporting for cross-functional teams as they plan, build, and launch world-class experiences. You will help translate raw data into actionable insights and better experiences for our customers. Your deep knowledge of measurement solutions will help to determine the best approaches for implementations that best meet business needs. Working closely with the NBx/Pega Business teams and deliver reporting stories each release, and where required build new dashboards in Tableau or Qlik SenseContributing to requirement sessions with key stakeholders and actively participate in grooming sessions with business teamsDefining new metrics and business KPIs.Creating wireframes and mockups of reporting dashboards.Documenting all validated standards and processes to ensure accuracy across the enterprise.Collaborating with cross-functional teams to resolve NBx proposition anomalies and actively contribute to production defect resolutions. What were looking for You are a strong collaborator who can effectively own and prioritize multiple work streams and adapt during sometimes pressured situations. You display initiative and resourcefulness in achieving goals but are comfortable brainstorming and sharing ideas in a team environment. You will have excellent communication skills and the ability to speak effectively to internal and external stakeholders. You can partner across multiple business and technology teams. You should have strong Business Intelligence and analytics experience in CX (Customer Experience) area/root cause analytics with attention to detail, be adaptable to change and tight deadlines, and be focused on quality. Ability to mine, extract, transform, load large data sets, and create concise readouts and analyses based on the actionable insights found in the data. Bachelors degree and Six or more years of work experience.Six or more years of relevant work experience.Experience with SQL and SQL performance tuning.Experience with Tableau and Qlik Sense.Experience with data modeling for different data sources in Tableau or Qlik Sense.Knowledge of Google Suite and database management systems.Experience with dashboard creation with insightful visualization.Knowledge of OneJira or any ticketing tool. Even better if you have one or more of the following: Experience with third-party reporting tools (e.g., ThoughtSpot, IBM Cognos, Looker tools).Exposure to HiveQL, GCP Big Query, Teradata, and Oracle databases.Basic knowledge of programming languages (e.g., VBA/Python).Ability to derive insights from data and recommend action.Knowledge of end-to-end ETL process. If Verizon and this role sound like a fit for you, we encourage you to apply even if you dont meet every even better qualification listed above. Where youll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics. Locations Chennai, IndiaHyderabad, India
Posted 1 month ago
5 - 8 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Dear Tech Professional Greetings from Tata Consultancy Services TCS has always been in the spotlight for being adept in “the next big technologies”. What we can offer you is a space to explore varied technologies and quench your techie soul. What we are looking for: AWS Snowflake developer Years of exp: 8-13 Years Job Location: Chennai Job description • Experience of SQL language and cloud-based technologies• Data warehousing concepts, data modelling, metadata management• Data lakes, multi-dimensional models, data dictionaries• Migration to AWS Snowflake platform• Performance tuning and setting up resource monitors• Snowflake modelling – roles, databases, schemas (Hands on must) • SQL performance measuring, query tuning, and database tuning• ETL tools with cloud-driven skills• Integration with third-party tools• Ability to build analytical solutions and models• Coding in languages Python,• Root cause analysis of models with solutions• Hadoop, Spark, and other warehousing tools• Managing sets of XML, JSON, and CSV from disparate sources• SQL-based databases like Oracle SQL Server, Teradata, etc.• Snowflake warehousing, architecture, processing, administration (must )• Data ingestion into Snowflake (must) • Enterprise-level technical exposure to Snowflake applications RegardsPrashaanthini
Posted 1 month ago
2 - 5 years
0 Lacs
Chennai, Tamil Nadu, India
When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What You’ll Be Doing… As part of DMBM-BI, you will be helping to create and deliver a comprehensive measurement and reporting approach for all of Verizon Consumer Groups. In this role, you will interact with cross-functional teams working throughout Verizon bringing new experiences to life for our Customers. You will help in measurement, and reporting for cross-functional teams as they plan, build, and launch world-class experiences. You will help translate raw data into actionable insights and better experiences for our customers. Your deep knowledge of measurement solutions will help to determine the best approaches for implementations that best meet business needs. Working closely with the NBx/Pega Business teams and deliver reporting stories each release, and where required build new dashboards in Tableau or Qlik SenseContributing to requirement sessions with key stakeholders and actively participate in grooming sessions with business teamsDefining new metrics and business KPIs.Creating wireframes and mockups of reporting dashboards.Documenting all validated standards and processes to ensure accuracy across the enterprise.Collaborating with cross-functional teams to resolve NBx proposition anomalies and actively contribute to production defect resolutions. What We’re Looking For… You are a strong collaborator who can effectively own and prioritize multiple work streams and adapt during sometimes pressured situations. You display initiative and resourcefulness in achieving goals but are comfortable brainstorming and sharing ideas in a team environment. You will have excellent communication skills and the ability to speak effectively to internal and external stakeholders. You can partner across multiple business and technology teams. You should have strong Business Intelligence and analytics experience in CX (Customer Experience) area/root cause analytics with attention to detail, be adaptable to change and tight deadlines, and be focused on quality. Ability to mine, extract, transform, load large data sets, and create concise readouts and analyses based on the actionable insights found in the data. Bachelor’s degree and Six or more years of work experience.Six or more years of relevant work experience.Experience with SQL and SQL performance tuning.Experience with Tableau and Qlik Sense.Experience with data modeling for different data sources in Tableau or Qlik Sense.Knowledge of Google Suite and database management systems.Experience with dashboard creation with insightful visualization.Knowledge of OneJira or any ticketing tool. Even better if you have one or more of the following: Experience with third-party reporting tools (e.g., ThoughtSpot, IBM Cognos, Looker tools).Exposure to HiveQL, GCP Big Query, Teradata, and Oracle databases.Basic knowledge of programming languages (e.g., VBA/Python).Ability to derive insights from data and recommend action.Knowledge of end-to-end ETL process. If Verizon and this role sound like a fit for you, we encourage you to apply even if you don’t meet every “even better” qualification listed above. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics.
Posted 1 month ago
0 - 2 years
0 Lacs
Chennai, Tamil Nadu, India
When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What you’ll be doing… You’ll be ensuring that analytics and AI/ML models drive continuous business improvements, bringing more profitable growth and an even better customer experience. You’ll execute the analytics strategy, modeling strategy and champion that strategy across the business, in order to create buy-in for data-driven decision making. Your end goal is to recommend improvements that will go straight to the bottom line, by helping us serve our customers. Building predictive models using AI and ML algorithmsBuilding AI driven audience strategies for Marketing campaigns and offersCreating & automating sales, service, network and marketing intervention streams driven by model outputs to drive growth, mitigate churn and improve CXBuilding dashboards to measure the effectiveness of the intervention programs What we’re looking for... You are a master at analyzing big data. You thrive in an environment where enormous volumes of data are generated at rapid speed. You’re a creative thinker who likes to explore and uncover the issues. You are decisive. Communicating what you’ve uncovered in a way that can be easily understood by others is one of your strengths. You are great at influencing up, down, and across groups. You’ll Need To Have Bachelor’s degree and One or more years of relevant work experience.Experience in applying statistical ideas and methods to data sets to answer business problems.Experience in building predictive models using Machine learning algorithmsExperience with visual science and dashboard design principles. Even Better If You Have Degree in mathematics, statistics, physics, engineering, computer science, or economics.Expertise with Tableau or similar visual analysis tools, optimization, analytics and large data sets, developing visually compelling interactive dashboards.Strong understanding of database concepts (Oracle, MS SQL, generic SQL, etc.).Strong understanding in visualization tools like TableauStrong understanding of data warehouse and data lake architecture (GCP, Teradata, Hadoop).Strong understanding on ML tools including R, Python, SAS, SPSS, DataRobot, H2OStrong understanding in AI & Gen AI toolsGood understanding of third party analytic tools.Experience with general purpose programming languages (Java, .Net, Python, Perl, etc.).Experience with shell scripting tools in Windows, Linux/Unix is a plus.Experience with data aggregating tools such as SPLUNK is a plus. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics.
Posted 1 month ago
2 - 5 years
0 Lacs
Pune, Maharashtra, India
Join us as a Test Automation Engineer at Barclays, where you will be responsible for supporting the successful delivery of location strategy projects to plan, budget, agreed quality and governance standards. You'll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. To be successful as a Test Automation Engineer you should have experience in: Programming Languages: Java, Python language used in test automation. Basic Unix knowledgeExperience on Service Virtualization will be a added advantageHands on Automation Frameworks: Selenium WebDriver, TestNG, CucumberTest Automation Design Patterns: Page Object Model, Factory PatternCI/CD Tools: Gitlab JenkinsVersion Control Systems: Gitlab, BitbucketAPI Testing: Rest Assured, Insomnia : JSON, and XML [Karate to be validated]Agile Methodologies: Knowledge of Kanban/Scrum practices, JIRA for tracking Some Other Highly Valued Skills May Include Database Testing: SQL, Oracle, MongoDB, Teradata database validationCloud Technologies: Familiarity with AWS cloud platforms for testing. You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based out of Pune. Purpose of the role To design, develop, and execute testing strategies to validate functionality, performance, and user experience, while collaborating with cross-functional teams to identify and resolve defects, and continuously improve testing processes and methodologies, to ensure software quality and reliability. Accountabilities Development and implementation of comprehensive test plans and strategies to validate software functionality and ensure compliance with established quality standards.Creation and execution automated test scripts, leveraging testing frameworks and tools to facilitate early detection of defects and quality issues. .Collaboration with cross-functional teams to analyse requirements, participate in design discussions, and contribute to the development of acceptance criteria, ensuring a thorough understanding of the software being tested.Root cause analysis for identified defects, working closely with developers to provide detailed information and support defect resolution.Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing.Stay informed of industry technology trends and innovations, and actively contribute to the organization's technology communities to foster a culture of technical excellence and growth. Analyst Expectations Will have an impact on the work of related teams within the area.Partner with other functions and business areas.Takes responsibility for end results of a team’s operational processing and activities.Escalate breaches of policies / procedure appropriately.Take responsibility for embedding new policies/ procedures adopted due to risk mitigation.Advise and influence decision making within own area of expertise.Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct.Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function.Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function.Make evaluative judgements based on the analysis of factual information, paying attention to detail.Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents.Guide and persuade team members and communicate complex / sensitive information.Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.
Posted 1 month ago
0 - 2 years
0 Lacs
Pune, Maharashtra, India
On-site
Join us as a Test Automation Engineer at Barclays, where you will be responsible for supporting the successful delivery of location strategy projects to plan, budget, agreed quality and governance standards. You'll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. To be successful as a Test Automation Engineer you should have experience in: Programming Languages: Java, Python language used in test automation. Basic Unix knowledgeExperience on Service Virtualization will be a added advantageHands on Automation Frameworks: Selenium WebDriver, TestNG, CucumberTest Automation Design Patterns: Page Object Model, Factory PatternCI/CD Tools: Gitlab JenkinsVersion Control Systems: Gitlab, BitbucketAPI Testing: Rest Assured, Insomnia : JSON, and XML [Karate to be validated]Agile Methodologies: Knowledge of Kanban/Scrum practices, JIRA for tracking Some Other Highly Valued Skills May Include Database Testing: SQL, Oracle, MongoDB, Teradata database validationCloud Technologies: Familiarity with AWS cloud platforms for testing. You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based out of Pune. Purpose of the role To design, develop, and execute testing strategies to validate functionality, performance, and user experience, while collaborating with cross-functional teams to identify and resolve defects, and continuously improve testing processes and methodologies, to ensure software quality and reliability. Accountabilities Development and implementation of comprehensive test plans and strategies to validate software functionality and ensure compliance with established quality standards.Creation and execution automated test scripts, leveraging testing frameworks and tools to facilitate early detection of defects and quality issues. .Collaboration with cross-functional teams to analyse requirements, participate in design discussions, and contribute to the development of acceptance criteria, ensuring a thorough understanding of the software being tested.Root cause analysis for identified defects, working closely with developers to provide detailed information and support defect resolution.Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing.Stay informed of industry technology trends and innovations, and actively contribute to the organization's technology communities to foster a culture of technical excellence and growth. Analyst Expectations Will have an impact on the work of related teams within the area.Partner with other functions and business areas.Takes responsibility for end results of a team’s operational processing and activities.Escalate breaches of policies / procedure appropriately.Take responsibility for embedding new policies/ procedures adopted due to risk mitigation.Advise and influence decision making within own area of expertise.Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct.Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function.Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function.Make evaluative judgements based on the analysis of factual information, paying attention to detail.Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents.Guide and persuade team members and communicate complex / sensitive information.Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave. Back to nav Share job X(Opens in new tab or window)Facebook(Opens in new tab or window)LinkedIn(Opens in new tab or window)
Posted 1 month ago
2 - 6 years
4 - 8 Lacs
Bengaluru
Work from Office
Employment Type Permanent Closing Date 11 May 2025 11:59pm Job Title Software Engineer - Informatica Job Summary As a Software Engineer, you thrive on working with your team to design, build and deliver innovative software products and solutions that delight our customers. You apply broad knowledge in software application layer solutions and the software development lifecycle, to experiment, solve problems and own solutions that transform epics into new product features and capabilities. Your continuous learning and improvement mindset and collaboration skills are critical to success in this role, as you continue to deepen your knowledge and expertise in the Software Engineering Domain. Job Description Who We Are Were an iconic Aussie brand with a global footprint. Our purpose is to build a connected future so everyone can thrive. Were all about providing the best experience and delivering the best tech on the best network. This includes making Telstra the place you want to work. For you, that means a having career that grows with you and working with a team powered by human connection that prioritizes wellbeing and choice. Focus of the Role As a Senior Software Engineer, you excel at understanding and translating customer needs into innovative products and capabilities. You leverage your deep technical expertise and experience in software application layer solutions to develop and deliver scalable design blueprints throughout the entire software development life cycle. Your commitment to continuous learning and improvement, along with your collaboration and influencing skills, are essential for success in this role. Responsibilities: As a Senior Software Engineer, you will utilize your extensive experience and technical knowledge in the Software Engineering domain to: Design, develop, and maintain ETL workflows using Informatica PowerCenter and/or Informatica Cloud (IICS) to support enterprise data integration needs. Lead the architecture and design of data integration solutions using Informatica PowerCenter, IICS, and other Informatica suite tools. Collaborate with business analysts and data architects to understand requirements and translate them into scalable ETL solutions. Perform data extraction, transformation, and loading from various sources including flat files, databases, and cloud platform. Optimize ETL processes for performance, reliability, and scalability. Write and optimize complex SQL queries for data transformation, validation, and troubleshooting. Perform unit testing and ensure data quality, accuracy, and integrity throughout the ETL lifecycle. Troubleshoot and resolve data-related issues and support production workflows as required. Document technical specifications and maintain code version control. Work in an Agile/Scrum environment and actively participate in sprint planning and reviews. Provide technical leadership across multiple projects and guide development teams on best practices and standards. Collaborate with business stakeholders, solution architects, and analysts to design robust and future-ready data solutions. Evaluate emerging technologies and propose improvements to the current data integration landscape. Ensure high availability and performance of Informatica environments; work closely with infrastructure and platform teams. Design and implement data governance, metadata management, and data quality frameworks. Essential Skills ETL Tools: Informatica PowerCenter, Informatica Cloud (IICS) Databases: Oracle, SQL Server, Snowflake, MySQL, Teradata Languages: SQL, PL/SQL, Shell Scripting Performance Tuning: Session/Workflow optimization, Pushdown Optimization Data Integration: Batch & Real-Time Data Processing, Data Migration Version Control: Git, SVN Testing & Debugging: Unit Testing, Data Validation, Error Handling Documentation: Technical Specs, Mapping Documents, Job Scheduling Were amongst the top 2% of companies globally in the CDP Global Climate Change Index 2023, being awarded an A rating. If you want to work for a company that cares about sustainability, we want to hear from you. As part of your application with Telstra, you may receive communications from us on +61 440 135 548 (for job applications in Australia) and +1 (623) 400-7726 (for job applications in the Philippines and India). When you join our team, you become part of a welcoming and inclusive community where everyone is respected, valued and celebrated. We actively seek individuals from various backgrounds, ethnicities, genders and abilities because we know that diversity not only strengthens our team but also enriches our work. We have zero tolerance for harassment of any kind, and we prioritize creating a workplace culture where everyone is safe and can thrive. As part of the hiring process, all identified candidates will undergo a background check, and the results will play a role in the final decision regarding your application.
Posted 1 month ago
6 - 8 years
8 - 10 Lacs
Hyderabad
Work from Office
Position Overview The job profile for this position is Software Engineering Lead Analyst, which is a Band 3 Contributor Career Track Role. Excited to grow your career? We value our talented employees, and whenever possible strive to help one of our associates grow professionally before recruiting new talent to our open positions. If you think this open position is right for you, we encourage you to apply! Our people make all the difference in our success. We are looking for exceptional software engineers/developers in our PBM Technology Organization. The Ab Initio ETL Developer is responsible for utilizing agile development methodologies to analyze, develop, and process Medical Claims data. This role is expected to understand user stories, identify appropriate designs, code to the design specifications, and complete appropriate unit testing to ensure quality solution delivery. In this role you are expected to work closely with developers, technical project managers, principal engineers and business stakeholders to ensure that application solutions meet business/customer requirements. Responsibilities Provide expertise in technical analysis and solving issues during project delivery. Perform unit testing and debugging. Set test conditions based upon code specifications. Prepare for code review, test case reviews, and ensure code developed meets the requirements. Review code developed by other IT Developers Ensure quality, standards, version control by properly following the process. Work proactively & independently to address project requirements, and articulate issues/challenges at appropriate time to address project delivery risks. Be hands-on in the design and development of robust solutions to hard problems, while considering scale, security, reliability, and cost Support other product delivery partners in the successful build, test, and release of solutions Work with distributed requirements and technical stakeholders to complete shared design and development Support the full software lifecycle of design, development, testing, and support for technical delivery Works with both onsite (Scrum Master, Product, QA and Developers) and offshore QA team members in properly defining testable scenarios based on requirements/acceptance criteria Participate in daily team standup meetings where youll give and receive updates on the current backlog and challenges Participate in code reviews. Ensure Code Quality and Deliverables Maximize the efficiency (operational, performance, and cost) of the application assets. Qualifications Required Skills: Strong knowledge and hands-on experience in SQL, Unix shell scripting, Oracle and Teradata databases Mainframe experience a plus Knowledge of Medical claims data and processing (Claims adjustments, ICD Codes, Procedures, Revenues, etc) Excellent analytical and interpersonal skills along with excellent oral and written communication skills. Required Experience & Education: 6-8 years of overall professional experience 3+ years of experience in Ab Initio ETL Experience with vendor management in an onshore/offshore model. Proven experience with architecture, design, and development of large-scale enterprise application solutions. Master degree or foreign degree equivalent in Information Technology, Business Information Systems, Technology Management, or related field of study. Industry certifications such as PMP, Scrum Master, or Six Sigma Green Beltis a plus Desired Experience: Healthcare experience including Disease Management Coachingof team members
Posted 1 month ago
5 - 8 years
0 Lacs
Navi Mumbai, Maharashtra, India
Hybrid
Our Company At Teradata, we believe that people thrive when empowered with better information. That’s why we built the most complete cloud analytics and data platform for AI. By delivering harmonized data, trusted AI, and faster innovation, we uplift and empower our customers—and our customers’ customers—to make better, more confident decisions. The world’s top companies across every major industry trust Teradata to improve business performance, enrich customer experiences, and fully integrate data across the enterprise. What you will do ? Be one of the Individual Contributor and a part of our Data Science practice in GSIH IndiaBe proactive and a SME in his/her skills and a great team member, specializing in AI/ML. Work in close coordination with Data Science teams in other GSIH centers as well International Data Science teams to serve global customers. Work with other GSIH and Teradata teams to produce packaged solutions, in addition to supporting Data Science projects and PoCs/PoVs. Work in a team to define and execute data science solutions that address client use cases and business requirementsDefine activities, scope, and timelines on data science projectsDeliver projects following our internal frameworks and best practicesDiscover, interpret and document unique insights in large-scale distributed datasets through exploratory analysis and the application of advanced analytical methodologiesCreate statistical, machine learning and deep learning models in multiple technologies using best-in-class data science approachesPrepare and conduct presentations and client workshops, communicating past experiences in sales and speaking opportunitiesGuide junior members of the delivery team during projectsTeradata Vantage as a skill that needs to be acquired after joining TeradataCandidate will work on Teradata ClearScape Analytics Offerings Who You’ll Work With Work in close coordination with Data Science teams in other GSIH centers as well International Data Science teams to serve global customers. Work with other GSIH and Teradata teams to produce packaged solutions, AaaS offerings, in addition to supporting Analytics projects and PoCs/PoVs. Work as an individual contributor to define and execute Advanced Analytics solutions that address client use cases and business requirements. Define activities, scope, and timelines for Advanced Analytics projects. Deliver projects following our internal frameworks and best practices. Be AI/ML expert to advice customers in defining Analytics Roadmap, Ecosystem and Architecture for Complex Enterprise Systems. Experienced in implementing AI/ML solutions in Agile environment. Will need to be a quick learner with the desire to improve skill setsAs this is a project-oriented position in either onsite or offsite situations having accountability for managing client expectations while delivering the services and solutions associated with the Teradata databaseThis position will combine direct client consulting engagement activities, new solution development and analytics positioning responsibilities What Makes You a Qualified Candidate Bachelor’s, Master’s or PhD degree in a related discipline (Mathematics, Statistics, Computer Science, or Data Science)8 years of related work and/or research experience in quantitative rolesIn-depth knowledge in at least 2 of the following data science domainsText Mining / NLPGraph and Network AnalysisDeep Learning/Gen AIGeospatial AnalysisSignal Processing, Image, VideoPredictive Modeling and Recommender SystemsExtensive knowledge of at least one open-source scientific language such as PythonKnowledge of working with AI/ML modules of cloud providers is a plusFluency in SQL and good knowledge of relational databasesPassionate about asking and answering questions in large, distributed datasets Experience with at least one general purpose, high level programming language such as C/C++, Java, PHP, or Python What You’ll Bring Experience implementing end-to-end large scale Machine Learning/AI projectExperience in Python, SQLBe a clear and confident communicator with excellent presentation skills and with the ability to translate technology concepts into concise business focused messages in both technical and sales presentationsPossess strong interpersonal and communication skillsBe proficient in the use of both written and spoken business EnglishProven skills in creative-problem-solving of complex and advanced technical subject matterPossess a solid understanding of the value of data and how technology enables companies to compete better in the market placeBe able to consult in a client facing environment, linking analytical solutions to the value and competitive advantage that technology can help a client deliver. Will have prior experience of working in a multi-country organizationHave the ability to collaborate with an interdisciplinary team to solve problemsBe self-starter with a positive attitude, intellectual curiosity and a passion for analytics and solving real world problemsBe willing to travel up to 40-70% Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status.
Posted 1 month ago
2 - 7 years
5 - 10 Lacs
Bengaluru
Work from Office
Provide input into design and development of systems and applications to meet technical / business requirements in accordance with ANZ s technical standards. Produce re-usable data assets that stakeholders can leverage to derive value through identification of key trends, insights, and reporting. Provide quality assurance through following technical governance & quality control standards. Contribute to the management of the regular jobs to ensure all data assets are produced according to their schedule and any issues are promptly identified whilst contributing to continuous improvement for quality control. Provide timely operational support for specific systems which may include after-hours pager support for production systems using the Bank standard change and problem management tools. Ensure all services comply with ANZ Group, Information & Insights strategy, policies, processes, and standards and with external regulatory requirements. Provide guidance and support to peer through code reviews. What will you bring The must have knowledge, skill, and experience (KSE) the role requires are: Strong interpersonal skills with the ability to consult and influence key stakeholders effectively. Excellent communication skills, including written, verbal, and presentation capabilities. Proven problem-solving abilities, with a track record of working independently and collaboratively in team environments. Demonstrated ability to set clear, achievable goals, manage priorities, and consistently meet deadlines. Ability to constructively challenge and improve existing processes and procedures. Hands-on experience with database platforms such as Teradata, Oracle, or Microsoft SQL Server. Strong programming expertise with Python and Apache Spark. Mandatory experience with Apache Airflow for workflow orchestration. The good to have knowledge, skill and experience (KSE) the role requires are: Working knowledge and experience with Docker for containerization. Experience working with Astronomer for Airflow. Experience delivering work and leading others in an Agile context. Experience delivering data outcomes that drive decision making in a financial services or banking organisation. Experience with API s, containerisation, automation and CI/CD is highly advantageous. Previous data warehousing & ELT experience across a broad technology stack. Apache Airflow Certification from Astronomer is an added advantage
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Teradata is a popular data warehousing platform that is widely used by businesses in India. As a result, there is a growing demand for skilled professionals who can work with Teradata effectively. Job seekers in India who have expertise in Teradata have a wide range of opportunities available to them across different industries.
These cities are known for their thriving tech industries and have a high demand for Teradata professionals.
The average salary range for Teradata professionals in India varies based on experience levels. Entry-level roles can expect to earn around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15 lakhs per annum.
In the field of Teradata, a typical career path may involve progressing from roles such as Junior Developer to Senior Developer, and eventually to a Tech Lead position. With experience and skill development, professionals can take on more challenging and higher-paying roles in the industry.
In addition to Teradata expertise, professionals in this field are often expected to have knowledge of SQL, data modeling, ETL tools, and data warehousing concepts. Strong analytical and problem-solving skills are also essential for success in Teradata roles.
As you prepare for interviews and explore job opportunities in Teradata, remember to showcase your skills and experience confidently. With the right preparation and determination, you can land a rewarding role in the dynamic field of Teradata in India. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.