Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
2.0 - 5.0 years
6 - 10 Lacs
Pune
Work from Office
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviour’s. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modelling techniques to support analytics and reporting requirements Preferred technical and professional experience Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Title: Data Quality Lead Department: Data Governance / IT Location: Pune / Bangalore Experience: 6 -8 yrs Notice period: 30 days Key Responsibilities: Lead the development and implementation of the enterprise-wide Data Quality Define and monitor key data quality metrics across various business domains. Collaborate with IT and data governance teams to establish and enforce data governance policies and frameworks. Conduct regular data quality assessments to identify gaps and areas for improvement. Implement data cleansing, validation, and enrichment processes to enhance data accuracy and reliability. Preferred Skills: Experience with tools like Informatica, Talend, Collibra, or similar. Familiarity with regulatory requirements Certification in Data Management or Data Governance. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Preferred Education Master's Degree Required Technical And Professional Expertise Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modelling techniques to support analytics and reporting requirements Preferred Technical And Professional Experience Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks Show more Show less
Posted 1 week ago
5.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology/MCA 5 to 8 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: ------------------------------------------------------ Citi is an equal opportunity and affirmative action employer. Qualified applicants will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Citigroup Inc. and its subsidiaries ("Citi”) invite all qualified interested applicants to apply for career opportunities. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi . View the " EEO is the Law " poster. View the EEO is the Law Supplement . View the EEO Policy Statement . View the Pay Transparency Posting Show more Show less
Posted 1 week ago
3.0 - 4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology/MCA 3 to 4 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in at least one of the data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Exposure to ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity and affirmative action employer. Qualified applicants will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Citigroup Inc. and its subsidiaries ("Citi”) invite all qualified interested applicants to apply for career opportunities. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi . View the " EEO is the Law " poster. View the EEO is the Law Supplement . View the EEO Policy Statement . View the Pay Transparency Posting Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
India
Remote
🚀 We’re Hiring: Data Engineer | Join Our Team! Location: Remote We’re looking for a skilled and motivated Data Engineer to join our growing team and help us build scalable data pipelines, optimize data platforms, and enable real-time analytics. 🧠 What You'll Do 🔹 Design, develop, and maintain robust data pipelines using tools like Databricks, PySpark, SQL, and Azure Data Factory 🔹 Collaborate with data scientists, analysts, and business teams to ensure data is accessible, clean, and actionable 🔹 Work on modern data lakehouse architectures and contribute to data governance and quality frameworks 🎯 Tech Stack ☁️ Azure | 🧱 Databricks | 🐍 PySpark | 📊 SQL 👤 What We’re Looking For ✅ 3+ years experience in data engineering or analytics engineering ✅ Hands-on with cloud data platforms and large-scale data processing ✅ Strong problem-solving mindset and a passion for clean, efficient data design Job Description: Min 3 years of experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, AWS, GCP, Data Bricks etc. Azure experience is preferred over other cloud platforms. 5 years of proven experience with SQL, schema design and dimensional data modelling Solid knowledge of data warehouse best practices, development standards and methodologies Experience with ETL/ELT tools like ADF, Informatica, Talend etc., and data warehousing technologies like Azure Synapse, Azure SQL, Amazon redshift, Snowflake, Google Big Query etc. Strong experience with big data tools (Databricks, Spark etc..) and programming skills in PySpark and Spark SQL. Be an independent self-learner with “let’s get this done” approach and ability to work in Fast paced and Dynamic environment. Excellent communication and teamwork abilities. Nice-to-Have Skills: Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Cosmo DB knowledge. SAP ECC /S/4 and Hana knowledge. Intermediate knowledge on Power BI Azure DevOps and CI/CD deployments, Cloud migration methodologies and processes Show more Show less
Posted 1 week ago
5.0 - 7.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job Title: Assistant Manager - Data Engineer Location: Andheri (Mumbai) Job Type: Full-Time Department: IT Position Overview: The Assistant Manager - Data Engineer will play a pivotal role in the design, development, and maintenance of data pipelines that ensure the efficiency, scalability, and reliability of our data infrastructure. This role will involve optimizing and automating ETL/ELT processes, as well as developing and refining databases, data warehouses, and data lakes. As an Assistant Manager, you will also mentor junior engineers and collaborate closely with cross-functional teams to support business goals and drive data excellence. Key Responsibilities: Data Pipeline Development: Design, build, and maintain efficient, scalable, and reliable data pipelines to support data analytics, reporting, and business intelligence initiatives. Database and Data Warehouse Management: Develop, optimize, and manage databases, data warehouses, and data lakes to enhance data accessibility and business decision-making. ETL/ELT Optimization: Automate and optimize data extraction, transformation, and loading (ETL/ELT) processes, ensuring efficient data flow and improved system performance. Data Modeling & Architecture: Develop and maintain data models to enable structured data storage, analysis, and reporting in alignment with business needs. Workflow Management Systems: Implement, optimize, and maintain workflow management tools (e.g., Apache Airflow, Talend) to streamline data engineering tasks and improve operational efficiency. Team Leadership & Mentorship: Guide, mentor, and support junior data engineers to enhance their skills and contribute effectively to projects. Collaboration with Cross-Functional Teams: Work closely with data scientists, analysts, business stakeholders, and IT teams to understand requirements and deliver solutions that align with business objectives. Performance Optimization: Continuously monitor and optimize data pipelines and storage solutions to ensure maximum performance and cost efficiency. Documentation & Process Improvement: Create and maintain documentation for data models, workflows, and systems. Contribute to the continuous improvement of data engineering practices. Qualifications: Educational Background: B.E., B.Tech., MCA Professional Experience: At least 5 to 7 years of experience in a data engineering or similar role, with hands-on experience in building and optimizing data pipelines, ETL processes, and database management. Technical Skills: Proficiency in Python and SQL for data processing, transformation, and querying. Experience with modern data warehousing solutions (e.g., Amazon Redshift, Snowflake, Google BigQuery, Azure Data Lake). Strong background in data modeling (dimensional, relational, star/snowflake schema). Hands-on experience with ETL tools (e.g., Apache Airflow, Talend, Informatica) and workflow management systems . Familiarity with cloud platforms (AWS, Azure, Google Cloud) and distributed data processing frameworks (e.g., Apache Spark). Data Visualization & Exploration: Familiarity with data visualization tools (e.g., Tableau, Power BI) for analysis and reporting. Leadership Skills: Demonstrated ability to manage and mentor a team of junior data engineers while fostering a collaborative and innovative work environment. Problem-Solving & Analytical Skills: Strong analytical and troubleshooting skills with the ability to optimize complex data systems for performance and scalability. Experience in Pharma/Healthcare (preferred but not required): Knowledge of the pharmaceutical industry and experience with data in regulated environments Desired Skills: Familiarity with industry-specific data standards and regulations. Experience working with machine learning models or data science pipelines is a plus. Strong communication skills with the ability to present technical data to non-technical stakeholders. Why Join Us: Impactful Work: Contribute to the pharmaceutical industry by improving data-driven decisions that impact public health. Career Growth: Opportunities to develop professionally in a fast-growing industry and company. Collaborative Environment: Work with a dynamic and talented team of engineers, data scientists, and business stakeholders. Competitive Benefits: Competitive salary, health benefits and more. Show more Show less
Posted 1 week ago
5.0 - 8.0 years
9 - 9 Lacs
Hyderābād
On-site
Title: Data Integration Developer – Assistant Manager Department: Alpha Data Platform Reports To: Data Integration Lead, Engineering Summary: State Street Global Alpha Data Platform , lets you load, enrich and aggregate investment data. Alpha Clients will be able to manage multi-asset class data from any service provider or data vendor for a more holistic and integrated view of their holdings. This platform reflects State Street’s years of experience servicing complex instruments for our global client base and our investments in building advanced data management technologies. Reporting to the Alpha Development delivery manager in <
Posted 1 week ago
4.0 years
6 - 9 Lacs
Hyderābād
On-site
About Citco Citco is a global leader in fund services, corporate governance and related asset services with staff across 80 offices worldwide. With more than $1.7 trillion in assets under administration, we deliver end-to-end solutions and exceptional service to meet our clients’ needs. For more information about Citco, please visit www.citco.com About the Team & Business Line: Citco Fund Services is a division of the Citco Group of Companies and is the largest independent administrator of Hedge Funds in the world. Our continuous investment in learning and technology solutions means our people are equipped to deliver a seamless client experience. This position reports in to the Loan Services Business Line As a core member of our Loan Services Data and Reporting team, you will be working with some of the industry’s most accomplished professionals to deliver award-winning services for complex fund structures that our clients can depend upon Job Duties in Brief: Your Role: Develop and execute database queries and conduct data analyses Create scripts to analyze and modify data, import/export scripts and execute stored procedures Model data by writing SQL queries/Python codes to support data integration and dashboard requirements Develop data pipelines that provide fast, optimized, and robust end-to-end solutions Leverage and contribute to design/building relational database schemas for analytics. Handle and manipulate data in various structures and repositories (data cube, data mart, data warehouse, data lake) Analyze, implement and contribute to building of APIs to improve data integration pipeline Perform data preparation tasks including data cleaning, normalization, deduplication, type conversion etc. Perform data integration through extracting, transforming and loading (ETL) data from various sources. Identify opportunities to improve processes and strategies with technology solutions and identify development needs in order to improve and streamline operations Create tabular reports, matrix reports, parameterized reports, visual reports/dashbords in a reporting application such as Power BI Desktop/Cloud or QLIK Integrating PBI/QLIK reports into other applications using embedded analytics like Power BI service (SaaS), or by API automation is also an advantage Implementation of NLP techniques for text representation, semantic extraction techniques, data structures and modelling Contribute to deployment and maintainence of machine learning solutions in production environments Building and Designing cloud applications using Microsoft Azure/AWS cloud technologies. About You: Background / Qualifications Bachelor’s Degree in technology/related field or equivalent work experience 4+ Years of SQL and/or Python experience is a must Strong knowledge of data concepts and tools and experienced in RDMS such as MS SQL Server, Oracle etc. Well-versed with concepts and techniques of Business Intelligence and Data Warehousing. Strong database designing and SQL skills. objects development, performance tuning and data analysis In-depth understanding of database management systems, OLAP & ETL frameworks Familiarity or hands on experience working with REST or SOAP APIs Well versed with concepts for API Management and Integration with various data sources in cloud platforms, to help with connecting to traditional SQL and new age data sources, such as Snowflake Familiarity with Machine Learning concepts like feature selection/deep learning/AI and ML/DL frameworks (like Tensorflow or PyTorch) and libraries (like scikit-learn, StatsModels) is an advantage Familiarity with BI technologies (e.g. Microsoft Power BI, Oracle BI) is an advantage Hands-on experience at least in one ETL tool (SSIS, Informatica, Talend, Glue, Azure Data factory) and associated data integration principles is an advantage Minimum 1+ year experience with Cloud platform technologies (AWS/Azure), including Azure Machine Learning is desirable. Following AWS experience is a plus: Implementing identity and access management (IAM) policies Managing user accounts with IAM Knowledge of writing infrastructure as code (IaC) using CloudFormation or Terraform. Implementing cloud storage using Amazon Simple Storage Service (S3) Experience with serverless approaches using AWS Lambda, e.g. AWS (SAM) Configuring Amazon Elastic Compute Cloud (EC2) Instances Previous Work Experience: Experience querying databases and strong programming skills: Python, SQL, PySpark etc. Prior experience supporting ETL production environments & web technologies such as XML is an advatange Previous working experience on Azure Data Services including ADF, ADLS, Blob, Data Bricks, Hive, Python, Spark and/or features of Azure ML Studio, ML Services and ML Ops is an advantage Experience with dashboard and reporting applications like Qlik, Tableau, Power BI Other: Well rounded individual possessing a high degree of initiative Proactive person willing to accept responsibility with very little hand-holding A strong analytical and logical mindset Demonstrated proficiency in interpersonal and communication skills including oral and written English. Ability to work in fast paced, complex Business & IT environments Knowledge of Loan Servicing and/or Loan Administration is an advantage Understanding of Agile/Scrum methodology as it relates to the software development lifecycle What We Offer: A rewarding and challenging environment that spans multiple geographies and multiple business lines Great working environment, competitive salary and benefits, and opportunities for educational support Be part of an industry leading global organisation, renowned for excellence Opportunities for personal and professional career development Our Benefits Your well-being is of paramount importance to us, and central to our success. We provide a range of benefits, training and education support, and flexible working arrangements to help you achieve success in your career while balancing personal needs. Ask us about specific benefits in your location. We embrace diversity, prioritizing the hiring of people from diverse backgrounds. Our inclusive culture is a source of pride and strength, fostering innovation and mutual respect. Citco welcomes and encourages applications from people with disabilities. Accommodations are available upon request for candidates taking part in all aspects of the selection .
Posted 1 week ago
5.0 years
6 - 9 Lacs
Gurgaon
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title and Summary Senior Analyst, Big Data Analytics & Engineering Overview: Job Title: Sr. Analyst, Data Engineering, Value Quantification Team (Based in Pune, India) About Mastercard: Mastercard is a global technology leader in the payments industry, committed to powering an inclusive, digital economy that benefits everyone, everywhere. By leveraging secure data, cutting-edge technology, and innovative solutions, we empower individuals, financial institutions, governments, and businesses to achieve their potential. Our culture is driven by our Decency Quotient (DQ), ensuring inclusivity, respect, and integrity guide everything we do. Operating across 210+ countries and territories, Mastercard is dedicated to building a sustainable world with priceless opportunities for all. Position Overview: This is a techno-functional position that combines strong technical skills with a deep understanding of business needs and requirements with 5-7 years or experience. The role focuses on developing and maintaining advanced data engineering solutions for pre-sales value quantification within the Services business unit. As a Sr. Analyst, you will be responsible for creating and optimizing data pipelines, managing large datasets, and ensuring the integrity and accessibility of data to support Mastercard’s internal teams in quantifying the value of services, enhancing customer engagement, and driving business outcomes. The role requires close collaboration across teams to ensure data solutions meet business needs and deliver measurable impact. Role Responsibilities: Data Engineering & Pipeline Development: Develop and maintain robust data pipelines to support the value quantification process. Utilize tools such as Apache NiFi, Azure Data Factory, Pentaho, Talend, SSIS, and Alteryx to ensure efficient data integration and transformation. Data Management and Analysis: Manage and analyze large datasets using SQL, Hadoop, and other database management systems. Perform data extraction, transformation, and loading (ETL) to support value quantification efforts. Advanced Analytics Integration: Use advanced analytics techniques, including machine learning algorithms, to enhance data processing and generate actionable insights. Leverage programming languages such as Python (Pandas, NumPy, PySpark) and Impala for data analysis and model development. Business Intelligence and Reporting: Utilize business intelligence platforms such as Tableau and Power BI to create insightful dashboards and reports that communicate the value of services. Generate actionable insights from data to inform strategic decisions and provide clear, data-backed recommendations. Cross-Functional Collaboration & Stakeholder Engagement: Collaborate with Sales, Marketing, Consulting, Product, and other internal teams to understand business needs and ensure successful data solution development and deployment. Communicate insights and data value through compelling presentations and dashboards to senior leadership and internal teams, ensuring tool adoption and usage. All About You: Data Engineering Expertise: Proficiency in data engineering tools and techniques to develop and maintain data pipelines. Experience with data integration tools such as Apache NiFi, Azure Data Factory, Pentaho, Talend, SSIS, and Alteryx. Advanced SQL Skills: Strong skills in SQL for querying and managing large datasets. Experience with database management systems and data warehousing solutions. Programming Proficiency: Knowledge of programming languages such as Python (Pandas, NumPy, PySpark) and Impala for data analysis and model development. Business Intelligence and Reporting: Experience in creating insightful dashboards and reports using business intelligence platforms such as Tableau and Power BI. Statistical Analysis: Ability to perform statistical analysis to identify trends, correlations, and insights that support strategic decision-making. Cross-Functional Collaboration: Strong collaboration skills to work effectively with Sales, Marketing, Consulting, Product, and other internal teams to understand business needs and ensure successful data solution development and deployment. Communication and Presentation: Excellent communication skills to convey insights and data value through compelling presentations and dashboards to senior leadership and internal teams. Execution Focus: A results-driven mindset with the ability to balance strategic vision with tactical execution, ensuring that data solutions are delivered on time and create measurable business value. Education: Bachelor’s degree in Data Science, Computer Science, Business Analytics, Economics, Finance, or a related field. Advanced degrees or certifications in analytics, data science, AI/ML, or an MBA are preferred. Why Us? At Mastercard, you’ll have the opportunity to shape the future of internal operations by leading the development of data engineering solutions that empower teams across the organization. Join us to make a meaningful impact, drive business outcomes, and help Mastercard’s internal teams create better customer engagement strategies through innovative value-based ROI narratives. Location: Gurgaon/Pune, India Employment Type: Full-Time Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 1 week ago
5.0 years
5 - 8 Lacs
Chennai
On-site
Candidate should be having 5+ years of experience in Talend. Should be expert in using all components in Talend. Good and experienced in SQL Good analytical skill in resolving UAT/PROD issues and provide solution. Should have experience in various file formats like .xls, .csv, xml, JSON etc., Should have experience in resolving performance related issues and exception handling techniques. Should have good PLSQL skills. Basics in Unix Fundamental knowledge of Linux and/or Unix About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 1 week ago
10.0 years
4 - 5 Lacs
Chennai
On-site
The Testing Sr Analyst is a seasoned professional role. Applies in-depth disciplinary knowledge, contributing to the development of new techniques and the improvement of processes and work-flow for the area or function. Integrates subject matter and industry expertise within a defined area. Requires in-depth understanding of how areas collectively integrate within the sub-function as well as coordinate and contribute to the objectives of the function and overall business. Evaluates moderately complex and variable issues with substantial potential impact, where development of an approach/taking of an action involves weighing various alternatives and balancing potentially conflicting situations using multiple sources of information. Requires good analytical skills in order to filter, prioritize and validate potentially complex and dynamic material from multiple sources. Strong communication and diplomacy skills are required. Regularly assumes informal/formal leadership role within teams. Involved in coaching and training of new recruits Significant impact in terms of project size, geography, etc. by influencing decisions through advice, counsel and/or facilitating services to others in area of specialization. Work and performance of all teams in the area are directly affected by the performance of the individual. Responsibilities: Supports initiatives related to User Acceptance Testing (UAT) process and product rollout into production. Testing specialists who work with technology project managers, UAT professionals and users to design and implement appropriate scripts/plans for an application testing strategy/approach. Tests and analyzes a broad range of systems and applications to ensure they meet or exceed specified standards and end-user requirements. Work closely with key stakeholders to understand business and functional requirements to develop test plans, test cases and scripts. Works complex testing assignments. Executes test scripts according to application requirements documentation. Identifies defects and recommends appropriate course of action; performs root cause analyses. Coordinates multiple testers and testing activities within a project. Retests after corrections are made to ensure problems are resolved. Documents, evaluates and researches test results for future replication. Identifies, recommends and implements process improvements to enhance testing strategies. Analyzes requirements and design aspects of projects. Interfaces with client leads and development teams. Exhibits sound understanding of concepts and principles in own technical area and a basic knowledge of these elements in other areas. Requires in-depth understanding of how own area integrates within IT testing and has basic commercial awareness. Makes evaluative judgments based on analysis of factual information in complicated and novel situations. Participate in test strategy meetings, Has direct impact on the team and closely related teams by ensuring the quality of the tasks services information provided by self and others. Requires sound and comprehensive communication and diplomacy skills to exchange complex information. Provide metrics related to the cost, effort, and milestones of Quality activities on a project level Acts as advisor and mentor for junior members of the team. Regularly assumes informal/formal leadership role within teams. Perform other duties and functions as assigned Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 10+ years Testing Analyst experience Familiarity with the Software Development Lifecycle (SDLC) and how Quality Assurance methodology fits into the SDLC Knowledge of relevant operating systems, languages and database tools Knowledge of defect tracking systems and processes; including change management Knowledge of automated regression testing tools. Experience of testing trading platforms or similar software. Ability to work under pressure during tight dead lines Requires methodical approach to testing and problem solving. Requires theoretical and analytical skills, with demonstrated ability in planning and operations Excellent communication and stakeholder management skills with a proactive attitude, always seeking opportunities to add value Specific software languages will be dependent of area of business Education: Bachelor’s/University degree or equivalent experience We are seeking a highly skilled ETL Automation Tester with strong expertise in complex SQL, file-to-database validation, and data quality assurance. The ideal candidate will have hands-on experience validating various feed file formats (.csv, .json, .xls), and be comfortable with automation frameworks and tools for enhancing test efficiency. This role involves close collaboration with developers, data engineers, and stakeholders to ensure the integrity, consistency, and quality of data across our systems. Experience in testing reporting systems with Cognos/Tableau is required. Key Responsibilities: Lead the end-to-end validation of ETL processes, including data extraction, transformation, and loading validation across large volumes of structured and semi-structured data. Drive data quality assurance initiatives by defining test strategy, creating comprehensive test plans, and executing test cases based on data mapping documents and transformation logic. Validate file-based feeds (.csv, .json, .xls, etc.) by ensuring accurate ingestion into target data warehouse environments. Develop and optimize complex SQL queries to perform deep data audits, aggregation checks, and integrity validations across staging and warehouse layers. Own the defect lifecycle using tools like JIRA, providing high-quality defect reporting and traceability across all testing cycles. Collaborate with business analysts, developers, and data architects to ensure test alignment with business expectations and technical design. Perform report-level validations in tools such as Cognos or Tableau, ensuring consistency between backend data and visual representations. Mentor junior testers, review test artifacts, and guide the team in best practices for ETL testing and documentation. Contribute to QA process improvements, testing templates, and governance initiatives to standardize data testing practices across projects. Required Skills: Strong hands-on experience in ETL and data warehouse testing. Advanced proficiency in SQL and strong experience with RDBMS technologies (Oracle, SQL Server, PostgreSQL, etc.). In-depth experience with file-to-database validation and knowledge of various data formats. Proven track record in test strategy design, test planning, and defect management for large-scale data migration or ETL initiatives. Experience with ETL tools like Talend, or custom data processing scripts (tool-specific expertise not mandatory). Strong understanding of data modeling concepts, referential integrity, and transformation rules. Familiarity with Agile methodologies and experience working in fast-paced environments with iterative delivery. Excellent communication, stakeholder management, and documentation skills. Good to Have: Exposure to BI tools like Cognos, Tableau, etc. for end-user report validation. Prior experience in validating front-end UI connected to data dashboards or reports. - Job Family Group: Technology - Job Family: Technology Quality - Time Type: Full time - Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi . View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 1 week ago
3.0 years
2 - 5 Lacs
Chennai
On-site
Req ID: 324657 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Analyst to join our team in Chennai, Tamil Nādu (IN-TN), India (IN). Key Responsibilities: Extract, transform, and load (ETL) data from various sources, ensuring data quality, integrity, and accuracy. Perform data cleansing, validation, and preprocessing to prepare structured and unstructured data for analysis. Develop and execute queries, scripts, and data manipulation tasks using SQL, Python, or other relevant tools. Analyze large datasets to identify trends, patterns, and correlations, drawing meaningful conclusions that inform business decisions. Create clear and concise data visualizations, dashboards, and reports to communicate findings effectively to stakeholders. Collaborate with clients and cross-functional teams to gather and understand data requirements, translating them into actionable insights. Work closely with other departments to support their data needs. Collaborate with Data Scientists and other analysts to support predictive modeling, machine learning, and statistical analysis. Continuously monitor data quality and proactively identify anomalies or discrepancies, recommending corrective actions. Stay up-to-date with industry trends, emerging technologies, and best practices to enhance analytical techniques. Assist in the identification and implementation of process improvements to streamline data workflows and analysis. Basic Qualifications: 3 + years of proficiency in data analysis tools such as [Tools - e.g., Excel, SQL, R, Python]. 3+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. 2+ years of experience leading a team supporting data related projects to develop end-to-end technical solutions. Undergraduate or Graduate degree preferred Ability to travel at least 25%. Preferred Skills: Strong proficiency in data analysis tools such as Python, SQL, Talend (any ETL). Experience with data visualization tools like PowerBI. Experience with cloud data platforms . Familiarity with ETL (Extract, Transform, Load) processes and tools. Knowledge of machine learning techniques and tools. Experience in a specific industry (e.g., financial services, healthcare, manufacturing) can be a plus. Understanding of data governance and data privacy regulations. Ability to query and manipulate databases and data warehouses. Excellent analytical and problem-solving skills. Strong communication skills with the ability to explain complex data insights to non-technical stakeholders. Detail-oriented with a commitment to accuracy. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.
Posted 1 week ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Key Responsibilities ETL & BI Testing: Manage the testing of ETL processes, data pipelines, and BI reports to ensure accuracy and reliability. Develop and execute test strategies, test plans, and test cases for data validation. Perform data reconciliation, transformation validation, and SQL-based testing to ensure data correctness. Validate reports and dashboards built using BI tools (Power BI, Tableau). Automate ETL testing where applicable using Python, Selenium, or other automation tools. Identify and log defects, track issues, and ensure timely resolution. Collaborate with business stakeholders to understand data requirements and reporting needs. Assist in documenting functional and non-functional requirements for data transformation and reporting. Support data mapping, data profiling, and understanding business rules applied to datasets. Participate in requirement-gathering sessions and provide inputs on data validation needs. Required Skills & Experience 6–8 years of experience in ETL, Data Warehouse, and BI testing. Strong experience with SQL, data validation techniques, and database testing. Hands-on experience with ETL tools (Informatica, Talend, SSIS, or similar). Proficiency in BI tools like Power BI, Tableau for report validation. Good knowledge of data modeling, star schema, and OLAP concepts. Mentor a team of ETL/BI testers and provide guidance on testing best practices. Coordinate with developers, BAs, and business users to ensure end-to-end data validation. Define QA processes, best practices, and automation strategies to improve testing efficiency. Experience in data reconciliation, transformation logic validation, and data pipeline testing. Experience in Insurance domain is an added advantage. Automation skills for data and report testing (Python, Selenium, or ETL testing frameworks) are a plus. Experience in understanding and documenting business & data requirements. Ability to work with business users to gather and analyze reporting needs. Strong analytical and problem-solving skills. Excellent communication and stakeholder management abilities. Experience with Agile/Scrum methodologies and working in a cross-functional team. Preferred Qualifications Experience in cloud-based data platforms (AWS, Azure, GCP) is a plus. ISTQB or equivalent certification in software testing is preferred. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Andhra Pradesh
On-site
DBT - Designing, developing, and technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. Very strong on PL/SQL - Queries, Procedures, JOINs. Snowflake SQL - Writing SQL queries against Snowflake and developing scripts in Unix, Python, etc., to perform Extract, Load, and Transform operations. Good to have Talend knowledge and hands-on experience. Candidates who have worked in PROD support would be preferred. Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Complex problem-solving capability and continuous improvement approach. Desirable to have Talend / Snowflake Certification. Excellent SQL coding skills, excellent communication, and documentation skills. Familiar with Agile delivery process. Must be analytical, creative, and self-motivated. Work effectively within a global team environment. Excellent communication skills. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 1 week ago
0 years
0 Lacs
Andhra Pradesh
On-site
Data Tester Job Description Highlights: 5 plus years experience in data testing ETL Testing: Validating the extraction, transformation, and loading (ETL) of data from various sources. Data Validation: Ensuring the accuracy, completeness, and integrity of data in databases and data warehouses. SQL Proficiency: Writing and executing SQL queries to fetch and analyze data. Data Modeling: Understanding data models, data mappings, and architectural documentation. Test Case Design: Creating test cases, test data, and executing test plans. Troubleshooting: Identifying and resolving data-related issues. Dashboard Testing: Validating dashboards for accuracy, functionality, and user experience. Collaboration: Working with developers and other stakeholders to ensure data quality and functionality. Primary Responsibilities Dashboard Testing Components: Functional Testing: Simulating user interactions and clicks to ensure dashboards are functioning correctly. Performance Testing: Evaluating dashboard responsiveness and load times. Data Quality Testing: Verifying that the data displayed on dashboards is accurate, complete, and consistent. Usability Testing: Assessing the ease of use and navigation of dashboards. Data Visualization Testing: Ensuring charts, graphs, and other visualizations are accurate and present data effectively. Security Testing: Verifying that dashboards are secure and protect sensitive data. Tools and Technologies: SQL: Used for querying and validating data. Hands on snowflake ETL Tools: Tools like Talend, Informatica, or Azure Data Factory used for data extraction, transformation, and loading. Data Visualization Tools: Tableau, Power BI, or other BI tools used for creating and testing dashboards. Testing Frameworks: Frameworks like Selenium or JUnit used for automating testing tasks. Cloud Platforms: AWS platforms used for data storage and processing. Hands on Snowflake experience HealthCare Domain knowledge is plus point. Secondary Skills Automation framework, Life science domain experience. UI Testing, API Testing Any other ETL Tools About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 1 week ago
0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Role Description Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes Act creatively to develop pipelines/applications by selecting appropriate technical options optimizing application development maintenance and performance through design patterns and reusing proven solutions. Support the Project Manager in day-to-day project execution and account for the developmental activities of others. Interpret requirements create optimal architecture and design solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code using best standards debug and test solutions to ensure best-in-class quality. Tune performance of code and align it with the appropriate infrastructure understanding cost implications of licenses and infrastructure. Create data schemas and models effectively. Develop and manage data storage solutions including relational databases NoSQL databases Delta Lakes and data lakes. Validate results with user representatives integrating the overall solution. Influence and enhance customer satisfaction and employee engagement within project teams. Measures Of Outcomes TeamOne's Adherence to engineering processes and standards TeamOne's Adherence to schedule / timelines TeamOne's Adhere to SLAs where applicable TeamOne's # of defects post delivery TeamOne's # of non-compliance issues TeamOne's Reduction of reoccurrence of known defects TeamOne's Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirementst Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). TeamOne's Average time to detect respond to and resolve pipeline failures or data issues. TeamOne's Number of data security incidents or compliance breaches. Code Outputs Expected: Develop data processing code with guidance ensuring performance and scalability requirements are met. Define coding standards templates and checklists. Review code for team and peers. Documentation Create/review templates checklists guidelines and standards for design/process/development. Create/review deliverable documents including design documents architecture documents infra costing business requirements source-target mappings test cases and results. Configure Define and govern the configuration management plan. Ensure compliance from the team. Test Review/create unit test cases scenarios and execution. Review test plans and strategies created by the testing team. Provide clarifications to the testing team. Domain Relevance Advise data engineers on the design and development of features and components leveraging a deeper understanding of business needs. Learn more about the customer domain and identify opportunities to add value. Complete relevant domain certifications. Manage Project Support the Project Manager with project inputs. Provide inputs on project plans or sprints as needed. Manage the delivery of modules. Manage Defects Perform defect root cause analysis (RCA) and mitigation. Identify defect trends and implement proactive measures to improve quality. Estimate Create and provide input for effort and size estimation and plan resources for projects. Manage Knowledge Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Execute and monitor the release process. Design Contribute to the creation of design (HLD LLD SAD)/architecture for applications business components and data models. Interface With Customer Clarify requirements and provide guidance to the Development Team. Present design options to customers. Conduct product demos. Collaborate closely with customer architects to finalize designs. Manage Team Set FAST goals and provide feedback. Understand team members' aspirations and provide guidance and opportunities. Ensure team members are upskilled. Engage the team in projects. Proactively identify attrition risks and collaborate with BSE on retention measures. Certifications Obtain relevant domain and technology certifications. Skill Examples Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning. Experience in data warehouse design and cost improvements. Apply and optimize data models for efficient storage retrieval and processing of large datasets. Communicate and explain design/development aspects to customers. Estimate time and resource requirements for developing/debugging features/components. Participate in RFP responses and solutioning. Mentor team members and guide them in relevant upskilling and certification. Knowledge Examples Knowledge Examples Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF. Proficient in SQL for analytics and windowing functions. Understanding of data schemas and models. Familiarity with domain-related data. Knowledge of data warehouse optimization techniques. Understanding of data security concepts. Awareness of patterns frameworks and automation practices. Additional Comments Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes: Act creatively to develop pipelines/applications by selecting appropriate technical options optimizing application development maintenance and performance through design patterns and reusing proven solutions. Support the Project Manager in day-to-day project execution and account for the developmental activities of others. Interpret requirements create optimal architecture and design solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code using best standards debug and test solutions to ensure best-in-class quality. Tune performance of code and align it with the appropriate infrastructure understanding cost implications of licenses and infrastructure. Create data schemas and models effectively. Develop and manage data storage solutions including relational databases NoSQL databases Delta Lakes and data lakes. Validate results with user representatives integrating the overall solution. Influence and enhance customer satisfaction and employee engagement within project teams. Measures of Outcomes: TeamOne's Adherence to engineering processes and standards TeamOne's Adherence to schedule / timelines TeamOne's Adhere to SLAs where applicable TeamOne's # of defects post delivery TeamOne's # of non-compliance issues TeamOne's Reduction of reoccurrence of known defects TeamOne's Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirementst Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). TeamOne's Average time to detect respond to and resolve pipeline failures or data issues. TeamOne's Number of data security incidents or compliance breaches. Outputs Expected: Code: Develop data processing code with guidance ensuring performance and scalability requirements are met. Define coding standards templates and checklists. Review code for team and peers. Documentation: Create/review templates checklists guidelines and standards for design/process/development. Create/review deliverable documents including design documents architecture documents infra costing business requirements source-target mappings test cases and results. Configure: Define and govern the configuration management plan. Ensure compliance from the team. Test: Review/create unit test cases scenarios and execution. Review test plans and strategies created by the testing team. Provide clarifications to the testing team. Domain Relevance: Advise data engineers on the design and development of features and components leveraging a deeper understanding of business needs. Learn more about the customer domain and identify opportunities to add value. Complete relevant domain certifications. Manage Project: Support the Project Manager with project inputs. Provide inputs on project plans or sprints as needed. Manage the delivery of modules. Manage Defects: Perform defect root cause analysis (RCA) and mitigation. Identify defect trends and implement proactive measures to improve quality. Estimate: Create and provide input for effort and size estimation and plan resources for projects. Manage Knowledge: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release: Execute and monitor the release process. Design: Contribute to the creation of design (HLD LLD SAD)/architecture for applications business components and data models. Interface with Customer: Clarify requirements and provide guidance to the Development Team. Present design options to customers. Conduct product demos. Collaborate closely with customer architects to finalize designs. Manage Team: Set FAST goals and provide feedback. Understand team members' aspirations and provide guidance and opportunities. Ensure team members are upskilled. Engage the team in projects. Proactively identify attrition risks and collaborate with BSE on retention measures. Certifications: Obtain relevant domain and technology certifications. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Skills scala,Python,Pyspark Show more Show less
Posted 1 week ago
0 years
0 Lacs
Andhra Pradesh, India
On-site
Data Tester Job Description Highlights 5 plus years experience in data testing ETL Testing: Validating the extraction, transformation, and loading (ETL) of data from various sources. Data Validation: Ensuring the accuracy, completeness, and integrity of data in databases and data warehouses. SQL Proficiency: Writing and executing SQL queries to fetch and analyze data. Data Modeling: Understanding data models, data mappings, and architectural documentation. Test Case Design: Creating test cases, test data, and executing test plans. Troubleshooting: Identifying and resolving data-related issues. Dashboard Testing: Validating dashboards for accuracy, functionality, and user experience. Collaboration: Working with developers and other stakeholders to ensure data quality and functionality. Primary Responsibilities Dashboard Testing Components: Functional Testing: Simulating user interactions and clicks to ensure dashboards are functioning correctly. Performance Testing: Evaluating dashboard responsiveness and load times. Data Quality Testing: Verifying that the data displayed on dashboards is accurate, complete, and consistent. Usability Testing: Assessing the ease of use and navigation of dashboards. Data Visualization Testing: Ensuring charts, graphs, and other visualizations are accurate and present data effectively. Security Testing: Verifying that dashboards are secure and protect sensitive data. Tools And Technologies SQL: Used for querying and validating data. Hands on snowflake ETL Tools: Tools like Talend, Informatica, or Azure Data Factory used for data extraction, transformation, and loading. Data Visualization Tools: Tableau, Power BI, or other BI tools used for creating and testing dashboards. Testing Frameworks: Frameworks like Selenium or JUnit used for automating testing tasks. Cloud Platforms: AWS platforms used for data storage and processing. Hands on Snowflake experience HealthCare Domain knowledge is plus point. Secondary Skills Automation framework, Life science domain experience. UI Testing, API Testing Any other ETL Tools Show more Show less
Posted 1 week ago
0 years
0 Lacs
Andhra Pradesh, India
On-site
QA Tester to ensure the accuracy and functionality of ETL jobs migrated from Talend to Python and SQL queries converted to Snowflake. The role involves validating data integrity, performance, and the seamless transition of ETL processes, working closely with developers skilled in Python and SQL. Combine interface design concepts with digital design and establish milestones to encourage cooperation and teamwork. Develop overall concepts for improving the user experience within a business webpage or product, ensuring all interactions are intuitive and convenient for customers. Collaborate with back-end web developers and programmers to improve usability. Conduct thorough testing of user interfaces in multiple platforms to ensure all designs render correctly and systems function properly. Performing all testing activities for initiatives across one or more assigned projects, utilizing processes, methods, metrics and software that ensure the quality, reliability and systems safety and security and Hoverfly Component and contract Testing Embedded Cassandra Component Testing on multiple repositor. QA Tester to ensure the accuracy and functionality of ETL jobs migrated from Talend to Python and SQL queries converted to Snowflake. The role involves validating data integrity, performance, and the seamless transition of ETL processes, working closely with developers skilled in Python and SQL. Test strategy formulation will include decomposing the business and technical requirements into test case scenarios, defining test data requirements, managing test case creation, devising contingencies plans and other preparation activities. Development of the test case execution plan, test case execution, managing issues, and status metrics. Working with a global team and responsible for directing/reviewing the test planning and execution work efforts of an offshore team. Communicating effectively with business units, IT Development, Project Management and other support staff on testing timelines, deliverables, status and other information. Assisting in the project quality reviews for your assigned applications Assessing risk to the project based on the execution and validation and making appropriate recommendations Ability to interpret quality audits, drive improvements and change, and facilitate test methodology discussions across the business unit Providing project implementation support on an as needed basis and Assisting with application training of new resources. Ability to create and manage project plans and activity timelines and Investigating, monitoring, reporting and driving solutions to issues Acting as a liaison between the Line of Business testing resources and the development team Identifying and creating risk mitigation activities, and developing and implementing process improvements. Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
4+ years of experience in Talend, SQL, and Unix. Hands-on experience in developing ETL jobs and workflows using Talend. Strong SQL skills, including the ability to write complex queries and optimize performance. Basic knowledge of Unix/Linux shell scripting. Familiarity with data integration processes and tools. Strong communication and teamwork skills. Experience with data warehousing concepts Good problem-solving and analytical skills Experience with data quality and validation tools Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Role Description Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be adept at using ETL tools such as Informatica Glue Databricks and DataProc with coding skills in Python PySpark and SQL. Works independently and demonstrates proficiency in at least one domain related to data with a solid understanding of SCD concepts and data warehousing principles. Outcomes Collaborate closely with data analysts data scientists and other stakeholders to ensure data accessibility quality and security across various data sources.rnDesign develop and maintain data pipelines that collect process and transform large volumes of data from various sources. Implement ETL (Extract Transform Load) processes to facilitate efficient data movement and transformation. Integrate data from multiple sources including databases APIs cloud services and third-party data providers. Establish data quality checks and validation procedures to ensure data accuracy completeness and consistency. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Measures Of Outcomes Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirementst Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Outputs Expected Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Documentation Create documentation for personal work and review deliverable documents including source-target mappings test cases and results. Configuration Follow configuration processes diligently. Testing Create and conduct unit tests for data pipelines and transformations to ensure data quality and correctness. Validate the accuracy and performance of data processes. Domain Relevance Develop features and components with a solid understanding of the business problems being addressed for the client. Understand data schemas in relation to domain-specific contexts such as EDI formats. Defect Management Raise fix and retest defects in accordance with project standards. Estimation Estimate time effort and resource dependencies for personal work. Knowledge Management Consume and contribute to project-related documents SharePoint libraries and client universities. Design Understanding Understand design and low-level design (LLD) and link it to requirements and user stories. Certifications Obtain relevant technology certifications to enhance skills and knowledge. Skill Examples Proficiency in SQL Python or other programming languages utilized for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning data processes. Proficiency in querying data warehouses. Knowledge Examples Knowledge Examples Knowledge of various ETL services provided by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow and Azure ADF/ADLF. Understanding of data warehousing principles and practices. Proficiency in SQL for analytics including windowing functions. Familiarity with data schemas and models. Understanding of domain-related data and its implications. Additional Comments Design, develop, and maintain data pipelines and architectures using Azure services. Collaborate with data scientists and analysts to meet data needs. Optimize data systems for performance and reliability. Monitor and troubleshoot data storage and processing issues. Responsibilities Design, develop, and maintain data pipelines and architectures using Azure services. Collaborate with data scientists and analysts to meet data needs. Optimize data systems for performance and reliability. Monitor and troubleshoot data storage and processing issues. Ensure data security and compliance with company policies. Document data solutions and architecture for future reference. Stay updated with Azure data engineering best practices and tools. Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. 3+ years of experience in data engineering. Proficiency in Azure Data Factory, Azure SQL Database, and Azure Databricks. Experience with data modeling and ETL processes. Strong understanding of database management and data warehousing concepts. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Skills Azure Data Factory Azure SQL Database Azure Databricks ETL Data Modeling SQL Python Big Data Technologies Data Warehousing Azure DevOps Skills Azure,Aws,Aws Cloud,Azure Cloud Show more Show less
Posted 1 week ago
1.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Do you want to be part of an inclusive team that works to develop innovative therapies for patients? Every day, we are driven to develop and deliver innovative and effective new medicines to patients and physicians. If you want to be part of this exciting work, you belong at Astellas! Astellas Pharma Inc. is a pharmaceutical company conducting business in more than 70 countries around the world. We are committed to turning innovative science into medical solutions that bring value and hope to patients and their families. Keeping our focus on addressing unmet medical needs and conducting our business with ethics and integrity enables us to improve the health of people throughout the world. For more information on Astellas, please visit our website at www.astellas.com . This position is based in Bengaluru and will require some on-site work. Purpose And Scope As a Data and Analytics Tester, you will play a critical role in validating the accuracy, functionality, and performance of our BI, Data Warehousing and ETL systems. You’ll work closely with FoundationX Data Engineers, analysts, and developers to ensure that our QLIK, Power BI, and Tableau reports meet high standards. Additionally, your expertise in ETL tools (such as Talend, DataBricks) will be essential for testing data pipelines. Essential Job Responsibilities Development Ownership: Support testing for Data Warehouse and MI projects. Collaborate with senior team members. Administer multi-server environments. Test Strategy And Planning Understand project requirements and data pipelines. Create comprehensive test strategies and plans. Participate in data validation and user acceptance testing (UAT). Data Validation And Quality Assurance Execute manual and automated tests on data pipelines, ETL processes, and models. Verify data accuracy, completeness, and consistency. Ensure compliance with industry standards. Regression Testing Validate changes to data pipelines and analytics tools. Monitor performance metrics. Test Case Design And Execution Create detailed test cases based on requirements. Collaborate with development teams to resolve issues. Maintain documentation. Data Security And Privacy Validate access controls and encryption mechanisms. Ensure compliance with privacy regulations. Collaboration And Communication Work with cross-functional teams. Communicate test progress and results. Continuous Improvement And Technical Support Optimize data platform architecture. Provide technical support to internal users. Stay updated on trends in full-stack development and cloud platforms. Qualifications Required Bachelor’s degree in computer science, information technology, or related field (or equivalent experience.) 1-3 + years proven experience as a Tester, Developer or Data Analyst within a Pharmaceutical or working within a similar regulatory environment. 1- 3 + years' experience in using BI Development, ETL Development, Qlik, PowerBI including DAX and Power Automate (MS Flow) or PowerBI alerts or equivalent technologies. Experience with QLIK Sense and QLIKView, Tableau application and creating data models. Familiarity with Business Intelligence and Data Warehousing concepts (star schema, snowflake schema, data marts). Knowledge of SQL, ETL frameworks and data integration techniques. Other complex and highly regulated industry experience will be considered across diverse areas like Commercial, Manufacturing and Medical. Data Analysis and Automation Skills: Proficient in identifying, standardizing, and automating critical reporting metrics and modelling tools. Exposure to at least 1-2 full large complex project life cycles. Experience with test management software (e.g., qTest, Zephyr, ALM). Technical Proficiency: Strong coding skills in SQL, R, and/or Python, coupled with expertise in machine learning techniques, statistical analysis, and data visualization. Manual testing (test case design, execution, defect reporting). Awareness of automated testing tools (e.g., Selenium, JUnit). Experience with data warehouses and understanding of BI/DWH systems. Agile Champion: Adherence to DevOps principles and a proven track record with CI/CD pipelines for continuous delivery. Preferred: - Experience working in the Pharma industry. Understanding of pharmaceutical data (clinical trials, drug development, patient records) is advantageous. Certifications in BI tools or testing methodologies. Knowledge of cloud-based BI solutions (e.g., Azure, AWS) Cross-Cultural Experience: Work experience across multiple cultures and regions, facilitating effective collaboration in diverse environments Innovation and Creativity: Ability to think innovatively and propose creative solutions to complex technical challenges Global Perspective: Demonstrated understanding of global pharmaceutical or healthcare technical delivery, providing exceptional customer service and enabling strategic insights and decision-making. Working Environment At Astellas we recognize the importance of work/life balance, and we are proud to offer a hybrid working solution allowing time to connect with colleagues at the office with the flexibility to also work from home. We believe this will optimize the most productive work environment for all employees to succeed and deliver. Hybrid work from certain locations may be permitted in accordance with Astellas’ Responsible Flexibility Guidelines. \ Category FoundationX Astellas is committed to equality of opportunity in all aspects of employment. EOE including Disability/Protected Veterans Show more Show less
Posted 2 weeks ago
8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Summary We are seeking a highly experienced and strategic Lead Data Architect with 8+ years of hands-on experience in designing and leading data architecture initiatives. This individual will play a critical role in building scalable, secure, and high-performance data solutions that support enterprise-wide analytics, reporting, and operational systems. The ideal candidate will be both technically proficient and business-savvy, capable of translating complex data needs into innovative architecture designs. Key Responsibilities Design and implement enterprise-wide data architecture to support business intelligence, advanced analytics, and operational data needs. Define and enforce standards for data modeling, integration, quality, and governance. Lead the adoption and integration of modern data platforms (data lakes, data warehouses, streaming, etc.). Develop architecture blueprints, frameworks, and roadmaps aligned with business objectives. Ensure data security, privacy, and regulatory compliance (e.g., GDPR, HIPAA). Collaborate with business, engineering, and analytics teams to deliver high-impact data solutions. Provide mentorship and technical leadership to data engineers and junior architects. Evaluate emerging technologies and provide recommendations for future-state architectures. Required Qualifications 8+ years of experience in data architecture, data engineering, or a similar senior technical role. Bachelor's or Master’s degree in Computer Science, Information Systems, or a related field. Expertise in designing and managing large-scale data systems using cloud platforms (AWS, Azure, or GCP). Strong proficiency in data modeling (relational, dimensional, NoSQL) and modern database systems (e.g., Snowflake, BigQuery, Redshift). Hands-on experience with data integration tools (e.g., Apache NiFi, Talend, Informatica) and orchestration tools (e.g., Airflow). In-depth knowledge of data governance, metadata management, and data cataloging solutions. Experience with real-time and batch data processing frameworks, including streaming technologies like Kafka. Excellent leadership, communication, and cross-functional collaboration skills. Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
hackajob is collaborating with American Express to connect them with exceptional tech professionals for this role. You Lead the Way. We’ve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities and each other. Here, you’ll learn and grow as we help you create a career journey that’s unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. As part of our diverse tech team, you can architect, code and ship software that makes us an essential part of our customers’ digital lives. Here, you can work alongside talented engineers in an open, supportive, inclusive environment where your voice is valued, and you make your own decisions on what tech to use to solve challenging problems. Amex offers a range of opportunities to work with the latest technologies and encourages you to back the broader engineering community through open source. And because we understand the importance of keeping your skills fresh and relevant, we give you dedicated time to invest in your professional development. Find your place in technology on #TeamAmex. How will you make an impact in this role? Build NextGen Data Strategy, Data Virtualization, Data Lakes Warehousing Transform and improve performance of existing reporting & analytics use cases with more efficient and state of the art data engineering solutions. Analytics Development to realize advanced analytics vision and strategy in a scalable, iterative manner. Deliver software that provides superior user experiences, linking customer needs and business drivers together through innovative product engineering. Cultivate an environment of Engineering excellence and continuous improvement, leading changes that drive efficiencies into existing Engineering and delivery processes. Own accountability for all quality aspects and metrics of product portfolio, including system performance, platform availability, operational efficiency, risk management, information security, data management and cost effectiveness. Work with key stakeholders to drive Software solutions that align to strategic roadmaps, prioritized initiatives and strategic Technology directions. Work with peers, staff engineers and staff architects to assimilate new technology and delivery methods into scalable software solutions. Minimum Qualifications Bachelor’s degree in computer science, Computer Science Engineering, or related field required; Advanced Degree preferred. 5+ years of hands-on experience in implementing large data-warehousing projects, strong knowledge of latest NextGen BI & Data Strategy & BI Tools Proven experience in Business Intelligence, Reporting on large datasets, Data Virtualization Tools, Big Data, GCP, JAVA, Microservices Strong systems integration architecture skills and a high degree of technical expertise, ranging across a number of technologies with a proven track record of turning new technologies into business solutions. Should be good in one programming language python/Java. Should have good understanding of data structures. GCP /cloud knowledge has added advantage. PowerBI, Tableau and looker good knowledge and understanding. Outstanding influential and collaboration skills; ability to drive consensus and tangible outcomes, demonstrated by breaking down silos and fostering cross communication process. Experience managing in a fast paced, complex, and dynamic global environment. Outstanding influential and collaboration skills; ability to drive consensus and tangible outcomes, demonstrated by breaking down silos and fostering cross communication process. Preferred Qualifications Bachelor’s degree in computer science, Computer Science Engineering, or related field required; Advanced Degree preferred. 5+ years of hands-on experience in implementing large data-warehousing projects, strong knowledge of latest NextGen BI & Data Strategy & BI Tools Proven experience in Business Intelligence, Reporting on large datasets, Oracle Business Intelligence (OBIEE), Tableau, MicroStrategy, Data Virtualization Tools, Oracle PL/SQL, Informatica, Other ETL Tools like Talend, Java Should be good in one programming language python/Java. Should be good data structures and reasoning. GCP knowledge has added advantage or cloud knowledge. PowerBI, Tableau and looker good knowledge and understanding. Strong systems integration architecture skills and a high degree of technical expertise, ranging across several technologies with a proven track record of turning new technologies into business solutions. Outstanding influential and collaboration skills; ability to drive consensus and tangible outcomes, demonstrated by breaking down silos and fostering cross communication process. Benefits We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less
Posted 2 weeks ago
3.0 - 8.0 years
15 - 30 Lacs
Pune
Hybrid
Job Title: Data Engineer Location: Work from Office (Hybrid) Job Location: Magarpatta, Pune Shift timing: 11 am to 8 pm Job responsibilities: Design, develop, and maintain ETL pipelines using Informatica PowerCenter or Talend to extract, transform, and load data into EDW systems and data lake. Optimize and troubleshoot complex SQL queries and ETL jobs to ensure efficient data processing and high performance. Technologies - SQL, Informatica Power center, Big Data, Hive Need combination of skills for below stack : SQL, Informatica Power center, Hive, Talend
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Talend is a popular data integration and management tool used by many organizations in India. As a result, there is a growing demand for professionals with expertise in Talend across various industries. Job seekers looking to explore opportunities in this field can expect a promising job market in India.
These cities have a high concentration of IT companies and organizations that frequently hire for Talend roles.
The average salary range for Talend professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum
A typical career progression in the field of Talend may follow this path: - Junior Developer - Developer - Senior Developer - Tech Lead - Architect
As professionals gain experience and expertise, they can move up the ladder to more senior and leadership roles.
In addition to expertise in Talend, professionals in this field are often expected to have knowledge or experience in the following areas: - Data Warehousing - ETL (Extract, Transform, Load) processes - SQL - Big Data technologies (e.g., Hadoop, Spark)
As you explore opportunities in the Talend job market in India, remember to showcase your expertise, skills, and knowledge during the interview process. With preparation and confidence, you can excel in securing a rewarding career in this field. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2