Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 12.0 years
0 - 2 Lacs
bengaluru
Remote
Job Role: Senior Data Engineer - IRIS Duration: 6 months contract Timing: Night shift (6pm to 3am) Key Skills Required: Programming: Python, SQL, Spark Cloud Platforms: Azure, Snowflake Data Tools: DBT, Erwin Data Modeler, Apache Airflow, API Integrations, ADF Governance: Data masking, metadata management, SOX compliance Soft Skills: Communication, problem-solving, stakeholder engagement As IRIS Data Engineer, you will work with Data Scientists and Data Architects to translate prototypes into scalable solutions. Key Responsibilities: 1. Data Pipeline Design & Development Data Engineers are responsible for designing and building robust, scalable, and high-quality data pipelines that support analytics and reporting needs. This includes: Integration of structured and unstructured data from various sources into data lakes and warehouses. Build and maintain scalable ETL/ELT pipelines for batch and streaming data using Azure Data Factory, Databricks, Snowflake and Azure SQL Server, control -M. Collaborate with data scientists, analysts, and platform engineers to enable analytics and ML use cases. Design, develop, and optimise DBT models to support scalable data transformations. 2. Cloud Platform Engineering They operationalize data solutions on cloud platforms, integrating services like Azure, Snowflake, and third-party technologies. Manage environments, performance tuning, and configuration for cloud-native data solutions. 3. Data Modeling & Architecture Apply dimensional modeling, star schemas, and data warehousing techniques to support business intelligence and machine learning workflows. Collaborate with solution architects and analysts to ensure models meet business needs. 4. Data Governance & Security Ensure data integrity, privacy, and compliance through governance practices and secure schema design. Implement data masking, access controls, and metadata management for sensitive datasets. 5. Collaboration & Agile Delivery Work closely with cross-functional teams including product owners, architects, and business stakeholders to translate requirements into technical solutions. Participate in Agile ceremonies, sprint planning, and DevOps practices for continuous integration and deployment. Technical Skills: 7+ years of data engineering or design experience, designing, developing, and deploying scalable enterprise data analytics solutions from source system through ingestion and reporting. 5+ years of experience in ML Lifecycle using Azure Kubernetes service, Azure Container Instance service, Azure Data Factory, Azure Monitor, Azure DataBricks building datasets, ML pipelines, experiments, logging, and monitoring. (Including Drifting, Model Adaptation and Data Collection). 5+ years of experience in data engineering using Snowflake. Experience in designing, developing & scaling complex data & feature pipelines feeding ML models and evaluating their performance. Experience in building and managing streaming and batch inferencing. Proficiency in SQL and any one other programming language (e.g., R, Python, C++, Minitab, SAS, Matlab, VBA knowledge of optimization engines such as CPLEX or Gurobi is a plus). Strong experience with cloud platforms (AWS, Azure, etc.) and containerization technologies (Docker, Kubernetes). Experience with CI/CD tools such as GitHub Actions, GitLab, Jenkins, or similar tools. Familiarity with security best practices in DevOps and ML Ops. Experience in developing and maintaining APIs (e.g.: REST) Agile/Scrum operating experience using Azure DevOps. Experience with MS Cloud - ML Azure Databricks, Data Factory, Synapse, among others. Professional Skills: Strong analytical and problem-solving skills and passion for product development. Strong understanding of Agile methodologies and open to working in agile environments with multiple stakeholders. Professional attitude and service orientation; team player. Ability to translate business needs into potential analytics solutions. Strong work ethic: ability to work at an abstract level and gain consensus. Ability to build a sense of trust and rapport to create a comfortable and effective workplace.
Posted 6 days ago
3.0 - 8.0 years
3 - 7 Lacs
hyderabad, chennai, bengaluru
Work from Office
We are hiring a Delphix Engineer with 3-12 years of experience for a 12-month full-time onsite role across Bengaluru, Chennai, Hyderabad, Pune, and Vadodara The candidate must have strong hands-on experience with Test Data Management (TDM), particularly Delphix, along with data de-identification, masking, and synthetic data generation The ideal engineer will work closely with consumers to enable fast, secure test data provisioning Exposure to Python or NET is a plus, as is knowledge of CI/CD pipelines and cloud-hosted platforms Must be a proactive contributor and effective collaborator, comfortable in dynamic environments Location - Bengaluru, Chennai, Hyderabad, Pune, Vadodara (Onsite)
Posted 1 week ago
5.0 - 8.0 years
4 - 8 Lacs
pune
Work from Office
Skills Sets 7+ years of experience in Test Data Management, Data Governance, or Data Engineering. 3+ years of hands-on experience with Informatica Data Masking (TDM, ILM) tools. Strong hands-on experience with Informatica TDM (Test Data Management), ILM (Information Lifecycle Management), and Persistent Data Masking Knowledge of Informatica Data Engineering Integration (DEI), Informatica Data Quality (IDQ), PowerCenter for ETL integration with TDM Experience with static and dynamic data masking across RDBMS, files, and applications. Proficiency in data sub-setting, synthetic data generation, cloning, and archiving using Informatica tools Working knowledge of Informatica Developer, Metadata Manager, and Model Repository Service Strong understanding of RDBMS (Oracle, SQL Server, DB2, PostgreSQL, MySQL) and ability to write complex SQL queries. Proven track record of working with data privacy frameworks and compliance standards. Knowledge of ETL processes, data warehousing concepts, and data migration testing. Familiarity with cloud platforms (AWS, Azure, GCP) and data security in cloud environments is a plus. Mandatory Skills: Data Centric testing .Experience: 5-8 Years .>
Posted 1 week ago
8.0 - 13.0 years
11 - 18 Lacs
kolkata, bengaluru, delhi / ncr
Hybrid
PAN INDIA Bengaluru, Hyderabad, Pune, Chennai, Gurugram, Noida, Mumbai, Kolkata Job description * Role & Responsibilities: Provide 24x7 support for data security and encryption services in production systems. Manage Key Management Systems (KMS) and Tokenization platforms (Thales, Fortanix, Protegrity). Perform encryption key lifecycle operations generation, rotation, expiration, archival. Troubleshoot encryption/decryption failures, key access issues, and DB/storage integration problems. Onboard applications into Thales/Fortanix/Protegrity platforms. Work closely with Infra/Cloud/BAU teams to implement enterprise encryption strategies. Lead/participate in security audits, risk assessments, and compliance reporting . Document SOPs, runbooks, and mentor junior engineers. Preferred Candidate Profile: 812 years of IT experience, with minimum 5 years in Data Security / Encryption / Tokenization . Hands-on with Thales CipherTrust Manager, Vormetric, Fortanix DSM, Protegrity . Strong understanding of cryptographic standards AES, RSA, PKCS, KMIP. Exposure to Linux/Unix/Windows administration & shell scripting . Familiarity with AWS KMS, Azure Key Vault, Google Cloud KMS . Strong communication, leadership, and problem-solving skills.
Posted 2 weeks ago
8.0 - 15.0 years
0 Lacs
delhi
On-site
As an independent, customer-focused global provider of interventional cardiovascular medical technologies, Cordis has a rich history of over 60 years in pioneering breakthrough technologies. Our legacy includes the development of the first guiding catheters and coronary drug eluting stents, solidifying our strong global footprint across 70 countries. At Cordis, we value teamwork and empowerment. Our culture encourages you to take ownership and unleash your innovative potential. With diverse teams on a global scale, we promote inclusivity, embracing individual uniqueness and perspectives. We believe that our diverse experiences enhance the careers of our team members, our service to customers, and ultimately, the lives of our patients. If you are seeking a challenging role where you can directly impact the lives of millions, Cordis is the perfect place for you. Join us in our mission to improve the wellbeing of millions, together. We are the driving force behind those who are dedicated to saving lives. Responsibilities: - Develop data modeling solutions following industry best practices. - Implement Star and extended star schema modeling with SCD Type I and II. - Conduct SQL performance measurement, query tuning, and database tuning. - Utilize Data Masking, Encryption, Data Wrangling, and Data Pipeline orchestration tasks. - Demonstrate expertise in Oracle ODI technology with hands-on experience in Golden Gate. - Willingness to upskill and learn Golden Gate, APEX technologies to support growth strategies. - Perform Data Integration with SFDC and other third-party tools. - Optimize ADW/ODI environments to enable cost savings and timely data solutions for business needs. - Develop scripts and programs to automate data operations. - Preferred domain knowledge in Healthcare Manufacturing and distribution. - Possess excellent analytical, problem-solving, time management skills, and be a strong team player. - Excellent communication skills. Qualifications: - Bachelor's degree or equivalent from an accredited institution; three years of progressive experience can be considered in lieu of each year of education. - Minimum of 8 years of experience in Oracle Cloud Technologies. - Over 15 years of experience in Data modeling/Warehouse and Database architecture design. - Working knowledge of ERP systems such as SAP & JDE. - Primary Skillset required: ODI, ADW. - Minimum 5 years of ODI Experience. - Strong teaming and collaboration skills both internally and externally. - Ability to develop ML/AI solutions using OAC/ADW technologies (OCI) and drive tangible business outcomes. Cordis is proud to be an Equal Opportunity Employer, providing a workplace that is inclusive and supportive of individuals with disabilities and protected veterans.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As an engineer in this role, you are expected to have a strong technical skill set in various areas including operating systems such as Windows, Unix, and Linux, databases like MSSQL, Transact Releases R22 and higher, frameworks like TAFJ, and programming languages such as InfoBasic, Java, and Python (nice to have). Additionally, familiarity with user interfaces like Classic, BrowserWeb, and Transact Explorer (nice to have), as well as scripting languages like Bash and sh is required. You should also be proficient in tools like JIRA, Confluence, Putty, Tectia or WinScp, and ServiceNow, along with DevOps tools including Design Studio, DSF Packager, GitHub, Maven, Jenkins, and UCD. In terms of development skills, the engineering team collectively should possess expertise in application programming, basic Java programming, application menus, subroutines, version and enquiry design, single and multi-threaded routines, local reference tables and fields, composite/tabbed screen designing, single customer view in Transact, role-based home pages, and migration to Transact Explorer (nice to have). Furthermore, experience in archival, data masking, purging, TAFJ installation and configurations, file-based and real-time interfaces development, DFE, API Framework including IRIS, TOCFEE including TCServer, JMS, TWS, IF Events, MQ setup, dormancy framework, and extensibility framework (nice to have) is essential. You should also be proficient in system performance optimization including COB, redesigning local code, DB indexes, general Transact WAS configurations, application configuration in WebSphere, application configuration in SQL, TAFJ configuration, Tivoli scripts, decommissioning modules, local development, and back to core (specific to modules below).,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Salesforce Developer with 5+ years of experience, you will be responsible for developing and customizing Salesforce applications using Apex, Visualforce, Lightning Web Components (LWC), and configuration tools. Your primary focus will be on working with Salesforce CPQ, Revenue Cloud, or Sales Cloud, with a strong understanding of Product Catalogue implementation. Your deep expertise with Gearset, SFDX, and Salesforce DevOps Center, or equivalent tooling, will be crucial in ensuring efficient development processes. Proficiency in CI/CD practices using Git-based version control and PR workflows will enable you to streamline deployment workflows effectively. You will collaborate closely with business stakeholders to gather requirements, design scalable solutions, and implement features that align with business goals. Additionally, you will be responsible for conducting thorough testing, debugging, and deployment of enhancements or new features to ensure the quality of Salesforce applications. Your role will also involve maintaining and documenting Salesforce best practices and technical architecture. You will leverage your experience in configuring and managing sandbox environments, as well as your understanding of Salesforce metadata architecture, change sets, and deployment automation to drive successful project outcomes. Furthermore, your proven experience in migrating legacy automations to Flow-first architectures and familiarity with security hardening practices such as RBAC, CRUD/FLS validation, health checks, and IRPs will be valuable assets in this role. Overall, as a Salesforce Developer, you will play a key role in designing, developing, and enhancing Salesforce applications to meet business requirements and contribute to the overall success of the organization.,
Posted 2 weeks ago
15.0 - 20.0 years
3 - 7 Lacs
coimbatore
Work from Office
Project Role : Security Engineer Project Role Description : Apply security skills to design, build and protect enterprise systems, applications, data, assets, and people. Provide services to safeguard information, infrastructures, applications, and business processes against cyber threats. Must have skills : Data Loss Prevention (DLP) Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Security Engineer, you will apply security skills to design, build, and protect enterprise systems, applications, data, assets, and people. A typical day involves collaborating with various teams to implement security measures, conducting assessments to identify vulnerabilities, and ensuring that all systems are fortified against potential cyber threats. You will also engage in continuous monitoring and improvement of security protocols to safeguard sensitive information and maintain compliance with industry standards. Roles & Responsibilities:- Expected to be an SME in DLP and Data masking solution implementation and support.- Collaborate and manage the team to perform.- Demonstrates excellent problem-solving skills and the ability to collaborate effectively with diverse stakeholders- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Conduct regular security assessments and audits to identify vulnerabilities and recommend improvements.- Develop and implement security policies and procedures to ensure compliance with industry standards. Professional & Technical Skills: - Must Have Skills: Proficiency in Proofpoint and Microsoft Purview Data Loss Prevention (DLP) tools, Varonis Data Discovery and Data masking.- Creation of DLP detection and prevention policies- DLP agents compliance and incident monitoring- DLP agent upgradation- Design and implementation of Data masking solution across enterprise-wide applications- Perform Sensitive Data Discovery and analysis across enterprise data repositories- Create Technical documentation and installation/administration manuals- Strong understanding of risk management and mitigation strategies.- Experience with security frameworks and compliance standards such as ISO 27001, NIST, or GDPR, HIPAA, HiTrust- Familiarity with incident response and threat intelligence processes.- Knowledge of network security protocols and technologies. Additional Information:- The candidate should have minimum 8 years of experience in Data Loss Prevention (DLP) and Data Discovery- Good to have experience in Health care industry - Certifications on Proofpoint, Varonis is preferred.- This position is based in Coimbatore.- Willing to work in US shifts including support in late IST hours. Willing to work in office adhering to current HR policies.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
15.0 - 20.0 years
3 - 7 Lacs
chennai
Work from Office
Project Role : Security Engineer Project Role Description : Apply security skills to design, build and protect enterprise systems, applications, data, assets, and people. Provide services to safeguard information, infrastructures, applications, and business processes against cyber threats. Must have skills : Data Loss Prevention (DLP) Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Security Engineer, you will apply security skills to design, build, and protect enterprise systems, applications, data, assets, and people. A typical day involves collaborating with various teams to implement security measures, conducting assessments to identify vulnerabilities, and ensuring that all systems are fortified against potential cyber threats. You will also engage in continuous monitoring and improvement of security protocols to safeguard sensitive information and maintain compliance with industry standards. Roles & Responsibilities:- Expected to be an SME in DLP and Data masking solution implementation and support.- Collaborate and manage the team to perform.- Demonstrates excellent problem-solving skills and the ability to collaborate effectively with diverse stakeholders- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Conduct regular security assessments and audits to identify vulnerabilities and recommend improvements.- Develop and implement security policies and procedures to ensure compliance with industry standards. Professional & Technical Skills: - Must Have Skills: Proficiency in Proofpoint and Microsoft Purview Data Loss Prevention (DLP) tools, Varonis Data Discovery and Data masking.- Creation of DLP detection and prevention policies- DLP agents compliance and incident monitoring- DLP agent upgradation- Design and implementation of Data masking solution across enterprise-wide applications- Perform Sensitive Data Discovery and analysis across enterprise data repositories- Create Technical documentation and installation/administration manuals- Strong understanding of risk management and mitigation strategies.- Experience with security frameworks and compliance standards such as ISO 27001, NIST, or GDPR, HIPAA, HiTrust- Familiarity with incident response and threat intelligence processes.- Knowledge of network security protocols and technologies. Additional Information:- The candidate should have minimum 8 years of experience in Data Loss Prevention (DLP) and Data Discovery- Good to have experience in Health care industry - Certifications on Proofpoint, Varonis is preferred.- This position is based in Coimbatore.- Willing to work in US shifts including support in late IST hours. Willing to work in office adhering to current HR policies.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
5.0 - 7.0 years
9 - 14 Lacs
mumbai, delhi / ncr, bengaluru
Work from Office
AWS Storage & Compute Services S3: Strong understanding of bucket policies, lifecycle rules & data organization Glue: ETL jobs, crawlers & data cataloging Athena: Querying data directly from S3 using SQL EMR: Big data processing with Spark/Hadoop Lakehouse Technologies Apache Hudi: Hands-on knowledge of transactional data lakes & schema evolution Data Cataloging & Governance AWS Glue Data Catalog Integration with Lake Formation Security & Compliance IAM roles & policies Encryption strategies Data masking & tokenization Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune,Remote
Posted 3 weeks ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As a Data Migration Specialist with expertise in Salesforce Field Service Lightning (FSL), you will be responsible for planning, executing, and validating complex data migrations from legacy systems to Salesforce. Your role will involve tasks such as data mapping, cleansing, transformation, and loading to ensure accuracy, performance, and compliance throughout the migration process. To excel in this position, you should have at least 5 years of experience in Salesforce data migration, with a minimum of 2 years dedicated to Field Service Lightning projects. A strong understanding of the FSL data model and the relationships between various entities such as Work Orders, Service Appointments, Technicians, Territories, and Scheduling rules is essential for this role. Hands-on experience with ETL tools like Informatica, MuleSoft, Talend, or Client Boomi is required to effectively manage the data migration process. Proficiency in SOQL, SOSL, SQL, and advanced Excel for data analysis is crucial for successful data handling and transformation. You should also have experience in handling large data volumes (LDV) within Salesforce and be familiar with performance tuning techniques specifically tailored for migrations. Knowledge of Salesforce APIs (SOAP, REST, Bulk API) and data load best practices will be advantageous for this role. Additionally, familiarity with data masking and anonymization techniques for sandbox refreshes is preferred. Holding a Salesforce Administrator or FSL Consultant certification will be considered a plus. Strong problem-solving skills, effective communication abilities, and meticulous documentation practices are key attributes that will contribute to your success in this position.,
Posted 3 weeks ago
5.0 - 8.0 years
10 - 14 Lacs
bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Data Masking Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the development process. Your role will require a balance of technical expertise and leadership skills to drive the project forward successfully. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and adjust plans as necessary to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Masking.- Strong understanding of data protection regulations and compliance standards.- Experience with data encryption techniques and methodologies.- Familiarity with data governance frameworks and best practices.- Ability to implement data masking solutions in various environments. Additional Information:- The candidate should have minimum 5 years of experience in Data Masking.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
3.0 - 8.0 years
3 - 7 Lacs
hyderabad, chennai, bengaluru
Work from Office
We are hiring a Delphix Engineer with 3–12 years of experience for a 12-month full-time onsite role across Bengaluru, Chennai, Hyderabad, Pune, and Vadodara. The candidate must have strong hands-on experience with Test Data Management (TDM), particularly Delphix, along with data de-identification, masking, and synthetic data generation. The ideal engineer will work closely with consumers to enable fast, secure test data provisioning. Exposure to Python or .NET is a plus, as is knowledge of CI/CD pipelines and cloud-hosted platforms. Must be a proactive contributor and effective collaborator, comfortable in dynamic environments. Location - Bengaluru, Chennai, Hyderabad, Pune, Vadodara (Onsite)
Posted 3 weeks ago
5.0 - 10.0 years
30 - 35 Lacs
pune
Work from Office
About The Role : Job TitleQA Performance Testing, AVP LocationPune, India Role Description DB Anti-Financial Crime (AFC) Our AFC team is responsible for protecting Deutsche Bank from financial and reputational losses incurred by financial crimes by assessing, controlling and mitigating risks. Risk types related to Anti-Financial Crime are consolidated in a comprehensive and effective risk management framework that covers Anti-Money-Laundering, Sanctions & Embargoes, Anti-Bribery & Corruption as well as Anti-Fraud & Investigations. We are seeking a Senior Performance Test Engineer to lead performance validation efforts for critical transaction monitoring systems in a financial services environment. You will design and execute load, stress, and scalability tests using Micro Focus LoadRunner (VuGen) and manage large-scale synthetic data creation to mirror complex production data models. The role requires strong collaboration with development, infrastructure, and business teams to ensure system stability and performance under high transaction volumes. Experience in financial services performance testing, data modeling, and synthetic data generation is essential. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Design, develop, and execute performance test scripts using Micro Focus LoadRunner (VuGen). Model complex transaction data flows to simulate real-world banking transaction volumes and patterns. Collaborate with architecture, development, and infrastructure teams to define performance benchmarks and SLAs. Create and manage large synthetic datasets to accurately reflect production-like conditions. Analyze performance test results to identify bottlenecks and provide actionable recommendations. Conduct scalability, stress, endurance, and load tests in large-scale environments. Develop performance test strategies and contribute to test plans and test cases for performance validation. Document findings and report to stakeholders with clear metrics and analysis. Mentor and guide junior performance engineers when needed. Your skills and experience Bachelor's degree in Computer Science, Information Systems, or related field. 5+ years of experience in performance testing, with a focus on financial or transaction monitoring applications. Proven expertise with Micro Focus LoadRunner (VuGen) and strong scripting experience (C, Java, or protocol-specific scripting). Experience with large data models and understanding of relational databases (Oracle, SQL Server, GCP BQ, CDSW, HIVE, etc.). Strong understanding of synthetic data generation techniques and data masking principles. Knowledge of monitoring tools (e.g., Dynatrace, AppDynamics, Grafana, OEM, GCP console) for correlation and root cause analysis. Familiarity with transaction monitoring systems, regulatory compliance, and anti-money laundering (AML) use cases is a strong plus. Excellent analytical, problem-solving, and communication skills. Working in parallel on more than one initiative, good time and priorities management Good communication, organizational and test reporting skills Preferred Experience working in cloud or hybrid cloud environments (GCP). Familiarity with Continuous Integration/Continuous Deployment (CI/CD) pipelines and automated performance testing frameworks. Hands on experience in using test and defects management tools (JIRA, HP ALM, Zephyr) ISTQB or performance testing certifications are a plus. Financial Services Domain knowledge Experience in working in and with distributed vendor model; Excellent analytical, problem-solving, and communication How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs
Posted 3 weeks ago
8.0 - 12.0 years
15 - 30 Lacs
hyderabad, pune, bengaluru
Hybrid
Job Summary The Test Data Management (TDM) architect is responsible for end-to-end TDM strategy, architecture, and governance across enterprise applications. This role leads the design and implementation of scalable test data provisioning, masking, synthetic data generation solutions and managing test data across non-production environments ensuring compliance, consistency, and readiness to support functional, integration, performance and automation testing especially in complex enterprise environments with integrated systems like Salesforce/SAP, and others. The architect works closely with testing, development, compliance, DBA, and DevOps teams in a delivery environment. Required skills Required Skills: 10+ years of experience in TDM, or QA architecture Proficiency in stakeholder and program management. Serve as a TDM SME for program level and cross functional initiatives. Conduct customer and stakeholder meetings, driving technical discussions Strong capability in devising and implementing test data strategies and solutions. Develop and execute robust TDM strategies and solutions for large migration programs. Strong experience in handling Test Data Management projects. Understand the business requirements from a test data perspective. Experience in managing test data scenarios and requirement gathering processes, as well as estimating the same. Experience in pre-sales including crafting proposals, solutions, estimates and creating additional business in the Test Data Management area. Technical Expertise: Strong expertise in Data Masking, Data Subset, Discovery of sensitive data, Synthetic data creation, Data Virtualization, Self-service portal. Hands-on experience with database technologies such as DB2, Oracle, and MS SQL for enterprise applications. Good Understanding of data compliance, regulatory standards (PII/PHI/PCI, GDPR, HIPAA etc.) and experience in implementing streamline test data refresh processes Advanced scripting skills using SQL, PL/SQL, SSIS, and JCL is advantageous. Integrating TDM practices into Agile/CI-CD environments. Proficiency and Hands on experience in Test Data Management tools like Broadcom CA TDM, IBM-Optim, GenRocket, Informatica TDM, Delphix, K2View. Experience with REST/SOAP APIs for test data automation.
Posted 3 weeks ago
8.0 - 10.0 years
7 - 11 Lacs
pune
Work from Office
Role Purpose The purpose of this role is to provide solutions and bridge the gap between technology and business know-how to deliver any client solution Do 1. Bridging the gap between project and support teams through techno-functional expertise For a new business implementation project, drive the end to end process from business requirement management to integration & configuration and production deployment Check the feasibility of the new change requirements and provide optimal solution to the client with clear timelines Provide techno-functional solution support for all the new business implementations while building the entire system from the scratch Support the solutioning team from architectural design, coding, testing and implementation Understand the functional design as well as technical design and architecture to be implemented on the ERP system Customize, extend, modify, localize or integrate to the existing product by virtue of coding, testing & production Implement the business processes, requirements and the underlying ERP technology to translate them into ERP solutions Write code as per the developmental standards to decide upon the implementation methodology Provide product support and maintenance to the clients for a specific ERP solution and resolve the day to day queries/ technical problems which may arise Create and deploy automation tools/ solutions to ensure process optimization and increase in efficiency Sink between technical and functional requirements of the project and provide solutioning/ advise to the client or internal teams accordingly Support on-site manager with the necessary details wrt any change and off-site support 2. Skill upgradation and competency building Clear wipro exams and internal certifications from time to time to upgrade the skills Attend trainings, seminars to sharpen the knowledge in functional/ technical domain Write papers, articles, case studies and publish them on the intranet Mandatory Skills: Data Obfuscation. Experience: 8-10 Years.
Posted 3 weeks ago
10.0 - 15.0 years
11 - 15 Lacs
bengaluru
Work from Office
Educational Requirements Bachelor of Engineering Service Line Infosys Quality Engineering Responsibilities The Test Data Management (TDM) architect is responsible for end-to-end TDM strategy, architecture, and governance across enterprise applications. This role leads the design and implementation of scalable test data provisioning, masking, synthetic data generation solutions and managing test data across non-production environments ensuring compliance, consistency, and readiness to support functional, integration, performance and automation testing especially in complex enterprise environments with integrated systems like Salesforce/SAP, and others.The architect works closely with testing, development, compliance, DBA, and DevOps teams in a delivery environment. Additional Responsibilities: Preferred Skills: Experience with data virtualization Proficiency in scheduling tools like Control-M, ESP. SF data model and metadata familiarity SAP data model and table knowledge Experience with cloud platform Knowledge of DevOps tooling (Jenkins, Git, Azure DevOps) Expertise in data migration testing and ETL testing techniques. Scripting knowledge (Python, Shell) for custom provisioning solutions Technical and Professional Requirements: Strong expertise in Data Masking, Data Subset, Discovery of sensitive data, Synthetic data creation, Data Virtualization, Self-service portal. Hands-on experience with database technologies such as DB2, Oracle, and MS SQL for enterprise applications. Good Understanding of data compliance, regulatory standards (PII/PHI/PCI, GDPR, HIPAA etc.) and experience in implementing streamline test data refresh processes Advanced scripting skills using SQL, PL/SQL, SSIS, and JCL is advantageous. Integrating TDM practices into Agile/CI-CD environments. Proficiency and Hands on experience in Test Data Management tools like Broadcom CA TDM, IBM-Optim, GenRocket, Informatica TDM, Delphix, K2View. Experience with REST/SOAP APIs for test data automation Preferred Skills: Technology->Open System->Open System- ALL->Python Technology->Cloud Platform->Cloud Platform - ALL Technology->Data Services Testing->Data Warehouse Testing->ETL tool Technology->Data Services Testing->Test Data Management Technology->DevOps->DevOps Architecture Consultancy Technology->DevOps->Continuous Testing
Posted 4 weeks ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
You will be working as an Oracle HCM Quality Analyst in Bangalore. As a Manual QA professional, you should have a minimum of 4 years of hands-on experience in testing. Your responsibilities will include creating test data based on model data, automating test data creation, using tools for data masking, understanding the scope of testing by meeting with system users, collaborating with software developers and project support teams, writing and executing test scripts, running manual and automated tests, and ensuring effective communication within the team. You should possess strong analytical skills to derive test scenarios for maximum test coverage. Experience in all phases of testing activities for UI, middleware, and API based applications is required. Additionally, you must have hands-on experience in writing complex SQL queries, reading and interpreting JSON files, and working with SCRUM teams. Experience with APM tools and Log Tracing tools like Splunk is essential for this role. Nice to have skills include API test experience using tools like Postman or SOAP, familiarity with automation tools, build tools like Maven/Gradle, GIT, and CI/CD tools like Jenkins. Knowledge of tools such as JIRA with Adaptivist/Zephyr, AppDynamics or similar, and Splunk is a must. Familiarity with PostgreSQL, Selenium with Java, and other tools is considered advantageous. If you meet the above requirements and are keen to work in a Hybrid work environment with 3 days in the office, please apply for this position. Immediate joiners will be preferred for this role.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You should have at least 5+ years of experience in HCM Release Management for Oracle HCM components such as EXTRACT, ALERT, BIP, and Reports. You must be proficient in utilizing Oracle CSM & FSM tools for automated Release Management. Additionally, prior experience in managing Oracle POD, instance strategy, P2T, T2T, and Data Masking is required. It is essential that you have experience in Certificate Management for Real-time Integrations and automating user, role, and Area of responsibilities creation in Oracle HCM. Experience with LBAC features implementation and integrating with SSO tools like OKTA is highly beneficial. You should have hands-on experience in configuring Oracle HCM role-based security across functional areas, including HCM and Recruiting. Proficiency in Oracle Cloud HCM Security setup and modifications related to roles, permissions, and data security is necessary. This includes building custom roles based on the delivered roles provided in the Oracle product. Nice to have skills include prior experience in implementing Continuous Integration & Continuous Deployment with Oracle HCM & Oracle Integration Cloud, as well as automating user & AOR assignment with enterprise systems.,
Posted 1 month ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
As a Platform Transition Manager, you will be responsible for developing and maintaining project timelines, migration playbooks, and contingency plans. You will supervise all stages of environment setup, configuration, interface validation, and dry-run execution to ensure smooth transitions. It will be your duty to ensure that environments (UAT, Pre-Prod, Prod) are fully configured, data-masked, and validated for readiness before each phase. You will work closely with solution architects and development teams to configure OBP modules as per bank-specific use cases. Facilitating customization reviews, ensuring regulatory compliance, and alignment with internal policies will be part of your functional and technical oversight. Additionally, you will review and validate functional specs and mapping documents from legacy to OBP standards. Collaboration with business users, operations teams, and compliance units will be essential for gathering requirements and conducting solution walkthroughs. You will drive UAT planning, execution, and signoffs to ensure business readiness and smooth cutover. Supporting country-specific workshops and playback sessions during scoping studies to gather detailed functional and regulatory requirements will also be a key aspect of your role. In terms of vendor and third-party coordination, you will liaise with Oracle and Profinch implementation partners to track deliverables, manage issue logs, and ensure SLAs are met. Coordination of joint workshops with vendors to validate design decisions, data transformation rules, and exception handling flows will also fall under your responsibilities. Risk mitigation and controls will be a critical part of your role, where you will implement strong controls to manage data integrity, reconciliation accuracy, and fallback procedures. Proactively addressing gaps in audit, compliance, or operational processes identified during the transition will be crucial. Defining production fallback strategies and rollback scenarios for cutover readiness will also be part of your tasks. As part of knowledge management and team enablement, you will create and maintain platform documentation, process manuals, and user guides. Supporting internal teams through training sessions, walkthroughs, and post-go-live hypercare initiatives will be essential. You will maintain traceability matrices for migrated functionalities, ensuring all user journeys are covered end-to-end. During the post-migration stabilization and optimization phase, you will lead the stabilization post go-live, including defect triage, SLA monitoring, and performance tuning. Establishing production monitoring controls and dashboards to track volumes, transaction health, and alerts will be part of your responsibilities. Continuously identifying opportunities for streamlining operations, reducing manual interventions, and enhancing automation will also be crucial. Ensuring system readiness and production support enablement will be another key aspect of your role. You will validate production readiness through infrastructure sizing reviews, HA/DR testing, and health system checks. Defining SOPs for Level 1 and Level 2 production support teams and ensuring knowledge transfer is completed will also be part of your responsibilities. Participation in production release dry runs, cutover rehearsals, and post-release validations will be required. You will act as a senior escalation point during post-deployment hypercare and issue resolution cycles. Clear communication with business users and other teams, along with good analytical and communication skills, will be necessary for success in this role. The leading financial institution in MENA is looking for individuals who think like challengers, startups, and innovators in the banking and finance sector. If you are passionate about delivering superior service to clients, leading with innovation, and contributing to the community through responsible banking, this role offers you the opportunity to pioneer key innovations and developments in banking and financial services. Join us in our mission to inspire more people to Rise every day and make a meaningful impact in the industry.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
We are looking for a skilled Data Governance Engineer to take charge of developing and overseeing robust data governance frameworks on Google Cloud Platform (GCP). Your role will involve leveraging your expertise in data management, metadata frameworks, compliance, and security within cloud environments to ensure the implementation of high-quality, secure, and compliant data practices aligned with organizational objectives. With a minimum of 4 years of experience in data governance, data management, or data security, you should possess hands-on proficiency with Google Cloud Platform (GCP) tools such as BigQuery, Dataflow, Dataproc, and Google Data Catalog. Additionally, a strong command over metadata management, data lineage, and data quality tools like Collibra and Informatica is crucial. A deep understanding of data privacy laws and compliance frameworks, coupled with proficiency in SQL and Python for governance automation, is essential. Experience with RBAC, encryption, data masking techniques, and familiarity with ETL/ELT pipelines and data warehouse architectures will be advantageous. Your responsibilities will include developing and executing comprehensive data governance frameworks with a focus on metadata management, lineage tracking, and data quality. You will be tasked with defining, documenting, and enforcing data governance policies, access control mechanisms, and security standards using GCP-native services like IAM, DLP, and KMS. Managing metadata repositories using tools such as Collibra, Informatica, Alation, or Google Data Catalog will also be part of your role. Collaborating with data engineering and analytics teams to ensure compliance with regulatory standards like GDPR, CCPA, SOC 2, and automating processes for data classification, monitoring, and reporting using Python and SQL will be key responsibilities. Supporting data stewardship initiatives, optimizing ETL/ELT pipelines, and data workflows to adhere to governance best practices will also be part of your role. At GlobalLogic, we offer a culture of caring, emphasizing inclusivity and personal growth. You will have access to continuous learning and development opportunities, engaging and meaningful work, as well as a healthy work-life balance. Join our high-trust organization where integrity is paramount, and collaborate with us to engineer innovative solutions that have a lasting impact on industries worldwide.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
We are looking for a skilled Data Governance Engineer to spearhead the development and supervision of robust data governance frameworks on Google Cloud Platform (GCP). You should have a deep understanding of data management, metadata frameworks, compliance, and security within cloud environments to ensure the adoption of high-quality, secure, and compliant data practices aligned with organizational objectives. The ideal candidate should possess: - Over 4 years of experience in data governance, data management, or data security. - Hands-on expertise with Google Cloud Platform (GCP) tools like BigQuery, Dataflow, Dataproc, and Google Data Catalog. - Proficiency in metadata management, data lineage, and data quality tools such as Collibra, Informatica. - Comprehensive knowledge of data privacy laws and compliance frameworks. - Strong skills in SQL and Python for governance automation. - Experience with RBAC, encryption, and data masking techniques. - Familiarity with ETL/ELT pipelines and data warehouse architectures. Your main responsibilities will include: - Developing and implementing comprehensive data governance frameworks emphasizing metadata management, lineage tracking, and data quality. - Defining, documenting, and enforcing data governance policies, access control mechanisms, and security standards utilizing GCP-native services like IAM, DLP, and KMS. - Managing metadata repositories using tools like Collibra, Informatica, Alation, or Google Data Catalog. - Collaborating with data engineering and analytics teams to ensure compliance with GDPR, CCPA, SOC 2, and other regulatory standards. - Automating processes for data classification, monitoring, and reporting using Python and SQL. - Supporting data stewardship initiatives including the creation of data dictionaries and governance documentation. - Optimizing ETL/ELT pipelines and data workflows to adhere to governance best practices. At GlobalLogic, we offer: - A culture of caring that prioritizes inclusivity, acceptance, and personal connections. - Continuous learning and development opportunities to enhance your skills. - Engagement in interesting and meaningful work with cutting-edge solutions. - Balance and flexibility to help you integrate work and life effectively. - A high-trust organization committed to integrity and ethical practices. GlobalLogic, a Hitachi Group Company, is a leading digital engineering partner to world-renowned companies, focusing on creating innovative digital products and experiences. Join us to collaborate on transforming businesses through intelligent products, platforms, and services.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
punjab
On-site
You have an exciting opportunity to join as a DevSecOps in Sydney. As a DevSecOps, you should have 3+ years of extensive Python proficiency and 3+ years of Java Experience. Your role will also require extensive exposure to technologies such as Javascript, Jenkins, Code Pipeline, CodeBuild, and AWS" ecosystem including AWS Well Architected Framework, Trusted Advisor, GuardDuty, SCP, SSM, IAM, and WAF. It is essential for you to have a deep understanding of automation, quality engineering, architectural methodologies, principles, and solution design. Hands-on experience with Infrastructure-As-Code tools like CloudFormation and CDK will be preferred for automating deployments in AWS. Moreover, familiarity with operational observability, including log aggregation, application performance monitoring, deploying auto-scaling and load-balanced / Highly Available applications, and managing certificates (client-server, mutual TLS, etc) is crucial for this role. Your responsibilities will include improving the automation of security controls, working closely with the consumer showback team on defining processes and system requirements, and designing and implementing updates to the showback platform. You will collaborate with STO/account owners to uplift the security posture of consumer accounts, work with the Onboarding team to ensure security standards and policies are correctly set up, and implement enterprise minimum security requirements from the Cloud Security LRP, including Data Masking, Encryption monitoring, Perimeter protections, Ingress / Egress uplift, and Integration of SailPoint for SSO Management. If you have any questions or need further clarification, feel free to ask.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
punjab
On-site
As a GCP Data Engineer in Australia, you will be responsible for leveraging your experience in Google Cloud Platform (GCP) to handle various aspects of data engineering. Your role will involve working on data migration projects from legacy systems such as SQL and Oracle. You will also be designing and building ETL pipelines for data lake and data warehouse solutions on GCP. In this position, your expertise in GCP data and analytics services will be crucial. You will work with tools like Cloud Dataflow, Cloud Dataprep, Apache Beam/Cloud composer, Cloud BigQuery, Cloud Fusion, Cloud PubSub, Cloud storage, and Cloud Functions. Additionally, you will utilize Cloud Native GCP CLI/gsutil for operations and scripting languages like Python and SQL to enhance data processing efficiencies. Furthermore, your experience with data governance practices, metadata management, data masking, and encryption will be essential. You will utilize GCP tools such as Cloud Data Catalog and GCP KMS tools to ensure data security and compliance. Overall, this role requires a strong foundation in GCP technologies and a proactive approach to data engineering challenges in a dynamic environment.,
Posted 1 month ago
5.0 - 10.0 years
25 - 40 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
We are hiring Various Skills for our Client Ascendion Bangalore. Shortlisted candidates will receive interview invites from HR team. (Note: If you already received any call from different vendors or from Ascendion kindly do not apply) If you're ready to take the next step in your career, share your profile with us at hiring@radonglobaltech.com Job Title: Test Data Management Engineer (Delphix Specialist) Company: Ascendion Location: Hyderabad / Bangalore Job Type: Full-time / Contract Experience: 5-10Years Availability: Immediate to Quick Joiners Preferred Job Summary: Ascendion is seeking a skilled Test Data Management Engineer with strong hands-on experience in Delphix and test data provisioning for healthcare systems. The ideal candidate will have expertise in data masking, synthetic data generation, and a deep understanding of TDM tools. Key Responsibilities: Work extensively with Delphix TDM tools for data masking and de-identification. Handle test data provisioning activities, especially in healthcare environments. Implement synthetic data generation strategies. Align test data provisioning with stakeholder requirements and delivery roadmaps. Collaborate across teams for fast and efficient data delivery. Required Skills: 5+ years of experience with Test Data Management (TDM) tools, specifically Delphix. Proven experience in data masking, de-identification, and test data provisioning. At least 2 years of experience with synthetic data generation. Nice to have: Working knowledge of Python and .NET. Exposure to cloud platforms, CI/CD pipelines, and data integration workflows is an added advantage.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |