Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
1.0 - 4.0 years
1 - 5 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Informatica TDM 1. Data Discovery, Data Subset and Data Masking 2. Data generation 3. Complex masking and genration rule creation 4. Performance tunning for Inoformatica Mapping 5. Debugging with Informatica power center 6. Data Migration Skills and Knowledge 1. Informatica TDM development experiance in Data masking, discovery, Data subsetting and Data Generation is must 2. Should have experiance in working with flat file, MS SQL, Oracle and Snowflake 3. Debugging using Inofmratica Power center 4. Experiance in Tableau will be added advantage 5. Should have basic knowledge about IICS" 6. Must have and Good to have skills Informatica TDM, SQL, Informatica Powercenter, GDPR
Posted 3 months ago
7.0 - 11.0 years
19 - 30 Lacs
Pune
Work from Office
Seeking a Database Engineer with 10+ years of experience in PostgreSQL, Liquibase, data masking, performance tuning and strong expertise in real-time data sync and database optimization within financial systems. CC: recruitment@fortitudecareer.com Flexi working Work from home
Posted 3 months ago
6.0 - 10.0 years
1 - 5 Lacs
Pune
Work from Office
Job Information Job Opening ID ZR_1661_JOB Date Opened 17/12/2022 Industry Technology Job Type Work Experience 6-10 years Job Title Informatica TDM Developer City Pune Province Maharashtra Country India Postal Code 411001 Number of Positions 4 LocationPune, Bangalore, Hyderabad Informatica TDM 1. Data Discovery, Data Subset and Data Masking 2. Data generation 3. Complex masking and genration rule creation 4. Performance tunning for Inoformatica Mapping 5. Debugging with Informatica power center 6. Data Migration Skills and Knowledge 1. Informatica TDM development experiance in Data masking, discovery, Data subsetting and Data Generation is must 2. Should have experiance in working with flat file, MS SQL, Oracle and Snowflake 3. Debugging using Inofmratica Power center 4. Experiance in Tableau will be added advantage 5. Should have basic knowledge about IICS" 6. Must have and Good to have skills Informatica TDM, SQL, Informatica Powercenter, GDPR check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 3 months ago
6.0 - 11.0 years
8 - 13 Lacs
Hyderabad
Work from Office
6-10 years of IT experience. Good Calypso knowledge (v13 and above). Excellent Core Java knowledge (JDK 7 and above) Good knowledge on OTC markets , Derivatives Excellent design skills / problem solving capability. Exposure to CI/CD pipelines and Automation Framework SQL and more than basic database operations. Good Unix, Python knowledge -- Identify and resolve complex problems Ability to work under tight timelines
Posted 3 months ago
5.0 - 10.0 years
22 - 27 Lacs
Bengaluru
Work from Office
Data Strategy and PlanningDevelop and implement data architecture strategies that align with organizational goals and objectives. Collaborate with business stakeholders to understand data requirements and translate them into actionable plans. Data ModelingDesign and implement logical and physical data models to support business needs. Ensure data models are scalable, efficient, and comply with industry best practices. Database Design and ManagementOversee the design and management of databases, selecting appropriate database technologies based on requirements. Optimize database performance and ensure data integrity and security. Data IntegrationDefine and implement data integration strategies to facilitate seamless flow of information across. Responsibilities: Experience in data architecture and engineering Proven expertise with Snowflake data platform Strong understanding of ETL/ELT processes and data integration Experience with data modeling and data warehousing concepts Familiarity with performance tuning and optimization techniques Excellent problem-solving skills and attention to detail Strong communication and collaboration skills Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Cloud & Data ArchitectureAWS ,Snowflake ETL & Data EngineeringAWS Glue, Apache Spark, Step Functions Big Data & AnalyticsAthena,Presto, Hadoop Database & StorageSQL,Snow sql Security & ComplianceIAM, KMS, Data Masking Preferred technical and professional experience Cloud Data WarehousingSnowflake (Data Modeling, Query Optimization) Data TransformationDBT (Data Build Tool) for ELT pipeline management Metadata & Data GovernanceAlation (Data Catalog, Lineage, Governance)
Posted 3 months ago
5.0 - 10.0 years
22 - 27 Lacs
Navi Mumbai
Work from Office
Data Strategy and PlanningDevelop and implement data architecture strategies that align with organizational goals and objectives. Collaborate with business stakeholders to understand data requirements and translate them into actionable plans. Data ModelingDesign and implement logical and physical data models to support business needs. Ensure data models are scalable, efficient, and comply with industry best practices. Database Design and ManagementOversee the design and management of databases, selecting appropriate database technologies based on requirements. Optimize database performance and ensure data integrity and security. Data IntegrationDefine and implement data integration strategies to facilitate seamless flow of information across. Responsibilities: Experience in data architecture and engineering Proven expertise with Snowflake data platform Strong understanding of ETL/ELT processes and data integration Experience with data modeling and data warehousing concepts Familiarity with performance tuning and optimization techniques Excellent problem-solving skills and attention to detail Strong communication and collaboration skills Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Cloud & Data ArchitectureAWS , Snowflake ETL & Data EngineeringAWS Glue, Apache Spark, Step Functions Big Data & AnalyticsAthena,Presto, Hadoop Database & StorageSQL, Snow sql Security & ComplianceIAM, KMS, Data Masking Preferred technical and professional experience Cloud Data WarehousingSnowflake (Data Modeling, Query Optimization) Data TransformationDBT (Data Build Tool) for ELT pipeline management Metadata & Data GovernanceAlation (Data Catalog, Lineage, Governance
Posted 3 months ago
5 - 10 years
30 - 35 Lacs
Pune
Work from Office
About The Role : Job TitleQA Performance Testing, AVP LocationPune, India Role Description DB Anti-Financial Crime (AFC) Our AFC team is responsible for protecting Deutsche Bank from financial and reputational losses incurred by financial crimes by assessing, controlling and mitigating risks. Risk types related to Anti-Financial Crime are consolidated in a comprehensive and effective risk management framework that covers Anti-Money-Laundering, Sanctions & Embargoes, Anti-Bribery & Corruption as well as Anti-Fraud & Investigations. We are seeking a Senior Performance Test Engineer to lead performance validation efforts for critical transaction monitoring systems in a financial services environment. You will design and execute load, stress, and scalability tests using Micro Focus LoadRunner (VuGen) and manage large-scale synthetic data creation to mirror complex production data models. The role requires strong collaboration with development, infrastructure, and business teams to ensure system stability and performance under high transaction volumes. Experience in financial services performance testing, data modeling, and synthetic data generation is essential. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Design, develop, and execute performance test scripts using Micro Focus LoadRunner (VuGen). Model complex transaction data flows to simulate real-world banking transaction volumes and patterns. Collaborate with architecture, development, and infrastructure teams to define performance benchmarks and SLAs. Create and manage large synthetic datasets to accurately reflect production-like conditions. Analyze performance test results to identify bottlenecks and provide actionable recommendations. Conduct scalability, stress, endurance, and load tests in large-scale environments. Develop performance test strategies and contribute to test plans and test cases for performance validation. Document findings and report to stakeholders with clear metrics and analysis. Mentor and guide junior performance engineers when needed. Your skills and experience Bachelor's degree in Computer Science, Information Systems, or related field. 5+ years of experience in performance testing, with a focus on financial or transaction monitoring applications. Proven expertise with Micro Focus LoadRunner (VuGen) and strong scripting experience (C, Java, or protocol-specific scripting). Experience with large data models and understanding of relational databases (Oracle, SQL Server, GCP BQ, CDSW, HIVE, etc.). Strong understanding of synthetic data generation techniques and data masking principles. Knowledge of monitoring tools (e.g., Dynatrace, AppDynamics, Grafana, OEM, GCP console) for correlation and root cause analysis. Familiarity with transaction monitoring systems, regulatory compliance, and anti-money laundering (AML) use cases is a strong plus. Excellent analytical, problem-solving, and communication skills. Working in parallel on more than one initiative, good time and priorities management Good communication, organizational and test reporting skills Preferred Experience working in cloud or hybrid cloud environments (GCP). Familiarity with Continuous Integration/Continuous Deployment (CI/CD) pipelines and automated performance testing frameworks. Hands on experience in using test and defects management tools (JIRA, HP ALM, Zephyr) ISTQB or performance testing certifications are a plus. Financial Services Domain knowledge Experience in working in and with distributed vendor model; Excellent analytical, problem-solving, and communication How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs
Posted 4 months ago
2.0 - 7.0 years
6 - 12 Lacs
bengaluru
Work from Office
Job Description: Product Specialist Core Banking Data Masking (Mage Data Product) Location: Bengaluru (Work from Office) Notice Period: Immediate Joiners or Max 30 Days Job Type: Full-Time Experience Level: 2–5 Years (Mid-Level / Senior) Department: Data Security / IT Operations / Core Banking About Yethi Consulting Yethi Consulting is one of India’s fastest-growing pure-play testing services providers, focused on the BFSI domain. We have partnered with 130+ banks across 22 countries, delivering quality assurance solutions to some of the world’s leading financial institutions. About Tenjin Products Tenjin is our flagship product suite designed to automate testing and assurance processes for banking and financial systems, ensuring speed, accuracy, and compliance. Job Summary We are seeking a Product Specialist with hands-on experience in Mage Data for data masking in core banking environments. The ideal candidate will handle deployment, configuration, and operational support of data masking solutions, ensuring protection of sensitive data (PII, PCI, PHI) across banking and enterprise applications. Key Responsibilities Prepare environments for product installation. Install, configure, and validate Mage Data masking solutions across Dev/Test environments. Establish secure connectivity between Mage Data and Core Banking applications/databases (Oracle, etc.). Design and implement data masking templates for sensitive fields. Execute Dynamic Data Masking (DDM) after completing pre-check validations. Configure and monitor Database Activity Monitoring (DAM) alerts. Perform L1 troubleshooting for Mage Data masking processes in core banking systems. Collaborate with application owners, DBAs, and security teams to ensure compliance with data protection policies. Document all configurations, processes, and resolutions as per internal standards. Required Skills & Qualifications Hands-on experience with Mage Data and understanding of Core Banking Systems . Strong knowledge of data privacy regulations (GDPR, HIPAA, etc.) and data masking methods. Proficiency in database technologies : Oracle, MS SQL Server, MySQL, DB2. Experience with secure application connectivity and system integration. Ability to perform pre-check validations and implement DDM. Basic scripting knowledge ( Shell, Python ) for automation and troubleshooting. Strong communication and documentation skills. Preferred Qualifications Certifications in Data Security or Data Privacy (CIPP, CDPSE, etc.). Experience with Data Discovery, DAM tools , or other enterprise data protection technologies. If your profile matches the above requirements and you are interested, kindly share your updated resume with Lokesh.ms@yethi.in
Posted Date not available
7.0 - 11.0 years
12 - 16 Lacs
gurugram
Work from Office
About The Role Project Role : Security Delivery Lead Project Role Description : Leads the implementation and delivery of Security Services projects, leveraging our global delivery capability (method, tools, training, assets). Must have skills : Data Loss Prevention (DLP) Good to have skills : Data Masking, Data ObfuscationMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education" Summary :As a Data Obfuscation Specialist you will design, implement, and manage data obfuscation strategies that protect sensitive and personally identifiable information (PII) across development, testing, and production environments Roles & Responsibilities:Design and implement data masking, anonymization, and tokenization techniques to protect sensitive information across systems. Apply static and dynamic data masking in databases, applications, and test environments. Collaborate with data governance teams to identify and classify sensitive data across structured (SQL, ERP, CRM) and unstructured (documents, logs) sources. Evaluate and deploy data obfuscation tools (e.g., Informatica, Delphix, IBM InfoSphere Optim, Microsoft Purview). Automate obfuscation processes using scripting tools (e.g., Python, PowerShell, Bash) Ensure all obfuscation practices align with data protection regulations (e.g., GDPR, CCPA, HIPAA). Partner with developers, QA, DBAs, and data engineers to deliver masked datasets for non-production environments. Provide training and guidance on proper handling of masked or anonymized data. Professional & Technical Skills: Experience in data masking, obfuscation, or data privacy engineering roles Strong knowledge of Data masking tools, Obfuscation methods and Databases Familiarity with data privacy laws (GDPR, CCPA, etc.) Strong analytical and investigation skills Experience working with cloud platforms (Azure, AWS, GCP) Additional Information:- 7 or more years experience implementing and managing data masking solutions- This position is based at our Bangalore office- A 15-year full time education is required" Qualification 15 years full time education
Posted Date not available
6.0 - 10.0 years
1 - 5 Lacs
hyderabad, pune, bengaluru
Work from Office
Informatica TDM : 1. Data Discovery, Data Subset and Data Masking 2. Data generation 3. Complex masking and genration rule creation 4. Performance tunning for Inoformatica Mapping 5. Debugging with Informatica power center 6. Data Migration Skills and Knowledge 1. Informatica TDM development experiance in Data masking, discovery, Data subsetting and Data Generation is must 2. Should have experiance in working with flat file, MS SQL, Oracle and Snowflake 3. Debugging using Inofmratica Power center 4. Experiance in Tableau will be added advantage 5. Should have basic knowledge about IICS" 6. Must have and Good to have skills Informatica TDM, SQL, Informatica Powercenter, GDPR.
Posted Date not available
5.0 - 10.0 years
25 - 30 Lacs
new delhi, gurugram
Work from Office
As a Google DLP Specialist you will be responsible for implementing, managing, and optimizing Google Clouds Data Loss Prevention (DLP) services to protect sensitive information and ensure compliance with data privacy regulations. Responsibilities: Design, deploy, and maintain Google Cloud DLP solutions across cloud environments. Identify, classify, and monitor sensitive data (PII, PHI, PCI, etc.) within GCP resources. Develop and enforce DLP policies to prevent unauthorized data exposure. Integrate DLP with other security tools and workflows. Configure data masking, redaction, tokenization, and encryption as needed. Monitor DLP alerts and investigate incidents of potential data loss. Collaborate with compliance and IT teams to align DLP controls with regulatory requirements (GDPR, HIPAA, etc.)
Posted Date not available
5.0 - 10.0 years
25 - 30 Lacs
kolkata
Work from Office
As a Google DLP Specialist you will be responsible for implementing, managing, and optimizing Google Clouds Data Loss Prevention (DLP) services to protect sensitive information and ensure compliance with data privacy regulations. Responsibilities: Design, deploy, and maintain Google Cloud DLP solutions across cloud environments. Identify, classify, and monitor sensitive data (PII, PHI, PCI, etc.) within GCP resources. Develop and enforce DLP policies to prevent unauthorized data exposure. Integrate DLP with other security tools and workflows. Configure data masking, redaction, tokenization, and encryption as needed. Monitor DLP alerts and investigate incidents of potential data loss. Collaborate with compliance and IT teams to align DLP controls with regulatory requirements (GDPR, HIPAA, etc.)
Posted Date not available
5.0 - 10.0 years
25 - 30 Lacs
hyderabad
Work from Office
As a Google DLP Specialist you will be responsible for implementing, managing, and optimizing Google Clouds Data Loss Prevention (DLP) services to protect sensitive information and ensure compliance with data privacy regulations. Responsibilities: Design, deploy, and maintain Google Cloud DLP solutions across cloud environments. Identify, classify, and monitor sensitive data (PII, PHI, PCI, etc.) within GCP resources. Develop and enforce DLP policies to prevent unauthorized data exposure. Integrate DLP with other security tools and workflows. Configure data masking, redaction, tokenization, and encryption as needed. Monitor DLP alerts and investigate incidents of potential data loss. Collaborate with compliance and IT teams to align DLP controls with regulatory requirements (GDPR, HIPAA, etc.)
Posted Date not available
5.0 - 10.0 years
25 - 30 Lacs
mumbai, pune
Work from Office
As a Google DLP Specialist you will be responsible for implementing, managing, and optimizing Google Clouds Data Loss Prevention (DLP) services to protect sensitive information and ensure compliance with data privacy regulations. Responsibilities: Design, deploy, and maintain Google Cloud DLP solutions across cloud environments. Identify, classify, and monitor sensitive data (PII, PHI, PCI, etc.) within GCP resources. Develop and enforce DLP policies to prevent unauthorized data exposure. Integrate DLP with other security tools and workflows. Configure data masking, redaction, tokenization, and encryption as needed. Monitor DLP alerts and investigate incidents of potential data loss. Collaborate with compliance and IT teams to align DLP controls with regulatory requirements (GDPR, HIPAA, etc.)
Posted Date not available
5.0 - 10.0 years
25 - 30 Lacs
chennai
Work from Office
As a Google DLP Specialist you will be responsible for implementing, managing, and optimizing Google Clouds Data Loss Prevention (DLP) services to protect sensitive information and ensure compliance with data privacy regulations. Responsibilities: Design, deploy, and maintain Google Cloud DLP solutions across cloud environments. Identify, classify, and monitor sensitive data (PII, PHI, PCI, etc.) within GCP resources. Develop and enforce DLP policies to prevent unauthorized data exposure. Integrate DLP with other security tools and workflows. Configure data masking, redaction, tokenization, and encryption as needed. Monitor DLP alerts and investigate incidents of potential data loss. Collaborate with compliance and IT teams to align DLP controls with regulatory requirements (GDPR, HIPAA, etc.)
Posted Date not available
5.0 - 10.0 years
25 - 30 Lacs
chennai
Work from Office
As a Google DLP Specialist you will be responsible for implementing, managing, and optimizing Google Clouds Data Loss Prevention (DLP) services to protect sensitive information and ensure compliance with data privacy regulations. Responsibilities: Design, deploy, and maintain Google Cloud DLP solutions across cloud environments. Identify, classify, and monitor sensitive data (PII, PHI, PCI, etc.) within GCP resources. Develop and enforce DLP policies to prevent unauthorized data exposure. Integrate DLP with other security tools and workflows. Configure data masking, redaction, tokenization, and encryption as needed. Monitor DLP alerts and investigate incidents of potential data loss. Collaborate with compliance and IT teams to align DLP controls with regulatory requirements (GDPR, HIPAA, etc.)
Posted Date not available
5.0 - 10.0 years
25 - 30 Lacs
bengaluru
Work from Office
As a Google DLP Specialist you will be responsible for implementing, managing, and optimizing Google Clouds Data Loss Prevention (DLP) services to protect sensitive information and ensure compliance with data privacy regulations. Responsibilities: Design, deploy, and maintain Google Cloud DLP solutions across cloud environments. Identify, classify, and monitor sensitive data (PII, PHI, PCI, etc.) within GCP resources. Develop and enforce DLP policies to prevent unauthorized data exposure. Integrate DLP with other security tools and workflows. Configure data masking, redaction, tokenization, and encryption as needed. Monitor DLP alerts and investigate incidents of potential data loss. Collaborate with compliance and IT teams to align DLP controls with regulatory requirements (GDPR, HIPAA, etc.)
Posted Date not available
10.0 - 15.0 years
8 - 14 Lacs
bengaluru
Work from Office
Project description The project will focus on ensuring data privacy and compliance in the client environment by implementing and managing data masking solutions using the Delphix platform. This role involves collaboration with cross-functional teams to secure sensitive data while maintaining data integrity for development, testing, and analytics. Responsibilities Key Responsibilities: Design and Implementation: -Develop and implement robust data masking solutions using the Delphix platform. -Analyze data sets to identify sensitive information that requires masking. -Create and maintain masking rules, algorithms, and templates for various data environments. Data Security & Compliance: -Ensure sensitive data complies with regulatory requirements, such as GDPR, HIPAA, PCI-DSS, and CCPA. -Collaborate with security and compliance teams to establish masking policies and standards. Testing & Validation: -Conduct thorough testing to validate the effectiveness of masking solutions. -Troubleshoot and resolve issues related to data masking processes. Collaboration & Support: -Work closely with database administrators, developers, and QA teams to integrate masking into workflows. -Provide training and documentation on the Delphix platform and data masking techniques. Monitoring & Optimization: -Continuously monitor and optimize data masking performance. -Stay updated on the latest trends and updates in data masking technologies. Skills Must have Required Skills and Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field with 10+ years of IT industry experience. 8-10 years of experience in data masking, data privacy, or a related field. Expertise in the Delphix platform with 4+ years, including data virtualization and masking. Strong understanding of database management systems (e.g., Oracle, SQL Server, MySQL). Understanding of mainframe databases is an added advantage. Knowledge of data privacy regulations (GDPR, HIPAA, PCI-DSS, etc.). Proficiency in scripting languages such as Java, Python, Shell, or SQL. Excellent problem-solving and troubleshooting skills. Strong communication skills and ability to work in a collaborative team environment. Nice to have Preferred Qualifications: Mainframe exposure -Experience with cloud platforms (AWS, Azure, or GCP). -Familiarity with other data masking tools and technologies. -Certification in data privacy or data security (e.g., CIPP, CDPSE).
Posted Date not available
5.0 - 10.0 years
7 - 11 Lacs
noida
Work from Office
we help our enterprise clients thrive with technology-enabled transformation across financial services, healthcare, transportation & logistics, and professional services. such as high-value complex Application & Product Engineering, Data & Analytics, Cloud, DevOps, Data & MLOps, Quality Engineering, and Business Automation. Our employee value proposition (EVP) is about Being Your Best as a professional and person. It is about being challenged by work that inspires us, being empowered to excel and grow in your career, and being part of a culture where talent is valued. Were a place where everyone can discover and be their best version. 5+ years of experience in performance testing, with a focus on large-scale systems and applications Experienceintesting batch jobs, messaging queue (Kafka, rabbitMQ). Strong understanding of performance testing tools and technologies (e.g., Loadrunner, JMeter, Gatling, NeoLoad) Must have worked in Data Masking and data creation process via Test Data Management Tool(CA/Broadcom tool). Knowledge in using Unix commands to create shell scripts Hand on experience in job scheduling tools like stone branch/ autosys jobs. Knowledge of Jenkins and executing PT test cases via DevOps pipeline. Experience with Agile methodologies and version control systems (e.g., Git) Excellent analytical and problem-solving skills Effective communication and collaboration skills Nice to Have: Experience with cloud-based performance testing and monitoring tools Knowledge of programming languages (e.g., Java, Python) Familiarity with DevOps practices and continuous integration/continuous deployment (CI/CD) pipeline. Mandatory Competencies Performance Testing - Performance Testing Performance Tools - Performance Tools - Jmeter QE - Agile Methodology Tech - Analytical \ Problem Solving Beh - Communication and collaboration QA/QE - QA Automation - Performance testing (Jmeter, Loadrunner, NeoLoad etc) Performance Tools - Performance Tools - Gatling
Posted Date not available
11.0 - 14.0 years
35 - 50 Lacs
hyderabad
Work from Office
Job Summary Requires a minimum of 5 to 15 years of experience managing Calypso software projects implementations. Experience interest rates derivatives commodities FX Options or equity derivative asset classes also collateral management Experience managing Calypso software implementations with financial market utilities such as DTCC SDR Swift FedWire Bloomberg AcadiaSoft or TriResolve and with internal systems such as General Ledger market and credit risk and data warehouses Experience with Gene Responsibilities Requires a minimum of 5 to 15 years of experience managing Calypso software projects implementations. Experience interest rates derivatives commodities FX Options or equity derivative asset classes also collateral management Experience managing Calypso software implementations with financial market utilities such as DTCC SDR Swift FedWire Bloomberg AcadiaSoft or TriResolve and with internal systems such as General Ledger market and credit risk and data warehouses Experience with General Ledger accounting market data sources LDAP and Single Sign On portals and payment processing Certifications Required Calypso Certified Professional Financial Risk Manager (FRM)
Posted Date not available
4.0 - 9.0 years
8 - 18 Lacs
hyderabad, bengaluru
Hybrid
Greetings from Tech Mahindra!! With reference to your profile on Naukri portal, we are contacting you to share a better job opportunity for the role of Test Data Engineer with our own organization, Tech Mahindra based. COMPANY PROFILE: Tech Mahindra is an Indian multinational information technology services and consulting company. Website: www.techmahindra.com We are looking for Test Data Engineer for our Organization. Job Details: Job Description - Coordinated activities across multiple operations teams including Storage, OS , and DBA teams Provide required Test data for Performance Test, User Acceptance Test, QA Test, Application Development, and other Non-Production Environments Expertise on DBs Postgres, Oracle, SQL Server, DB2 Experience in TDM masking, subsetting, synthetic data concepts and Tools Work with application architects and QA teams to understand data requirements and ensure synthetic test data aligns with business rules. Develop data models and test data subsets from PostgreSQL databases using GenRocket and native SQL techniques. Manage data masking, data obfuscation, and data privacy rules for sensitive data. Required Skills & Qualifications: 3+ years of experience in Test Data Management (TDM). Hands-on expertise in GenRocket : test data scenario design, receiver configuration, generators, data rules, and integration. Strong proficiency in PostgreSQL : schema design, SQL scripting, data extraction/transformation, and performance tuning. Experience with data masking , synthetic data generation, and data subsetting strategies. Knowledge of test automation frameworks and CI/CD tools (e.g., Jenkins, GitLab CI). Familiarity with data privacy and compliance standards (e.g., GDPR, HIPAA). Experience with REST/SOAP API testing and test data for microservices (desirable). Experience in working with cloud-based databases or containerized environments. Exposure to other TDM tools like Delphix, Informatica, or K2View is a plus. ============================================================== Experience : 4+ years Education : Any Work timings : Normal Mode-Hybrid Location:-Hyderabad/Bangalore No of days working : 05 Days Working Kindly share only interested candidates forward your updated resumes with below details at: ps00874998@techmahindra.com Total years of experience: Relevant experience as Data Engineer : Relevant experience in GenRocket :- Relevant experience in PostgreSQL :- Offer amount (if holding any offer ) : Location of offer:- Reason for looking another offer:- Notice Period (if serving LWD) : Current location :- Preferred location : CTC: Exp CTC: When you are available for the interview? (Time/Date): How soon can you join? Best Regards, Prerna Sharma Business Associate | RMG Tech Mahindra | PS00874998@TechMahindra.com
Posted Date not available
4.0 - 9.0 years
12 - 20 Lacs
hyderabad
Hybrid
Job Title: Test Data Engineer Experience Level: 4+ years Location- Hyderabad Job Description - Coordinated activities across multiple operations teams including Storage, OS , and DBA teams Provide required Test data for Performance Test, User Acceptance Test, QA Test, Application Development, and other Non-Production Environments Expertise on DBs Postgres, Oracle, SQL Server, DB2 Experience in TDM masking, subsetting, synthetic data concepts and Tools Work with application architects and QA teams to understand data requirements and ensure synthetic test data aligns with business rules. Develop data models and test data subsets from PostgreSQL databases using GenRocket and native SQL techniques. Manage data masking, data obfuscation, and data privacy rules for sensitive data. Required Skills & Qualifications: 3+ years of experience in Test Data Management (TDM). Hands-on expertise in GenRocket : test data scenario design, receiver configuration, generators, data rules, and integration. Strong proficiency in PostgreSQL : schema design, SQL scripting, data extraction/transformation, and performance tuning. Experience with data masking , synthetic data generation, and data subsetting strategies. Knowledge of test automation frameworks and CI/CD tools (e.g., Jenkins, GitLab CI). Familiarity with data privacy and compliance standards (e.g., GDPR, HIPAA). Experience with REST/SOAP API testing and test data for microservices (desirable). Experience in working with cloud-based databases or containerized environments. Exposure to other TDM tools like Delphix, Informatica, or K2View is a plus.
Posted Date not available
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |