Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
12.0 years
0 Lacs
Pune/Pimpri-Chinchwad Area
On-site
About The Client We’re hiring for a global digital solutions company known for its expertise in cloud, data engineering, and quality assurance, with a strong focus on innovation and agile delivery across BFSI and Healthcare sectors. About The Role We are hiring for a leadership QA role responsible for managing both manual and automated testing across ETL pipelines, UI, and databases, while leading a team and collaborating cross-functionally across global teams. The role involves implementing best practices in testing, owning quality strategy documentation, and integrating with CI/CD tools. Key Responsibilities Lead and manage a QA team (4+ members) handling ETL, UI, DB, and end-to-end testing. Analyze requirements, create detailed test cases and test data (manual & automated). Validate ETL workflows and transformation logic using advanced SQL (Snowflake/Postgres/MySQL). Create and maintain automation scripts using BDD (Gherkin/Behave, Pytest) in Python. Integrate automation frameworks with CI/CD tools like Jenkins, GitHub, Rundeck. Develop and maintain documentation – Test Strategy, BRDs, Defect Reports, etc. Drive collaboration across DevOps, SRE, developers, and stakeholders. Provide testing metrics and improvement plans to senior leadership. Must-Have Qualifications 9–12 years’ total QA experience with 6+ years in ETL/UI/DB testing. 3+ years’ experience in automation testing (Selenium v3+, Playwright) with Python. Strong hands-on with SQL queries (preferably Snowflake/Postgres). Experience with AWS or other cloud platforms. Proven leadership managing at least 2 QA team members. Working experience with Agile methodologies (Scrum/Kanban). Excellent communication and stakeholder management skills. Nice To Have Exposure to DBT, Informatica, or MS Power BI. Healthcare/Life Sciences domain knowledge. Front-end vs back-end validation exposure. Willingness to work flexible hours with international teams. Required Education: BCA / B.Sc. (Computer Science) / B.E. / B.Tech / MCA / M.E. / M.Tech Show more Show less
Posted 1 week ago
0 years
0 Lacs
Salzburg, Austria
Remote
Approach People Recruitment Location: Remote (Open to Austrian Nationals Only) Job Type: Part-Time (20 hours per week) Our client is a leader in AI innovation, dedicated to enhancing artificial intelligence to ensure it meets the highest standards of accuracy, cultural sensitivity, and functionality. Join them to help refine AI models that serve users globally. Role Overview As an AI Content Editor, you will test and challenge AI responses, identifying areas for improvement to help make AI smarter, more accurate, and culturally aware. This role is ideal for Austrian nationals with bilingual proficiency who want to contribute to cutting-edge technology. Key Responsibilities Challenge AI models to assess and improve their accuracy and responsiveness. Identify and document issues related to language nuances and cultural context. Provide insights to improve language-specific functionality in AI systems. Collaborate with our team to refine AI technology through detailed feedback and testing. Qualifications Nationality: Austrian national Language Skills: Bilingual proficiency in English & German Strong analytical and problem-solving skills. Passion for AI technology and innovation. Availability for 20 hours of remote work per week. Detail-oriented with a commitment to quality and precision. What We Offer Flexible, remote work setup. Competitive pay for part-time engagement. The chance to directly impact the future of AI technology. If you're an Austrian national with bilingual skills and a drive to enhance AI, we'd love to hear from you. Be a key player in the evolution of AI - apply now! Settore: Informatica Ruolo: IT/Technology Gestisce altre persone: No Tipo di occupazione: Contratto a tempo indeterminato Inquadramento: Primo impiego Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Gurugram, Haryana, India
Remote
Title: ADP Payroll Implementation Location: Remote, Gurugram, Haryana Duration: 24+ Months Key Responsibilities: ADP Payroll Implementation Lead implementation of ADP GlobalView or Celergo for international payroll, ensuring accurate and compliant country-specific rollouts. Partner with ADP, who will manage ongoing payroll operations, while focusing internally on enabling systems integration and readiness. Project Management & Business Analysis Drive the payroll implementation lifecycle over the 18–24 month project, managing timelines, requirements, and stakeholders. Use Agile/Scrum methodology and maintain Jira backlogs, sprints, and user stories. Elicit and document business and system requirements from local stakeholders and payroll teams. Integration Oversight Coordinate integration efforts between ADP and Oracle Fusion Cloud HCM (managed by ADP), and oversee delivery of internal third-party integrations (e.g., benefits and retirement vendors). Identify, define, and document integration requirements in Jira for internal development teams to execute. Oracle Configuration Support Work with country teams to identify needs for changes to Oracle Absence Management and Time and Labor modules. Document required configuration changes and create user stories for execution by internal Scrum teams—this role does not perform configuration directly. Testing & Validation Support and coordinate functional testing and user acceptance testing (UAT). Ensure test coverage for payroll-related integrations and configurations. Training & Documentation Produce detailed documentation including process flows, integration logic, and user training materials. Ensure smooth handoffs to support teams and local stakeholders. Post-Go-Live Support ADP will manage payroll operations; this role will monitor and escalate any issues requiring updates to integrations or configurations via the backlog. Required Skills and Qualifications Bachelor’s degree in HR, Information Systems, Business, or related field. 5+ years of experience leading payroll system implementations, including ADP GlobalView or Celergo. Familiarity with Oracle Fusion Cloud HCM, especially Absence Management and Time and Labor. Strong project management and business analysis experience in global environments. Agile/Scrum experience; strong proficiency with Jira required. Understanding of system integration design and lifecycle management. Excellent communication, stakeholder management, and documentation skills. Preferred Qualifications Oracle certifications in Absence Management or Time and Labor. Experience with payroll delivery in emerging or equity markets. Familiarity with integration tools such as Oracle Integration Cloud or Informatica. Awareness of global compliance frameworks such as SOX, GDPR, PDPB. EoE Show more Show less
Posted 1 week ago
7.0 years
0 Lacs
India
Remote
Kindly share your resume lakshmi.b@iclanz.com or hr@iclanz.com Position: Lead Data Engineer - Health Care domain Experience: 7+ Years Location: Hyderabad | Chennai | Remote SUMMARY: Data Engineer will be responsible for ETL and documentation in building data warehouse and analytics capabilities. Additionally, maintain existing systems/processes and develop new features, along with reviewing, presenting and implementing performance improvements. Duties and Responsibilities • Build ETL (extract, transform, and loading) jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce, and AWS technologies • Monitoring active ETL jobs in production. • Build out data lineage artifacts to ensure all current and future systems are properly documented • Assist with the build out design/mapping documentation to ensure development is clear and testable for QA and UAT purposes • Assess current and future data transformation needs to recommend, develop, and train new data integration tool technologies • Discover efficiencies with shared data processes and batch schedules to help ensure no redundancy and smooth operations • Assist the Data Quality Analyst to implement checks and balances across all jobs to ensure data quality throughout the entire environment for current and future batch jobs. • Hands-on experience in developing and implementing large-scale data warehouses, Business Intelligence and MDM solutions, including Data Lakes/Data Vaults. Required Skills • This job has no supervisory responsibilities. • Bachelor's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 6+ years’ experience in business analytics, data science, software development, data modeling or data engineering work • 5+ years’ experience with a strong proficiency with SQL query/development skills • Develop ETL routines that manipulate and transfer large volumes of data and perform quality checks • Hands-on experience with ETL tools (e.g Informatica, Talend, dbt, Azure Data Factory) • Experience working in the healthcare industry with PHI/PII • Creative, lateral, and critical thinker • Excellent communicator • Well-developed interpersonal skills • Good at prioritizing tasks and time management • Ability to describe, create and implement new solutions • Experience with related or complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef) • Knowledge / Hands-on experience with BI tools and reporting software (e.g. Cognos, Power BI, Tableau) • Big Data stack (e.g. Snowflake(Snowpark), SPARK, MapReduce, Hadoop, Sqoop, Pig, HBase, Hive, Flume). Details Required for Submission: Requirement Name: First Name Last Name Email id: Best Number: Current Organization / Previous Organization you Worked (last date): Currently working on a project: Total Experience: Relevant Experience Primary Skills Years of Experience Ratings (out of 10) Data Engineer : ETL : Healthcare (PHI/PII): Fivetran: DBT: LinkedIn profile: Comfortable to work from 03.00 pm to 12.00 am IST? Communication: Education Details – Degree & Passed out year: Notice Period: Vendor Company Name: iClanz Inc expected Salary: Current Location / Preferred Location: Show more Show less
Posted 1 week ago
157.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
You are as unique as your background, experience and point of view. Here, you’ll be encouraged, empowered and challenged to be your best self. You'll work with dynamic colleagues - experts in their fields - who are eager to share their knowledge with you. Your leaders will inspire and help you reach your potential and soar to new heights. Every day, you'll have new and exciting opportunities to make life brighter for our Clients - who are at the heart of everything we do. Discover how you can make a difference in the lives of individuals, families and communities around the world. Job Description: Are you ready to shine? At Sun Life, we empower you to be your most brilliant self. Who we are? Sun Life is a leading financial services company with 157 years of history that helps our clients achieve lifetime financial security and live healthier lives. We serve millions in Canada, the U.S., Asia, the U.K., and other parts of the world. We have a network of Sun Life advisors, third-party partners, and other distributors. Through them, we’re helping set our clients free to live their lives their way, from now through retirement. We’re working hard to support their wellness and health management goals, too. That way, they can enjoy what matters most to them. And that’s anything from running a marathon to helping their grandchildren learn to ride a bike. To do this, we offer a broad range of protection and wealth products and services to individuals, businesses, and institutions, including: Insurance. Life, health, wellness, disability, critical illness, stop-loss, and long-term care insurance Investments. Mutual funds, segregated funds, annuities, and guaranteed investment products Advice. Financial planning and retirement planning services Asset management. Pooled funds, institutional portfolios, and pension funds With innovative technology, a strong distribution network and long-standing relationships with some of the world’s largest employers, we are today providing financial security to millions of people globally. Sun Life is a leading financial services company that helps our clients achieve lifetime financial security and live healthier lives, with strong insurance, asset management, investments, and financial advice portfolios. At Sun Life, our asset management business draws on the talent and experience of professionals from around the globe. Sun Life Global Solutions (SLGS) With 32 years of operations in the Philippines and 17 years in India, Sun Life Global Solutions, (formerly Asia Service Centres), a microcosm of Sun Life, is poised to harness the regions’ potential in a significant way - from India and the Philippines to the world. We are architecting and executing a BOLDER vision: being a Digital and Innovation Hub, shaping the Business, driving Transformation and superior Client experience by providing expert Technology, Business and Knowledge Services and advanced Solutions. We help our clients achieve lifetime financial security and live healthier lives – our core purpose and mission. Drawing on our collaborative and inclusive culture, we are reckoned as a ‘Great Place to Work’, ‘Top 100 Best Places to Work for Women’ and stand among the ‘Top 11 Global Business Services Companies’ across India and the Philippines. The technology function at Sun Life Global Solutions is geared towards growing our existing business, deepening our client understanding, managing new age technology systems, and demonstrating thought leadership. We are committed to building greater domain expertise and engineering ability, delivering end to end solutions for our clients, and taking a lead in intelligent automation. Tech services at Sun Life Global Solutions have evolved in areas such as application development and management, Support, Testing, Digital, Data Engineering and Analytics, Infrastructure Services and Project Management. We are constantly expanding our strength in Information technology and are looking for fresh talents who can bring ideas and values aligning with our Digital strategy. Our Client Impact strategy is motivated by the need to create an inclusive culture, empowered by highly engaged people. We are entering a new world that focuses on doing purpose driven work. The kind that fills your day with excitement and determination, because when you love what you do, it never feels like work. We want to create an environment where you feel empowered to act and are surrounded by people who challenge you, support you and inspire you to become the best version of yourself. As an employer, we not only want to attract top talent, but we want you to have the best Sun Life Experience. We strive to Shine Together, Make Life Brighter & Shape the Future! What will you do? Data is a strategic asset for Sun Life, and Data Management Platform ensures that every piece of data is well-documented, easily discoverable, and adheres to the data quality standards. Further, it enables modernizing data through consolidation, trusted views, advance matching algorithm, event-driven, and API capabilities. At Sun Life, we aim to enable our business partners and IT colleagues to deliver innovative client services that are BOLDER and future leaning. As the Data Management Platform (DMP) Technical Lead, you will be responsible for embedding a world class product development and engineering culture and organization. You will work with development, architecture and operations as well as platform teams to ensure we are delivering a best-in-class technology solution. You will work closely together with the Business Platform Owner to ensure an integrated end-to-end view across people and technology for the Business Platform. You will also “defend the faith” and work with stakeholders across the enterprise to ensure we are developing the right solutions. In parallel, you will focus on building a high-performing team that will thrive in a fast-paced continuous delivery engineering environment The role involves architecting, designing, and delivering solutions in tool stack including Informatica MDM SaaS, Informatica Data Quality, Collibra Data Governance, and other data tools. Key responsibilities: Shape technical strategy (e.g., build vs. buy decisions, technical road-mapping) in collaboration with architects Evaluate and identify appropriate technology stacks, platforms and vendors, including web application frameworks and cloud providers for solution development Attend team ceremonies as required; in particular, feature refinement and cross-team iteration reviews/demos Drive the resolution of technical impediments Own the 'success' of foundational enablers Champion for Research and Innovation Lead in scaled agile ceremonies and activities, like quarterly reviews, quarterly increment planning and OKR writing Collaborate with the Platform Owner in the writing and prioritization of technical capabilities and enablers Present platform delivery metrics, OKR health and platform finance status to Executive audiences Collaborate with other Technical Leads Create and maintain the technical roadmap for in-scope products and services at the platform/portfolio level Key experience: B.E. / B.Tech or equivalent Engineering professional Master’s degree or equivalent experience in Marketing, Business or finance is an added advantage 10+ yrs. of experience in technical architecture, solution design, and platform engineering Strong experience in MDM, Data Quality and Data Governance practices including tool stack such as Informatica MDM SaaS, Informatica Data Quality, and Collibra is a plus Good experience with major cloud platforms and data tools in cloud including but not limited to AWS, Microsoft Azure, Kafka, CDC, Tableau, and Data virtualization tools Good experience in ETL and BI solution development and tool stack – Informatica ETL experience is a plus Good experience in Data Architecture, SQL, NoSQL, REST API, data security, and AI concepts Familiarity with agile methodologies and data factory operations processes including usage of tools like confluence, Jira and Miro Strong knowledge of industry standards and regulations: A data platform owner should have knowledge of industry standards and regulations related to data management, such as HIPAA, PCI-DSS, and GDPR. Proven knowledge of working in financial services, preferably, insurance space Experience in senior engineering and technology roles working with teams to build/deliver digital products Experience in providing guidance and insight to establish governance processes, direction, and control, to ensure objectives are achieved and risks are managed appropriately for product development A leader who has a track record of onboarding, and developing engineering and product teams Experience as a technology leader who has defined and implemented technical strategies within complex organizations and is able to Influence and contribute to the higher-level engineering strategy Has insight into the newest technologies and trends and is an expert in product development with experience in code delivery and management of full stack technology Experience in digital capabilities such as DevSecOps, CI/CD, and agile release management A wide experience and understanding of architecture in terms of solution, data, and integration Can provide direct day-to-day engineering and technology problem-solving, coaching, direction, and guidance to Technical Leads and Senior Technical Leads within their Platform Strong leadership skills with an ability to influence a diverse group of stakeholders Ability to influence technical strategy at the BG and Enterprise level Experience working in Agile teams with a strong understanding of agile ways of working Experience managing technical priorities within an evolving product backlog Understands how to decompose large technical initiatives into actionable technical enablers Experience in the continuous improvement of software development workflows and pipelines Proven leadership; ability to articulate ideas to both technical and non-technical audiences Ability to communicate strategy and objectives, and align organizations to a common goal Strong problem solver with ability to lead the team to push the solution Ability to empower teams and encourage collaboration Ability to inspire people and teams, building momentum around a vision Critical thinker and passion to challenge status quo to find new solutions and drive out of the box ideas Believes in a non-hierarchical culture of collaboration, transparency and trust across the team Experimental mindset to drive innovation and continuous improvement of team Job Category: Scrum/Coach Posting End Date: 29/06/2025 Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Title: Data Engineer – C10/Officer (India) The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 3 to 4 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in at least one of the data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Exposure to ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 3 to 4 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in at least one of the data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Exposure to ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Pune/Pimpri-Chinchwad Area
On-site
Company Description Strategy (Nasdaq: MSTR) is at the forefront of transforming organizations into intelligent enterprises through data-driven innovation. We don't just follow trends—we set them and drive change. As a market leader in enterprise analytics and mobility software, we've pioneered BI and analytics space, empowering people to make better decisions and revolutionizing how businesses operate. But that's not all. Strategy is also leading to a groundbreaking shift in how companies approach their treasury reserve strategy, boldly adopting Bitcoin as a key asset. This visionary move is reshaping the financial landscape and solidifying our position as a forward-thinking, innovative force in the market. Four years after adopting the Bitcoin Standard, Strategy's stock has outperformed every company in S&P 500. Our people are the core of our success. At Strategy, you'll join a team of smart, creative minds working on dynamic projects with cutting-edge technologies. We thrive on curiosity, innovation, and a relentless pursuit of excellence. Our corporate values—bold, agile, engaged, impactful, and united—are the foundation of our culture. As we lead the charge into the new era of AI and financial innovation, we foster an environment where every employee's contributions are recognized and valued. Join us and be part of an organization that lives and breathes innovation every day. At Strategy, you're not just another employee; you're a crucial part of a mission to push the boundaries of analytics and redefine financial investment Job Description Reporting to the Senior Director of SaaS, the Salesforce Developer is responsible for customizing, developing and supporting solutions on the Salesforce and ServiceNow platform. The ideal candidate will strong understanding of the Salesforce.com and ServiceNow platform, and basic to intermediate understanding of integrations, single sign, security, etc. (Informatica, other ETL tools a huge plus), interest and ability to understand the problem to solve, solution design and critical path to develop. The candidate will have exceptional technical, analytical and problem-solving skills and be comfortable interacting with all levels of the organization. We are seeking a self-starter, with a bias towards action, who can recognize and make process improvement recommendations. Responsibilities: Perform day-to-day administration of the ServiceNow system, including making approved changes to forms, tables, reports, and workflows Create and customize reports, homepages, and dashboards in ServiceNow Ensure the ServiceNow platform and tools remain current by performing testing and installation of ServiceNow updates, patches, and new releases Create and configure Business Rules, UI Policies, UI Actions and ScriptsDesign and develop advanced ServiceNow customizations Troubleshoot multiple integrations with ServiceNow and Rally. Manage ServiceNow security by managing roles and access control lists Train personnel in ServiceNow use and processes to include creating supporting documentation including user and training guides Work directly with end users to resolve support issues within ServiceNow Oversee code reviews. Design, develop, configure, test and deploy solutions built on Salesforce platform Responsible for configuration, design, functionality and end-user support of the Force.com platform Implement solutions in an agile environment delivering high-quality code and configuration Develop and manage Workflows, Process Builder, Assignment rules, email templates, and all other declarative and programmatic features. Handle mass imports and exports of data. Customize custom objects, fields, reports, 3rd party apps etc. Manage Users, Profiles, Permission Sets, Security, and other administrations tasks. Lead testing of various functionalities, create test data, test plans and perform feature testing. Demo solutions to users, train and document as needed. Provide on-going support and system administration to quickly fix production issues. Map functional requirements to Salesforce.com features and functionality. Implement change control and best practices with regards to system maintenance, configuration, development, testing, data integrity, etc. Hands-on Sales cloud, ServiceNow and Salesforce Community experience Have a programming background with the ability to develop custom code using Visualforce/Apex/Lightning/JavaScript to meet user requirements Know when to use out-of-the-box functionality versus custom code We Are Seeking Candidates With: Outstanding listening, analytical, organizational and time management skills. Excellent written and oral communication skills; Demonstrates a high level of diplomacy and professionalism. Strong work ethic, hands-on, with a customer service mentality. Team player, ability to work cross-functionally, self-driven, motivated, and able to work under pressure. Able to work independently and lead projects of moderate complexity. Ability to identify areas for process improvement and recommend/implement solutions Proven creativity and problem solving skills; ability to work around obstacles and solve problems with minimal direction Ability to develop effective relationships with business users, technical staff and executive management Ability to prioritize work and meet deadlines in a fast-paced environment Flexibility with a demonstrated ability to embrace change Bachelor's Degree BA/BS in Computer Science or similar technical degree or equivalent experience 3+ years of hands-on experience developing Salesforce and ServiceNow In-depth knowledge of Salesforce and ServiceNow programmatic features Outstanding listening, analytical, organizational and time management skills. Ability to dig into data, surface actionable insights and demonstrates sound judgement and decision-making skills. A problem-solver at heart. Excellent written and oral communication skills; Demonstrates a high level of diplomacy and professionalism. Strong work ethic, hands-on, with a customer service mentality. Team player, ability to work cross-functionally, self-driven, motivated, and able to work under pressure Extensive experience in using Data Loader and other data loading tools Additional background in SeviceNow, Community, CPQ, Marketo and other integrations a plus Experience using MS Excel and Database modeling Ability to work independently and be proactive Be able to work under pressure, multi-task, manage changing priorities and workload Additional Information The recruitment process includes online assessments as a first step. We send them via e-mail, please check also your SPAM folder. We work from Pune office 4 times a week Qualifications We are seeking candidates with: Outstanding listening, analytical, organizational and time management skills. Excellent written and oral communication skills; Demonstrates a high level of diplomacy and professionalism. Strong work ethic, hands-on, with a customer service mentality. Team player, ability to work cross-functionally, self-driven, motivated, and able to work under pressure. Able to work independently and lead projects of moderate complexity. Ability to identify areas for process improvement and recommend/implement solutions Proven creativity and problem solving skills; ability to work around obstacles and solve problems with minimal direction Ability to develop effective relationships with business users, technical staff and executive management Ability to prioritize work and meet deadlines in a fast-paced environment Flexibility with a demonstrated ability to embrace change Bachelor's Degree BA/BS in Computer Science or similar technical degree or equivalent experience 3+ years of hands-on experience developing Salesforce and ServiceNow In-depth knowledge of Salesforce and ServiceNow programmatic features Outstanding listening, analytical, organizational and time management skills. Ability to dig into data, surface actionable insights and demonstrates sound judgement and decision-making skills. A problem-solver at heart. Excellent written and oral communication skills; Demonstrates a high level of diplomacy and professionalism. Strong work ethic, hands-on, with a customer service mentality. Team player, ability to work cross-functionally, self-driven, motivated, and able to work under pressure Extensive experience in using Data Loader and other data loading tools Additional background in SeviceNow, Community, CPQ, Marketo and other integrations a plus Experience using MS Excel and Database modeling Ability to work independently and be proactive Be able to work under pressure, multi-task, manage changing priorities and workload Additional Information The recruitment process includes online assessments as a first step. We send them via e-mail, please check also your SPAM folder. We work from Pune office 4 times a week Show more Show less
Posted 1 week ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Bachelor’s degree in computer science, engineering, or a related field. Master’s degree preferred. Data: 6+ years of experience with data analytics and data warehousing. Sound knowledge of data warehousing concepts. SQL: 6+ years of hands-on experience on SQL and query optimization for data pipelines. ELT/ETL: 6+ years of experience in Informatica/ 3+ years of experience in IICS/IDMC Migration Experience: Experience Informatica on prem to IICS/IDMC migration Cloud: 5+ years’ experience working in AWS cloud environment Python: 5+ years of hands-on experience of development with Python Workflow: 4+ years of experience in orchestration and scheduling tools (e.g. Apache Airflow) Advanced Data Processing: Experience using data processing technologies such as Apache Spark or Kafka Troubleshooting: Experience with troubleshooting and root cause analysis to determine and remediate potential issues Communication: Excellent communication, problem-solving and organizational and analytical skills Able to work independently and to provide leadership to small teams of developers. Reporting: Experience with data reporting (e.g. MicroStrategy, Tableau, Looker) and data cataloging tools (e.g. Alation) Experience in Design and Implementation of ETL solutions with effective design and optimized performance, ETL Development with industry standard recommendations for jobs recovery, fail over, logging, alerting mechanisms. Show more Show less
Posted 1 week ago
4.0 - 9.0 years
35 - 40 Lacs
Bengaluru
Work from Office
we're looking for Solution Architect candidate with experience in Pre- Sales/ Technical sales or Consulting experience with Data Management, MDM, or Data Governance to join our team in Bangalore- Hybrid You will report to the Technical Sales Manager As a Solution Architect, you will engage with internal collaborators and customers and partners to develop technical account plan/strategy and increase sales engagements to technical closure. As an important member of the tech sales team, you will ensure success by capturing customer use cases and designing complex technical solutions on top of Informatica Data Management Cloud. You will educate customers and team members on the Informatica value proposition and participate in deep architectural discussions to ensure solutions are designed for successful deployment and use in the cloud. Within our technical sales community you will be an active, constant learner with a stay up to date with industry trends. You will mentor others on the team and contributing to the development of shared selling assets, knowledge stores, and enablement materials. To ensure sustained revenue growth you will contribute to pipeline generation by delivering technical workshops, being active in social media promotions, developing content for external circulation, and participating in industry marketing events. Your Role Responsibilities? Heres What you'll Do Manage customer engagements independently. Share best practices, content, and tips and tricks within primary responsibilities. Stay current on certification of services required for responsibilities. Perform activities leading up to the delivery of a customer demo with little assistance including discovery, technical qualification/fit, customer presentations, standard demos, and related customer-facing communication. Lead on RFP responses and POCs. Create customized demos. Partner with the CSM team on nurture activities including technical advisory, workshops, etc Provide customer feedback on product gaps. Support demos at marketing events. Conduct technical workshops with customers. What we'd Like to See Basic certification on at least one cloud ecosystem and a data related cloud technology. Intermediate knowledge of security for cloud computing. Advanced level skills for Informatica services and product capabilities in respective major or area of focus. Storytelling and presentation skills specific to use cases. Ability to engage and create relationships with influencers, coaches, decision makers, and partners. Intermediate technical knowledge of hybrid deployment of software solutions, data warehousing, database, or business intelligence software concepts and products. Role Essentials 4+ years of relevant experience in Data Management, MDM, or Data Governance 5 years of prior presales/technical sales or consulting experience BA/BS or equivalent educational background This is a hybrid remote/in-office role. Perks & Benefits Comprehensive health, vision, and we'llness benefits (Paid parental leave, adoption benefits, life insurance, disability insurance and 401k plan or international pension/retirement plans Flexible time-off policy and hybrid working practices Equity opportunities and an employee stock purchase program (ESPP) Comprehensive Mental Health and Employee Assistance Program (EAP) benefit
Posted 1 week ago
2.0 - 7.0 years
10 - 14 Lacs
Bengaluru
Work from Office
we're looking for an MDM Support Engineer candidate with experience in Informatica MDM, SQL, Shell Scripting to join our team in Bangalore Office. You will report to the Manager, Techical Support Technology you'll Use Informatica MDM OR CDI (cloud data integration) OR CAI (cloud application integration) AND scripting. Your Role Responsibilities? Heres What you'll Do As a part of the MDM Technical Support, you will ensure our customers success and satisfaction with our products and contributing to their long-term loyalty. You will work with the MDM support team, QA, Engineering, Solutions Delivery, Sales, and Product Management to ensure that MDM is delivering good support to our customers. Additional responsibilities include the following: Manage customer support technical issues daily, including verifying issues, isolating and diagnosing the problem, and resolving the issue. Provide technical support to partners, sales engineers and post-sales consultants via telephone, email and the web. Reproduce product behaviours to determine the problem root-cause(s), issue work-arounds and solutions. Coordinate with Quality Assurance and Engineering teams to report and solve product defects. Author, edit, publish an online knowledge base of known issues/solutions. What we'd Like to See Articulate we'll and have skills in customer relatonship - responsiveness, sensitivity, diplomacy Are comfortable working both independently and collaboratively. you're advanced problem-solving skills and technical aptitudes allow you to adapt to new circumstances and learn when facing new problems and challenges. Applying your business knowledge and resource management skills you meet requirement, and set the example for good work procedures. In addition to the attributes mentioned, you'll also be able to: Inspire and motivate people to lead support behind the vision, make it sharable by everyone. Role Essentials College degree in computer science related subject mandatory. Analyse, debug and trouble-shoot skills, Minimum 2+ years of experience on Informatica MDM or CDI or CAI is mandatory Perks & Benefits Comprehensive health, vision, and we'llness benefits (Paid parental leave, adoption benefits, life insurance, disability insurance and 401k plan or international pension/retirement plans Flexible time-off policy and hybrid working practices Tuition reimbursement programme to support your and personal growth Equity opportunities and an employee stock purchase program (ESPP) Comprehensive Mental Health and Employee Assistance Program (EAP) benefit
Posted 1 week ago
2.0 - 4.0 years
3 - 6 Lacs
Hyderabad, Bengaluru
Work from Office
Paramatrix Technologies Pvt. Ltd is looking for Data Modeller to join our dynamic team and embark on a rewarding career journey As a Data Modeler, you will be responsible for designing and implementing data models, ensuring the integrity and performance of databases, and collaborating with other teams to understand data requirements Your role is pivotal in creating efficient and effective data solutions that align with business objectives Key Responsibilities:Data Modeling:Develop, design, and maintain conceptual, logical, and physical data models based on business requirements Ensure that data models are scalable, flexible, and support future business needs Database Design:Collaborate with database administrators and developers to implement and optimize database structures Design and implement indexing strategies to improve database performance Requirements Analysis:Work closely with business analysts and stakeholders to understand data requirements and translate them into data models Documentation:Create and maintain comprehensive documentation for data models, ensuring clarity and accessibility for other team members Data Governance:Implement and enforce data governance policies and best practices to ensure data quality and consistency Collaborate with data stewards to define and manage data standards Data Integration:Collaborate with ETL (Extract, Transform, Load) developers to ensure smooth data integration processes Design and optimize data integration workflows Data Quality Assurance:Implement data quality checks and validation processes to ensure the accuracy and reliability of data Collaboration:Work closely with cross-functional teams, including business analysts, data scientists, and software developers, to ensure seamless integration of data models into applications
Posted 1 week ago
4.0 - 8.0 years
11 - 15 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Youll do this by: Having good communication skill. Work with client directly for any issue or concern. Core Skills: Informatica PC/ Informatica Cloud (IICS) & Oracle, Snowflake, SQL. Desired Skills: Informatica PC/ Informatica Cloud (IICS) & Oracle, Snowflake, SQL. Good to have knowledge of Jira, Github and Jenkins How we d like you to lead: Direct interaction with client on daily basis for gathering requirements, for reviewing and verifying the developed code.
Posted 1 week ago
5.0 - 9.0 years
11 - 12 Lacs
Bengaluru
Work from Office
5 to 9 years experience Nice to have Worked in hp eco system (FDL architecture) Databricks + SQL combination is must EXPERIENCE 6-8 Years SKILLS Primary Skill: Data Engineering Sub Skill(s): Data Engineering Additional Skill(s): databricks, SQL
Posted 1 week ago
8.0 - 10.0 years
25 - 30 Lacs
Hyderabad
Work from Office
Total Yrs. of Experience* 8 - 10years Detailed JD *(Roles and Responsibilities) Mandatory skills* Minimum 5 years of relevant informatica technical experience with Enterprise Data Warehouse(DWH) design and development using Informatica Design, develop, and optimize robust ETL/ELT pipelines using Informatica PowerCenter and other data integration tools. Manage and maintain IPP application related components BDM Ability to do performance tuning of Informatica BDM jobs Write complex SQL queries for data extraction, transformation, and analysis across relational databases. Architect and implement data solutions leveraging AWS services such as S3 Collaborate with business stakeholders to understand data requirements and deliver solutions. Monitor and troubleshoot data pipelines to ensure high availability and data quality. Implement best practices in data governance, security, and compliance. Create, maintain, review design documentations Implement and manage Change Requests pertaining to IPP application related components S3, BDM and MFT Work with other relevant teams (SCP, IOC, Production Control, Infra, Security, Audit) as required Work with platform and application teams to manage and close incidents / issues and implement changes, to troubleshoot complex Data issues and resolve them within SLA Documenting and updating of Standard Operating Procedures (SOPs) Desired skills* Architect and implement data solutions leveraging Python, lambda, EMR Build and maintain scalable data workflows and orchestration using Apache Airflow.
Posted 1 week ago
8.0 - 13.0 years
25 - 30 Lacs
Bengaluru
Work from Office
JD: 1) Must have 8+ years of experience in Python / Java, Spark, Scala, Hive, Microsoft Azure Cloud Services (Data Bricks platform and developer tools) 2) 4 to 8 years of experience in Data Warehouse, ETL, Snowflake and Report Testing 3) Strong in writing SQL scripts & Database knowledge on Oracle, SQL Server, Snowflake 4) Hands on working experience with any of the ETL tools, preferably Informatica and Report / Dashboard tools 5) Ability to work independently and 12:30 pm to 9:30 pm timings Good to have, Data Processing Ability to build optimized and cleaned ETL pipelines using Databricks flows, Scala, Python, Spark, SQL Testing and Deployment Preparing pipelines for deployment Data Modeling Knowledge of general data modeling concepts to model data into a lakehouse Building custom utilities using Python for ETL automation Experience working in agile(scrum) environment and usage of tools like JIRA. Must haves Data Bricks: 4/5 Pyspark with Scala: 4/5 SQL: 3/5
Posted 1 week ago
5.0 - 9.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Job Summary: We are seeking a skilled Data Conversion Test Lead / ETL Test Lead to lead testing efforts for large-scale data migration and ETL projects. The ideal candidate will have strong expertise in validating data transformations, data integrity, and migration processes, with the ability to manage teams and ensure test quality across multiple phases of the project lifecycle. Key Responsibilities: Lead testing for data conversion, migration, and ETL processes across multiple source and target systems. Analyze data mapping and transformation rules and translate them into comprehensive test cases and validation scenarios. Define test strategies and plans for data migration validation, including row-level and column-level comparisons, data quality checks, and exception handling. Collaborate with business analysts, developers, and data architects to understand source-to-target mappings. Validate data extraction, transformation, and loading processes to ensure accuracy and completeness. Develop and execute complex SQL queries for backend data validation. Coordinate with cross-functional teams during SIT, UAT, and production validation phases. Track defects and work with development teams to ensure timely resolution. Document and report testing progress, risks, and metrics to stakeholders and leadership. Mentor junior team members and enforce testing best practices. Required Skills: Proven experience as a Test Lead for data migration, conversion, and ETL testing projects. Strong hands-on experience with ETL tools such as Informatica, Talend, DataStage, or equivalent. Excellent SQL skills for data validation, profiling, and reconciliation. Experience in handling large volumes of data and performance testing of ETL jobs. Familiarity with data warehousing concepts, dimensional modeling, and data governance. Proficiency in test management and defect tracking tools like JIRA, TestRail, or Client ALM. Strong analytical thinking and problem-solving abilities. Excellent communication and coordination skills. Nice to Have: Experience with cloud-based data platforms (AWS Redshift, Azure Synapse, GCP BigQuery). Exposure to automation frameworks for data validation. Knowledge of industry-specific data models (Finance, Healthcare, Insurance, etc.). ISTQB or similar testing certification. Education: Bachelors or Masters degree in Computer Science, Information Systems, or a related field. ",
Posted 1 week ago
9.0 - 12.0 years
14 - 18 Lacs
Bengaluru
Work from Office
Experience: 9 - 12 Years Location: Bangalore / Hyderabad Notice Period: Immediate to 15 Days Overview We are looking for a highly experienced and strategic Snowflake Data Architect to lead the transformation and modernization of our data architecture. You will be responsible for designing scalable, high-performance data solutions and ensuring seamless data quality and integration across the organization. This role requires close collaboration with data modelers, business stakeholders, governance teams, and engineers to develop robust and efficient data architectures. This is an excellent opportunity to join a dynamic, innovation-driven environment with significant room for professional growth. We encourage initiative, creative problem-solving, and a proactive approach to optimizing our data ecosystem. Responsibilities Architect, design, and implement scalable data solutions using Snowflake . Build and maintain efficient data pipelines using SQL and ETL tools to integrate data from multiple ERP and other source systems. Leverage data mappings, modeling (2NF/3NF) , and best practices to ensure consistent and accurate data structures. Collaborate with stakeholders to gather requirements and design data models that support business needs. Optimize and debug complex SQL queries and ensure performance tuning of pipelines. Create secure, reusable, and maintainable components for data ingestion and transformation workflows. Implement and maintain data quality frameworks , ensuring adherence to governance standards. Lead User Acceptance Testing (UAT) support, production deployment activities, and manage change requests. Produce comprehensive technical documentation for future reference and auditing purposes. Provide technical leadership in the use of cloud platforms (Snowflake, AWS) and support teams through knowledge transfer. Requirements Bachelor s degree in Computer Science, Information Technology, or a related field. 9 to 12 years of overall experience in data engineering and architecture roles. Strong, hands-on expertise in Snowflake with a solid understanding of its advanced features. Proficient in advanced SQL with extensive experience in data transformation and pipeline optimization . Deep understanding of data modeling techniques , especially 2NF/3NF normalization . Experience with cloud-native platforms , especially AWS (S3, Glue, Lambda, Step Functions) is highly desirable. Knowledge of ETL tools (Informatica, Talend, etc.) and working in agile environments . Familiarity with structured deployment workflows (e.g., Carrier CAB process). Strong debugging, troubleshooting , and analytical skills . Excellent communication and stakeholder management skills. Key Skills Snowflake (Advanced) SQL (Expert) Data Modeling (2NF/3NF) ETL Tools AWS (S3, Glue, Lambda, Step Functions) Agile Development Data Quality & Governance Performance Optimization Technical Documentation Stakeholder Collaboration
Posted 1 week ago
5.0 - 10.0 years
5 - 9 Lacs
Kochi, Bengaluru
Work from Office
Overview Join global organization with 82000+ employees around the world, as a ETL Data Brick Developer role based in IQVIA Bangalore. You will be part of IQVIA s world class technology team and will be involved in design, development, enhanced software programs or cloud applications or proprietary products. The selected candidate will primarily work on Databricks and Reltio projects, focusing on data integration and transformation tasks. This role requires a deep understanding of Databricks, ETL tools, and data warehousing/data lake concepts. Experience in the life sciences domain is preferred. Candidate with Databricks certification is preferred. Key Responsibilities: Design, develop, and maintain data integration solutions using Databricks. Collaborate with cross-functional teams to understand data requirements and deliver efficient data solutions. Implement ETL processes to extract, transform, and load data from various sources into data warehouses/data lakes. Optimize and troubleshoot Databricks workflows and performance issues. Ensure data quality and integrity throughout the data lifecycle. Provide technical guidance and mentorship to junior developers. Stay updated with the latest industry trends and best practices in data integration and Databricks. Required Qualifications: Bachelor s degree in computer science or equivalent. Minimum of 5 years of hands-on experience with Databricks. Strong knowledge of any ETL tool (e.g., Informatica, Talend, SSIS). Well-versed in data warehousing and data lake concepts. Proficient in SQL and Python for data manipulation and analysis. Experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services. Excellent problem-solving skills. Strong communication and collaboration skills. Preferred Qualifications: Certified Databricks Engineer. Experience in the life sciences domain. Familiarity with Reltio or similar MDM (Master Data Management) tools. Experience with data governance and data security best practices.
Posted 1 week ago
2.0 - 6.0 years
3 - 8 Lacs
Pune, Sangli
Work from Office
We are looking for a Data Science Engineer with strong experience in ETL development and Talend to join our data and analytics team. The ideal candidate will be responsible for designing robust data pipelines, enabling analytics and AI solutions, and working on scalable data science projects that drive business value. Key Responsibilities: Design, build, and maintain ETL pipelines using Talend Data Integration . Extract data from multiple sources (databases, APIs, flat files) and load it into data warehouses or lakes. Ensure data integrity , quality , and performance tuning in ETL workflows. Implement job scheduling, logging, and exception handling using Talend and orchestration tools. Prepare and transform large datasets for analytics and machine learning use cases. Build and deploy data pipelines that feed predictive models and business intelligence platforms. Collaborate with data scientists to operationalize ML models and ensure they run efficiently at scale. Assist in feature engineering , data labeling , and model monitoring processes. Required Skills & Qualifications: Bachelors or Masters degree in Computer Science, Information Technology, Data Engineering, or a related field. 3+ years of experience in ETL development , with at least 2 years using Talend . Proficiency in SQL , Python (for data transformation or automation) Hands-on experience with data integration , data modeling , and data warehousing . Must have Strong Knowledge of cloud platforms such as AWS , Azure , or Google Cloud . Familiarity with big data tools like Spark, Hadoop, or Kafka is a plus.
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
As an employee at Thomson Reuters, you will play a role in shaping and leading the global knowledge economy. Our technology drives global markets and helps professionals around the world make decisions that matter. As the world’s leading provider of intelligent information, we want your unique perspective to create the solutions that advance our business and your career.Our Service Management function is transforming into a truly global, data and standards-driven organization, employing best-in-class tools and practices across all disciplines of Technology Operations. This will drive ever-greater stability and consistency of service across the technology estate as we drive towards optimal Customer and Employee experience. About the role: In this opportunity as Application Support Analyst, you will: Experience on Informatica support. The engineer will be responsible for supporting Informatica Development, Extractions, and loading. Fixing the data discrepancies and take care of performance monitoring. Collaborate with stakeholders such as business teams, product owners, and project management in defining roadmaps for applications and processes. Drive continual service improvement and innovation in productivity, software quality, and reliability, including meeting/exceeding SLAs. Thorough understanding of ITIL processes related to incident management, problem management, application life cycle management, operational health management. Experience in supporting applications built on modern application architecture and cloud infrastructure, Informatica PowerCenter/IDQ, Javascript frameworks and Libraries, HTML/CSS/JS, Node.JS, TypeScript, jQuery, Docker, AWS/Azure. About You: You're a fit for the role of Application Support Analyst - Informatica if your background includes: 3 to 8+ experienced Informatica Developer and Support will be responsible for implementation of ETL methodology in Data Extraction, Transformation and Loading. Have Knowledge in ETL Design of new or changing mappings and workflows with the team and prepares technical specifications. Should have experience in creating ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter 10.x and prepare corresponding documentation. Designs and builds integrations supporting standard data warehousing objects (type-2 dimensions, aggregations, star schema, etc.). Should be able to perform source system analysis as required. Works with DBAs and Data Architects to plan and implement appropriate data partitioning strategy in Enterprise Data Warehouse. Implements versioning of the ETL repository and supporting code as necessary. Develops stored procedures, database triggers and SQL queries where needed. Implements best practices and tunes SQL code for optimization. Loads data from SF Power Exchange to Relational database using Informatica. Works with XML's, XML parser, Java and HTTP transformation within Informatica. Experience in Integration of various data sources like Oracle, SQL Server, DB2 and Flat Files in various formats like fixed width, CSV, Salesforce and excel Manage. Have in depth knowledge and experience in implementing the best practices for design and development of data warehouses using Star schema & Snowflake schema design concepts. Experience in Performance Tuning of sources, targets, mappings, transformations, and sessions Carried out support and development activities in a relational database environment, designed tables, procedures/Functions, Packages, Triggers and Views in relational databases and used SQL proficiently in database programming using SNFL Thousand Coffees Thomson Reuters café networking. What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
As an employee at Thomson Reuters, you will play a role in shaping and leading the global knowledge economy. Our technology drives global markets and helps professionals around the world make decisions that matter. As the world’s leading provider of intelligent information, we want your unique perspective to create the solutions that advance our business and your career.Our Service Management function is transforming into a truly global, data and standards-driven organization, employing best-in-class tools and practices across all disciplines of Technology Operations. This will drive ever-greater stability and consistency of service across the technology estate as we drive towards optimal Customer and Employee experience. About the role: In this opportunity as Application Support Analyst, you will: Experience on Informatica support. The engineer will be responsible for supporting Informatica Development, Extractions, and loading. Fixing the data discrepancies and take care of performance monitoring. Collaborate with stakeholders such as business teams, product owners, and project management in defining roadmaps for applications and processes. Drive continual service improvement and innovation in productivity, software quality, and reliability, including meeting/exceeding SLAs. Thorough understanding of ITIL processes related to incident management, problem management, application life cycle management, operational health management. Experience in supporting applications built on modern application architecture and cloud infrastructure, Informatica PowerCenter/IDQ, Javascript frameworks and Libraries, HTML/CSS/JS, Node.JS, TypeScript, jQuery, Docker, AWS/Azure. About You: You're a fit for the role of Application Support Analyst - Informatica if your background includes: 3 to 8+ experienced Informatica Developer and Support will be responsible for implementation of ETL methodology in Data Extraction, Transformation and Loading. Have Knowledge in ETL Design of new or changing mappings and workflows with the team and prepares technical specifications. Should have experience in creating ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter 10.x and prepare corresponding documentation. Designs and builds integrations supporting standard data warehousing objects (type-2 dimensions, aggregations, star schema, etc.). Should be able to perform source system analysis as required. Works with DBAs and Data Architects to plan and implement appropriate data partitioning strategy in Enterprise Data Warehouse. Implements versioning of the ETL repository and supporting code as necessary. Develops stored procedures, database triggers and SQL queries where needed. Implements best practices and tunes SQL code for optimization. Loads data from SF Power Exchange to Relational database using Informatica. Works with XML's, XML parser, Java and HTTP transformation within Informatica. Experience in Integration of various data sources like Oracle, SQL Server, DB2 and Flat Files in various formats like fixed width, CSV, Salesforce and excel Manage. Have in depth knowledge and experience in implementing the best practices for design and development of data warehouses using Star schema & Snowflake schema design concepts. Experience in Performance Tuning of sources, targets, mappings, transformations, and sessions Carried out support and development activities in a relational database environment, designed tables, procedures/Functions, Packages, Triggers and Views in relational databases and used SQL proficiently in database programming using SNFL Thousand Coffees Thomson Reuters café networking. What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com. Show more Show less
Posted 1 week ago
2.0 - 7.0 years
4 - 9 Lacs
Pune
Work from Office
Responsibilities A day in the life of an Infoscion: Support the Service & Products across several technical domains with full spectrum of Production Support responsibilities Uphold high standards for timely issue resolution Ensure workflows, processes, tooling and applications are of the highest quality standard Contribute expertise to the management of existing and new IT products and services Define workarounds for known errors and initiate process improvements Maintain a knowledge database Flexible to work in APAC/EMEA hours Technical and Professional Requirements: ideally 4-6 years of hands-on experience in IT support role within Financial services industry proficient with software development tools, such as Unix, Shell Scripting, Oracle, Sybase, MSSQL, Autosys, Informatica, Splunk, AppDynamics, Java good Knowledge of ITIL processes, knowledge of Cloud Technologies (Azure), RESTful APIs, Microservices, Compliance and Legal IT systems is an added advantage. ability to solve complex issues, good at problem statement analysis and solution design thinking track record of influencing senior IT stakeholders and business partners confident communicator that can explain technology to non-technical audiences capable of understanding client needs and translating this into products and services contribute expertise to the management of existing and new IT products and services define workarounds for known errors and initiate process improvements maintain a knowledge database flexible to work in APAC/EMEA hours Preferred Skills: Application Support Technology->Architecture->Architecture - ITIL Technology->Open System->Shell scripting Technology->Open System->UNIX Technology->Infrastructure- Batch Scheduler->Autosys Technology->Oracle->PL/SQL Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Educational Requirements Master Of Engineering,MBA,MCA,MTech,Bachelor of Engineering,BCA,BTech Service Line Cloud & Infrastructure Services * Location of posting is subject to business requirements
Posted 1 week ago
12.0 - 17.0 years
25 - 30 Lacs
Hyderabad
Work from Office
Overview We are seeking an experienced and strategic leader to join our Business Intelligence & Reporting organization as Deputy Director BI Governance. This role will lead the design, implementation, and ongoing management of BI governance frameworks across sectors and capability centres. The ideal candidate will bring deep expertise in BI governance, data stewardship, demand management, and stakeholder engagement to ensure a standardized, scalable, and value-driven BI ecosystem across the enterprise. Responsibilities Key Responsibilities Governance Leadership Define and implement the enterprise BI governance strategy, policies, and operating model. Drive consistent governance processes across sectors and global capability centers. Set standards for BI solution lifecycle, metadata management, report rationalization, and data access controls. Stakeholder Management Serve as a trusted partner to sector business leaders, IT, data stewards, and COEs to ensure alignment with business priorities. Lead governance councils, working groups, and decision forums to drive adoption and compliance. Policy and Compliance Establish and enforce policies related to report publishing rights, tool usage, naming conventions, and version control. Implement approval and exception processes for BI development outside the COE. Demand and Intake Governance Lead the governance of BI demand intake and prioritization processes. Ensure transparency and traceability of BI requests and outcomes across business units. Metrics and Continuous Improvement Define KPIs and dashboards to monitor BI governance maturity and compliance. Identify areas for process optimization and lead continuous improvement efforts. Qualifications Experience12+ years in Business Intelligence, Data Governance, or related roles, with at least 4+ years in a leadership capacity. Domain ExpertiseStrong understanding of BI platforms (Power BI, Tableau, etc.), data management practices, and governance frameworks Strategic MindsetProven ability to drive change, influence at senior levels, and align governance initiatives with enterprise goals. Operational ExcellenceExperience managing cross-functional governance processes and balancing centralized control with local flexibility. EducationBachelor's degree required; MBA or Masters in Data/Analytics preferred.
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Hands on Informatica power center or IICS Worked on Various basic and advanced transformation in informatica and multiple source systems Having experience on source to target mapping documents creation Get the requirement from stake holders and convert the same into Detail design documents Ability to debug the issues in informatica IICS and power centre worked on API and rest v2 connectors hands on experience in Oracle/SQL Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The informatica job market in India is thriving with numerous opportunities for skilled professionals in this field. Companies across various industries are actively hiring informatica experts to manage and optimize their data integration and data quality processes.
The average salary range for informatica professionals in India varies based on experience and expertise: - Entry-level: INR 3-5 lakhs per annum - Mid-level: INR 6-10 lakhs per annum - Experienced: INR 12-20 lakhs per annum
A typical career progression in the informatica field may include roles such as: - Junior Developer - Informatica Developer - Senior Developer - Informatica Tech Lead - Informatica Architect
In addition to informatica expertise, professionals in this field are often expected to have skills in: - SQL - Data warehousing - ETL tools - Data modeling - Data analysis
As you prepare for informatica job opportunities in India, make sure to enhance your skills, stay updated with the latest trends in data integration, and approach interviews with confidence. With the right knowledge and expertise, you can excel in the informatica field and secure rewarding career opportunities. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.