Jobs
Interviews

5357 Informatica Jobs - Page 19

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 15.0 years

20 - 30 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Hybrid

Role: Lead Data Engineer Exp.: 10+ years Location: Pune, Bengaluru, Hyderabad, Chennai, Gurugram, Noida Work Mode: Hybrid (3 days work from office) Key Skills: Snowflake, SQL, Data Engineering, ETL, Any Cloud (GCP/AWS/Azure) Must Have Skills: Proficient in snowflake and SQL 4+ years of experience in snowflake and 8+ years of experience in SQL Atleast 10+ years of experience in data engineering development project Atleast 6+ years of experience in Data Engineering in the cloud technology Strong expertise with Snowflake data warehouse platform, including architecture, features, and best practices. Hands on experience ETL and DE tools Design, develop, and maintain efficient ETL/ELT pipelines using Snowflake and related data engineering tools. Optimize Snowflake data warehouses for performance, cost, and scalability. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver data solutions. Implement data modeling and schema design best practices in Snowflake. Good communication skills is a must Good to have skills: Knowledge of DNA/Fiserv- core banking system Knowledge of data governance, security, and compliance standards

Posted 1 week ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Hyderabad

Work from Office

The Minimum Qualifications 7-9 years of experience with data analytics, data modelling, and database design. 3+ years of coding and scripting (Python, Java, Scala) and design experience. 3+ years of experience with Spark framework. 5+ Experience with ELT methodologies and tools. 5+ years mastery in designing, developing, tuning and troubleshooting SQL. Knowledge of Informatica Power center and Informatica IDMC. Knowledge of distributed, column- orientated technology to create high-performant database technologies like - Vertica, Snowflake. Strong data analysis skills for extracting insights from financial data Proficiency in reporting tools (e.g., Power BI, Tableau). The Ideal Qualifications Technical Skills: Domain knowledge of Investment Management operations including Security Masters, Securities Trade and Recon Operations, Reference data management, and Pricing. Familiarity with regulatory requirements and compliance standards in the investment management industry.

Posted 1 week ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Role Description Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures Of Outcomes Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration Define and govern the configuration management plan. Ensure compliance within the team. Testing Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management Manage the delivery of modules effectively. Defect Management Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation Create and provide input for effort and size estimation for projects. Knowledge Management Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management Execute and monitor the release process to ensure smooth transitions. Design Contribution Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments Skills Cloud Platforms ( AWS, MS Azure, GC etc.) Containerization and Orchestration ( Docker, Kubernetes etc..) APIs - Change APIs to APIs development Data Pipeline construction using languages like Python, PySpark, and SQL Data Streaming (Kafka and Azure Event Hub etc..) Data Parsing ( Akka and MinIO etc..) Database Management ( SQL and NoSQL, including Clickhouse, PostgreSQL etc..) Agile Methodology ( Git, Jenkins, or Azure DevOps etc..) JS like Connectors/ framework for frontend/backend Collaboration and Communication Skills Aws Cloud,Azure Cloud,Docker,Kubernetes

Posted 1 week ago

Apply

5.0 - 10.0 years

25 - 35 Lacs

Noida, Pune, Bengaluru

Work from Office

Description: We are seeking a proficient Data Governance Engineer to lead the development and management of robust data governance frameworks on Google Cloud Platform (GCP). The ideal candidate will bring in-depth expertise in data management, metadata frameworks, compliance, and security within cloud environments to ensure high-quality, secure, and compliant data practices aligned with organizational goals. Requirements: 4+ years of experience in data governance, data management, or data security. Hands-on experience with Google Cloud Platform (GCP) including BigQuery, Dataflow, Dataproc, and Google Data Catalog. Strong command over metadata management, data lineage, and data quality tools (e.g., Collibra, Informatica). Deep understanding of data privacy laws and compliance frameworks. Proficiency in SQL and Python for governance automation. Experience with RBAC, encryption, and data masking techniques. Familiarity with ETL/ELT pipelines and data warehouse architectures. Job Responsibilities: Develop and implement comprehensive data governance frameworks , focusing on metadata management, lineage tracking , and data quality. Define, document, and enforce data governance policies, access control mechanisms, and security standards using GCP-native services such as IAM, DLP, and KMS. Manage metadata repositories using tools like Collibra, Informatica, Alation, or Google Data Catalog. Collaborate with data engineering and analytics teams to ensure compliance with GDPR, CCPA, SOC 2, and other regulatory standards. Automate processes for data classification, monitoring, and reporting using Python and SQL. Support data stewardship initiatives including the development of data dictionaries and governance documentation. Optimize ETL/ELT pipelines and data workflows to meet governance best practices. What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!

Posted 1 week ago

Apply

8.0 - 13.0 years

40 - 45 Lacs

Hyderabad

Work from Office

About the job: We are an innovative global healthcare company, driven by one purpose: we chase the miracles of science to improve people s lives. Our team, across some 100 countries, is dedicated to transforming the practice of medicine by working to turn the impossible into the possible. We provide potentially life-changing treatment options and life-saving vaccine protection to millions of people globally, while putting sustainability and social responsibility at the center of our ambitions. Sanofi has recently embarked into a vast and ambitious digital transformation program. A cornerstone of this roadmap is the acceleration of its data transformation and of the adoption of artificial intelligence (AI) and machine learning (ML) solutions that will accelerate Manufacturing & Supply performance and help bring drugs and vaccines to patients faster, to improve health and save lives. As part of the Digital M&S Foundations organization, the data modeler designs, implements, and documents data architecture and data modeling solutions, which include the use of relational and dimensional databases. These solutions support Manufacturing and Supply Data and Analytical products and other business interests. What you will be doing: Be responsible for the development of the conceptual, logical, and physical data models in line with the architecture and platforms strategy Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively with the M&S teams Demonstrate a strong expertise in one of the following functional business areas of M&S: Manufacturing, Quality or Supply Chain Main Responsibilities Design and implement business data models in line with data foundations strategy and standards Work with business and application/solution teams to understand requirements, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, and analytic models. Hands-on data modeling, design, configuration, and performance tuning Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Skills Bachelor s or master s degree in computer/data engineer technical or related experience. 8+ years of hands-on relational, dimensional, and/or analytic experience, including 5+ years of hands-on experience with data from core manufacturing and supply chain systems such as SAP, Quality Management, LIMS, MES, Planning Experience hands-on programing in SQL Experience with data warehouse (Snowflake), data lake (AWS based), and enterprise big data platforms in a pharmaceutical company. Good knowledge of metadata management, data modeling, and related tools: Snowflake, Informatica, DBT Experience with Agile Good communication, and presentation skills Why choose us Bring the miracles of science to life alongside a supportive, future-focused team. Discover endless opportunities to grow your talent and drive your career, whether it s through a promotion or lateral move, at home or internationally. Enjoy a thoughtful, well-crafted rewards package that recognizes your contribution and amplifies your impact. Take good care of yourself and your family, with a wide range of health and wellbeing benefits including high-quality healthcare, prevention and wellness programs and at least 14 weeks gender-neutral parental leave. Opportunity to work in an international environment, collaborating with diverse business teams and vendors, working in a dynamic team, and fully empowered to propose and implement innovative ideas. Pursue Progress . Discover Extraordinary . Progress doesn t happen without people people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. You can be one of those people. Chasing change, embracing new ideas and exploring all the opportunities we have to offer. Let s pursue progress. And let s discover extraordinary together. At Sanofi, we provide equal opportunities to all regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, or gender identity. Watch our ALL IN video and check out our Diversity Equity and Inclusion actions at sanofi.com !

Posted 1 week ago

Apply

3.0 - 6.0 years

10 - 19 Lacs

Pune

Hybrid

Bitwise is hiring for IDMC Data engineers , PFB Detail JD for your reference. Kindly share updates CV on Email ID - chetana.kashid@bitwiseglobal.com Mandatory skills needed: 1. Strong experience on IDMC-CDI 2. Strong experience on Informatica powercenter 3. Good understanding of SQL concepts 4. Good understanding of ETL concepts 5. Should have experience on relational database Good to have: 1. IDMC certification 2. Experience with DataStage 3. Basic Unix commands, shell scripting 4. Good communication skills 5. Proactive in nature and Positive attitude

Posted 1 week ago

Apply

5.0 - 8.0 years

22 - 30 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Job Summary: We are looking for an experienced Informatica Cloud Developer with 5 8 years of hands-on experience in data integration, ETL, and Informatica Intelligent Cloud Services (IICS). The ideal candidate should have a strong understanding of cloud-based data architecture, ETL/ELT best practices, and be able to build and manage scalable data pipelines using Informatica Cloud. Key Responsibilities: Design, develop, and deploy data integration workflows using Informatica Intelligent Cloud Services (IICS) . Migrate and optimize ETL processes from on-premises to cloud environments. Build scalable and reusable data pipelines for ingestion, transformation, and loading into cloud data platforms (e.g., Snowflake, Azure, AWS Redshift, GCP BigQuery). Collaborate with data architects and business analysts to gather data integration requirements. Implement performance tuning and error handling in ETL jobs. Monitor and troubleshoot ETL jobs to ensure smooth data flow and accurate reporting. Work closely with cross-functional teams including Data Engineering, BI, and Application teams. Ensure data quality and integrity throughout the integration process. Prepare technical documentation, unit test cases, and deployment guidelines. Required Skills: 5 8 years of experience in ETL Development with strong focus on Informatica Cloud (IICS) . Proficiency in developing Cloud Data Integration (CDI) and Application Integration solutions. Solid experience in working with REST/SOAP APIs , JSON/XML formats. Good understanding of SQL and relational databases (e.g., Oracle, SQL Server, PostgreSQL). Hands-on experience with at least one cloud data platform like Snowflake , Azure Synapse , AWS Redshift , or GCP BigQuery . Experience with job scheduling and monitoring tools. Exposure to version control tools (e.g., Git) and CI/CD practices. Strong problem-solving, analytical, and communication skills. Preferred Qualifications: Informatica IICS or PowerCenter certifications. Knowledge of data warehousing concepts , data lakes , and cloud data architecture . Experience with Python , Shell scripting , or automation frameworks is a plus. Prior experience in Agile/Scrum delivery environments. Educational Qualification: Bachelor s or Master s degree in Computer Science, Information Systems, Engineering, or related field.

Posted 1 week ago

Apply

7.0 - 12.0 years

6 - 10 Lacs

Chennai

Work from Office

At Alight, we believe a company s success starts with its people. At our core, we Champion People, help our colleagues Grow with Purpose and true to our name we encourage colleagues to Be Alight. Our Values: Champion People be empathetic and help create a place where everyone belongs. Grow with purpose Be inspired by our higher calling of improving lives. Be Alight act with integrity, be real and empower others. It s why we re so driven to connect passion with purpose. Alight helps clients gain a benefits advantage while building a healthy and financially secure workforce by unifying the benefits ecosystem across health, wealth, wellbeing, absence management and navigation. With a comprehensive total rewards package, continuing education and training, and tremendous potential with a growing global organization, Alight is the perfect place to put your passion to work. Join our team if you Champion People, want to Grow with Purpose through acting with integrity and if you embody the meaning of Be Alight. Learn more at careers.alight.com . Alight is seeking a skilled and passionate Senior ETL Software Developer to join our team. As the Senior ETL Developer, you will be a member of a team responsible for various stages of software development, including understanding business requirements, coding, testing, documentation, deployment, and production support. As part of the ETL development team, you will focus on delivering high-quality enterprise caliber systems on Informatica PowerCenter, focused on source and targets to/from flat files, MS Dynamics CRM and Microsoft SQL Server. Your primary role will involve participating in full life-cycle data integration development projects. Qualifications: Knowledge & Experience: 7+ years of data integration, data warehousing, or data conversion experience. 4+ years of SQL writing and optimization experience. 5+ years of Informatica PowerCenter experience. 3.5+ years working with Microsoft SQL Server Management Studio. Experience with XML file data integration. Experience with UNIX shell scripting. Experience with Microsoft Dynamics or other CRM system preferred. Strong understanding of using ETL tools to integrate internal and third-party systems. Excellent analytical and critical thinking skills Strong interpersonal skills with the ability to work effectively with diverse and remote teams Experience in agile processes and development task estimation Strong sense of responsibility for deliverables Ability to work in a small team with moderate supervision Responsibility Areas: Design software solutions for small to medium complexity requirements independently, adhering to existing standards Develop high-priority and highly complex code for systems based on functional specifications, detailed design, maintainability, and coding and efficiency standards, working independently Estimate and evaluate risks, and prioritize technical tasks based on requirements Collaborate actively with ETL Lead, Product Owners, Quality Assurance, and stakeholders to ensure high-quality project delivery Conduct and supervise formal code reviews to ensure compliance with standards Utilize appropriately system design, development, and process standards Write and execute unit test cases to verify basic functionality, both for your own code and that of your peers Create, maintain, and publish system-level documentation, including system diagrams, with minimal guidance Ensure clarity, conciseness, and completeness of requirements before starting development, collaborating with Business Analysts and stakeholders to evaluate feasibility. Take primary accountability for meeting non-functional requirements. Education: Bachelors degree (with preferred concentrations in Computer Science, MIS, Engineering) or equivalent work experience. Master s Degree in related area preferred. Computer application certifications, as applicable. Alight requires all virtual interviews to be conducted on video. Flexible Working So that you can be your best at work and home, we consider flexible working arrangements wherever possible. Alight has been a leader in the flexible workspace and Top 100 Company for Remote Jobs 5 years in a row. Benefits We offer programs and plans for a healthy mind, body, wallet and life because it s important our benefits care for the whole person. Options include a variety of health coverage options, wellbeing and support programs, retirement, vacation and sick leave, maternity, paternity & adoption leave, continuing education and training as well as several voluntary benefit options. By applying for a position with Alight, you understand that, should you be made an offer, it will be contingent on your undergoing and successfully completing a background check consistent with Alight s employment policies. Background checks may include some or all the following based on the nature of the position: SSN/SIN validation, education verification, employment verification, and criminal check, search against global sanctions and government watch lists, credit check, and/or drug test. You will be notified during the hiring process which checks are required by the position. Our commitment to Inclusion We celebrate differences and believe in fostering an environment where everyone feels valued, respected, and supported. We know that diverse teams are stronger, more innovative, and more successful. At Alight, we welcome and embrace all individuals, regardless of their background, and are dedicated to creating a culture that enables every employee to thrive. Join us in building a brighter, more inclusive future. Authorization to work in the Employing Country Applicants for employment in the country in which they are applying (Employing Country) must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the Employing Country and with Alight. Note, this job description does not restrict managements right to assign or reassign duties and responsibilities of this job to other entities; including but not limited to subsidiaries, partners, or purchasers of Alight business units. We offer you a competitive total rewards package, continuing education & training, and tremendous potential with a growing worldwide organization. DISCLAIMER: Nothing in this job description restricts managements right to assign or reassign duties and responsibilities of this job to other entities; including but not limited to subsidiaries, partners, or purchasers of Alight business units. .

Posted 1 week ago

Apply

1.0 - 6.0 years

2 - 5 Lacs

Bengaluru

Work from Office

Our Partner IRISH OPERATIONS was established in Dublin since 1997, Our Partner has a Fund/Asset Management Company and a Life Insurance Company with each focused on the research, development, and discovery of innovative investment products and services, which it then delivers to its European retail client base through UCITS and Non-UCITS mutual funds as well as unit linked life insurance products. These products are distributed primarily through the Group s Banking entities and their Family Banker sales networks. The Data Operations Associate is responsible for assisting in the delivery of quality, integrity, and efficacy of asset data across our information environment and is responsible for executing data quality routines. This individual will proactively monitor, measure, track and report on data ingestion and operations. The Specialist will also assist in the technical execution of data quality runbooks, including the escalation of identified issues to the respective business owners and/or vendors as directed by the Senior Data Quality Engineer. Roles and Responsibilities Ensure that proper monitoring, alerting, and tracking of data ingestion processes using DataDog. Ability to document data operations and monitoring solutions in clear and concise guides. Engage with the Operations team to remediate data operation errors for key Our Partner systems. Execute data quality runbooks when required under the guidance of the Data Operations team. Develop subject matter expertise to enhance our data offering to clients. Assist in the creation/maintenance of data catalogues & business dictionaries/glossaries in Informatica. Run data quality rules and notify key stakeholders Contributes to team effort by accomplishing related results as needed. Key Requirements 1 year of work experience in the Data management or Financial services industry. Keen interest in the finance domain and knowledge of financial markets. Should have experience using Microsoft tools such as Excel, SharePoint, and Office 365. Basic knowledge of SQL is required. Knowledge of Informatica and/or DataDog is a clear advantage. Fluency in English and have excellent writing and communication skills. Bachelor s degree in computer science, Data Analysis or equivalent relevant qualification Prior exposure to project management and stakeholder management responsibilities. Joining Our Partner will give you a fantastic opportunity to work in the most innovative space in an already innovative fast-growing company, rapidly adding achievements to your portfolio and playing a pivotal role in the growth of the organisation.

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 8 Lacs

Mumbai

Work from Office

Company Description Strategy (Nasdaq: MSTR) is at the forefront of transforming organizations into intelligent enterprises through data-driven innovation. We dont just follow trends we set them and drive change. As a market leader in enterprise analytics and mobility software, weve pioneered BI and analytics space, empowering people to make better decisions and revolutionizing how businesses operate. But thats not all. Strategy is also leading to a groundbreaking shift in how companies approach their treasury reserve strategy, boldly adopting Bitcoin as a key asset. This visionary move is reshaping the financial landscape and solidifying our position as a forward-thinking, innovative force in the market. Four years after adopting the Bitcoin Standard, Strategys stock has outperformed every company in S&P 500. Our people are the core of our success. At Strategy, youll join a team of smart, creative minds working on dynamic projects with cutting-edge technologies. We thrive on curiosity, innovation, and a relentless pursuit of excellence. Our corporate values bold, agile, engaged, impactful, and united are the foundation of our culture. As we lead the charge into the new era of AI and financial innovation, we foster an environment where every employees contributions are recognized and valued. Join us and be part of an organization that lives and breathes innovation every day. At Strategy, youre not just another employee; youre a crucial part of a mission to push the boundaries of analytics and redefine financial investment. Job Description Reporting to the Senior Director of SaaS, the Salesforce Developer is responsible for customizing, developing and supporting solutions on the Salesforce and ServiceNow platform. The ideal candidate will strong understanding of the Salesforce.com and ServiceNow platform, and basic to intermediate understanding of integrations, single sign, security, etc. (Informatica, other ETL tools a huge plus), interest and ability to understand the problem to solve, solution design and critical path to develop. The candidate will have exceptional technical, analytical and problem-solving skills and be comfortable interacting with all levels of the organization. We are seeking a self-starter, with a bias towards action, who can recognize and make process improvement recommendations. Responsibilities: Perform day-to-day administration of the ServiceNow system, including making approved changes to forms, tables, reports, and workflows Create and customize reports, homepages, and dashboards in ServiceNow Ensure the ServiceNow platform and tools remain current by performing testing and installation of ServiceNow updates, patches, and new releases Create and configure Business Rules, UI Policies, UI Actions and ScriptsDesign and develop advanced ServiceNow customizations Troubleshoot multiple integrations with ServiceNow and Rally. Manage ServiceNow security by managing roles and access control lists Train personnel in ServiceNow use and processes to include creating supporting documentation including user and training guides Work directly with end users to resolve support issues within ServiceNow Oversee code reviews. Design, develop, configure, test and deploy solutions built on Salesforce platform Responsible for configuration, design, functionality and end-user support of the Force.com platform Implement solutions in an agile environment delivering high-quality code and configuration Develop and manage Workflows, Process Builder, Assignment rules, email templates, and all other declarative and programmatic features. Handle mass imports and exports of data. Customize custom objects, fields, reports, 3rd party apps etc. Manage Users, Profiles, Permission Sets, Security, and other administrations tasks. Lead testing of various functionalities, create test data, test plans and perform feature testing. Demo solutions to users, train and document as needed. Provide on-going support and system administration to quickly fix production issues. Map functional requirements to Salesforce.com features and functionality. Implement change control and best practices with regards to system maintenance, configuration, development, testing, data integrity, etc. Hands-on Sales cloud, ServiceNow and Salesforce Community experience Have a programming background with the ability to develop custom code using Visualforce / Apex / Lightning / JavaScript to meet user requirements Know when to use out-of-the-box functionality versus custom code We are seeking candidates with: Outstanding listening, analytical, organizational and time management skills. Excellent written and oral communication skills; Demonstrates a high level of diplomacy and professionalism. Strong work ethic, hands-on, with a customer service mentality. Team player, ability to work cross-functionally, self-driven, motivated, and able to work under pressure. Able to work independently and lead projects of moderate complexity. Ability to identify areas for process improvement and recommend/implement solutions Proven creativity and problem solving skills; ability to work around obstacles and solve problems with minimal direction Ability to develop effective relationships with business users, technical staff and executive management Ability to prioritize work and meet deadlines in a fast-paced environment Flexibility with a demonstrated ability to embrace change Bachelors Degree BA/BS in Computer Science or similar technical degree or equivalent experience 3+ years of hands-on experience developing Salesforce and ServiceNow In-depth knowledge of Salesforce and ServiceNow programmatic features Outstanding listening, analytical, organizational and time management skills. Ability to dig into data, surface actionable insights and demonstrates sound judgement and decision-making skills. A problem-solver at heart. Excellent written and oral communication skills; Demonstrates a high level of diplomacy and professionalism. Strong work ethic, hands-on, with a customer service mentality. Team player, ability to work cross-functionally, self-driven, motivated, and able to work under pressure Extensive experience in using Data Loader and other data loading tools Additional background in SeviceNow, Community, CPQ, Marketo and other integrations a plus Experience using MS Excel and Database modeling Ability to work independently and be proactive Be able to work under pressure, multi-task, manage changing priorities and workload Additional Information The recruitment process includes online assessments as a first step. We send them via e-mail, please check also your SPAM folder. We work from Pune office 4 times a week

Posted 1 week ago

Apply

12.0 - 17.0 years

30 - 37 Lacs

Pune

Work from Office

will serve as a strategic leader within our Data & Analytics team, responsible for the development and delivery of scalable, secure, and high-performance data pipelines and analytical models. This individual will play a critical role in shaping the enterprise data ecosystem, driving innovation across analytics and data integration initiatives, and ensuring operational excellence in data engineering practices. You will lead a global team of data engineers to support enterprise-wide analytics across Sales, Marketing, Services, Customer Success, Finance, HR, Product, and Engineering. As a technical leader, you will balance hands-on contributions with strategic direction, governance, and execution oversight. You ll collaborate closely with enterprise architects, data & reporting analysts, data scientists, and business stakeholders to enable actionable insights and build a modern data infrastructure that supports our company s growth. This role goes beyond traditional BI it includes developing predictive models, uncovering insights with Gen AI-powered features to drive data driven decision making . You ll use tools like Python, Snowflake Cortex , alongside SQL, DBT, Air Flow, and Snowflake to deliver scalable solutions that influence strategy and execution across the enterprise. In this role, you will... Provide strategic and technical leadership to the data engineering function, including architecture, development, quality, and delivery of data solutions. Lead and mentor a global team of engineers to build and support robust, performant data pipelines using Snowflake, DBT, Python, and AWS-based infrastructure. Partner with Business Stakeholders, Data Governance, and IT teams to define and execute a multi-year roadmap for data platform modernization and analytics enablement. Own the ETL/ELT development lifecycle including design, orchestration, configuration, automation, testing, and support ensuring adherence to SDLC, architectural standards, and security practices. Establish and enforce scalable best practices for data ingestion, transformation, modeling, documentation, and operational support. Define current and future state BI and data engineering architectures to support enterprise reporting, self-service analytics, and advanced data science use cases. Drive adoption of modern tools, techniques, and reusable frameworks that increase engineering velocity and reliability (e.g., CI/CD for data pipelines, automated testing, monitoring). Proactively identify areas for data quality, performance, and process improvement across the data pipeline ecosystem, lead remediation efforts. Act as an escalation point for complex production issues and strategic project deliverables. Serve as a technical thought leader and advocate for data engineering practices across the organization. You have what it takes if you have... 12+ years of experience in data engineering, data integration, or data architecture roles, including 4+ years leading high-performing teams in enterprise environments. Deep expertise in modern cloud data platforms (Snowflake, Azure, and/or AWS Redshift) and ETL/ELT tools such as DBT, Informatica, Talend, or equivalent. Strong hands-on experience designing and delivering scalable, secure, and resilient data pipelines and APIs in cloud environments (AWS preferred). Proficiency in SQL and Python for data manipulation, pipeline development, and automation. Solid understanding of data warehousing concepts (e.g., CDC, SCD types, dimensional modeling), data lake architectures, and real-time streaming. Proven ability to balance tactical execution with long-term strategic planning and stakeholder alignment. Experience working in Agile/Scrum environments using tools like Jira, Git, and Confluence. Demonstrated success collaborating across global teams and partnering with business, IT, and analytics leaders. Strong communication, documentation, and stakeholder management skills. Excellent communication skills and experience working cross-functionally with business and IT teams. Strong domain knowledge in CRM and Marketing Tech applications (Salesforce, Marketo, Gong, Gainsight). Bachelors degree in Computer Science, Data Science, or equivalent. Extra dose of awesome if you have... Knowledge of Tableau, Looker, or similar BI tools is a plus. Experience with predictive analytics in go-to-market functions (e.g., campaign attribution, customer lifetime value modeling, lead prioritization, churn). Experience with Oracle ERP, CPQ, & quote to cash processes in a SaaS / recurring revenue company. Ability to work seamlessly as part of a multi-site, multicultural, development and testing team, onshore and offshore, internal and external resources.

Posted 1 week ago

Apply

3.0 - 7.0 years

25 - 30 Lacs

Ahmedabad

Work from Office

Apexon is a digital-first technology services firm specializing in accelerating business transformation and delivering human-centric digital experiences. We have been meeting customers wherever they are in the digital lifecycle and helping them outperform their competition through speed and innovation.Apexon brings together distinct core competencies in AI, analytics, app development, cloud, commerce, CX, data, DevOps, IoT, mobile, quality engineering and UX, and our deep expertise in BFSI, healthcare, and life sciences to help businesses capitalize on the unlimited opportunities digital offers. Our reputation is built on a comprehensive suite of engineering services, a dedication to solving clients toughest technology problems, and a commitment to continuous improvement. Backed by Goldman Sachs Asset Management and Everstone Capital, Apexon now has a global presence of 15 offices (and 10 delivery centers) across four continents. We enable #HumanFirstDigital Tools & Technologies Total EXP :: 3-7yrs 3 - 5 years hands-on experience in ETL process At least 3 - 4 years experience with ETL/data integration tools like Pervasive, Informatica, Talend, iC4 Hands-on experience with all aspects of designing, developing, testing and implementing ETL solutions using Pentaho/Kettle. Experience with Oracle DB, MS SQL Server, MySQL, or other database technology. Proficient in writing highly optimized SQL s and algorithm for data processing Ability to perform complex application testing, deployment, maintenance, and evolution activities by correcting programming errors, responding to scope changes, and coding application enhancements. Passionate about complex data structures and problem solving Knowledge of Azure services is added advantage Experience in C#.NET development and the Microsoft web development stack is added advantage Other skills Self-managed and results-oriented with sense of ownership is required Excellent analytical, debugging and problem-solving skills is required Experience with Agile/Scrum development methodologies a plus Role in Project ETL developer/Prod Support Project/Work Details Design and develop solutions to resolve defects and production problems, including the ability to document, communicate and share specifications for solutions Designs data warehousing system that meets the specific business needs and works with a development team to build the warehouse. Successfully implement development processes, coding best practices, and code reviews Our Commitment to Diversity & Inclusion: Our Perks and Benefits: Our benefits and rewards program has been thoughtfully designed to recognize your skills and contributions, elevate your learning/upskilling experience and provide care and support for you and your loved ones. As an Apexon Associate, you get continuous skill-based development, opportunities for career advancement, and access to comprehensive health and well-being benefits and assistance. We also offer: o Group Health Insurance covering family of 4 o Term Insurance and Accident Insurance o Paid Holidays & Earned Leaves o Paid Parental LeaveoLearning & Career Development o Employee Wellness Job Location : Ahmedabad, India

Posted 1 week ago

Apply

4.0 - 9.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Apexon is a digital-first technology services firm specializing in accelerating business transformation and delivering human-centric digital experiences. We have been meeting customers wherever they are in the digital lifecycle and helping them outperform their competition through speed and innovation.Apexon brings together distinct core competencies in AI, analytics, app development, cloud, commerce, CX, data, DevOps, IoT, mobile, quality engineering and UX, and our deep expertise in BFSI, healthcare, and life sciences to help businesses capitalize on the unlimited opportunities digital offers. Our reputation is built on a comprehensive suite of engineering services, a dedication to solving clients toughest technology problems, and a commitment to continuous improvement. Backed by Goldman Sachs Asset Management and Everstone Capital, Apexon now has a global presence of 15 offices (and 10 delivery centers) across four continents. We enable #HumanFirstDigital Job Title: MDM Consultant (Informatica MDM or Reltio) Location: Preferably Bengaluru or any Apexon Location Job Type: Full-time Experience: 5 to 9 years (Minimum 4+ years in MDM implementation) Work Mode: [Hybrid / Remote] Reports To: MDM Lead / Data Architect / Project Manager Job Summary: We are looking for a highly skilled and hands-on MDM Consultant with a strong background in Informatica MDM and/or Reltio MDM to design, implement, and support enterprise Master Data Management solutions. The ideal candidate should have 4+ years of MDM-specific experience, a strong foundation in data modeling, match/merge rules, governance, and integration, and must be flexible and eager to learn other MDM tools as required. Key Responsibilities: Lead or support the development and implementation of MDM solutions using Informatica MDM (Hub/IDD) and/or Reltio MDM. Perform MDM hub configurations, data modeling, match/merge rule tuning, survivorship rule setup, and trust configuration. Configure and enhance IDD (Informatica Data Director) and/or Reltio UI for data stewardship, workflows, and validations. Develop data mappings, transformation logic, and data validations aligned with business rules. Interpret business requirements, define technical architectures, and convert them into robust MDM solutions. Collaborate with business analysts, data stewards, source system owners, and data governance teams. Define and implement data security, role-based access, and audit mechanisms. Participate in use case design, test scenario definition, and support during testing and deployment phases. Work closely with data architects to define and refine data architecture for master data domains. Support performance tuning, incident resolution, and continuous improvement initiatives in MDM environments. Required Skills & Qualifications: Minimum 4+ years of hands-on experience in MDM implementation and development. Strong expertise in Informatica MDM core components (Hub, IDD, SIF APIs). Proficient in SQL and experience with relational databases (Oracle, SQL Server, PostgreSQL, etc.). Solid understanding of MDM concepts like golden record creation, data mastering, survivorship, and hierarchies. Experience in Reltio MDM is a plus; willingness to learn Reltio or other MDM tools is essential. Strong knowledge of match/merge tuning, data stewardship processes, and MDM best practices. Ability to document and communicate technical architectures and standards effectively. Hands-on experience with data integration and ETL tools like Informatica PowerCenter, IICS, or others. Familiarity with data governance principles and experience collaborating with governance teams. Desirable Skills: Experience in cloud-based MDM solutions (e.g., Reltio, Informatica MDM SaaS, Azure/AWS/GCP). Exposure to DevOps/CI-CD practices and tools (Git, Jenkins, etc.). Knowledge of data quality tools (Informatica DQ/IDQ, Talend, etc.). Understanding of Agile/Scrum methodologies. Soft Skills: Strong communication and stakeholder management abilities. Analytical thinker with attention to detail and problem-solving mindset. Collaborative and proactive team player. Flexible and eager to upskill in new technologies and tools in the MDM ecosystem. Our Commitment to Diversity & Inclusion: Our Perks and Benefits: Our benefits and rewards program has been thoughtfully designed to recognize your skills and contributions, elevate your learning/upskilling experience and provide care and support for you and your loved ones. As an Apexon Associate, you get continuous skill-based development, opportunities for career advancement, and access to comprehensive health and well-being benefits and assistance. We also offer: o Group Health Insurance covering family of 4 o Term Insurance and Accident Insurance o Paid Holidays & Earned Leaves o Paid Parental LeaveoLearning & Career Development o Employee Wellness Job Location : Bengaluru, India

Posted 1 week ago

Apply

5.0 - 6.0 years

7 - 8 Lacs

Mumbai

Work from Office

Job Title D ata Management Analyst Short Description Promote & ensure high-quality, fit-for-purpose data as part of Chief Data Office, shaping industry-leading data management strategies Posting Description The successful candidate will join a team focused on delivering business impact through transformation of Data Quality management. The role will require working with stakeholders across the CIB, other LOBs and Corporate Functions to ensure fit-for-purpose data, looking to leverage best practice across the industry and other J. P. Morgan business units. This role provides an outstanding opportunity for the selected candidate to join a high-profile team at JPMorgan Chase and take a partnership role, supporting one of the Bank s most significant change programs. Job responsibilities Demonstrate good understanding of data governance, data quality & data lineage Implement and support Data Quality (DQ) practices across the CIB Govern and triage DQ Issues as it progresses through the lifecycle Accelerate data quality issue level root cause analysis and resolution through effective program governance, process mapping and data deep-dives Discuss and agree technical resolutions with technology teams to remediate DQ issues Discover and document data-lineage to trace the end-to-end data journey from point of creation to consumption Set up data profiling and DQ rules leveraging DQ tools like Collibra, Informatica and other emerging tools Leverage productivity tools such as Alteryx and visualization tools such as Tableau to analyze large dataset to draw inferences Collaborate & Build strong partnerships with Business stakeholders & Technology teams to support data quality efforts Demonstrates Teamwork by collaborating with others to integrate ideas & achieve common goals Required qualification, capabilities & skills 5+ years experience in Financial Services with Data Quality / Data Lineage / Business Analysis background Excellent analytical and problem-solving skills. Capacity to think laterally and convey an understanding of the big picture Proficiency in manipulating and analyzing large data sets in Excel (pivots, formulas, charts, etc. ) Excellent communication, presentation (both oral and written) & influencing skills - candidate will be dealing with stakeholders of varying degrees of seniority across Corporate/Firmwide, Finance, Risk, Compliance, Operations and Technology teams Self-starter, able to work autonomously, with strong time management skills; efficient at multi-tasking and able to work under pressure to deliver multiple business demands on-time, to a high standard Basic understanding of the companys business practices and familiarity with the companys products and services Preferred qualification, capabilities & skills Experience in CIB and CDO/Data Quality organization Experience in querying database/data stores for data analysis. Experience in producing PowerPoint presentations for senior audiences Exposure to tools/technologies such as Alteryx / Tableau / SQL / Informatica / Collibra

Posted 1 week ago

Apply

3.0 - 8.0 years

22 - 25 Lacs

Bengaluru

Work from Office

Build Your Career at Informatica Were looking for a diverse group of collaborators who believe data has the power to improve society. Adventurous minds who value solving some of the worlds most challenging problems. Here, employees are encouraged to push their boldest ideas forward, united by a passion to create a world where data improves the quality of life for people and businesses everywhere. Enablement Program Manager - Remote Were looking for an Enablement Program Manager candidate with experience in Program Management, Curriculum and Course Design to join our team as remote. You will report to the Field Enablement Director You will support the development and execution of internal tech/product enablement programs and product go-to-market (GTM) strategy. Technology Youll Use Learning design using ADDIE/SAM or similar methodology Your Role ResponsibilitiesHeres What Youll Do You will support the development and execution of internal tech/product enablement programs and product go-to-market (GTM) strategy, guided by major solution use cases. This pivotal role involves oversight of internal enablement projects focused on Technical Sales, business development, partners and Account Executives. You will ensure our teams are equipped with the knowledge and skills to maximize the value of our technology. Sample projects include asynchronous onboarding design, product releases, use case training, and portfolio curriculum development. Growth projects might include test-out functionality and role-specific learning outcomes. Program Management: Determine and confirm all components of the program and how they work together to meet the program goalso Develop and maintain cross-functional relationships with stakeholders, team members, managers, and SMEs (Subject Matter Expert)o Create timeline and drive program to successful completion as defined by success criteria Curriculum and Course Design: Conduct needs analysis to determine leveraged areas of impact and appropriate learning approacho Develop standards for the development and deployment of online courses, curriculum, and live events for a variety of internal roles.o Work with stakeholders and SMEs to define objectives, sharpen content, and measure impact. Tools, Templates and Checklists: Review current product enablement materials, compare against internal outcome needs and external benchmarks. Build prototypes and refine with feedback Create professional material that are accessible. Training Materials: Design and develop creative and engaging materials appropriate for a given objectiveo Design with a deep understanding of audience diversityo Evaluate and assess learning impact on learners and the business What Wed Like to See 3+ years in technical, product enablement Scrum Product Owner certification Deep understanding of the tech industry and trends Adobe Creative Suite, LMS/LXP, and/or e-learning authoring tool experience Masters degree in Technology, Learning Design, Learning Tech, Education, Business or similar field Role Essentials 2-5 years proven enablement program design, development and execution 2 or more years of experience using ADDIE/SAM or similar methodology 3+ years of experience integrating theory and practice to deliver pedagogically sound learning experiences Ability to work collaboratively and cross-functionally to drive alignment with multiple stakeholders Experience translating complex concepts into easy-to-understand digestible formats Perks Benefits Comprehensive health, vision, and wellness benefits (Paid parental leave, adoption benefits, life insurance, disability insurance and 401k plan or international pension/retirement plans Flexible time-off policy and hybrid working practices Equity opportunities and an employee stock purchase program (ESPP) Comprehensive Mental Health and Employee Assistance Program (EAP) benefit Our DATA values are our north star and we are passionate about building and delivering solutions that accelerate data innovations. At Informatica, our employees are our greatest competitive advantage. So, if your experience aligns but doesnt exactly match every qualification, apply anyway. You may be exactly who we need to fuel our future with innovative ideas and a thriving culture. Informatica (NYSE: INFA), a leader in enterprise AI-powered cloud data management, brings data and AI to life by empowering businesses to realize the transformative power of their most critical assets. We pioneered the Informatica Intelligent Data Management Cloud that manages data across any multi-cloud, hybrid system, democratizing data to advance business strategies. Customers in approximately 100 countries and more than 80 of the Fortune 100 rely on Informatica. www.informatica.com . Connect with LinkedIn , X , and Facebook . Informatica. Where data and AI come to life. ","

Posted 1 week ago

Apply

6.0 - 12.0 years

0 Lacs

Bangalore Urban, Karnataka, India

On-site

We are seeking a dynamic and experienced Senior Business Analyst specializing in Informatica to join our innovative team. In this role, you will contribute to developing innovative solutions in the Life Sciences Technology domain. You will manage stakeholder expectations, coordinate with various teams, and play a crucial part in optimizing data quality and governance. We encourage you to apply and be part of our mission to enhance client experiences through data-driven decisions. Responsibilities Manage stakeholder expectations and coordinate multiple stakeholders and third-party vendors Optimize data quality and enhance data governance to adapt to changing business needs Engage regularly with client managers and development teams, with intermittent interaction with the client side Work closely with end users and stakeholders, including complicated stakeholders Collaborate with the client's Product Manager or Product Owner to own the roadmap and pitch ideas to the customer Define new opportunities, realize backlog items, and plan product/project promotions Implement a product mindset and take ownership over successful products to drive the project forward Requirements 6-12 years of experience in business analysis Demonstrated experience as a business analyst in the Life Sciences Technology domain Familiarity with Informatica Knowledge of ETL/ELT solutions Demonstrated stakeholder management skills Proven track record of successful product ownership Understanding of metrics and data-driven decision making Proficiency in Agile methodologies

Posted 1 week ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Hyderabad

Work from Office

Required Qualifications: 3-5 years of production support experience on Informatica/Python/AWS Technologies and applications. Must have good understanding and technical knowledge on Informatica architecture/client components such as Workflow Manager, Mapping Designer, workflow monitor and Repo manager. Excellent knowledge on AWS/Python concepts. Informatica to Cloud Migration. Hands-on expertise in debugging Informatica ETL Mapping to narrow down the Issue. Hands-on experience in ETL transformation such as lookup/joiners/source qualifier/normalizer. Hands-on experience in dealing with various types of sources such as Flat files/Mainframes/XML files and Databases. Experience on AWS environment, Data Pipelines, RDS, Reporting tools. Hands-on experience in Unix scripting/file operations. Strong knowledge of SQL/PL-SQL and oracle Databases. Able to debug complex queries. Good understanding on scheduling tool such as TWS/TIDAL/Others Worked at least 2 years on ServiceNow for application incident management, problem management in a 24*7 model. Strong communication skills both written and verbal with the ability to follow the processes Preferred qualifications: Experience working with US Clients and Business partners. Exposure to BFSI domain is a good to have. Experience in mainframe technologies will be plus. Job responsibilities: Provide production support for Informatica/Python/AWS suit of applications in 24X7 environment. Good understanding of Informatica/Python and AWS environment and able to handle batch recoveries and provide batch support. Assess & recommend solutions for permanent fixes to improve application stability and resiliency. Ability to handle production incident bridge calls for P1 and high priority P2 incidents Strong analytical capability to do independent ticket analysis and resolution. Incident Management/Problem management and Change management. Ability to do the Root cause analysis, recap the issues and problem in an email and communicate to all stakeholders and cross commit teams. Ability to drive production issues bridges and able to work across the teams to collate the impacts and send leadership communications. Propose solutions to perform complex troubleshooting. Should be able to identify the areas of improvement and re-engineering scope. Helping other team members to resolve their technical issues. Mentoring the interns and new joiners. Monitor and report issues and work with the required team/vendor for quick resolution

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Mumbai

Work from Office

Job Description Company Description Strategy (Nasdaq: MSTR) is at the forefront of transforming organizations into intelligent enterprises through data-driven innovation. We dont just follow trends we set them and drive change. As a market leader in enterprise analytics and mobility software, weve pioneered BI and analytics space, empowering people to make better decisions and revolutionizing how businesses operate. But thats not all. Strategy is also leading to a groundbreaking shift in how companies approach their treasury reserve strategy, boldly adopting Bitcoin as a key asset. This visionary move is reshaping the financial landscape and solidifying our position as a forward-thinking, innovative force in the market. Four years after adopting the Bitcoin Standard, Strategys stock has outperformed every company in S&P 500. Our people are the core of our success. At Strategy, youll join a team of smart, creative minds working on dynamic projects with cutting-edge technologies. We thrive on curiosity, innovation, and a relentless pursuit of excellence. Our corporate values bold, agile, engaged, impactful, and united are the foundation of our culture. As we lead the charge into the new era of AI and financial innovation, we foster an environment where every employees contributions are recognized and valued. Join us and be part of an organization that lives and breathes innovation every day. At Strategy, youre not just another employee; youre a crucial part of a mission to push the boundaries of analytics and redefine financial investment. Job Description Reporting to the Senior Director of SaaS, the Salesforce Developer is responsible for customizing, developing and supporting solutions on the Salesforce and ServiceNow platform. The ideal candidate will strong understanding of the Salesforce.com and ServiceNow platform, and basic to intermediate understanding of integrations, single sign, security, etc. (Informatica, other ETL tools a huge plus), interest and ability to understand the problem to solve, solution design and critical path to develop. The candidate will have exceptional technical, analytical and problem-solving skills and be comfortable interacting with all levels of the organization. We are seeking a self-starter, with a bias towards action, who can recognize and make process improvement recommendations. Responsibilities: Perform day-to-day administration of the ServiceNow system, including making approved changes to forms, tables, reports, and workflows Create and customize reports, homepages, and dashboards in ServiceNow Ensure the ServiceNow platform and tools remain current by performing testing and installation of ServiceNow updates, patches, and new releases Create and configure Business Rules, UI Policies, UI Actions and ScriptsDesign and develop advanced ServiceNow customizations Troubleshoot multiple integrations with ServiceNow and Rally. Manage ServiceNow security by managing roles and access control lists Train personnel in ServiceNow use and processes to include creating supporting documentation including user and training guides Work directly with end users to resolve support issues within ServiceNow Oversee code reviews. Design, develop, configure, test and deploy solutions built on Salesforce platform Responsible for configuration, design, functionality and end-user support of the Force.com platform Implement solutions in an agile environment delivering high-quality code and configuration Develop and manage Workflows, Process Builder, Assignment rules, email templates, and all other declarative and programmatic features. Handle mass imports and exports of data. Customize custom objects, fields, reports, 3rd party apps etc. Manage Users, Profiles, Permission Sets, Security, and other administrations tasks. Lead testing of various functionalities, create test data, test plans and perform feature testing. Demo solutions to users, train and document as needed. Provide on-going support and system administration to quickly fix production issues. Map functional requirements to Salesforce.com features and functionality. Implement change control and best practices with regards to system maintenance, configuration, development, testing, data integrity, etc. Hands-on Sales cloud, ServiceNow and Salesforce Community experience Have a programming background with the ability to develop custom code using Visualforce / Apex / Lightning / JavaScript to meet user requirements Know when to use out-of-the-box functionality versus custom code We are seeking candidates with: Outstanding listening, analytical, organizational and time management skills. Excellent written and oral communication skills; Demonstrates a high level of diplomacy and professionalism. Strong work ethic, hands-on, with a customer service mentality. Team player, ability to work cross-functionally, self-driven, motivated, and able to work under pressure. Able to work independently and lead projects of moderate complexity. Ability to identify areas for process improvement and recommend/implement solutions Proven creativity and problem solving skills; ability to work around obstacles and solve problems with minimal direction Ability to develop effective relationships with business users, technical staff and executive management Ability to prioritize work and meet deadlines in a fast-paced environment Flexibility with a demonstrated ability to embrace change Bachelors Degree BA/BS in Computer Science or similar technical degree or equivalent experience 3+ years of hands-on experience developing Salesforce and ServiceNow In-depth knowledge of Salesforce and ServiceNow programmatic features Outstanding listening, analytical, organizational and time management skills. Ability to dig into data, surface actionable insights and demonstrates sound judgement and decision-making skills. A problem-solver at heart. Excellent written and oral communication skills; Demonstrates a high level of diplomacy and professionalism. Strong work ethic, hands-on, with a customer service mentality. Team player, ability to work cross-functionally, self-driven, motivated, and able to work under pressure Extensive experience in using Data Loader and other data loading tools Additional background in SeviceNow, Community, CPQ, Marketo and other integrations a plus Experience using MS Excel and Database modeling Ability to work independently and be proactive Be able to work under pressure, multi-task, manage changing priorities and workload Additional Information The recruitment process includes online assessments as a first step. We send them via e-mail, please check also your SPAM folder. We work from Pune office 4 times a week

Posted 1 week ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Hyderabad

Work from Office

Pay and Benefits: Competitive compensation, including base pay and annual incentive Comprehensive health and life insurance and well-being benefits Pension Paid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being. DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (onsite Tuesdays, Wednesdays and a third day unique to each team or employee). The impact you will have in this role: The Application Support Engineering role specializes in maintaining and providing technical support for all applications that are beyond the development stage and are running in the daily operations of the firm. Works closely with development teams, infrastructure partners, and internal / external clients to bring up and resolve technical support incidents. Your Primary Responsibilities: Verify analysis performed by team members and implement changes required to prevent reoccurance of incidents Resolve Critical application alerts in a timely fashion including production defects, providing business impact and analysis to teams, handling minor enhancements as needed Review and update knowledge articles and runbooks with application development teams to confirm information is up to date Collaborate with internal teams to provide answers to application issues and escalate to as needed Validate and submit responses to requests for information from onging audits **NOTE: The Primary Responsibilities of this role are not limited to the details above. Qualifications: Minimum 4+ years of related experience Bachelors degree (preferred) or equivalent experience Minimum of 4+ years of related experience in Application Support. Bachelors degree preferred or equivalent experience. Solid Experience in Application Support. Hands on experience in Unix, Linux, Windows, SQL/PLSQL Familiarity working with relational databases (DB2, Oracle, Snowflake) Monitoring and Data Tools experience (Splunk, DynaTrace, Thousand Eyes, Grafana, Selenium, HiPam IBM Zolda) Cloud Technologies (AWS services (S3, EC2, Lambda, SQS, IAM roles), Azure, OpenShift, RDS Aurora, Postgress) Scheduling Tool experience (CA AutoSys, Control-M) Scripting languages (Bash, Python, Ruby, Shell, Perl, JavaScript) Hands on experience with ETL tools (Informatica Datahub/IDQ, Talend) Strong problem-solving skills with the ability to think creatively. Please contact us to request accommodation.

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

[{"Salary":"25 LPA" , "Remote_Job":false , "Posting_Title":"Business Intelligence Engineer" , "Is_Locked":false , "City":"Bangalore & Hyderabad" , "Industry":"Technology" , "Job_Description":" Summary: We are seeking a highly motivated and self-driven Business IntelligenceEngineer with 5+ years of experience in enterprise and/or hosted applicationsdevelopment with demonstrable, proven expertise in cloud applicationdevelopment, design, and implementation. Broad experience in various types ofdatabases, distributed, and web-based reporting & analytics applicationsbased. TechnicalSkills and Key Interests: MinimumQualifications Bachelor\u2019s in computer science or equivalent work experience. Minimum2 years of experience in development of reports, views, summarizations, anddashboards for at least one Reporting tool such as SAP BO, Tableau, Cognos,PowerBI, TIBCO Jaspersoft. Experiencedin Jaspersoft or willing to learn TIBCO Jaspersoft report development (willingto grow expertise from an elementary to an advanced level of expertise) Experiencedin ETL development in Technologies in one or more technologies such asInformatica, Hadoop, Spark as an example (Overall ETL + Report developmentcombined experience should be 5 or more) AdditionalTechnical Skills (Value-add): Experience in Java,spring framework, object-oriented design methodologies would be a value-add. Experiencein using AI tools for problem solving and development. Hands on experience ofbuilding cloud native applications for one or more cloud provider e.g., AWS,Google Cloud would be a value-add. Responsibilities: The area of development would be on Report development in TIBCO Jaspersoft andETL (SQL, PySpark). If you have an experience in other reporting tools then youshould be willing to learn TIBCO Jaspersoft at the initial period of yourtenure with the team. Development would involve one or both of these areasdepending on the business priorities. You will be responsible for designing andimplementing product enhancements, design of product functions, troubleshootingand resolving product defects, unit and integration testing. You will be responsible for enhancing andbuilding features end to end for a reporting-based application involvingsourcing data from relational, non-relational databases, transforming usingcomplex sql queries and developing reports. Activeinteraction with internal customers, other developers, Quality Assurance,Business System Analysts is an integral part of the role. Some of thekey tasks you will perform include: Participate in project/work planning sessions to analyze and understand requirements to the level of being able to contribute to their creation, in collaboration with capability/product and/or business owners. Develop and integrate applications per specification and translate technical requirements into application code and modules Approach development work with a DevOps and continuous integration mindset. Ensure consistency with cloud architectural guiding principles for assigned projects Develop prototype or "proof of concept" implementations of projects where the technical solution is unknown or unproven Proactive in raising problems, identifying solutions and giving/receiving feedback Assist in identifying and correcting software performance bottlenecks Work in a highly collaborative and dynamic agile team environment with multiple levels of technology staff across various geographical locations Providing technical expertise and peer code reviews to other team members and assist team leads and project managers in work break down and story planning. Otherspecialized knowledge and skills required: Must be able to work independently as well as collaboratively Comfortable working in an offshore-onshore model Proven strong analytical design and trouble-shooting skills Highly accountable for meeting all commitments and deadlines Effective communication skills, both written and verbal for technical and non-technical audiences Drive for continuous process improvement ","Job_Type":"Full time" , "Job_Opening_Name":"Business Intelligence Engineer" , "State":"Karnataka" , "Country":"India" , "Zip_Code":"560001" , "id":"153957000004631685" , "Publish":true , "Date_Opened":"2025-07-23" , "Keep_on_Career_Site":false}]

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Chennai

Work from Office

ServiceNow ITOM & AIOps Administrator We re Hiring: ServiceNow ITOM & AIOps Administrator Location: Abu Dhabi Notice Period: Immediate to 30 Days Job Overview: Join our team to enhance IT operations and service reliability across hybrid environments using ServiceNow ITOM & AIOps solutions. 5+ years of hands-on experience with ServiceNow ITOM Skilled in Discovery, Service Mapping & Event Management Strong scripting & integrations (JavaScript, REST APIs) Familiarity with AIOps tools & hybrid cloud setups Preferred: ServiceNow ITOM/AIOps certification Apply Now: Send your resume to: Job Application Name * E-mail address * Phone number * Attach CV We craft, deploy, and manage bespoke services in CRM, data and AI, cybersecurity and consulting.

Posted 1 week ago

Apply

8.0 - 13.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Position Title: ETL-Associate Architect (IND) - GR-39166-67754 Job Family: IFT > IT Architecture Shift: Job Description: JOB DESCRIPTION Job Title Associate Architect (IND) Requirement Type Full-Time Employee Job Location Bangalore Requirement Level Associate Architect (IND) Hiring Manager Jayanth Chandran Primary Skill ETL, SQL Business EDA - Data Skill Category Generic ABOUT ELEVANCE HEALTH Elevance Health is a leading health company in America dedicated to improving lives and communities and making healthcare simpler. It is the largest managed health care company in the Blue Cross Blue Shield (BCBS) Association serving more than 45 million lives across 14 states. A regular in Fortune 500 list, Elevance Health ranked 20 in 2022. Gail Boudreaux , President and CEO of Elevance Health has been a consistent name in the Fortune list of most powerful women and currently holds 4th rank on this list. ABOUT CARELONCARELON Carelon Global Solutions (CGS) is a healthcare solutions company that is simplifying complex operational processes to improve the health of the healthcare system. Previously known as Legato Health Technologies, Carelon Global Solutions (hereinafter, CGS) underwent a name change and joined the Carelon family of brands in January 2023, as a fully owned subsidiary of Elevance Health (Previously Anthem Inc.). CGS brings together a global team of like-minded innovators who manage and optimize operational processes for health plans as well as providers. Our brightest minds housed across our global headquarters in Indianapolis as well as Bengaluru, Hyderabad and Gurugram in India, Manila in the Philippines, Limerick in Ireland and San Juan in Puerto Rico bring with them innovative capabilities and an unmatched depth of experience. This global team uniquely positions CGS to enable scalable, next-generation platforms and specialized digital tools that make healthcare operations more practical, effective and efficient. OUR MISSION & VALUES Our Mission: Improving Lives and Communities. Simplifying Healthcare. Expecting More. Our Values: Leadership | Community | Integrity | Agility | Diversity JOB POSITION Associate Architect JOB RESPONSIBILITY Design and implement scalable, high-performance ETL solutions for data ingestion, transformation, and loading. Define and maintain data architecture standards, best practices, and governance policies. Collaborate with data engineers, analysts, and business stakeholders to understand data requirements. Optimize existing ETL pipelines for performance, reliability, and scalability. Ensure data quality, consistency, and security across all data flows. Lead the evaluation and selection of ETL tools and technologies. Provide technical leadership and mentorship to junior data engineers. Document data flows, architecture diagrams, and technical specifications. Good to have experience on Snowflake and Oracle QUALIFICATION Bachelor s degree EXPERIENCE Must have good knowledge on Provider systems. Bachelor s or Master s degree in Computer Science, Information Systems, or a related field. 8+ years of experience in data engineering or ETL development. Strong expertise in ETL tools such as Informatica, Talend, Apache NiFi, SSIS, or similar. Proficiency in SQL and experience with relational and NoSQL databases. Experience with cloud platforms like AWS, Azure, or Google Cloud. Familiarity with data modeling, data warehousing (e.g., Snowflake, Redshift), and big data technologies (e.g., Hadoop, Spark). Strong problem-solving and communication skills. Must have good business communication skills. SKILLS AND COMPETENCIES Committed and Accountable Communication (verbal & written) Ability to communicate status to stakeholders in timely manner. Collaboration and leadership Ability to collaborate with the global teams THE CARELON PROMISE Aligning with our brand belief of limitless minds are our biggest asset , we offer a world of limitless opportunities to our associates. It is our strong belief that one is committed to a role when it is not just what the role entails, but also what lies in its periphery that completes the value circle for an associate. This world of limitless opportunities thrives in an environment that fosters growth and well-being, and gives you purpose and the feeling of belonging. LIFE @ CARELON Extensive focus on learning and development An inspiring culture built on innovation, creativity, and freedom. Holistic well-being Comprehensive range of rewards and recognitions Competitive health and medical insurance coverage Best-in-class amenities and workspaces Policies designed with associates at the center. EQUAL OPPORTUNITY EMPLOYER Reasonable Accommodation Our inclusive culture empowers Carelon to deliver the best results for our customers. We not only celebrate the diversity of our workforce, but we also celebrate the diverse ways we work. If you have a disability and need accommodation such as an interpreter or a different interview format, please ask for the Reasonable Accommodation Request Form. Job Type: Full time

Posted 1 week ago

Apply

15.0 - 16.0 years

45 - 50 Lacs

Chennai

Work from Office

Job Summary: We are seeking a Data Architect to design and implement scalable, secure, and efficient data solutions that support Convey Health Solutions business objectives. This role will focus on data modeling, cloud data platforms, ETL processes, and analytics solutions, ensuring compliance with healthcare regulations (HIPAA, CMS guidelines). The ideal candidate will collaborate with data engineers, BI analysts, and business stakeholders to drive data-driven decision-making. Key Responsibilities: Enterprise Data Architecture: Design and maintain the overall data architecture to support Convey Health Solutions\u2019 data-driven initiatives. Cloud & Data Warehousing: Architect cloud-based data solutions (AWS, Azure, Snowflake, BigQuery) to optimize scalability, security, and performance. Data Modeling: Develop logical and physical data models for structured and unstructured data, supporting analytics, reporting, and operational processes. ETL & Data Integration: Define strategies for data ingestion, transformation, and integration, leveraging ETL tools like INFORMATICA, TALEND, DBT, or Apache Airflow. Data Governance & Compliance: Ensure data quality, security, and compliance with HIPAA, CMS, and SOC 2 standards. Performance Optimization: Optimize database performance, indexing strategies, and query performance for real-time analytics. Collaboration: Partner with data engineers, software developers, and business teams to align data architecture with business objectives. Technology Innovation: Stay up to date with emerging data technologies, AI/ML applications, and industry trends in healthcare data analytics. Required Qualifications: Education: Bachelor\u2019s or Master\u2019s degree in Computer Science, Information Systems, Data Engineering, or a related field. Experience: 7+ years of experience in data architecture, data engineering, or related roles. Technical Skills: Strong expertise in SQL, NoSQL, and data modeling techniques Hands-on experience with cloud data platforms (AWS Redshift, Snowflake, Google BigQuery, Azure Synapse) Experience with ETL frameworks ( INFORMATICA, TALEND, DBT, Apache Airflow, etc.) Knowledge of big data technologies (Spark, Hadoop, Data-bricks) Strong understanding of data security and compliance (HIPAA, CMS, SOC 2, GDPR) Soft Skills: Strong analytical, problem-solving, and communication skills. Ability to work in a collaborative, agile environment. Preferred Qualifications: Experience in healthcare data management, claims processing, risk adjustment, or pharmacy benefit management (PBM). Familiarity with AI/ML applications in healthcare analytics. Certifications in cloud data platforms (AWS Certified Data Analytics, Google Professional Data Engineer, etc.). ","Max_Experience":"15+

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 12 Lacs

Pune

Work from Office

Solid experience working with Snowflake, including data modeling, query optimization, and data integration. Proficiency in designing and implementing complex ETL pipelines and data workflows. Strong understanding of cloud-based data platforms

Posted 1 week ago

Apply

4.0 - 7.0 years

8 - 14 Lacs

Pune

Hybrid

Job Description We are Hiring for ETL Engineer with GCP Location: India (Pune) Exp: 3 - 7 Years Required Skills and Qualifications: 3+ years of experience in Data Engineering roles. Strong hands-on experience with Google Cloud Platform (GCP) data services, specifically BigQuery, Cloud Composer (Apache Airflow), and Cloud Storage. Mandatory expertise in informatica. Mandatory strong proficiency in SQL and PL/SQL for data manipulation, stored procedures, functions, and complex query writing. Ability to optimize BigQuery queries for performance and cost. Familiarity with version control systems (e.g., Git). Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively in an agile environment. Bachelor's degree in Computer Science, Engineering, or a related field.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies