Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 years
6 - 9 Lacs
Chennai
On-site
As a Technical Anchor working in Ford Credit IT, you will join a team that supports to develop enterprise scale applications/building SaaS products in the Salesforce Service Cloud/ Auto Cloud. Work on a balanced product team to define, design, develop and deploy Salesforce Service Cloud/ Auto Cloud in developing Form Data Models, Customer Data Platforms (CDP)/Interaction Studio/Journey builder/Automation Studio/Email and Mobile studio, contact builder,data extension, data sync,Sitemap,content block. Ability to Productize (Build/Author) a document generation product as a SaaS (Software as a Service) products hosted on Mulesoft and Google Cloud Platform (GCP). Build and maintain digital expertise by researching latest industry trends and standards, driving innovation through PoCs and experiments. Develop Salesforce Service Cloud/ Auto Cloud applications . Evaluate potential solutions using both technical and commercial criteria that support the established cost and service requirements with continuous improvement and innovative mindset. Develop and automate unit and integration test scripts. Integrated with MuleSoft applications for integrations around Sales/ Service clouds with Ford Credit Systems. Act as a mentor for less experienced developers through both your technical knowledge and ability to inspire a team to build extraordinary impact together. Understand the depth of the User Stories and provide accurate estimates. Automate performance monitoring and notification in the event of failures using best practices and tools. Research new technologies, influences and implements enterprise technology shifts and new trends impacting Ford application delivery. Do code deployments using CICD Salesforce Salescloud and Mulesoft pipeline with Service cloud – Copado Salesforce deployment. Participate in highly collaborative environment. DevOps o Continuous Integration and Continuous Deployment (CI/CD) Security (SAST/DAST) Monitoring/logging/tracing/ tools (SPLUNK etc…) Experience deployment using source control using Visualsourcecode/Github repo/Copado. Strong sense of code with ability to review code using SonarQube, Checkmarx, rework and deliver Quality code. Build a reusable component using LWC component, AmpScript, Service Side Java Script (SSJS), and SQL. Integrating salesforce Marketing cloud with external system using SFMC APIs Follow enterprise architecture processes and advise teams on cloud design, development, and architecture, service blueprints. Engage in Agile practices including but not limited to Stand-ups, backlog grooming, sprint demos and journey mapping. 5+ years of experience in architecting and implementing fault tolerant, highly available Service/Auto cloud API/REST/SOAP,Platform Event(Pub/Sub). Salesforce Service/Auto Cloud Developer/Consultant/Salesforce Application Architect certifications will be an added advantage. Should have SQL knowledge and have the experience writing database scripts using DDL or queries using DML. Experience in SRE in Copado and ability to architect the services considering observability, traceability and monitoring aspects. At least 4 years of experience in Agile scrum software development process. Ability to work in team in diverse/ multiple stakeholder environment. Experience and desire to work in a Global delivery environment. Excellent communication skills with the ability to adapt your communication style to the audience. Demonstrated ability to drive development of highly technical technology services and capabilities. Experience deployment using source control using change sets and CICD pipelines. B.E. / B.Tech / M.C.A Minimum 7 years of experience developing Salesforce Service/Auto Cloud customizations. Extensive experience in Ampscript,Apex, JavaScript, Lightning components, Aura Component and Lighting Web Component, Omniscript, Velocity Must have experience in in contact builder,data extension,data sync,Sitemap,content block, Lead Service/Auto Cloud data modeling and architecture including data extension modeling and cross-product data architecture & mapping Ability to integrate Mulesoft, Informatica, Grapghql, Mediallia and Emplifi. Ability to create flows, modify objects, create custom objects, write Apex, triggers and integrate API services using an IDE Demonstrated ability to drive development of highly technical technology services and capabilities. Experience with the Salesforce.com APEX data loader , Salesforce.com web services APIs/Platform Event/Changedata capture/REST/Pub/Sub. Strong sense of code with ability to review code using SonarQube, Checkmarx, rework and deliver Quality code. Demonstrated experience in Customer Data Platforms (CDP)/Interaction Studio/Journey builder/Automation Studio/Email and Mobile studio. Demonstrated experience establishing and maintaining data structures, data extensions and automations within Salesforce Service/Auto Cloud Experience in Enterprise data analytics, Reporting and Monitoring using Splunk, Dynatrace, healthnut etc
Posted 1 week ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Summary - We are seeking a detail-oriented and analytical Data Analyst to join our team. The ideal incumbent will be responsible for collecting, processing, and analyzing large datasets to uncover insights that drive strategic decision-making. You will work closely with cross-functional teams to identify trends, create visualizations, and deliver actionable recommendations that support business goals. Key Responsibilities. Drive business excellence by identifying opportunities for process optimization, automation, and standardization through data insights. Design, develop, and maintain robust ETL pipelines and SQL queries to ingest, transform, and load data from diverse sources. Build and maintain Excel-based dashboards, models, and reports; automate repetitive tasks using Excel macros, Power Query, or scripting tools. Ensure data quality, integrity, and consistency through profiling, cleansing, validation, and regular monitoring. Translate business questions into analytical problems and deliver actionable insights using statistical techniques and data visualization tools. Collaborate with cross-functional teams (e.g., marketing, finance, operations) to define data requirements and address business challenges. Develop and implement efficient data collection strategies and systems to optimize accuracy and performance. Monitor and troubleshoot data workflows, resolving issues and ensuring compliance with data privacy and security regulations. Document data processes, definitions, and business rules to support transparency, reuse, and continuous improvement. Support continuous improvement initiatives by providing data-driven recommendations that enhance operational efficiency and decision-making. Contribute to the development and implementation of best practices in data management, reporting, and analytics aligned with business goals. Person Profile . Qualification - Bachelor’s / Master’s degree in Computer Science, Information Systems, Statistics, or a related field. Experience- 2 -5 Yrs. Desired Certification & Must Have- 3–5 years of experience in data analysis, preferably in the pharmaceutical industry. Advanced proficiency in SQL (joins, CTEs, window functions, optimization) and expert-level Excel skills (pivot tables, advanced formulas, VBA/macros). Strong understanding of data warehousing, relational databases, and ETL tools (e.g., SSIS, Talend, Informatica). Proficiency in data visualization tools (e.g., Power BI, Tableau) and statistical analysis techniques. Solid analytical and problem-solving skills with attention to detail and the ability to manage complex data sets and multiple priorities. Excellent communication and documentation skills to convey insights to technical and non-technical stakeholders. Familiarity with data modelling, database management, and large-scale data manipulation and cleansing. Demonstrated ability to work collaboratively in Agile/Scrum environments and adapt to evolving business needs. Strong focus on process optimization, continuous improvement, and operational efficiency. Experience in implementing best practices for data governance, quality assurance, and compliance. Ability to identify and drive initiatives that enhance business performance through data-driven decision-making. Exposure to business domains such as finance, operations, or marketing analytics with a strategic mindset Show more Show less
Posted 1 week ago
3.0 - 7.0 years
0 - 1 Lacs
Pune, Ahmedabad, Bengaluru
Work from Office
Job Title: Reltio MDM Developer Location: Remote Experience Required: 2+ Years Key Responsibilities: Design, configure, and implement Reltio MDM solutions based on business and technical requirements. Develop and enhance Reltio data models including entities, attributes, relationships, and match/merge rules. Configure survivorship rules , reference data, workflows, and validation rules within the platform. Build seamless integrations between Reltio and external systems using REST APIs , ETL tools (e.g., Informatica, Talend), or middleware solutions (e.g., MuleSoft). Monitor, troubleshoot, and optimize data load and synchronization processes. Support data governance initiatives, including data quality profiling , standardization, and issue resolution. Collaborate with business stakeholders, data stewards, and analysts to refine requirements and address data integrity concerns. Ensure performance tuning and adherence to Reltio best practices for configuration and deployment. Required Skills: Minimum 2+ years of hands-on experience working with the Reltio Cloud MDM platform . Strong grasp of MDM principles , data modeling concepts , and entity relationship management . Experience configuring Reltio L3 , match/merge logic, and survivorship strategies. Proficiency with REST APIs , JSON , and XML for integration and data exchange. Working experience with integration tools like Talend , Informatica , or MuleSoft . Solid debugging and troubleshooting skills related to data quality , transformations, and API communication. Familiarity with data governance frameworks and compliance standards. Nice to Have: Experience in implementing Reltio UI configurations or custom UI components. Exposure to data analytics and reporting tools. Knowledge of cloud platforms (e.g., AWS, Azure) for hosting or extending MDM functionality. Familiarity with Agile methodologies and tools like JIRA or Confluence .
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Area MDM CoE MDM Grade Associate/Sr. Associate # of people 9 Skill Set Informatica MDM Location Gurgaon / Bangalore YoE 4-7 Comments Should be able to Lead MDM Delivery as a solution architect and contribute in BD Mandatory skill sets- Informatica, MDM Preferred Skill Sets- Informatica , MDM Year of experience required- 4-8 Years Qualifications-BTech/MBA/MTech/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field Of Study Required Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Informatica MDM Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 week ago
15.0 years
0 Lacs
Calcutta
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica Intelligent Cloud Services Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the solutions align with business objectives. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application development processes. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application specifications and user guides. - Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica Intelligent Cloud Services. - Strong understanding of application development methodologies. - Experience with cloud-based application deployment and management. - Familiarity with data integration and transformation processes. - Ability to troubleshoot and resolve application issues efficiently. Additional Information: - The candidate should have minimum 3 years of experience in Informatica Intelligent Cloud Services. - This position is based at our Kolkata office. - A 15 years full time education is required. 15 years full time education
Posted 1 week ago
30.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Position Overview ABOUT APOLLO Apollo is a high-growth, global alternative asset manager. In our asset management business, we seek to provide our clients excess return at every point along the risk-reward spectrum from investment grade to private equity with a focus on three investing strategies: yield, hybrid, and equity. For more than three decades, our investing expertise across our fully integrated platform has served the financial return needs of our clients and provided businesses with innovative capital solutions for growth. Through Athene, our retirement services business, we specialize in helping clients achieve financial security by providing a suite of retirement savings products and acting as a solutions provider to institutions. Our patient, creative, and knowledgeable approach to investing aligns our clients, businesses we invest in, our employees, and the communities we impact, to expand opportunity and achieve positive outcomes. OUR PURPOSE AND CORE VALUES Our Clients Rely On Our Investment Acumen To Help Secure Their Future. We Must Never Lose Our Focus And Determination To Be The Best Investors And Most Trusted Partners On Their Behalf. We Strive To Be The leading provider of retirement income solutions to institutions, companies, and individuals. The leading provider of capital solutions to companies. Our breadth and scale enable us to deliver capital for even the largest projects – and our small firm mindset ensures we will be a thoughtful and dedicated partner to these organizations. We are committed to helping them build stronger businesses. A leading contributor to addressing some of the biggest issues facing the world today – such as energy transition, accelerating the adoption of new technologies, and social impact – where innovative approaches to investing can make a positive difference. We are building a unique firm of extraordinary colleagues who: Outperform expectations Challenge Convention Champion Opportunity Lead responsibly Drive collaboration As One Apollo team, we believe that doing great work and having fun go hand in hand, and we are proud of what we can achieve together. Our Benefits Apollo relies on its people to keep it a leader in alternative investment management, and the firm’s benefit programs are crafted to offer meaningful coverage for both you and your family. Please reach out to your Human Capital Business Partner for more detailed information on specific benefits. Position Overview At Apollo, we’re a global team of alternative investment managers passionate about delivering uncommon value to our investors and shareholders. With over 30 years of proven expertise across Private Equity, Credit and Real Estate, regions and industries, we’re known for our integrated businesses, our strong investment performance, our value-oriented philosophy – and our people. The Client and Innovation Engineering team is responsible to design and deliver digital products to our institutional and wealth management clients and sales team. We are a product driven, and developer focused team; our goal to simplify our engineering process and meet our business objectives. We look for creative collaborators who evolve, adapt to change, and thrive in a fast-paced global environment. Primary Responsibilities Apollo is seeking a hands-on, business-oriented Lead Data Engineer to lead the technology efforts focused on supporting data driven distribution processes. The ideal candidate will bring strong experience in data engineering within asset and/or wealth management, combined with excellent technical acumen and a passion for building scalable, secure, and high-performance data solutions. This role will partner closely with Distribution Data Enablement, Sales & Marketing, Operations, and Finance teams to execute key initiatives aligned with Apollo’s target operating model. You will play a critical role in building and evolving our data products and infrastructure. You will learn new technologies, working on constantly upgrading your skill set and the products you work on to be at-par with the best in the industry. You will innovate and solve technical challenges that emphasizes a long-term vision. Design, build, and maintain scalable and efficient cloud-based data pipelines and integration workflows using Azure Data Factory (ADF), DBT, Snowflake, FiveTran, and related tools. Collaborate closely with business stakeholders to understand data needs and translate them into effective technical solutions, including developing relational and dimensional data models. Implement and optimize end-to-end ETL/ELT processes to support enterprise data needs. Design and implement pipeline controls, conduct data quality assessments and enforce data governance best practices to ensure accuracy and integrity. Monitor, troubleshoot, and resolve issues across data pipelines to ensure stability, reliability, and performance. Partner with cross-functional teams to support analytics, reporting, and operational data needs. Stay current with industry trends and emerging technologies to continuously improve our data architecture. Support master data management (MDM) initiatives and contribute to overall data strategy and architecture. Qualifications & Experience 8+ years of professional experience in data engineering or a related field, ideally within financial services or asset/wealth management. Proven expertise in Azure-based data engineering tools including ADF, DBT, Snowflake, and FiveTran. Programming skills in Python (or Scala/Java) for data transformation and automation. Solid understanding of modern data modeling (relational, dimensional, and star schema). Experience with MDM platforms and frameworks is highly desirable. Familiarity with additional ETL/ELT tools (e.g., Talend, Informatica, SSIS) is a plus. Comfortable working in a fast-paced, agile environment with rapidly changing priorities. Strong communication skills, with the ability to translate complex technical topics into business-friendly language. A degree in Computer Science, Engineering, or a related field is preferred. A strong analytical mindset with a passion for solving complex problems. A team player who is proactive, accountable, and detail oriented. A leader who sets high standards and delivers high-quality outcomes. An innovator who keeps up with industry trends and continually seeks opportunities to improve. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Andhra Pradesh, India
On-site
Key Responsibilities Set up and maintain monitoring dashboards for ETL jobs using Datadog, including metrics, logs, and alerts. Monitor daily ETL workflows and proactively detect and resolve data pipeline failures or performance issues. Create Datadog Monitors for job status (success/failure), job duration, resource utilization, and error trends. Work closely with Data Engineering teams to onboard new pipelines and ensure observability best practices. Integrate Datadog with tools. Conduct root cause analysis of ETL failures and performance bottlenecks. Tune thresholds, baselines, and anomaly detection settings in Datadog to reduce false positives. Document incident handling procedures and contribute to improving overall ETL monitoring maturity. Participate in on call rotations or scheduled support windows to manage ETL health. Required Skills & Qualifications 3+ years of experience in ETL/data pipeline monitoring, preferably in a cloud or hybrid environment. Proficiency in using Datadog for metrics, logging, alerting, and dashboards. Strong understanding of ETL concepts and tools (e.g., Airflow, Informatica, Talend, AWS Glue, or dbt). Familiarity with SQL and querying large datasets. Experience working with Python, Shell scripting, or Bash for automation and log parsing. Understanding of cloud platforms (AWS/GCP/Azure) and services like S3, Redshift, BigQuery, etc. Knowledge of CI/CD and DevOps principles related to data infrastructure monitoring. Preferred Qualifications Experience with distributed tracing and APM in Datadog. Prior experience monitoring Spark, Kafka, or streaming pipelines. Familiarity with ticketing tools (e.g., Jira, ServiceNow) and incident management workflows. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
We are seeking a skilled Salesforce Architect to lead the design and execution of impactful integration strategies that drive seamless collaboration between Salesforce and other systems, ensuring optimal functionality and efficiency across our organizational platforms. Responsibilities Design and architect end-to-end integration solutions for Salesforce, ensuring seamless data flow and system interoperability Collaborate with stakeholders to understand business requirements and translate them into scalable and efficient integration designs Define integration patterns, protocols, and standards for integrating Salesforce with various systems, both internal and external Assess and recommend integration tools, middleware, and technologies to facilitate data synchronization and communication between Salesforce and other applications Develop integration strategies and roadmaps to support the organization's overall business objectives and technology landscape Lead and guide development teams in implementing integration solutions, providing technical expertise and best practices Conduct thorough testing and quality assurance activities to ensure the reliability, performance, and security of integrations Monitor and troubleshoot integration processes, identifying and resolving issues to minimize downtime and data discrepancies Collaborate with cross-functional teams, including developers, administrators, and business analysts, to ensure successful integration outcomes Stay up-to-date with the latest Salesforce integration features, updates, and industry trends, and evaluate their applicability to the organization's integration needs Provide guidance and support during the full project lifecycle, from requirements gathering and design to deployment and post-implementation support Configure and integrate Salesforce External Objects to enable real-time access to external data sources Leverage out-of-the-box (OOTB) integration capabilities of Salesforce, such as Salesforce Connect, Platform Events, and REST/SOAP APIs Document integration architectures, technical specifications, and configurations for future reference and knowledge sharing Requirements Extensive experience in designing and architecting complex integration solutions involving Salesforce Strong knowledge of Salesforce integration technologies and tools, such as MuleSoft, Informatica Cloud, Salesforce Connect, Platform Events, and RESTful APIs Proficiency in integration patterns, message formats (JSON, XML), protocols (SOAP, HTTP, FTP), and security standards (OAuth, SSL) In-depth understanding of enterprise integration patterns and best practices, including ETL, data mapping, and data synchronization Familiarity with integration challenges related to cloud platforms, on-premises systems, databases, and third-party applications Excellent problem-solving and analytical skills, with the ability to identify and resolve complex integration issues Strong communication and collaboration skills to effectively interact with stakeholders at various levels of the organization Experience in leading and guiding development teams, providing technical leadership and mentorship Salesforce certifications, such as Salesforce Certified Integration Architecture Designer, are highly desirable Knowledge of configuring Salesforce External Objects and leveraging OOTB integration capabilities such as Canvas SDK, etc. Show more Show less
Posted 1 week ago
5.0 - 8.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
At Nielsen, we are passionate about our work to power a better media future for all people by providing powerful insights that drive client decisions and deliver extraordinary results. Our talented, global workforce is dedicated to capturing audience engagement with content - wherever and whenever it’s consumed. Together, we are proudly rooted in our deep legacy as we stand at the forefront of the media revolution. When you join Nielsen, you will join a dynamic team committed to excellence, perseverance, and the ambition to make an impact together. We champion you, because when you succeed, we do too. We enable your best to power our future. Job Description: We are seeking an experienced Developer with expertise in ETL, AWS Glue and Data Engineer experience, combined with strong skills in Java and SQL. The ideal candidate will have 5-8 years of experience designing, developing, and implementing ETL processes and data integration solutions. Responsibilities include developing ETL pipelines using AWS Glue, managing data workflows with Informatica, and writing complex SQL queries. Strong problem-solving abilities and experience with data warehousing are essential. Key Skills: Proficiency in AWS Glue and Informatica ETL tools Strong Java programming skills Advanced SQL querying and optimization Experience with data integration and data warehousing Excellent problem-solving and analytical skills Please be aware that job-seekers may be at risk of targeting by scammers seeking personal data or money. Nielsen recruiters will only contact you through official job boards, LinkedIn, or email with a nielsen.com domain. Be cautious of any outreach claiming to be from Nielsen via other messaging platforms or personal email addresses. Always verify that email communications come from an @ nielsen.com address. If you're unsure about the authenticity of a job offer or communication, please contact Nielsen directly through our official website or verified social media channels. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, protected veteran status or other characteristics protected by law. Show more Show less
Posted 1 week ago
6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Data Engineer Position Summary The Data Engineer is responsible for building and maintaining data pipelines ensuring the smooth operation of data systems and optimizing workflows to meet business requirements This role will support data integration and processing for various applications Minimum Qualifications 6 Years overall IT experience with minimum 4 years of work experience in below tech skills Tech Skills Proficient in Python scripting and PySpark for data processing tasks Strong SQL capabilities with hands on experience managing big data using ETL tools like Informatica Experience with the AWS cloud platform and its data services including S3 Redshift Lambda EMR Airflow Postgres SNS and EventBridge Skilled in BASH Shell scripting Understanding of data lakehouse architecture particularly with Iceberg format is a plus Preferred Experience With Kafka And Mulesoft API Understanding of healthcare data systems is a plus Experience in Agile methodologies Strong analytical and problem solving skills Effective communication and teamwork abilities Responsibilities Develop and maintain data pipelines and ETL processes to manage large scale datasets Collaborate to design test data architectures to align with business needs Implement and optimize data models for efficient querying and reporting Assist in the development and maintenance of data quality checks and monitoring processes Support the creation of data solutions that enable analytical capabilities Contribute to aligning data architecture with overall organizational solutions Show more Show less
Posted 1 week ago
5.0 - 8.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
At Nielsen, we are passionate about our work to power a better media future for all people by providing powerful insights that drive client decisions and deliver extraordinary results. Our talented, global workforce is dedicated to capturing audience engagement with content - wherever and whenever it’s consumed. Together, we are proudly rooted in our deep legacy as we stand at the forefront of the media revolution. When you join Nielsen, you will join a dynamic team committed to excellence, perseverance, and the ambition to make an impact together. We champion you, because when you succeed, we do too. We enable your best to power our future. Job Description: We are seeking an experienced Developer with expertise in ETL, AWS Glue and Data Engineer experience, combined with strong skills in Java and SQL. The ideal candidate will have 5-8 years of experience designing, developing, and implementing ETL processes and data integration solutions. Responsibilities include developing ETL pipelines using AWS Glue, managing data workflows with Informatica, and writing complex SQL queries. Strong problem-solving abilities and experience with data warehousing are essential. Key Skills: Proficiency in AWS Glue and Informatica ETL tools Strong Java programming skills Advanced SQL querying and optimization Experience with data integration and data warehousing Excellent problem-solving and analytical skills Please be aware that job-seekers may be at risk of targeting by scammers seeking personal data or money. Nielsen recruiters will only contact you through official job boards, LinkedIn, or email with a nielsen.com domain. Be cautious of any outreach claiming to be from Nielsen via other messaging platforms or personal email addresses. Always verify that email communications come from an @ nielsen.com address. If you're unsure about the authenticity of a job offer or communication, please contact Nielsen directly through our official website or verified social media channels. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, protected veteran status or other characteristics protected by law. Show more Show less
Posted 1 week ago
5.0 - 10.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Job Title: Data Architect (C2) Job Summary The Data Architect will provide technical expertise in analysis, design, development, rollout and maintenance of enterprise data models and solutions Provides technical expertise in needs identification, data modelling, data movement and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective. Understands and leverages best fit technologies (e.g., cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges Provides data understanding and coordinate data related activities with other data management groups such as master data management, data governance and metadata management. Leadership not only in the conventional sense, but also within a team we expect people to be leaders. Candidate should elicit leadership qualities such as Innovation, Critical thinking, optimism/positivity, Communication, Time Management, Collaboration, Problem-solving, Acting Independently, Knowledge sharing and Approachable. Essential Duties Design and develop conceptual / logical / physical data models for building large scale data lake and data warehouse solutions Understanding of data integration processes (batch or real-time) using tools such as Informatica PowerCenter and/or Cloud, Microsoft SSIS, MuleSoft, DataStage, Sqoop, etc. Create functional & technical documentation – e.g. data integration architecture documentation, data models, data dictionaries, data integration specifications, data testing plans, etc. Collaborate with business users to analyse and test requirements Stays current with emerging and changing technologies to best recommend and implement beneficial technologies and approaches for Data Architecture Assist with and support setting the data architecture direction (including data movement approach, architecture / technology strategy, and any other data-related considerations to ensure business value,) ensuring data architecture deliverables are developed, ensuring compliance to standards and guidelines, implementing the data architecture, and supporting technical developers at a project or business unit level Coordinate and consult with the project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels Education & Experience 5-10 years of Enterprise Data Modelling Experience using major data modelling tools (examples: ERwin, ER/Studio, PowerDesigner, etc.) Expert proficiency in Data Contracts, Data Modelling, and Data Vault 2.0. Experience with major database platforms (e.g. Oracle, SQL Server, Teradata, etc.) Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) 3-5 years of management experience required 3-5 years consulting experience preferred Bachelor’s degree or equivalent experience, Master’s Degree Preferred Experience in data analysis and profiling Strong data warehousing and OLTP systems from a modelling and integration perspective Strong understanding of data integration best practices and concepts Strong development experience under Unix and/or Windows environments Strong SQL skills required scripting (e.g., PL/SQL) preferred Strong Knowledge of all phases of the system development life cycle Understanding of modern data warehouse capabilities and technologies such as real-time, cloud, Big Data. Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, Google Cloud) Preferred Skills & Experience Comprehensive understanding of relational databases and technical documentation Ability to analyse business requirements as they relate to the data movement and transformation processes, research, evaluation and recommendation of alternative solutions. Ability to transform business requirements into technical requirement documents. Ability to run conceptual data modelling sessions to accurately define business processes, independently of data structures and then combines the two together. Can create documentation and presentations such that the they “stands on their own.” Demonstrates ability to create new and innovative solutions to problems that have previously not been encountered. Ability to work independently on projects as well as collaborate effectively across teams Must excel in a fast-paced, agile environment where critical thinking and strong problem solving skills are required for success Strong team building, interpersonal, analytical, problem identification and resolution skills Experience working with multi-level business communities Can effectively utilise SQL and/or available BI tool to validate/elaborate business rules. Demonstrates an understanding of EDM architectures and applies this knowledge in collaborating with the team to design effective solutions to business problems/issues. Understands and leverages a multi-layer semantic model to ensure scalability, durability, and supportability of the analytic solution. Understands modern data warehouse concepts (real-time, cloud, Big Data) and how to enable such capabilities from a reporting and analytic stand-point. Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM Show more Show less
Posted 1 week ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Description Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It With Pride. Together with analytics team leaders you will support our business with excellent data models to uncover trends that can drive long-term business results. How You Will Contribute You will: Work in close partnership with the business leadership team to execute the analytics agenda Identify and incubate best-in-class external partners to drive delivery on strategic projects Develop custom models/algorithms to uncover signals/patterns and trends to drive long-term business performance Execute the business analytics program agenda using a methodical approach that conveys to stakeholders what business analytics will deliver What You Will Bring A desire to drive your future and accelerate your career and the following experience and knowledge: Using data analysis to make recommendations to senior leaders Technical experience in roles in best-in-class analytics practices Experience deploying new analytical approaches in a complex and highly matrixed organization Savvy in usage of the analytics techniques to create business impacts Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It Possible As the Data & Analytics Manager for Consumer tower, you will be involved in driving the Data & Analytics strategic vision & roadmap, building momentum by rallying the rest of the organization, implementing data & analytics identified priorities to deliver strong business value across all levels of organization at right cost structure and lead the development of a cutting-edge solution that aims to bring BI and AI to the marketing space. How You Will Contribute You will: Be responsible for analytics engagement with the Marketing function at MDLZ and support the Consumer D&A Lead in driving the D&A agenda and roadmap for the market by collaborating closely with senior business stakeholders to deliver strong business value across functions in the organization. Consult, influence, and collaborate with business stakeholders to craft analytics methodologies and solutions relevant to their needs and use-cases - applying techno-functional expertise in data and AI as part of the solution-forming process. Oversee the day-to-day technical development of the solution, creating strong alignment in the working team to achieve goals. Collaborate across Business and MDS functions to build and develop demands by maximizing our resources of D&A across Data Management, Analytics Products and Data Science– this requires strong collaboration and influencing skills to drive adoption, relevancy, and business impact with speed. Minimizing complexity, establishing right ways of working to accelerate path to value by being choiceful and creative as well as having a growth mindset everyday will be essential. Validate the weekly progress of the technical teams and lead business user tests, ensuring product quality before the product gets into the hands of the internal customer. Creating value from business-driven data and analytics initiatives at scale. An important task for the Data & Analytics Manager is to support the business and other stakeholders in their solving their business problems via relevant data & analytics. This means they will support the business during various stages. In the inspiration phase they help the business in identifying the right use cases and support prioritization based on feasibility and value. In the ideation phase they help with the development of a minimum viable product and business case. During the implementation phase they make sure the service or product is adopted (by the employees), embedded in the workflow (process) and measured for impact. Helps uncover and translate business requirements and stakeholder needs. This translation needs to be done in such a way that the technical specialists in the D&A team can understand (Data Management, Analytics Products, Partners and / or Data Science resources). This requires an understanding of both the business objectives, goals, and domain expertise, as well as data, analytics and technology concepts, methods, and techniques. It also requires strong soft skills with a focus on communication. The role will lead analytics delivery for Marketing initiatives & BI development (reports, dashboard, visualization) and/or data governance (stewardship best practices). Data & Analytics Skills Must have a good understanding of the concepts, methods and techniques used: Analytics, for example diagnostic, descriptive, predictive and prescriptive. AI, for example machine learning, natural language processing. Data management, for example data integration (ETL) or metadata. Data architecture, for example the difference between data warehouse, data lake or data hub. Data modelling, for creation of right reusable data assets. Data governance, for example MDM, data quality and data stewardship practices. Statistical skills, for example understanding the difference between correlation and causation. Technology Skills Good understanding of the tools and technologies in the D&A team: Programming languages like SQL, Python or R and notebooks like R Studio or Jupyter. Data integration tools like Informatica or Talend. Analytics and Business Intelligence tools like Microsoft Power BI or Tableau. S oft skills Leadership with high level of self-initiative and drive, for example leading the discussions on D&A agenda in the BU and building a combined vision across multiple stakeholders. Communication, for example conveying information to diverse audiences in a way that is easily understood and actionable. Facilitation and conflict resolution, for example hosting sessions to elicit ideas from others, understand their issues and encourage group participation. Creative thinking and being comfortable with unknown or unchartered territories, for example framing new concepts for business teams and brainstorming with business users about future product and services. Teamwork, for example working with both business domain teams as well as D&A teams and MDS stakeholders. Collaboration, for example fostering group problem solving and solution creation with business and technical team members. Relationship management, for example creating relationships and builds trust with internal and external stakeholders quickly. Storytelling, for example by creating a consistent, clear storyline for better understanding. Influencing, for example by asserting ideas and persuading others to gain support across an organization or to adopt new behaviors. Domain Skills Must have a good understanding of the business process and associated data: Business Acumen, for example understanding business concepts, practices and business domain language to engage in problem solving sessions and discuss business issues in stakeholder language. Relevant experience in Data and Analytics CPG or FMCG is preferred. Business Process Transformation, for example ability to understand how D&A can help redesign the way work is done. Business data, Nielsen/Circana or other EPOS/Retail Sales data source; Kantar/GFK or other household panel source. Other Skills Agility, Growth mindset will be crucial. Project management capabilities including ability to manage risks, for example understanding of project management concepts to organize their own work and the ability to collaborate with project managers to align business expectations with the D&A team delivery capabilities. Vendor negotiation and effort estimation skills, for example to manage the right partner skills at right cost based on the complexity and importance of the initiatives to be delivered or supported. Business case development, for example to help develop support for the experimenting selected use cases or measure the impact/business value created. For this the role can collaborate with business analysts in sales, marketing, RGM, finance or supply chain. Decision modelling, for example, supports decision makers and improves complicated decisions that involve trade-offs among alternative courses of action by using decision-problem models. UX/design, for example by creating products and visualizations that are easy to work with and support the activities required by the end users. Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy Business Unit Summary At Mondelēz International, our purpose is to empower people to snack right by offering the right snack, for the right moment, made the right way. That means delivering a broad range of delicious, high-quality snacks that nourish life's moments, made with sustainable ingredients and packaging that consumers can feel good about. We have a rich portfolio of strong brands globally and locally including many household names such as Oreo , belVita and LU biscuits; Cadbury Dairy Milk , Milka and Toblerone chocolate; Sour Patch Kids candy and Trident gum. We are proud to hold the top position globally in biscuits, chocolate and candy and the second top position in gum. Our 80,000 makers and bakers are located in more than 80 countries and we sell our products in over 150 countries around the world. Our people are energized for growth and critical to us living our purpose and values. We are a diverse community that can make things happen—and happen fast. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Analytics & Modelling Analytics & Data Science Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
About The Role Grade Level (for internal use): 09 S&P Global – Corporate About the Role : Software Developer II - Oracle EPM The Team : Join the Corporate Finance IT EPM Team, responsible for developing and managing Oracle Enterprise Performance Management (EPM) applications. Our work supports Financial Reporting, Revenue, Corporate, Statutory, and Tax reporting, as well as Master Data management (EDMCS), Consolidations (FCCS), Reconciliations (ARCS), and Financial Close processes in a techno-functional project environment. Responsibilities and Impact : You will serve as an Administrator for the Oracle EPM suite working closely with the EPM development team to enhance system processes and the user experience. This role is essential for overseeing accounting period close and consolidation processes, ensuring compliance with SOX policies and procedures. Your expertise in reporting, reconciliation, and audit requests will support our global finance operations effectively. Administer the EPM Production environment, assisting global users with financial analysis. Primary Admin on Oracle EPM Financial Consolidation and Close Cloud Service (FCCS) application. Manage data load schedules from ERP and ensure data integrity through rigorous reconciliation processes Manage the Estimate data flows from Anaplan (Estimating/Budgeting System) to EPM via Informatica Support the categorization, data mapping, and governance for financial account requests, controlling reporting structure changes Conduct UAT testing and approvals for system enhancements Collaborate with internal and external partners to enhance system stability, performance, and functionality Utilize cutting-edge technologies and automation initiatives to enhance system functionality Provide ad-hoc support for timely closure of accounting books and resolve issues efficiently Maintain thorough documentation and work on process enhancements, incorporating automation tools where applicable Maintain data security access in all EPM pods and Anaplan models What We’re Looking For Basic Required Qualifications: Certified Chartered Accountant or Cost Accountant degree or equivalent preferred. Over 3 years of experience in finance and accounting operations, including record-to-report functions. Proficiency in reporting tools and experience with Oracle EPM systems or equivalent. Preferred to have experience with Oracle Enterprise Performance Management (EPM) system or HFM application or equivalent. Strong communication skills for collaboration across teams and management. Ability to manage workload efficiently, meet deadlines, and adapt to changing priorities. Experience in cloud platform transitions and system integration. Assertive problem-solving skills and the ability to work independently. Knowledge of all Microsoft Office Products, specifically Outlook, Excel, and Word. Must be able to work independently, be accountable for processes/tasks performed, and understand when to escalate issues to management. Flexible to work in shifting schedules, primarily to match extended US working hours (EST time zone), and render overtime when there is a strong business need, such as monthly closing of financial books or preparation of financial or reporting statements. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 315305 Posted On: 2025-05-15 Location: Hyderabad, Telangana, India Show more Show less
Posted 1 week ago
130.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description R3 Senior Manager – Data and Analytics Architect The Opportunity Based in Hyderabad, join a global healthcare biopharma company and be part of a 130- year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Be part of an organisation driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be a part of a team with passion for using data, analytics, and insights to drive decision-making, and which creates custom software, allowing us to tackle some of the world's greatest health threats. Our Technology Centre’s focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of our company’s IT operating model, Tech Centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each Tech Center helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers. Role Overview We are seeking a highly motivated and hands-on Data & Analytics Architect to join our Strategy & Architecture team within CDNA. This mid-level role will play a critical part in designing scalable, reusable, and secure data and analytics solutions across the enterprise. You will work under the guidance of a senior architect and be directly involved in the implementation of architectural patterns, reference solutions, and technical best practices. This is a highly technical role, ideal for someone who enjoys problem-solving, building frameworks, and working in a fast-paced, collaborative environment. What Will You Do In This Role Partner with senior architects to define and implement modern data architecture patterns and reusable frameworks. Design and develop reference implementations for ingestion, transformation, governance, and analytics using tools such as Databricks (must-have), Informatica, AWS Glue, S3, Redshift, and DBT. Contribute to the development of a consistent and governed semantic layer, ensuring alignment in business logic, definitions, and metrics across the enterprise. Work closely with product line teams to ensure architectural compliance, scalability, and interoperability. Build and optimize batch and real-time data pipelines, applying best practices in data modeling, transformation, and metadata management. Contribute to architecture governance processes, participate in design reviews, and document architectural decisions. Support mentoring of junior engineers and help foster a strong technical culture within the India-based team. What Should You Have Bachelor’s degree in information technology, Computer Science or any Technology stream. 5–8 years of experience in data architecture, data engineering, or analytics solution delivery. Proven hands-on experience with Databricks (must), Informatica, AWS data ecosystem (S3, Glue, Redshift, etc.), and DBT. Solid understanding of semantic layer design, including canonical data models and standardized metric logic for enterprise reporting and analytics. Proficient in SQL, Python, or Scala. Strong grasp of data modeling techniques (relational, dimensional, NoSQL), ETL/ELT design, and streaming data frameworks. Knowledge of data governance, data security, lineage, and compliance best practices. Strong collaboration and communication skills across global and distributed teams. Experience with Dataiku or similar data science/analytics platforms is a plus. Exposure to AI/ML and GenAI use cases is advantageous. Background in pharmaceutical, healthcare, or life sciences industries is preferred. Familiarity with API design, data services, and event-driven architecture is beneficial. Our technology teams operate as business partners, proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver services and solutions that help everyone be more productive and enable innovation. Who We Are We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What We Look For Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are among the intellectually curious, join us—and start making your impact today. #HYDIT2025 Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Enterprise Architecture (BEA), Business Process Modeling, Data Modeling, Emerging Technologies, Requirements Management, Solution Architecture, Stakeholder Relationship Management, Strategic Planning, System Designs Preferred Skills Job Posting End Date 06/15/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R341138 Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Specialist- Data Visualization Our Human Health Digital, Data and Analytics (HHDDA) team is innovating how we understand our patients and their needs. Working cross functionally we are inventing new ways of communicating, measuring, and interacting with our customers and patients leveraging digital, data and analytics. Are you passionate about helping people see and understand data? You will take part in an exciting journey to help transform our organization to be the premier data-driven company. As a Specialist in Data Visualization, you will be focused on designing and developing compelling data visualizations solutions to enable actionable insights & facilitate intuitive information consumption for internal business stakeholders. The ideal candidate will demonstrate competency in building user-centric visuals & dashboards that empower stakeholders with data driven insights & decision-making capability. Responsibilities Develop user-centric scalable visualization and analytical solutions, leveraging complex data sources to create intuitive and insightful dashboards. Apply best practices in visualization design to enhance user experience and business impact. Drive business engagement, collaborating with stakeholders to define key metrics, KPIs, and reporting needs. Facilitate workshops to develop user stories, wireframes, and interactive visualizations. Partner with data engineering, data science, and IT teams to develop scalable business-friendly reporting solutions. Ensure adherence to data governance, privacy, and security best practices. Identify opportunities for automation, streamlining manual reporting processes through modern visualization technologies and self-service analytics enablement. Provide thought leadership, driving knowledge-sharing within the Data & Analytics organization while staying ahead of industry trends to enhance visualization capabilities. Continuously innovative on visualization best practices & technologies by reviewing external resources & marketplace Ensuring timely delivery of high-quality outputs, while creating and maintaining SOPs, KPI libraries, and other essential governance documents. Required Experience And Skills 5+ years of experience in business intelligence, insight generation, business analytics, data visualization, infographics, and interactive visual storytelling Hands-on expertise in BI and visualization tools such as Power BI, MicroStrategy, and ThoughtSpot Solid understanding of data engineering and modeling, including ETL workflows, Dataiku, Databricks, Informatica, and database technologies like Redshift and Snowflake, with programming skills in SQL and Python. Deep knowledge of pharma commercial data sources, including IQVIA, APLD, Claims, Payer, Salesforce, Financials, Veeva, Komodo, IPSOS, and other industry datasets to drive strategic insights. Experience in pharmaceutical commercial analytics, including Field Force Effectiveness, customer engagement, market performance assessment, as well as web, campaign, and digital engagement analytics. Strong problem-solving, communication, and project management skills, with the ability to translate complex data into actionable insights and navigate complex matrix environments efficiently. Strong product management mindset, ensuring analytical solutions are scalable, user-centric, and aligned with business needs, with experience in defining product roadmaps and managing solution lifecycles. Expertise in agile ways of working, including Agile/Scrum methodologies, iterative development, and continuous improvement in data visualization and analytics solutions. Our Human Health Division maintains a “patient first, profits later” ideology. The organization is comprised of sales, marketing, market access, digital analytics and commercial professionals who are passionate about their role in bringing our medicines to our customers worldwide. We are proud to be a company that embraces the value of bringing diverse, talented, and committed people together. The fastest way to breakthrough innovation is when diverse ideas come together in an inclusive environment. We encourage our colleagues to respectfully challenge one another’s thinking and approach problems collectively. We are an equal opportunity employer, committed to fostering an inclusive and diverse workplace. Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Intelligence (BI), Data Visualization, Requirements Management, User Experience (UX) Design Preferred Skills Job Posting End Date 06/30/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R335067 Show more Show less
Posted 1 week ago
5.0 - 8.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
At Nielsen, we are passionate about our work to power a better media future for all people by providing powerful insights that drive client decisions and deliver extraordinary results. Our talented, global workforce is dedicated to capturing audience engagement with content - wherever and whenever it’s consumed. Together, we are proudly rooted in our deep legacy as we stand at the forefront of the media revolution. When you join Nielsen, you will join a dynamic team committed to excellence, perseverance, and the ambition to make an impact together. We champion you, because when you succeed, we do too. We enable your best to power our future. Job Description: We are seeking an experienced Developer with expertise in ETL, AWS Glue and Data Engineer experience, combined with strong skills in Java and SQL. The ideal candidate will have 5-8 years of experience designing, developing, and implementing ETL processes and data integration solutions. Responsibilities include developing ETL pipelines using AWS Glue, managing data workflows with Informatica, and writing complex SQL queries. Strong problem-solving abilities and experience with data warehousing are essential Key Skills: Proficiency in AWS Glue and Informatica ETL tools Strong Java programming skills Advanced SQL querying and optimization Experience with data integration and data warehousing Excellent problem-solving and analytical skills Please be aware that job-seekers may be at risk of targeting by scammers seeking personal data or money. Nielsen recruiters will only contact you through official job boards, LinkedIn, or email with a nielsen.com domain. Be cautious of any outreach claiming to be from Nielsen via other messaging platforms or personal email addresses. Always verify that email communications come from an @ nielsen.com address. If you're unsure about the authenticity of a job offer or communication, please contact Nielsen directly through our official website or verified social media channels. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, protected veteran status or other characteristics protected by law. Show more Show less
Posted 1 week ago
1.0 - 6.0 years
3 - 8 Lacs
Hyderabad
Work from Office
What you will do In this role with the Veeva Vault team, you will design, develop, and maintain software applications in Amgens Veeva Vault eTMF. You will ensure system availability and performance, collaborating with product managers, designers, and engineers to create scalable solutions. Your tasks include automating operations, monitoring system health, and responding to incidents to minimize downtime. Roles & Responsibilities: Possesses strong rapid prototyping skills and can quickly translate concepts into working code. Lead day to day operations and maintenance of Amgens Veeva Vault eTMF and hosted applications. Stay updated with the latest trends, advancements and standard process for Veeva Vault Platform ecosystem. Design, develop, and implement applications and modules, including custom reports, SDKs, interfaces, and enhancements. Analyze and understand the functional & technical requirements of applications, solutions and systems, translate them into software architecture and design specifications. Develop and implement unit tests, integration tests, and other testing strategies to ensure the quality of the software following IS change control and GxP Validation process while exhibiting expertise in Risk Based Validation methodology. Work closely with multi-functional teams, including product management, design, and QA, to deliver high-quality software on time. Maintain detailed documentation of software designs, code, and development processes. Work on integrating with other systems and platforms to ensure seamless data flow and functionality. Stay up to date on Veeva Vault Features, new releases and best practices around Veeva Platform Governance. Basic Qualifications: Masters degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelors degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Functional Skills: Must-Have Skills: Experience with Veeva Vault eTMF, including Veeva configuration settings and custom builds. Strong knowledge of information systems and network technologies. 6-8 years of experience working in global pharmaceutical Industry Experience in building configured and custom solutions on Veeva Vault Platform. Experience in managing systems, implementing and validating projects in GxP regulated environments. Extensive expertise in SDLC, including requirements, design, testing, data analysis, creating and managing change controls. Proficiency in programming languages such as Python, JavaScript etc. Strong understanding of software development methodologies, including Agile and Scrum. Experience with version control systems such as Git. Preferred Qualifications: Familiarity with relational databases (such as MySQL, SQL server, PostgreSQL etc.) Proficiency in programming languages such as Python, JavaScript or other programming languages Outstanding written and verbal communication skills, and ability to translate technical concepts for non-technical audiences. Experience with ETL Tools (Informatica, Databricks). Experience with API integrations such as MuleSoft. Solid understanding & Proficiency in writing SQL queries. Hands on experience on reporting tools such as Tableau, Spotfire & Power BI. Professional Certifications: Veeva Vault Platform Administrator or Equivalent Vault Certification (Must-Have) SAFe for Teams (Preferred) Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global, virtual teams. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills. Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements.
Posted 1 week ago
7.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Position : Lead Data Engineer Experience : 7+ Years Location : Remote Summary We are looking for a Lead Data Engineer responsible for ETL processes and documentation in building scalable data warehouses and analytics capabilities. This role involves maintaining existing systems, developing new features, and implementing performance improvements. Key Responsibilities Build ETL pipelines using Fivetran and dbt for internal and client projects across platforms like Azure , Salesforce , and AWS . Monitor active production ETL jobs. Create and maintain data lineage documentation to ensure complete system traceability. Develop design/mapping documents for clear and testable development, QA, and UAT. Evaluate and implement new data integration tools based on current and future requirements. Identify and eliminate process redundancies to streamline data operations. Work with the Data Quality Analyst to implement validation checks across ETL jobs. Design and implement large-scale data warehouses , BI solutions, and Master Data Management (MDM) systems, including Data Lakes/Data Vaults . Required Skills & Qualifications Bachelor's degree in Computer Science, Software Engineering, Math, or a related field. 6+ years of experience in data engineering, business analytics, or software development. 5+ years of experience with strong SQL development skills . Hands-on experience in Snowflake and Azure Data Factory (ADF) . Proficient in ETL toolsets such as Informatica , Talend , dbt , and ADF . Experience with PHI/PII data and working in the healthcare domain is preferred. Strong analytical and critical thinking skills. Excellent written and verbal communication. Ability to manage time and prioritize tasks effectively. Familiarity with scripting and open-source platforms (e.g., Python, Java, Linux, Apache, Chef ). Experience with BI tools like Power BI , Tableau , or Cognos . Exposure to Big Data technologies : Snowflake (Snowpark) , Apache Spark , Hadoop , Hive , Sqoop , Pig , Flume , HBase , MapReduce . Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
India
Remote
Job Title: Salesforce/Informatica BI Analyst Location: Remote (India) Type: Full-time | Immediate Joiners Preferred About the Role: We are looking for a hands-on Salesforce/Informatica BI Analyst to support critical integrations and data flows within our pharmaceutical business line. This role requires a solid understanding of Salesforce Health Cloud , Informatica , and cloud-based platforms such as AWS , Google Cloud (OPH) , and Airflow . You’ll be responsible for managing data pipelines, troubleshooting environments via SSH, and ensuring seamless data migration across systems to support analytics and operations. Key Responsibilities: Work with Salesforce Health Cloud and Informatica to manage data integration workflows. Enable and monitor data flows to/from Informatica using secure environments, including SSH access and virtual machines . Troubleshoot and resolve data or infrastructure issues across cloud and on-premise systems. Support migration of data outputs into Airflow pipelines and into the AWS ecosystem . Collaborate on integrations with Google Cloud (OPH) and manage OAPI/Pharma-related data workflows . Work closely with engineering and analytics teams to support business intelligence needs. Qualifications: 4+ years of experience with Salesforce Health Cloud and Informatica . Experience managing data pipelines, ETL processes, and real-time data transfers. Strong hands-on experience with SSH , VMs , and Linux-based troubleshooting . Familiarity with Airflow , AWS (S3, EC2, RDS, etc.) , and Google Cloud Platform (GCP) . Understanding of pharma/OAPI industry data structures is a strong plus. Excellent problem-solving, communication, and cross-functional collaboration skills. Nice to Have: Prior experience in the life sciences/pharma industry. Exposure to HIPAA-compliant environments and data privacy best practices. Knowledge of scripting (Python, Shell) to automate tasks. Join us to work at the intersection of healthcare, data, and technology—making a real impact on lives through insights and innovation. Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Mohali district, India
On-site
Job Title: Salesforce Technical Consultant Experience: 4+ Years Location: Mohali / Jaipur / Pune Employment Type: Full-Time Job Overview: We are looking for an accomplished Salesforce Technical Consultant to join our dynamic team. With over 4 years of experience, you will be a key player in designing, developing, and integrating cutting-edge Salesforce solutions. This role demands a visionary technical leader who excels in Salesforce architecture, fosters innovation, and delivers scalable, high-performing solutions that align with business goals. Key Responsibilities: Technical Leadership ● Architect, design, and deploy sophisticated Salesforce solutions tailored to business needs. ● Provide strategic guidance and technical expertise on Salesforce best practices. ● Lead technical teams, ensuring successful project delivery and knowledge sharing. Custom Development ● Develop and customize Salesforce applications using Apex, Visualforce, Lightning Web Components (LWC), and APIs. ● Build scalable, reusable, and maintainable components aligned with client requirements. Integration Expertise ● Design and implement seamless integrations between Salesforce and external systems using REST/SOAP APIs, middleware tools like MuleSoft, Dell Boomi, or Informatica. ● Ensure data integrity and synchronization across multiple platforms. Technical Analysis & Solutioning ● Analyze complex business requirements and convert them into technical deliverables. ● Identify risks, propose mitigation strategies, and ensure robust solution design. Collaboration & Mentorship ● Collaborate with cross-functional teams, including functional consultants and business analysts, to align technical solutions with business objectives. ● Mentor junior developers, fostering a culture of learning and growth. Performance Optimization ● Conduct system performance audits, ensuring scalability, reliability, and adherence to Salesforce best practices. ● Optimize existing implementations for peak performance and minimal downtime. Required Skills & Expertise: Salesforce Development ● Mastery of Apex, Visualforce, Lightning Web Components (LWC), SOQL, and SOSL. ● Proficiency with Salesforce tools like Flows, Process Builder, and Triggers. Integration Knowledge ● Extensive experience with API integrations and middleware platforms (MuleSoft, Informatica, Dell Boomi). ● Knowledge of Salesforce Connect, External Objects, and integration patterns. Certifications (Preferred): ● Salesforce Certified Platform Developer I & II ● Salesforce Certified Technical Architect (CTA) ● Salesforce Certified Integration Architecture Designer ● Salesforce Certified Data Architecture and Management Designer Technical Proficiency ● Advanced knowledge of Salesforce data modeling, security, and sharing rules. ● Experience with version control systems like Git and CI/CD pipelines. ● Familiarity with additional programming languages such as JavaScript, Java, or Python (a plus). Other Skills: ● Exceptional problem-solving and debugging capabilities. ● Strong communication skills to effectively collaborate with both technical and non-technical stakeholders. Education: ● Bachelor’s degree in Computer Science, Information Technology, or a related field. ● A master’s degree is a plus. Show more Show less
Posted 1 week ago
7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description As a Technical Anchor working in Ford Credit IT, you will join a team that supports to develop enterprise scale applications/building SaaS products in the Salesforce Service Cloud/ Auto Cloud. Work on a balanced product team to define, design, develop and deploy Salesforce Service Cloud/ Auto Cloud in developing Form Data Models, Customer Data Platforms (CDP)/Interaction Studio/Journey builder/Automation Studio/Email and Mobile studio, contact builder,data extension, data sync,Sitemap,content block. Ability to Productize (Build/Author) a document generation product as a SaaS (Software as a Service) products hosted on Mulesoft and Google Cloud Platform (GCP). Build and maintain digital expertise by researching latest industry trends and standards, driving innovation through PoCs and experiments. Develop Salesforce Service Cloud/ Auto Cloud applications . Evaluate potential solutions using both technical and commercial criteria that support the established cost and service requirements with continuous improvement and innovative mindset. Develop and automate unit and integration test scripts. Integrated with MuleSoft applications for integrations around Sales/ Service clouds with Ford Credit Systems. Act as a mentor for less experienced developers through both your technical knowledge and ability to inspire a team to build extraordinary impact together. Understand the depth of the User Stories and provide accurate estimates. Automate performance monitoring and notification in the event of failures using best practices and tools. Research new technologies, influences and implements enterprise technology shifts and new trends impacting Ford application delivery. Do code deployments using CICD Salesforce Salescloud and Mulesoft pipeline with Service cloud – Copado Salesforce deployment. Participate in highly collaborative environment. DevOps o Continuous Integration and Continuous Deployment (CI/CD) Security (SAST/DAST) Monitoring/logging/tracing/ tools (SPLUNK etc…) Experience deployment using source control using Visualsourcecode/Github repo/Copado. Strong sense of code with ability to review code using SonarQube, Checkmarx, rework and deliver Quality code. Build a reusable component using LWC component, AmpScript, Service Side Java Script (SSJS), and SQL. Integrating salesforce Marketing cloud with external system using SFMC APIs Follow enterprise architecture processes and advise teams on cloud design, development, and architecture, service blueprints. Engage in Agile practices including but not limited to Stand-ups, backlog grooming, sprint demos and journey mapping. Responsibilities B.E. / B.Tech / M.C.A Minimum 7 years of experience developing Salesforce Service/Auto Cloud customizations. Extensive experience in Ampscript,Apex, JavaScript, Lightning components, Aura Component and Lighting Web Component, Omniscript, Velocity Must have experience in in contact builder,data extension,data sync,Sitemap,content block, Lead Service/Auto Cloud data modeling and architecture including data extension modeling and cross-product data architecture & mapping Ability to integrate Mulesoft, Informatica, Grapghql, Mediallia and Emplifi. Ability to create flows, modify objects, create custom objects, write Apex, triggers and integrate API services using an IDE Demonstrated ability to drive development of highly technical technology services and capabilities. Experience with the Salesforce.com APEX data loader , Salesforce.com web services APIs/Platform Event/Changedata capture/REST/Pub/Sub. Strong sense of code with ability to review code using SonarQube, Checkmarx, rework and deliver Quality code. Demonstrated experience in Customer Data Platforms (CDP)/Interaction Studio/Journey builder/Automation Studio/Email and Mobile studio. Demonstrated experience establishing and maintaining data structures, data extensions and automations within Salesforce Service/Auto Cloud Experience in Enterprise data analytics, Reporting and Monitoring using Splunk, Dynatrace, healthnut etc Qualifications 5+ years of experience in architecting and implementing fault tolerant, highly available Service/Auto cloud API/REST/SOAP,Platform Event(Pub/Sub). Salesforce Service/Auto Cloud Developer/Consultant/Salesforce Application Architect certifications will be an added advantage. Should have SQL knowledge and have the experience writing database scripts using DDL or queries using DML. Experience in SRE in Copado and ability to architect the services considering observability, traceability and monitoring aspects. At least 4 years of experience in Agile scrum software development process. Ability to work in team in diverse/ multiple stakeholder environment. Experience and desire to work in a Global delivery environment. Excellent communication skills with the ability to adapt your communication style to the audience. Demonstrated ability to drive development of highly technical technology services and capabilities. Experience deployment using source control using change sets and CICD pipelines. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Greater Kolkata Area
On-site
We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions. Key Responsibilities Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies. Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high-performance data architectures. Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications Expertise in Snowflake for data warehousing and ELT processes. Strong proficiency in SQL for relational databases and writing complex queries. Experience with Informatica PowerCenter for data integration and ETL development. Experience using Power BI for data visualization and business intelligence reporting. Experience with Fivetran for automated ELT pipelines. Familiarity with Sigma Computing, Tableau, Oracle, and DBT. Strong data analysis, requirement gathering, and mapping skills. Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP Experience with workflow management tools such as Airflow, Azkaban, or Luigi. Proficiency in Python for data processing (other languages like Java, Scala are a plus). Education- Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field. Skills: informatica,data integration,data engineering,sql,gcp,dbt,power bi,snowflake,fivetran,business intelligence,python,etl,airflow,data modeling,azure,luigi,workflow management tools,data analysis,powerbi,azkaban,data warehousing,aws,informatica administration,cloud services Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions. Key Responsibilities Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies. Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high-performance data architectures. Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications Expertise in Snowflake for data warehousing and ELT processes. Strong proficiency in SQL for relational databases and writing complex queries. Experience with Informatica PowerCenter for data integration and ETL development. Experience using Power BI for data visualization and business intelligence reporting. Experience with Fivetran for automated ELT pipelines. Familiarity with Sigma Computing, Tableau, Oracle, and DBT. Strong data analysis, requirement gathering, and mapping skills. Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP Experience with workflow management tools such as Airflow, Azkaban, or Luigi. Proficiency in Python for data processing (other languages like Java, Scala are a plus). Education- Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field. Skills: informatica,data integration,data engineering,sql,gcp,dbt,power bi,snowflake,fivetran,business intelligence,python,etl,airflow,data modeling,azure,luigi,workflow management tools,data analysis,powerbi,azkaban,data warehousing,aws,informatica administration,cloud services Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions. Key Responsibilities Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies. Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high-performance data architectures. Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications Expertise in Snowflake for data warehousing and ELT processes. Strong proficiency in SQL for relational databases and writing complex queries. Experience with Informatica PowerCenter for data integration and ETL development. Experience using Power BI for data visualization and business intelligence reporting. Experience with Fivetran for automated ELT pipelines. Familiarity with Sigma Computing, Tableau, Oracle, and DBT. Strong data analysis, requirement gathering, and mapping skills. Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP Experience with workflow management tools such as Airflow, Azkaban, or Luigi. Proficiency in Python for data processing (other languages like Java, Scala are a plus). Education- Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field. Skills: informatica,data integration,data engineering,sql,gcp,dbt,power bi,snowflake,fivetran,business intelligence,python,etl,airflow,data modeling,azure,luigi,workflow management tools,data analysis,powerbi,azkaban,data warehousing,aws,informatica administration,cloud services Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The informatica job market in India is thriving with numerous opportunities for skilled professionals in this field. Companies across various industries are actively hiring informatica experts to manage and optimize their data integration and data quality processes.
The average salary range for informatica professionals in India varies based on experience and expertise: - Entry-level: INR 3-5 lakhs per annum - Mid-level: INR 6-10 lakhs per annum - Experienced: INR 12-20 lakhs per annum
A typical career progression in the informatica field may include roles such as: - Junior Developer - Informatica Developer - Senior Developer - Informatica Tech Lead - Informatica Architect
In addition to informatica expertise, professionals in this field are often expected to have skills in: - SQL - Data warehousing - ETL tools - Data modeling - Data analysis
As you prepare for informatica job opportunities in India, make sure to enhance your skills, stay updated with the latest trends in data integration, and approach interviews with confidence. With the right knowledge and expertise, you can excel in the informatica field and secure rewarding career opportunities. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.