Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
10.0 - 15.0 years
14 - 18 Lacs
Hyderabad, Bengaluru
Work from Office
The Data Excellence Data Architect is a demonstrated expert in technical and/or functional aspects of customer and partner engagements that lead to the successful delivery of data management projects. The Data Architect plays a critical role for setting customers up for success by prescriptively helping to shape and then execute in the Salesforce data space. This role also provides subject matter expertise related to the data management solutions and ensures successful project delivery. This includes helping identify and proactively manage risk areas, and ensuring issues are seen through to complete resolution as it relates to implementations. Will have the ability to configure and drive solutions to meet the customer s business and technical requirements. Additionally, this role will include helping align on the development of client-specific implementation proposals, SOWs, and staffing plans, engaging with SMEs across the organization to gain consensus on an acceptable proposal, developing best practices within the data excellence community, developing of shared assets. Responsibilities Serve as the Subject Matter Expert for Salesforce data excellence practice Recognized as a valuable and trusted advisor by our customers and other members of Salesforce community and continue to build a reputation for excellence in professional services Lead development of multi-year Data platform capabilities roadmaps for internal business units like Marketing, Sales, Services, and Finance. Facilitate enterprise information data strategy development, opportunity identification, business cases, technology adoption opportunities, operating model development, and innovation opportunities. Maximize value derived from data analytics by leveraging data assets through data exploitation, envisioning data-enabled strategies as well as enabling business outcomes through analytics, data analytics governance, and enterprise information policy. Translating business requirements into technical specifications, including data streams, integrations, transformations, databases, and data warehouses Defining the data architecture framework, standards and principles, including modeling, metadata, security, reference data such as product codes and client categories, and master data such as clients, vendors, materials, and employees Defining data flows, i.e., which parts of the organization generate data, which require data to function, how data flows are managed, and how data changes in transition Design and implement effective data solutions and models to store and retrieve data from different data sources Prepare accurate dataset, architecture, and identity mapping design for execution and management purposes. Examine and identify data structural necessities by evaluating client operations, applications, and programming. Research and properly evaluate new sources of information and new technologies to determine possible solutions and limitations in reliability or usability Assess data implementation procedures to ensure they comply with internal and external regulations. Lead or participate in the architecture governance, compliance, and security activities (architectural reviews, technology sourcing) to ensure technology solutions are consistent with the target state architecture. Partner with stakeholders early in the project lifecycle to identify business, information, technical, and security architecture issues and act as a strategic consultant throughout the technology lifecycle. Oversee the migration of data from legacy systems to new solutions. Preferred Qualifications and Skills: BA/BS degree or foreign equivalent Overall 10+ years of experience in Marketing data Data management space. Minimum 1 year of hands-on full lifecycle CDP implementation experience on platforms like Salesforce CDP(formerly 360 Audiences), Tealium AudienceStream, Adobe AEP, Segment, Arm Treasure Data, BlueShift, SessionM, RedPoint, etc. 5+ years of experience with data management, data transformation, ETL, preferably using cloud-based tools/infrastructure Experience with Data architecture (ideally with marketing data) using batch and/or real-time ingestion Relevant Salesforce experience in Sales Service Cloud as well as Marketing Cloud, related certifications is a plus (Marketing Cloud Consultant, Administrator, Advanced Administrator, Service Cloud Consultant, Sales Cloud Consultant, etc.) Experience with Technologies and Processes for Marketing, Personalization, and Data Orchestration. Experience with master data management (MDM), data governance, data security, data quality and related tools desired. Demonstrate deep data integration and/or migration experience with Salesforce.com and other cloud-enabled tools Demonstrate expertise in complex SQL statements and RDBMS systems such as Oracle, Microsoft SQL Server, PostGres Demonstrate experience with complex coding through ETL tools such as Informatica, SSIS, Pentaho, and Talend Knowledge of Data Governance and Data Privacy concepts and regulations a plus Required Skills Ability to work independently and be a self-starter Comfort and ability to learn new technologies quickly thoroughly Specializes in gathering and analyzing information related to data integration, subscriber management, and identify resolution Excellent analytical problem-solving skills Demonstrated ability to influence a group audience, facilitate solutions and lead discussions such as implementation methodology, Road-mapping, Enterprise Transformation strategy, and executive-level requirement gathering sessions Travel to client site (up to 50%)
Posted 3 weeks ago
7.0 - 15.0 years
12 - 17 Lacs
Mumbai
Work from Office
Prudential s purpose is to be partners for every life and protectors for every future. Our purpose encourages everything we do by creating a culture in which diversity is celebrated and inclusion assured, for our people, customers, and partners. We provide a platform for our people to do their best work and make an impact to the business, and we support our people s career ambitions. We pledge to make Prudential a place where you can Connect, Grow, and Succeed. Prudential (UK) in partnership with HCL group plans to set-up a standalone Indian health insurance company to address the growing healthcare needs of the Indian consumer. This joint venture will combine Prudentials global expertise in insurance and financial services with HCL Group s experience in technology and healthcare solutions. Prudential, with its longstanding presence in India, already operates two leading businesses in life insurance and asset management with the ICICI Group. Prudential was also the proud sponsor of the 1983 Cricket World Cup, India s first World Cup Victory! Prudential Health India is a Zero to One team undertaking a no-legacy, greenfield health insurance deployment in India, building journeys that truly empathize with the customer and offer a differentiated experience. To partner with us in this mission, we are looking for a talented to join our Experience team Name in Mumbai. Lead / Data Modeler Note: The title will depend on (1) Experience (2) Expertise and (3) Performance. So the title could be: (Senior Manager) Lead Data Modeler (Manager) Data Modeler Deep technology role Experience: 7 - 15 years. Location: Mumbai and/or Bangalore Work from office only Job Profile Summary: PHI intends to build a cloud-native, microservices-oriented, loosely coupled open technology platform, which is tightly aligned to health-insurance domain, and built expecting to be reused while anticipating change . The PHI platform will be made up of multiple applications supporting different business functions, which are well-integrated and well-orchestrated. The applications could be COTS (Common-Off-The-Shelf) vendor software, or Prudential Group software capabilities, or software built by in-house PHI engineering team. All applications need to adopt common services, platforms, architectural principles, and design patterns. The right candidate will be accountable to deliver technology artefacts to business stakeholders, in the fastest time possible , with least gaps , best quality , and clarity on how the technology and business requirements can be delivered and tested at pace, with minimal gaps, and best quality. Requirement gaps, change requests, non-integrated journeys, bugs in UAT, NFR failures - all these would signal poor quality deliverables by this candidate. Job Description: Deeply understand the long-term architectural direction, with emphasis on reusable components, and the interactions between the various applications. Work closely with data engineers, software engineers, data architects, solution designers, RD, analysts, product managers, and other teams and stakeholders to achieve desired outcomes for the company considering functionality, interoperability, performance, scalability, reliability availability other applicable criteria Design, develop, test, and implement data models supporting mobile apps, SDKs, micro-frontends, WhatsApp, and ecosystem partners. Establish standards for platform development. Data model and open APIs capabilities needs to be aligned to enable seamless integration across ecosystem partners. Use data-driven insights to guide the development of programs and apps that meet user needs. Follow and contribute to defining and ensuring adherence to architecture and coding standards. Keep up with innovativeness to build world-class propositions using bleeding-edge technologies, frameworks, and best practices to bring differentiation through technology. Establish and enforce data modeling standards and best practices to ensure consistency, quality, and integrity of data models. Continuously evolve and refine data models to adapt to new business requirements, technological advancements, and data sources. Create and maintain comprehensive documentation for data models, including data dictionaries, entity-relationship diagrams, and metadata. Optimize data models for performance, scalability, and reliability to ensure efficient data processing and retrieval. Ensure seamless integration of data models with existing data systems, databases, and data warehouses. Conduct thorough testing and validation of data models to ensure accuracy and consistency of data. Provide training and support to data users and stakeholders on data modeling concepts, tools, and best practices. Who we are looking for: Technical Skills work experience: MH: Proven hands-on experience in data modeling, with a strong understanding of data modeling principles, methodologies, and tools. MH: Demonstrated ability to understand technology and architectural strategy processes its successful translation into engineering solutions MH: Deep expertise in data modeling tools such as ER/Studio, ERwin, or similar. Strong Data and SQL skills and experience with database management systems (e.g., SQL and No-SQL DBs).. MH: Should have worked for large scale data engineering/transformation with successful implementation of data warehouses, lakes or lake houses leveraging cloud technologies like GCP. Personal Traits: First and foremost, be an exceptional engineer Highest standards of Collaboration Teamwork are critical to this role Strong communication skills ability to engagement senior management on strategic plans, leading project steering committees and status updates etc. Excellent problem analysis skills. Innovative and creative in developing solutions Ability and willingness to be hands-on; Strong attention to detail Ability to work independently and handle multiple concurrent initiatives Excellent organizational, vendor management, negotiation, and prioritization skills Education Bachelor s in computer science, Computer Engineering or equivalent; Suitable certifications for key skills Language Fluent written and spoken English Prudential is an equal opportunity employer. We provide equality of opportunity of benefits for all who apply and who perform work for our organisation irrespective of sex, race, age, ethnic origin, educational, social and cultural background, marital status, pregnancy and maternity, religion or belief, disability or part-time / fixed-term work, or any other status protected by applicable law. We encourage the same standards from our recruitment and third-party suppliers taking into account the context of grade, job and location. We also allow for reasonable adjustments to support people with individual physical or mental health requirements.
Posted 3 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Chandigarh, Dadra & Nagar Haveli
Hybrid
We are looking for a detail-oriented and proactive Salesforce Administrator with hands-on DevOps experience and proficiency in AutoRABIT (Autora). This role will focus on administering the Salesforce platform, managing release processes, and optimizing DevOps automation to ensure secure, efficient, and high-quality deployments. Responsibilities: Perform day-to-day Salesforce administration tasks, including user management, security controls, reports, dashboards, and configuration using Flows and Process Builder. Manage sandbox environments, data migrations, and metadata deployments using AutoRABIT or similar tools (Gearset, Copado). Monitor and maintain DevOps pipelines for Salesforce, ensuring smooth CI/CD processes and version control using Git. Work closely with developers, QA, and release managers to coordinate releases and manage deployment schedules. Create and maintain documentation for administrative processes, release runbooks, and DevOps workflows. Ensure compliance with security policies and governance frameworks across all Salesforce environments. Assist in auditing, troubleshooting, and resolving issues with deployments and configuration changes. Keep abreast of new Salesforce features and functionality andprovidingrecommendations for process improvements Required Qualifications: 35 years of experience as a Salesforce Administrator with strong understanding of Salesforce best practices. Hands-on experience with DevOps tools for Salesforce, especially AutoRABIT (Autora). Proficiency in managing deployment processes, metadata migration, and change tracking. Experience working with Git repositories and version control in a Salesforce context. Strong knowledge of Salesforce platform capabilities, including Flows, permission sets, roles, profiles, and data models. Salesforce Administrator Certification required (Advanced Admin is a plus). Familiarity with agile methodologies and tools like Jira, Confluence, and Slack. Preferred Skills: Knowledge of Apex, LWC, or SOQL (basic development understanding is a plus). Experience with other CI/CD tools like Copado, Gearset, Jenkins, or Azure DevOps. Understanding of Salesforce deployment risk mitigation strategies (backups, static code analysis, impact analysis). Strong communication and documentation skills. Location - Chandigarh,Dadra & Nagar Haveli,Daman,Diu,Goa,Haveli,Jammu,Lakshadweep,Nagar,New Delhi,Puducherry,Sikkim
Posted 3 weeks ago
0.0 - 4.0 years
2 - 6 Lacs
Chennai
Work from Office
The primary responsibility of this role is to perform various tasks related to content for the video catalog quality, under general supervision. This could involve tasks such as checking and/or fixing metadata, image, subtitles, audio and video assets to provide a seamless viewing experience to PV customers. The day to day job requires the individual to make judgment based decisions by following a standard operating procedure and perform Quality checks on various devices. The associate should have working knowledge of MS office to capture data on daily basis. This job requires you to be in the office 5-days per week for in-person work with your teammates. The day to day job requires the individual to make judgment-based decisions by following a standard operating procedure. This will involve tasks such as: -Understand and adhere to standard operating procedure. -Analyze, and identify the issues in the Video content. -Understand the issue and make best use of the available resources/tools to resolve/fix it. -Proactively raises issues /alarms to manager or stakeholders that may have an impact on core deliverables or operations -Communicate with internal and external stakeholders. -Adhere to the Service level agreement, and average handle time set for the processes. -Meet predetermined and assigned productivity targets and quality standards. About the team Prime Video Digi-Flex s (DF) vision is to be the most customer centric, agile and efficient operations powering Prime Video (PV) growth worldwide. Our mission is to be the center of operational excellence for PV through agile and efficient operations at scale. We influence technology-based scaling through tooling and automation. DF is a variable operations workforce that offers quick to market scalable solutions through manual execution for customer facing and business critical strategic initiatives. DF creates repeatable and standardized processes to ingest, process, cleanse, enrich, classify, match & merge partner assets and resolve customer facing issues, and enhance customer experience. - Bachelors degree - Speak, write, and read fluently in English - Experience with Microsoft Office products and applications - Knowledge of Excel at an advanced level
Posted 3 weeks ago
3.0 - 5.0 years
5 - 7 Lacs
Hyderabad
Work from Office
We are looking for a Sr. Business Analyst to join our Go-To-Market Reporting and Analytics team. In this role, you will help build data models and reports to support the broader sales organization. The ideal candidate will have experience working with sales data, SQL scripting, developing production-quality dashboards, and managing large data sets. They should possess a keen attention to detail, a creative problem-solving mindset, and strong communication skills. This individual will collaborate across functions to empower the sales operations community and leadership in making data-driven, strategic decisions. What you get to do in this role: Build complex data models to perform analysis of sales data in support of various reporting initiatives. Research required data and provide insights for ad-hoc questions from leadership. Use BI tools to design and implement industry standard best practices for scalable data management and processing architecture. Work with local and remote team members to design and build data models and perform data validation, integration testing, and support models. Develop functional subject matter expertise within various areas of the enterprise and maintain documentation for all areas of involvement, including metadata objects for end users. Manage and nurture relationships with key stakeholders, ensuring clear communication and alignment across all levels of the organization. To be successful in this role you have: Experience in leveraging or critically thinking about how to integrate AI into work processes, decision-making, or problem-solving. This may include using AI-powered tools, automating workflows, analyzing AI-driven insights, or exploring AI s potential impact on the function or industry. Minimum of 3-5 years experience in analytics or related field. Excellent knowledge of SQL scripting and writing stored procedures is a must. Good understanding of dimensional data modeling concepts. Hands-on experience with Visualization tools, particularly Power BI. Proficiency in utilizing Power Query and DAX. Excellent communication skills and ability to work individually and in a broader, geographically dispersed team. Positive can do attitude and highly analytical; enjoys challenges.
Posted 3 weeks ago
8.0 - 10.0 years
25 - 30 Lacs
Hyderabad
Work from Office
Job Title: Group Lead - Content Operations Hub Location: Hyderabad About the job Strategic context: Sanofi has currently the best and most robust pipeline of R&D and consequent new launches of our history. As new phase of Play-To-Win strategy, funding this pipeline and new launches is key to materialize the miracles of the science to improve people lives. Thus, as we enter the next phase, modernization of Sanofi is required as per the recent announcements on DRIVE, and in this respect, we are in the beginning stages of organizing the Go-to-Market Capabilities (GTMC) team at global level. The GTMC organization will help us to drive best-in-class capabilities across the board and bring value and excellence in our commercial operations. This move is a key part of the aimed modernization of Sanofi and will allow us to focus on our priorities across our products, market and pipeline through the reallocation of resources and realizing the efficiencies of removing silos that exist between our business units, avoiding the duplication and overlapping of resources, standardizing our processes and tools, operating with a One Sanofi approach to accelerate our key capabilities development, and fostering the entrepreneurial spirit by speeding up the decision making. As part of GTMC, vision of the Omnichannel pillar is the definition of Sanofi-wide best-in-class Omnichannel engagement strategy, including development of standards & best practices across markets and brand teams, as well as executional planning and support of local Omnichannel approaches (including change management). GTMC will also collaborate closely with Digital to provide consistent tools. Our Hubs are a crucial part of how we innovate, improving performance across every Sanofi department and providing a springboard for the amazing work we do. Build a career and you can be part of transforming our business while helping to change millions of lives. Ready? As Content Operations Hub Lead, within our Hyderabad Hub, youll be responsible for leading the Content Operations Team, ensuring seamless business continuity and driving strategies aligned with global priorities in Content Operations, GenAI, and digital presence optimization. You will manage resources, budget allocation, and vendor relationships, while overseeing content tagging, metadata management, and utilizing data-driven insights to optimize performance. You will also be responsible for driving synergies between other teams within Omnichannel/GTMC. You will lead the Content Operations Hub for planning and executing market-driven campaigns, making data-driven business recommendations, and creating insightful presentations. Main responsibilities: The overall purpose and main responsibilities are listed below: To create synergies and provide functional and operational direction to Content Operations Hub of Omnichannel pillar. Ensure seamless business continuity amidst capability and resource changes within content operations Drive Hub strategy aligned with global business priorities, focusing on content operations, GenAI and content optimization Lead Hub resources to improve individuals skills and enhance Hub services such as content creation, modular content and technical production Manage budget allocation and vendor relationships crucial for content production and digital marketing tools Report on content performance metrics and derive actionable insights for senior leadership, ensuring strategic alignment and performance optimization Stay up to date with industry trends and best practices in commercial operations, and standardize all tools/processes used in Omnichannel Content Operations activities deployed in hub and ensure their continuous improvement through continuous iteration and external benchmarking approach Support the content transformation program supporting the Glocal co-creation teams Be a strategic advisor for Omnichannel Content Operations capabilities execution Have a robust plan and implement concrete moves towards best-in-class capabilities Mentor the team, ensure knowledge sharing across team and company, provide global and local Content Operations teams with best practice and feedback loop on processes People : (1) Lead team of writers in content creation/corresponding support team content enhancement/graphic design/operations team; (2) Coach and develop team on content, process, agile methodologies, thoughtful risk taking, automation & innovation (including GenAI); (3) Maintain effective relationship with the stakeholders within the allocated GTMC pillar and cross-pillars - with an end objective to develop content as per requirement; (4) Interact effectively with health care professionals as relevant; (5) Partner with team to strengthen capabilities and support individual development plans (6) Collaborate with cross-functional teams in GTMC to build digital transformation/to bring innovative digital solutions (7) Provide proactive recommendations on improving scientific content of the deliverables and play an active role to follow the best practices in relation to processes, communications, project management, documentation and technical requirements Performance : (1) Provide strategic support across GTMC pillars; (2) Lead and support development of tools, technology, and processes to constantly improve quality and productivity (3) Ensure Content Operations team provides content as per agreed timelines and quality; (4) Coach team to become subject matter, process, and technological experts; and (5) Recommend, lead, and implement tactical process improvements within the department and division-wide Process : (1) Support delivery of projects in terms of resourcing, tools, technology, quality, timeliness, efficiency, and high technical standards for deliveries made by Content Operations Hub; (2) Contribute to overall quality enhancement by ensuring high scientific standards for the output produced by the Hub; (3) Secure adherence to compliance procedures and internal/operational risk controls in accordance with any and all applicable regulatory standards; (4) Facilitate development of complex scientific content (branded/unbranded); (5) Help build talent pool/capabilities/Omnichannel content experts across GBUs/therapeutic area(s); (6) Conduct comprehensive content-need analysis; (7) Implement the content plan and associated activities for the year identified for the pillar; (8) Work with selected vendors within the region to deliver the required deliverables as per defined process; (9) Leverage advanced training delivery tools and techniques thereby enhancing the effectiveness of training delivery; and (10) Design an overall plan of action based on end-user feedback and improve course content and delivery Stakeholder : (1) Work closely with GTMC/Omnichannel pillars (Global, Local, and Hub) to identify content need and assist in developing assigned deliverables and (2) Liaise with Omnichannel/ GBTs/AoR/LexMex to provide relevant and customized deliverables About you Experience : 8-10 years of experience in content creation/optimization / Leadership experience (building up teams is preferred, GCC experience)(including up to 2 years of experience in leading a multi-layered diverse team of 10 members) in medico-marketing / medical / commercial / Omnichannel domain for the pharmaceutical/healthcare industry/digital platforms; and ability to influence and negotiate,/ Familiarity with Veeva CRM tools + Veeva PromoMats (Veeva DAM, knowledge of content approval process, promotional & non-promotional materials)/ GenAI experience or interest (desirable not mandatory) Soft & Technical skill : Stakeholder management; proficient in written & oral communication; people management/ability to lead diverse teams; strong organizational and time management skills; and ability to work independently and within a team environment/ As applicable (including but not limited to therapeutic area/domain knowledge exposure - Proficient in multiple TAs/domains/GBUs; scientific communications/writing; and/or project management) Education : Advanced degree in life sciences/pharmacy/similar discipline or medical degree Languages : Excellent knowledge of English language (spoken and written) Why choose us? Bring the miracles of science to life alongside a supportive, future-focused team Discover endless opportunities to grow your talent and drive your career, whether it s through a promotion or lateral move, at home or internationally Enjoy a thoughtful, well-crafted rewards package that recognizes your contribution and amplifies your impact Take good care of yourself and your family, with a wide range of health and wellbeing benefits including high-quality healthcare, prevention and wellness programs and at least 14 weeks gender-neutral parental leave Play a key role in shaping and optimizing our content strategy, driving business growth and achieving impactful results At Sanofi, we provide equal opportunities to all regardless of race, colour, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, or gender identity. Watch our ALL IN video and check out our Diversity Equity and Inclusion actions at sanofi.com !
Posted 3 weeks ago
2.0 - 5.0 years
7 - 11 Lacs
Hyderabad
Work from Office
Data Quality : Define and Measure Data Quality Metrics: Establish metrics for accuracy, completeness, validity, consistency, timeliness, and reliability. Continuous Monitoring and Remediation: Regularly monitor data quality, conduct audits, perform root cause analysis for recurring data issues, and implement preventive measures and remediation plans. Data Profiling: Develop and maintain comprehensive data profiles to understand data characteristics. Data Validation: Create and implement validation rules to ensure that incoming data conforms to expected formats and values. Data Cleansing: Design and execute data cleansing processes to correct errors and inconsistencies, enhancing overall data quality and reliability. Data Governance : Establish Governance Framework: Implement and enforce data governance practices to ensure compliance with regulatory requirements and corporate policies, ensuring data is managed according to best practices. Metadata Management: Develop and maintain a comprehensive metadata repository to document data definitions, lineage, and usage, ensuring it is kept up to date and accessible to end users. Understand User Needs: Collaborate with business users to identify data needs, pain points, and requirements, ensuring the data is fit for its intended use. Identify Improvement Areas: Continuously seek opportunities for process improvement in data governance and quality management. User Roles and Access Requirements: Understand user roles and access requirements for systems, so that similar protection can be implemented into the analytical solutions. Row-Level Security: Work with the data & analytics team to establish row-level security for analytical solutions, ensuring data is accessible only to authorised users. Continuous Improvement: Establish Naming Conventions: Define business friendly table names and column names, along with synonyms, to ensure data easily accessible using AI. Create Synonyms: Implement synonyms to simplify data access and enhance data readability. Establish KPIs for data governance and quality efforts and create regular reports for stakeholders to track progress and demonstrate the value of data governance initiatives. Continuous Improvement: Establish a feedback loop where users can report data quality issues and suggest improvements.
Posted 3 weeks ago
5.0 - 10.0 years
3 - 7 Lacs
Coimbatore
Work from Office
Position Summary The BI Developer reports to the Business Intelligence Manager and is responsible for combining raw information from disparate IT source systems into data models, reports and dashboards to deliver business insights. The successful candidate will be able to complete the full lifecycle of development of data from the ETL process through to final deliverable of dashboards into the organization. To succeed in this BI Developer position, you should have strong analytical skills and the ability to develop and maintain data and security models using modern techniques. If you are detail-oriented, with excellent organizational skills and experience in this field, we d like to hear from you. Responsibilities for a BI Developer position include: Essential Duties and Responsibilities Participate in business requirement gathering and solution documentation Build required infrastructure for optimal pipeline management, including extraction, transformation and loading of data from various data sources using Azure Data Factory, Databricks and SQL technologies Assemble and analyze large, complex sets of data that meet non-functional and functional business requirements Implement internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes Work with users to assist them with data-related technical issues Implement and test data and security models Prepare data for prescriptive and predictive modeling Develop interactive visual reports, dashboards, charts, and measures with KPI scorecards using Microsoft Power BI desktop Analyze, design, deploy, troubleshoot, and support Power BI solutions Participate in user acceptance testing Explore and implement ways to enhance data quality and reliability Collaborate with data scientists and architects as needed Education Bachelor s degree (B.S./B.A.) in computer science, information systems, informatics, statistics or another quantitative field or equivalent from a college or university with IT focused specialization. A Master s Degree or Data engineering certification (e.g, Azure Certified Data Engineer) is a plus Skills/Experience 5+ years experience as a BI Developer or related experience in a global company with significant experience in hands-on technology delivery roles. Strong data analytics background with experience in developing use cases, deep understanding of managing data and generating insights thru visualization Background in custom build experience using Power BI Report Builder, Power BI Desktop, Power BI Service, Tabular Editor, ALM Toolkit and DAX Studio designing Power BI data models; including writing complex DAX, SQL queries and implementing role level security Ability to understand data modeling, data schemas (normalized, flat, star, snowflake, etc.), query optimization, query profiling and query performance monitoring tools and techniques Knowledge of programming languages (e.g. Java, AngularJS, Python) Implement data storage solutions using Azure SQL Database, Azure Data Lake Storage, and Azure Blob Storage. Monitor and optimize data workflows for performance and reliability. Experience with workflow management and pipeline tools - Azure Data Factory and DevOps; storage technologies - Azure Data Warehouse and Data Lake; stream-processing systems - Event Hub and Stream Analytics; transformation tools - Databricks; visualization tools - PowerBI; and metadata management systems. Experience with big data tools like Spark is a plus as well as knowledge of Pyspark. Familiarity with Machine Learning and Deep Learning concepts are a plus Ability to build processes that support data transformation, workload management, data structures, dependency, and metadata Experience working with unstructured datasets Hands-on experience with optimizing performance of SQL queries and applications Great numerical and analytical skills Ability to collaborate with technical resources to influence algorithms and other technology for improved customer experience
Posted 3 weeks ago
8.0 - 10.0 years
5 - 10 Lacs
Bengaluru
Work from Office
Job Title Analyze L7 Security Gateway ITG Environment & Setup Experience 8-10 Years Location Bangalore : Analyze L7 Security GatewayITG Environment & Setup Understand existing UID setup Onboard to CP OKTA and modify SAML metadata and config to integration with CP OKTA Testing CP OKTAfor external user Testing WF OKTA for internal user UAT MTP Hypercare Certifications Needed: NA Skills PRIMARY COMPETENCY Information Security PRIMARY Security Design & Integration PRIMARY PERCENTAGE 60 SECONDARY COMPETENCY Information Security SECONDARY Identity & Access Management SECONDARY PERCENTAGE 40
Posted 3 weeks ago
3.0 - 4.0 years
1 - 5 Lacs
Bengaluru
Work from Office
Job Title:Data Quality Engineer Experience3-4 Years Location:Bangalore : We are seeking a detail-oriented and highly motivated Data Quality Engineerto join our growing data team. In this role, you will be responsible for designing, implementing, and maintaining data quality frameworks to ensure the accuracy, completeness, consistency, and reliability of enterprise data. You will work closely with business stakeholders, data stewards, and data engineers to enforce data governance policies and utilize tools like Ataccamato support enterprise data quality initiatives. We only need immediate joiners. Key Responsibilities: Design and implement robust data quality frameworksand rules using Ataccama ONEor similar data quality tools. Develop automated data quality checks and validation routines to proactively detect and remediate data issues. Collaborate with business and technical teams to define data quality metrics, thresholds, and standards. Support the data governance strategyby identifying critical data elements and ensuring alignment with organizational policies. Monitor, analyze, and report on data quality trends, providing insights and recommendations for continuous improvement. Work with data stewards to resolve data issues and ensure adherence to data quality best practices. Support metadata management, data lineage, and data profiling activities. Document processes, data flows, and data quality rules to facilitate transparency and reproducibility. Conduct root cause analysis on data issues and implement corrective actions to prevent recurrence. Required Skills and Qualifications: Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field. 3+ years of experience in a Data Quality, Data Governance, or Data Engineering role. Hands-on experience with Ataccama ONE or similar data quality tools, including rule creation, data profiling, and issue management. Strong knowledge of data governance frameworks, principles, and best practices. Proficient in SQL and data analysis with the ability to query complex datasets. Experience with data management platforms and enterprise data ecosystems. Excellent problem-solving skills and attention to detail. Strong communication and stakeholder engagement skills. Preferred Qualifications: Experience with cloud data platforms (e.g., Snowflake, AWS, Azure). Familiarity with data catalog tools (e.g., Collibra, Alation). Knowledge of industry data standards and regulatory requirements (e.g., GDPR, HIPAA).
Posted 3 weeks ago
3.0 - 6.0 years
9 - 14 Lacs
Mumbai
Work from Office
Role Overview : We are looking for aTalend Data Catalog Specialistto drive enterprise data governance initiatives by implementingTalend Data Catalogand integrating it withApache Atlasfor unified metadata management within a Cloudera-based data lakehouse. The role involves establishing metadata lineage, glossary harmonization, and governance policies to enhance trust, discovery, and compliance across the data ecosystem Key Responsibilities: o Set up and configure Talend Data Catalog to ingest and manage metadata from source systems, data lake (HDFS), Iceberg tables, Hive metastore, and external data sources. o Develop and maintain business glossaries , data classifications, and metadata models. o Design and implement bi-directional integration between Talend Data Catalog and Apache Atlas to enable metadata synchronization , lineage capture, and policy alignment across the Cloudera stack. o Map technical metadata from Hive/Impala to business metadata defined in Talend. o Capture end-to-end lineage of data pipelines (e.g., from ingestion in PySpark to consumption in BI tools) using Talend and Atlas. o Provide impact analysis for schema changes, data transformations, and governance rule enforcement. o Support definition and rollout of enterprise data governance policies (e.g., ownership, stewardship, access control). o Enable role-based metadata access , tagging, and data sensitivity classification. o Work with data owners, stewards, and architects to ensure data assets are well-documented, governed, and discoverable. o Provide training to users on leveraging the catalog for search, understanding, and reuse. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 6–12 years in data governance or metadata management, with at least 2–3 years in Talend Data Catalog. Talend Data Catalog, Apache Atlas, Cloudera CDP, Hive/Impala, Spark, HDFS, SQL. Business glossary, metadata enrichment, lineage tracking, stewardship workflows. Hands-on experience in Talend–Atlas integration , either through REST APIs, Kafka hooks, or metadata bridges. Preferred technical and professional experience .
Posted 3 weeks ago
4.0 - 7.0 years
4 - 7 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Inviting applications for the role of EPM PBCS / EPBCS Consultant, BIDS In this role, you will be responsible for PBCS/ EPBCS implementation for one of GENPACT customer. You should have 4 years of experience in PBCS/EPBCA on Planning / Essbase, Data management and EPM Automator. You should have 5-6 Years Exp Hyperion Planning 11.1.2.3 / 11.1.2.4 and total 4-6 years of experience in Hyperion suite. You should have technical background with hands-on experience in Hyperion Planning , Essbase cube build using PBCS and data management to build metadata and data load. You should have experience in handling LCM and working with HFR report and web forms, business rules, Calc scripts is must. You should have experience in Data management using PBCS. You should have experience in EPM Automator and automating day to day tasks. You should have experience in Unix, windows based Planning and Essbase applications and handling Cron Jobs and doing daily maintenance activities to schedule, and changes to CRON jobs is must. Integration of DRM hierarchies flow into Hyperion applications. Architectural experience of BSO / ASO cube. Outline load utility experience MDX, MAXLscript writing experience. PL/SQL and SQL working experience. You should have experience in tuning Hyperion planning and Essbase applications and have support Hyperion environment in previous jobs. You Can leverage best practices from previous architecture and implementation experience and work on constant support and continuous improvement Skills (Mandatory) in Hyperion application environment. Responsibilities Be the primary point of contact for assigned pursuits to work with the GE team for developing PBCS / EPBCS applications. Qualifications Minimum qualifications BE Oracle EPM / Hyperion experience Preferred qualifications On Premise Hyperion experience
Posted 3 weeks ago
11.0 - 16.0 years
13 - 18 Lacs
Bengaluru
Work from Office
Dreaming big is in our DNA Its who we are as a company Its our culture Its our heritage And more than ever, its our future A future where were always looking forward Always serving up new ways to meet lifes moments A future where we keep dreaming bigger We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential The power we create together when we combine your strengths with ours is unstoppable Are you ready to join a team that dreams as big as you do AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology The teams are transforming Operations through Tech and Analytics, Do You Dream Big We Need You, Job Description Job Title: SRE Platform Engineer Location: Bengaluru Reporting to: Senior Engineering Manager Purpose of the role We are looking for a Site Reliability Engineer (SRE) Platform Engineer to design, build, and maintain scalable, resilient, and efficient infrastructure for our Data Platform This role focuses on developing platform solutions, improving system reliability, automating infrastructure, and enhancing developer productivity You will work closely with Software engineers, Architects, Data Engineers, DevOps, and security teams to create a highly available and performant platform, Key tasks & accountabilities Platform Engineering & Automation: Design and implement scalable, automated, and self-healing infrastructure solutions Infrastructure as Code (IaC): Develop and maintain infrastructure using Terraform Observability & Monitoring: Implement and maintain monitoring, logging, and alerting systems using Datadog, Grafana Unity Catalog: Implement and maintain Unity Catalog, Metadata, Delta Sharing and Identity access management Databases: Implement and maintain relational databases, data warehouses and NoSQL databases Power BI: Manage the entire Power BI tenant within ABI CI/CD & DevOps Practices: Optimize CI/CD pipelines using ADO, GitHub Actions, or ArgoCD to enable seamless deployments, Cloud: Architect and manage cloud-native platforms using Azure, AWS, or Google Cloud Platform (GCP) Networking: Manage and secure the data platform network by enforcing network security policies, integrating on-premises networks with cloud environments, configuring VNETs, subnets, and routing policies Disaster Recovery: Develop and maintain the Disaster Recovery environment and conduct periodic Disaster Recovery drills Resilience & Incident Management: Improve system reliability by implementing fault-tolerant designs and participating in L4 level resolution Security & Compliance: Ensure platform security by implementing best practices for cloud-based data platforms, access controls, and zone specific compliance requirements Developer Enablement: Build internal tools and frameworks to enhance the developer experience and enable self-service capabilities Qualifications, Experience, Skills Level of educational attainment required: Bachelor's or Master's degree in computer science, Information Technology, or a related field with minimum of 3 years of experience Certifications (Any one of them) Azure Developer Associate Azure DevOps Engineer Expert Azure Solutions Architect Expert Azure Data Engineer Associate Google Professional SRE Certification AWS Certified DevOps Engineer Professional) Technical Expertise: Programming Languages: Proficient in programming languages such as Bash, Powershell, Terraform, Python, Java, etc , Cloud Platforms: Expertise in Azure, AWS or GCP cloud services Infrastructure as Code (IaC): Experience with Terraform Unity Catalog: Deep understanding of Databricks architecture, Schema & table structure, Metadata, Delta Sharing and Identity access management Databases: Deep understanding of database concepts and experience with relational databases, datawarehouses and NoSQL databases Kubernetes & Containers: Hands-on experience with Kubernetes, Helm, and Docker in production environments Power BI: Deep understanding of Power BI administration, workspace management, dashboard development, performance optimization and integration Monitoring & Logging: Experience with observability tools like Datadog, Grafana CI/CD & DevOps: Experience with GitHub, Azure DevOps, GitHub Actions, or ArgoCD Networking & Security: Experience with cloud network, firewalls, VPNs, DNS, policy deployment and vulnerability remediations Disaster Recovery: Deep understanding of cloud DR concepts and high availability requirements And above all of this, an undying love for beer! We dream big to create future with more cheers,
Posted 3 weeks ago
6.0 - 11.0 years
8 - 13 Lacs
Bengaluru
Work from Office
Job Description Job Title: SRE Platform Engineer Location: Bengaluru Reporting to: Senior Engineering Manager Purpose of the role We are looking for a Site Reliability Engineer (SRE) Platform Engineer to design, build, and maintain scalable, resilient, and efficient infrastructure for our Data Platform This role focuses on developing platform solutions, improving system reliability, automating infrastructure, and enhancing developer productivity You will work closely with Software engineers, Architects, Data Engineers, DevOps, and security teams to create a highly available and performant platform, Key tasks & accountabilities Platform Engineering & Automation: Design and implement scalable, automated, and self-healing infrastructure solutions Infrastructure as Code (IaC): Develop and maintain infrastructure using Terraform Observability & Monitoring: Implement and maintain monitoring, logging, and alerting systems using Datadog, Grafana Unity Catalog: Implement and maintain Unity Catalog, Metadata, Delta Sharing and Identity access management Databases: Implement and maintain relational databases, data warehouses and NoSQL databases Power BI: Manage the entire Power BI tenant within ABI CI/CD & DevOps Practices: Optimize CI/CD pipelines using ADO, GitHub Actions, or ArgoCD to enable seamless deployments, Cloud: Architect and manage cloud-native platforms using Azure, AWS, or Google Cloud Platform (GCP) Networking: Manage and secure the data platform network by enforcing network security policies, integrating on-premises networks with cloud environments, configuring VNETs, subnets, and routing policies Disaster Recovery: Develop and maintain the Disaster Recovery environment and conduct periodic Disaster Recovery drills Resilience & Incident Management: Improve system reliability by implementing fault-tolerant designs and participating in L4 level resolution Security & Compliance: Ensure platform security by implementing best practices for cloud-based data platforms, access controls, and zone specific compliance requirements Developer Enablement: Build internal tools and frameworks to enhance the developer experience and enable self-service capabilities Qualifications, Experience, Skills Level of educational attainment required: Bachelor's or Master's degree in computer science, Information Technology, or a related field with minimum of 3 years of experience Certifications (Any one of them) Azure Developer Associate Azure DevOps Engineer Expert Azure Solutions Architect Expert Azure Data Engineer Associate Google Professional SRE Certification AWS Certified DevOps Engineer Professional) Technical Expertise: Programming Languages: Proficient in programming languages such as Bash, Powershell, Terraform, Python, Java, etc , Cloud Platforms: Expertise in Azure, AWS or GCP cloud services Infrastructure as Code (IaC): Experience with Terraform Unity Catalog: Deep understanding of Databricks architecture, Schema & table structure, Metadata, Delta Sharing and Identity access management Databases: Deep understanding of database concepts and experience with relational databases, datawarehouses and NoSQL databases Kubernetes & Containers: Hands-on experience with Kubernetes, Helm, and Docker in production environments Power BI: Deep understanding of Power BI administration, workspace management, dashboard development, performance optimization and integration Monitoring & Logging: Experience with observability tools like Datadog, Grafana CI/CD & DevOps: Experience with GitHub, Azure DevOps, GitHub Actions, or ArgoCD Networking & Security: Experience with cloud network, firewalls, VPNs, DNS, policy deployment and vulnerability remediations Disaster Recovery: Deep understanding of cloud DR concepts and high availability requirements And above all of this, an undying love for beer! We dream big to create future with more cheers,
Posted 3 weeks ago
7.0 - 12.0 years
7 - 17 Lacs
Ahmedabad
Work from Office
Job Title: Performance Marketing Manager Location: Ahmedabad Industry: Astrology & Wellness Experience Required: 512 years Type: Full-time About Us: We are a fast-growing astrology company committed to empowering people with authentic and personalized astrological guidance. Our platform connects users with expert astrologers through web and mobile applications. As we scale our digital presence, we are looking for a performance marketing expert to drive customer acquisition and retention. Job Summary: As a Performance Marketing Manager, you will be responsible for planning, executing, and optimizing ROI-driven digital marketing campaigns across multiple channels including Google Ads, Meta (Facebook/Instagram), YouTube, and affiliate networks. You’ll play a crucial role in increasing app installs, generating leads, and driving revenue from online astrology services. Key Responsibilities: Plan, execute, and manage paid media campaigns (Google, Meta, YouTube, affiliates, etc.) Monitor and optimize performance metrics like CPA, ROAS, CPL, and CTR Collaborate with creative teams to build ad creatives that drive engagement and conversions Conduct A/B testing across ad copies, landing pages, and creatives Analyze data and performance reports to extract actionable insights Allocate and manage budgets effectively to ensure maximum ROI Manage retargeting and remarketing strategies to improve user retention Collaborate with the product and tech teams to track user behavior and improve funnels Stay updated on the latest performance marketing trends and platform updates Optimize App Store (ASO) and Play Store presence to boost organic installs Requirements: Proven experience (5+ years) in performance marketing, preferably in a B2C or app-based environment Strong understanding of Google Ads, Facebook Ads Manager, and analytics tools like GA4, AppsFlyer, or Firebase Experience with marketing automation, retargeting, and funnel optimization Proficiency in data analysis and reporting using Excel, Google Sheets, or Data Studio Excellent communication and project management skills Passion for astrology/wellness domain is a plus Good to Have: Experience with influencer marketing or affiliate partnerships Knowledge of SEO and organic marketing strategies Experience in managing marketplace listings or e-commerce integrations Worked with astrology or spiritual platforms
Posted 3 weeks ago
4.0 - 7.0 years
3 - 6 Lacs
Hyderabad
Work from Office
Role & responsibilities SharePoint Online Admin Job Description JD for Sharepoint Online: 1. Should have strong knowledge on SharePoint Online environment. 2. Should know the difference types of Sites used in SharePoint Online. 3. Must have knowledge on SharePoint Online and OneDrive for Business Limitations and Knowns issues. 4. Hands on experience on Metadata, crawled properties, managed properties, Content Types, Workflows, User Profiles and SharePoint Online Search. 5. Must be aware of latest and deprecated features in Sharepoint online. 6. Should have basic knowledge on DirSync, Azure ADSync, or Azure ADConnect. 7. Must know how Point in time restore works in SharePoint Online. 8. Good Knowledge on retention policy, DLP and E-discovery hold. 9. Need knowledge on OneDrive for Business sync issues. 10. Should have understanding about office 365 groups and Permissions in SharePoint Online. 11. Office 365 Licensing. 12. Should have basic knowledge on Fiddler, Search Query tools and SharePoint Designer Roles & Responsibilities JD for Sharepoint Online: 1. Should have strong knowledge on SharePoint Online environment. 2. Should know the difference types of Sites used in SharePoint Online. 3. Must have knowledge on SharePoint Online and OneDrive for Business Limitations and Knowns issues. 4. Hands on experience on Metadata, crawled properties, managed properties, Content Types, Workflows, User Profiles and SharePoint Online Search. 5. Must be aware of latest and deprecated features in Sharepoint online. 6. Should have basic knowledge on DirSync, Azure ADSync, or Azure ADConnect. 7. Must know how Point in time restore works in SharePoint Online. 8. Good Knowledge on retention policy, DLP and E-discovery hold. 9. Need knowledge on OneDrive for Business sync issues. 10. Should have understanding about office 365 groups and Permissions in SharePoint Online. 11. Office 365 Licensing. 12. Should have basic knowledge on Fiddler, Search Query tools and SharePoint Designer
Posted 3 weeks ago
4.0 - 9.0 years
22 - 25 Lacs
Bengaluru
Work from Office
We look primarily for people who are passionate about solving business problems through innovation and engineering practices. You will be required to apply your depth of knowledge and expertise to all aspects of the software development lifecycle, as we'll as partner continuously with your many stakeholders daily to stay focused on common goals. We embrace a culture of experimentation and constantly strive for improvement and learning. We welcome diverse perspectives and people who are not afraid to challenge assumptions. Our mission is to accelerate the pace of financial innovation and build new financial products for American Express. Our platform streamlines the process of launching and iterating financial products. Responsibilities: Develops and tests software, including ongoing refactoring of code & drives continuous improvement in code structure & quality. Functions as a core member of an Agile team driving User story analysis & elaboration, design and development of software applications, testing & builds automation tools. Designs, codes, tests, maintains , and documents data applications. Takes part in reviews of own work and reviews of colleagues work. Defines test conditions based on the requirements and specifications provided. Partner with the product teams to understand business data requirements, identify data needs and data sources to create data architecture Documents data requirements / data stories and maintains data models to ensure flawless integration into existing data architectures Leads multiple tasks effectively - progresses work in parallel Adapts to change quickly and easily Handles problems and acts on own initiative without being prompted Must have demonstrated proficiency and experience in the following tools and technologies: Python Object Oriented Programming Python Built in libraries: JSON, Base64, logging, os , etc Python: Poetry and dependency management Asynchronous Reactive Micro services utilizing Fast API Firm foundational understanding of Distributed Storage and Distributed Compute Py spark framework: DataFrames (Aggregation, Windowing techniques), Spark SQL Cornerstone Data Ingestion Process, Cornerstone Business Metadata management , Interactive Analytics using YellowBrick , Hyperdrive JSON schema development , CStreams Realtime event ingestion pipeline using Kafka, Event Engine Management Test Driven Development Must have Banking Domain Knowledge: Money Movement, Zelle, ACH, Intraday Working Knowledge of following tools and technologies: Data Governance Toolset: Collibra, Manta REST APIs Specifications using Swagger Development Tool Central, XLR, Jenkins Docker Image creatio n, Containers, PODs Hydra Cloud Deployment and Troubleshooting Logging using Amex Enterprise Logging Framework Analytica l and problem-solving skills Technical fluency - ability to clearly describe tradeoffs to technical and non- technical audiences alike to help support product decisions. Highly organized with strong prioritization skills and outstanding written and verbal communication - you are great at research and documenting your learnings. A bachelors degree . We back you with benefits that support your holistic we'll-being so you can be and deliver your best. This means caring for you and your loved ones physical, financial, and mental health, as we'll as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-we'll-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site we'llness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities
Posted 3 weeks ago
5.0 - 6.0 years
9 - 13 Lacs
Hyderabad
Work from Office
1. Strong experience in developing of AI models using multiple programming languages with a strong foundation in a statistical platform such as Python, R, SAS, or MatLab. Proven experience with developing and testing of various ML models, deep learning, Tensorflow and NLP. 2. Experience in validation of existing AI models to check bias & fairness appropriately based on above mentioned risk tiering. High risk models might have high exposure to bias & fairness issues. 3. Analyzing the ML algorithms that could be used to solve a given problem and ranking them by their success probability 4. Exploring and visualizing data to gain an understanding of it, then identifying differences in data distribution that could affect performance when deploying the model in the real world 5. Tune the model by reviewing the parameters for coverage, accuracy, applicability and model bias. 6. Strong expertise in database management software (eg SQL Server, Oracle etc), statistics and machine learning software (eg Python, R etc) and link analysis/data visualization software, big data platforms 7. Work on functional design, process design (including scenario design, flow mapping), prototyping, testing, training, and defining support procedures, working with an advanced engineering team and executive leadership 8. 4+ years of experience in applying AI to practical and comprehensive technology solutions 9. Experience in building dashboards with Tableau, QlikView, Power BI, Spotfire etc 10. Strong knowledge of SQL and relational databases like MySQL, Oracle, etc Additional skills and attributes to success: Experience in performing analytics for a range of forensic investigations to identify red flags and evidences of fraud through data analytics. Note: Passion for Data Analytics along with a minimum of 5-6 years of experience in programming skills. 5-6 year prior experience to any data analytics tools and techniques is also beneficial. Qualifications Btech/ Mtech/ MCA/BCA Requires a bachelors degree or equivalent. Will need good communication skills, a calm voice in a crisis, an ability to efficiently solve new technical problems, and a broad knowledge of computer systems and security.
Posted 3 weeks ago
10.0 - 14.0 years
20 - 25 Lacs
Hyderabad
Work from Office
Let s do this. Let s change the world. In this vital role you will responsible for developing and maintaining the overall IT architecture of the organization. This role involves defining the architecture vision, creating roadmaps, and ensuring that IT strategies align with business goals. You will be working closely with collaborators to understand requirements, develop architectural blueprints, and ensure that solutions are scalable, secure, and aligned with enterprise standards. Architects will be involved in defining the enterprise architecture strategy, guiding technology decisions, and ensuring that all IT projects adhere to established architectural principles. Roles & Responsibilities: Develop and maintain the enterprise architecture vision and strategy, ensuring alignment with business objectives Create and maintain architectural roadmaps that guide the evolution of IT systems and capabilities Establish and enforce architectural standards, policies, and governance frameworks Evaluate emerging technologies and assess their potential impact on the enterprise/domain/solution architecture Identify and mitigate architectural risks, ensuring that IT systems are scalable, secure, and resilient Maintain comprehensive documentation of the architecture, including principles, standards, and models Drive continuous improvement in the architecture by finding opportunities for innovation and efficiency Work with collaborators to gather and analyze requirements, ensuring that solutions meet both business and technical needs Evaluate and recommend technologies and tools that best fit the solution requirements Ensure seamless integration between systems and platforms, both within the organization and with external partners Design systems that can scale to meet growing business needs and performance demands Develop and maintain logical, physical, and conceptual data models to support business needs Establish and enforce data standards, governance policies, and best practices Design and manage metadata structures to enhance information retrieval and usability Contribute to a program vision while advising and articulating program/project strategies on enabling technologies Provide guidance on application and integration development best practices, Enterprise Architecture standards, functional and technical solution architecture & design, environment management, testing, and Platform education Drive the creation of application and technical design standards which leverage best practices and effectively integrate Salesforce into Amgen s infrastructure Troubleshoot key product team implementation issues and demonstrate ability to drive to successful resolution. Lead the evaluation of business and technical requirements from a senior level Review releases and roadmaps from Salesforce and evaluate the impacts to current applications, orgs, and solutions. Identification and pro-active management of risk areas and commitment to seeing an issue through to complete resolution Negotiate solutions to complex problems with both the product teams and third-party service providers Build relationships and work with product teams; contribute to broader goals and growth beyond the scope of a single or your current project What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master s degree with 8 - 10 years of experience in Computer Science, IT or related field OR Bachelor s degree with 10 - 14 years of experience in Computer Science, IT or related field OR Diploma with 14 - 18 years of experience in Computer Science, IT or related field Experience with SFDC Service Cloud/ Health Cloud in a call center environment Strong architectural design and modeling skills Extensive knowledge of enterprise architecture frameworks and methodologies Experience with system integration, IT infrastructure Experience directing solution design, business processes redesign and aligning business requirements to technical solutions in a regulated environment Experience working in agile methodology, including Product Teams and Product Development models Extensive hands-on technical and solution implementation experience with the Salesforce Lightning Platform, Sales Cloud and Service Cloud, demonstrating positions of increasing responsibility and management/mentoring of more junior technical resources Demonstrable experience and ability to develop custom configured, Visualforce and Lightning applications on the platform. Demonstrable knowledge of the capabilities and features of Service Cloud and Sales Cloud. Demonstrable ability to analyze, design, and optimize business processes via technology and integration, including leadership in guiding customers and colleagues in rationalizing and deploying emerging technology for business use cases A thorough understanding of web services, data modeling, and enterprise application integration concepts, including experience with enterprise integration tools (ESBs and/or ETL tools), and common integration design patterns with enterprise systems (e. g. CMS, ERP, HRIS, DWH/DM) Demonstrably excellent, context-specific and adaptive communication and presentation skills across a variety of audiences and situations; established habit of proactive thinking and behavior and the desire and ability to self-start/learn and apply new technologies Preferred Qualifications: Strong solution design and problem-solving skills Solid understanding of technology, function, or platform Experience in developing differentiated and deliverable solutions Ability to analyze client requirements and translate them into solutions Professional Certifications Salesforce Admin Advanced Admin Platform Builder Salesforce Application Architect (Mandatory) Soft Skills: Excellent critical-thinking and problem-solving skills Good communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated awareness of presentation skills
Posted 3 weeks ago
0.0 - 4.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Digital Content Services Technicians are responsible for evaluating all digital deliveries, outputs and derivatives as we'll as preparing files for delivery to linear and non-linear distribution. They are expected to have knowledge of transcode platforms, file formats, domestic and international media standards, and media asset management . Additional responsibilities include but are not limited to: Initiating transcodes, reviewing and acting on Auto QC data, as we'll as tagging and updating associated metadata . Additional focus will be on demonstration efficient communication and computer skills . Your Role Accountabilities : Perform technical review and database entry of digital content Utilize Media Asset Management system to update metadata and provide reporting as needed Audio/Video fault and issue tracking and follow up Input/ Validate accurate information into Scheduling module Monitor internal systems for incoming media requests, including but not limited to standards conversion, file creation, content management and distribution Negotiate, prioritize and manage client expectations for delivery timelines Coordinate with internal and external Discovery clients to confirm specific technical requirements Knowledge of related standard operation procedures and company policies Initiate and troubleshoot media creation workflows Qualifications & Experiences: Must possess excellent organizational skills, good verbal and written communication skills and proficient in computer usage. Must have a thorough understanding of professional/broadcast HD and SD signal standards Must have a thorough understanding of video resolution, and audio formatting Must have basic understanding of Media files including: file formats, codecs, file size and storage Must have demonstrated experience dealing with immediate deadlines that require problem solving and on the fly critical analysis Must be self-motivated, highly organized, detail oriented, and able to handle multiple projects simultaneously under tight deadlines in a team environment Able to work independently and within a team Must maintain professional attitude, demeanor and relationship with DCI management, co-workers and staff at all times. This position is considered an essential position. This means that during times of inclement weather, emergencies, or when access to the workplace may be impeded, that employees in this position are expected to report to work to support business continuance, unless otherwise instructed by his/her direct manager. Not Required but preferred experience: Bachelors degree in Communication Arts or Radio/TV/Film preferred 2 to 3 years experience with various encoding processes, editing and signal routing in a broadcast/ post production environment is highly desirable . Experience with Avid, Final Cut and Adobe Creative Suite Experience with media file transfer and sharing workflows Multilingual in any of the following a plus: English, Spanish, Portuguese, Putonghua (Mandarin), Hindi, Tamil, Telugu, Bengali, Melayu , Japanese, Vietnamese, Complex Chinese, Simplified Chinese, Indonesia, Korea, Burmese.
Posted 3 weeks ago
3.0 - 8.0 years
11 - 12 Lacs
Hyderabad
Work from Office
As Specialist within the Ancillary Content team, you'll work across multiple functions (Onboarding, Management, and Servicing) to support the efficient end-to-end flow of promotional and ancillary assets for Streaming platforms, Linear channels, Social, Marketing, and Affiliates. This is a dynamic and flexible role requiring strong communication, excellent tracking and coordination, and a collaborative mindset. you'll manage asset ingest, content readiness, delivery coordination, issue resolution, and workflow improvements with a focus on scalability, quality, and creative opportunity. Must work in alignment with EMEA time zones. The Daily - Major Activities Promotional Asset Coordination & Tracking Ensure on-time and to-brief delivery of promotional content across all platforms. Track asset requests, updates, progress, changes, and approvals using internal tools. Stay aligned on priorities and timelines across content types and functions. Communicate daily with stakeholders to provide clear delivery updates . Attend regular meetings, provide feedback, capture key updates, and help align deliverables across the team. Main POC for Promotional Content Act as the key contact for sourcing, progress tracking, and updates around promotional asset delivery. Coordinate requests between functions (Servicing, Management, Onboarding) to ensure fast-turnaround content is delivered (eg, for Social Media). Monitor operational capacity and flag when projected demand exceeds current bandwidth. Collate and action updates from multiple communication channels. Ingest & Asset Onboarding Track content received for ingestion into MAM systems. Assign content to correct placeholders and metadata fields; follow up on missing or incomplete materials. Troubleshoot ingest issues and ensure smooth handoff to downstream functions Ancillary Content Management Maintain visibility and access to assets such as scripts, title treatments, and toolkits from programme distributors and internal teams. Ensure content is sourced, stored, and made available in accordance with contractual terms and technical requirements. Support content readiness for creative campaign use. Content Quality Control Check final assets for correct subtitle placement, dub sync, branding, and overall presentation before delivery. Raise and resolve any issues found during checks or flagged by stakeholders. Troubleshooting & Support Manage incoming tickets and issues related to content readiness or delivery; assess, triage and resolve or escalate as required. Conduct root cause analysis for recurring issues and collaborate on long-term solutions. Support spikes in volume and major campaign launches with focused issue management. Workflow & Tooling Improvements Analyse existing content workflows to identify inefficiencies or bottlenecks. Recommend and support improvements to tools, processes, or documentation. Partner with development teams to enhance visibility, automation, and prioritisation of assets. The Essentials bachelors degree in Engineering, IT, Communications, Broadcasting or related field or equivalent working experience At least 3 years of experience in Media Companies Business acumen on media operation Ability to work on cross functional multi-cultural teams in a collaborative way Technical knowledge of media workflows and media formats Knowledge of postproduction workflows Proactive, solutions-focused, and confident managing multiple priorities in a fast-paced environment. Excellent organizational skills Fluent English The Nice to Haves Knowledge of user experience basic principles High analytical skills Familiarity with project tracking tools (eg Mondaycom)
Posted 3 weeks ago
6.0 - 11.0 years
18 - 20 Lacs
Hyderabad
Work from Office
As part of the CE Data Platform team, your role will involve establishing a clear vision for data engineering practices, harmoniously aligning it with the data architecture. Collaboration with product managers is essential to comprehend business requirements and identify opportunities for data leverage. This position also entails the responsibility of creating, designing, and developing complex data processing pipelines. A cooperative relationship with the data intelligence team is necessary for designing scalable implementations and production of data models. The role involves writing clean, iterative code, utilizing various continuous delivery practices to deploy, support, and operate data pipelines. The selection of suitable data modeling techniques, optimization and design of physical data models, and understanding of the trade-offs between various data modeling techniques, form an integral part of this role. Job Qualifications : You are passionate about data, possessing the ability to build and operate data pipelines, and maintain data storage within distributed systems. This role requires a deep understanding of data modeling and experience with modern data engineering tools and platforms, along with cloud warehousing tools. It is perfect for individuals who can go deep into coding and leading junior members to implement a solution. Experience in defining and implementing data governance and security policies is crucial. Knowledge of DevOps and the ability to navigate all the phases of the data & release life cycle is also essential. Professional Skills: You are familiar with AWS and Azure Cloud. You have extensive knowledge of Snowflake , SnowPro Core certification is a must have. You have used DBT at least in one project to deploy models in production. You have configured and deployed Airflow and integrated various operator in airflow (especially DBT & Snowflake). You can design build, release pipelines and understand of Azure DevOps Ecosystem. You have excellent understanding of Python (especially PySpark) and able to write metadata driven programs. Familiar with Data Vault (Raw , Business) also concepts like Point In Time , Semantic Layer. You are resilient in ambiguous situations and can clearly articulate the problem in a business friendly way. You believe in documenting processes and managing the artifacts and evolving that over time. Good to have skills: You have experience with data visualization techniques and can communicate the insights as per the audience. Experience with Terraform and Hashicorp Vault highly desirable. Knowledge of docker and Streamlit is a big plus.
Posted 3 weeks ago
1.0 - 2.0 years
1 - 2 Lacs
Surat
Work from Office
Min. 1 to 2 Years experience Job Description We are looking for a talented and detail-oriented App Store Optimization (ASO) Executive to join our team. As an ASO Executive, you will play a crucial role in enhancing the visibility, downloads, and user engagement of our mobile apps across various app stores such as Google Play and Apple App Store. what do we need from you Conduct thorough keyword research to identify high-traffic, relevant keywords for app listings and seamlessly integrate them into titles, descriptions, and metadata. Optimize app store listings including titles, descriptions, screenshots, icons, etc., to boost visibility and conversion rates. Analyze app performance metrics, monitor rankings, and track key indicators like downloads, reviews, and ratings to pinpoint areas for enhancement. Technical Skills You Should Have Perform competitor analysis to stay ahead in the market. Run A/B tests on app visuals (icons, screenshots, videos) and metadata to optimize conversion rates and enhance user engagement. Provide regular performance reports, insights, and recommendations to improve ASO efforts. Key Expertise Strong analytical skills and experience with performance tracking tools like Google Analytics, Firebase, or similar platforms. Ability to analyze data and turn insights into actionable strategies. Familiarity with app marketing and mobile app growth strategies. Excellent written and verbal communication skills. Experience with A/B testing and conversion rate optimization (CRO). Knowledge of app localization strategies and multi-region app optimization. Experience with user acquisition campaigns and mobile marketing. Qualification Strong knowledge of app store algorithms, ranking factors, and industry best practices. Proficiency in utilizing ASO tools such as Apptweak, Mobile Action, etc. We are hiring for surat local candidates only. Good understanding of SEO principles and keyword research techniques. Experience Proven experience in App Store Optimization (ASO) or a related role with 6 months to 1 year of experience. Leveraging tech to drive a better IT experience.
Posted 3 weeks ago
0.0 - 4.0 years
2 - 6 Lacs
Chennai
Work from Office
The primary responsibility of this role is to perform various tasks related to content for the video catalog quality, under general supervision. This could involve tasks such as checking and/or fixing metadata, image, subtitles, audio and video assets to provide a seamless viewing experience to PV customers. The day to day job requires the individual to make judgment based decisions by following a standard operating procedure and perform Quality checks on various devices. The associate should have working knowledge of MS office to capture data on daily basis. This job requires you to be in the office 5-days per week for in-person work with your teammates. The day to day job requires the individual to make judgment-based decisions by following a standard operating procedure. This will involve tasks such as: -Understand and adhere to standard operating procedure. -Analyze, and identify the issues in the Video content. -Understand the issue and make best use of the available resources/tools to resolve/fix it. -Proactively raises issues /alarms to manager or stakeholders that may have an impact on core deliverables or operations -Communicate with internal and external stakeholders. -Adhere to the Service level agreement, and average handle time set for the processes. -Meet predetermined and assigned productivity targets and quality standards. About the team Prime Video Digi-Flex s (DF) vision is to be the most customer centric, agile and efficient operations powering Prime Video (PV) growth worldwide. Our mission is to be the center of operational excellence for PV through agile and efficient operations at scale. We influence technology-based scaling through tooling and automation. DF is a variable operations workforce that offers quick to market scalable solutions through manual execution for customer facing and business critical strategic initiatives. DF creates repeatable and standardized processes to ingest, process, cleanse, enrich, classify, match & merge partner assets and resolve customer facing issues, and enhance customer experience. - Bachelors degree - Speak, write, and read fluently in English - Experience with Microsoft Office products and applications - Knowledge of Excel at an advanced level
Posted 3 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad
Work from Office
5+ years of experience working in data warehousing systems 3+ strong hands-on programming expertise in Databricks landscape, including SparkSQL, Workflows for data processing and pipeline development 3+ strong hands-on data transformation/ETL skills using Spark SQL, Pyspark, Unity Catalog working in Databricks Medallion architecture 2+ yrs work experience in one of cloud platforms: Azure, AWS or GCP Experience working in using Git version control, and well versed with CI/CD best practices to automate the deployment and management of data pipelines and infrastructure Nice to have hands-on experience building data ingestion pipelines from ERP systems (Oracle Fusion preferably) to a Databricks environment, using Fivetran or any alternative data connectors Experience in a fast-paced, ever-changing and growing environment Understanding of metadata management, data lineage, and data glossaries is a plus Must have report development experience using PowerBI, SplashBI or any enterprise reporting tool. Roles & Responsibilities: Involved in design and development of enterprise data solutions in Databricks, from ideation to deployment, ensuring robustness and scalability. Work with the Data Architect to build and maintain robust and scalable data pipeline architectures on Databricks using PySpark and SQL Assemble and process large, complex ERP datasets to meet diverse functional and non-functional requirements. Involve in continuous optimization efforts, implementing testing and tooling techniques to enhance data solution quality. Focus on improving performance, reliability, and maintainability of data pipelines. Implement and maintain PySpark and databricks SQL workflows for querying and analyzing large datasets Involve in release management using Git and CI/CD practices Develop business reports using SplashBI reporting tool leveraging the data from Databricks gold layer Qualifications Bachelors degree in computer science, Engineering, Finance or equivalent experience. Good communication skills.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Metadata roles are in high demand in India, with many companies looking for professionals who can manage and analyze data effectively. In this article, we will explore the metadata job market in India, including top hiring locations, salary ranges, career progression, related skills, and common interview questions.
These cities are known for their thriving tech sectors and offer numerous opportunities for metadata professionals.
The average salary range for metadata professionals in India varies based on experience level: - Entry-level: ₹3-6 lakhs per annum - Mid-level: ₹6-12 lakhs per annum - Experienced: ₹12-20 lakhs per annum
Salaries may vary based on the company, location, and specific job responsibilities.
In the metadata field, a career typically progresses as follows: - Metadata Analyst - Metadata Specialist - Metadata Manager - Metadata Architect
As professionals gain experience and expertise, they can move into more senior roles with increased responsibilities.
In addition to metadata management, professionals in this field are often expected to have skills in: - Data analysis - Database management - Data modeling - Information governance
Having a combination of these skills can make job seekers more attractive to potential employers.
As you explore metadata jobs in India, remember to showcase your skills and experience confidently during interviews. By preparing thoroughly and demonstrating your expertise in metadata management, you can increase your chances of securing a rewarding career in this field. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2