Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
3.0 - 6.0 years
15 - 16 Lacs
Bengaluru
Work from Office
- Design, develop, and operate scalable and maintainable data pipelines in the Azure Databricks environment - Develop all technical artefacts as code, implemented in professional IDEs, with full version control and CI/CD automation - Enable data-driven decision-making in Human Resources (HR), Purchasing (PUR) and Finance (FIN) by ensuring high data availability, quality, and reliability - Implement data products and analytical assets using software engineering principles in close alignment with business domains and functional IT - Apply rigorous software engineering practices such as modular design, test-driven development, and artifact reuse in all implementations - Global delivery footprint; cross-functional data engineering support across HR, PUR & FIN domains - Collaboration with business stakeholders, functional IT partners, product owners, architects, ML/AI engineers, and Power BI developers - Agile, product-team structure embedded in an enterprise-scale Azure environment Main Tasks: Design scalable batch and streaming pipelines in Azure Databricks using PySpark and/or Scala Implement ingestion from structured and semi-structured sources (e.g., SAP, APIs, flat files) Build bronze/silver/gold data layers following the defined lakehouse layering architecture & governance Implement use-case driven dimensional models (star/snowflake schema) tailored to HR, PUR & FIN needs Ensure compatibility with reporting tools (e.g., Power BI) via curated data marts and semantic models Implement enterprise-level data warehouse models (domain-driven 3NF models) for HR, PUR & FIN data, closely aligned with data engineers for other business domains Develop and apply master data management strategies (e.g., Slowly Changing Dimensions) Develop automated data validation tests using frameworks Monitor pipeline health, identify anomalies, and implement quality thresholds Establish data quality transparency by defining and implementing meaningful data quality rules with source system and business stakeholders and implementing related reports Develop and structure pipelines using modular, reusable code in a professional IDE Apply test-driven development (TDD) principles with automated unit, integration, and validation tests Integrate tests into CI/CD pipelines to enable fail-fast deployment strategies Commit all artifacts to version control with peer review and CI/CD integration Work closely with Product Owners to refine user stories and define acceptance criteria Translate business requirements into data contracts and technical specifications Participate in agile events such as sprint planning, reviews, and retrospectives Document pipeline logic, data contracts, and technical decisions in markdown or auto-generated docs from code Align designs with governance and metadata standards (e.g., Unity Catalog) Track lineage and audit trails through integrated tooling Profile and tune data transformation performance Reduce job execution times and optimize cluster resource usage Refactor legacy pipelines or inefficient transformations to improve scalability Degree in Computer Science, Data Engineering, Information Systems, or related discipline. Certifications in software development and data engineering (e.g., Databricks DE Associate, Azure Data Engineer, or relevant DevOps certifications). 3-6 years of hands-on experience in data engineering roles in enterprise environments. Demonstrated experience building production-grade codebases in IDEs, with test coverage and version control. Proven experience in implementing complex data pipelines and contributing to full lifecycle data projects (development to deployment) Experience in at least one business domain: HR, PUR & FIN or a comparable field Not required; however, experience mentoring junior developers or leading implementation workstreams is a plus Experience working in international teams across multiple time zones and cultures, preferably with teams in India, Germany, and the Philippines.
Posted 3 weeks ago
5.0 - 10.0 years
4 - 8 Lacs
Coimbatore
Work from Office
Position Summary The BI Developer reports to the Business Intelligence Manager and is responsible for combining raw information from disparate IT source systems into data models, reports and dashboards to deliver business insights. The successful candidate will be able to complete the full lifecycle of development of data from the ETL process through to final deliverable of dashboards into the organization. To succeed in this BI Developer position, you should have strong analytical skills and the ability to develop and maintain data and security models using modern techniques. If you are detail-oriented, with excellent organizational skills and experience in this field, we d like to hear from you. Responsibilities for a BI Developer position include: Essential Duties and Responsibilities Participate in business requirement gathering and solution documentation Build required infrastructure for optimal pipeline management, including extraction, transformation and loading of data from various data sources using Azure Data Factory, Databricks and SQL technologies Assemble and analyze large, complex sets of data that meet non-functional and functional business requirements Implement internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes Work with users to assist them with data-related technical issues Implement and test data and security models Prepare data for prescriptive and predictive modeling Develop interactive visual reports, dashboards, charts, and measures with KPI scorecards using Microsoft Power BI desktop Analyze, design, deploy, troubleshoot, and support Power BI solutions Participate in user acceptance testing Explore and implement ways to enhance data quality and reliability Collaborate with data scientists and architects as needed Education Bachelor s degree (B.S./B.A.) in computer science, information systems, informatics, statistics or another quantitative field or equivalent from a college or university with IT focused specialization. A Master s Degree or Data engineering certification (e.g, Azure Certified Data Engineer) is a plus Skills/Experience 5+ years experience as a BI Developer or related experience in a global company with significant experience in hands-on technology delivery roles. Strong data analytics background with experience in developing use cases, deep understanding of managing data and generating insights thru visualization Background in custom build experience using Power BI Report Builder, Power BI Desktop, Power BI Service, Tabular Editor, ALM Toolkit and DAX Studio designing Power BI data models; including writing complex DAX, SQL queries and implementing role level security Ability to understand data modeling, data schemas (normalized, flat, star, snowflake, etc.), query optimization, query profiling and query performance monitoring tools and techniques Knowledge of programming languages (e.g. Java, AngularJS, Python) Implement data storage solutions using Azure SQL Database, Azure Data Lake Storage, and Azure Blob Storage. Monitor and optimize data workflows for performance and reliability. Experience with workflow management and pipeline tools - Azure Data Factory and DevOps; storage technologies - Azure Data Warehouse and Data Lake; stream-processing systems - Event Hub and Stream Analytics; transformation tools - Databricks; visualization tools - PowerBI; and metadata management systems. Experience with big data tools like Spark is a plus as well as knowledge of Pyspark. Familiarity with Machine Learning and Deep Learning concepts are a plus Ability to build processes that support data transformation, workload management, data structures, dependency, and metadata Experience working with unstructured datasets Hands-on experience with optimizing performance of SQL queries and applications Great numerical and analytical skills Ability to collaborate with technical resources to influence algorithms and other technology for improved customer experience Physical Demands To perform this job successfully, the physical demands listed are representative of those that must be met by an employee. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. While performing the duties of this job, the employee is regularly required to sit, stand, walk, use hands to handle and feel, reach with hands and arms, talk and hear. The employee may occasionally be required to crouch. The employee may occasionally lift items as heavy as 25lbs. Specific vision abilities may include the employee s ability to see near and far distances. DISCLAIMER: The above information on this job description has been designed to indicate the general nature and level of work performed by the employee within this classification. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications required of any employee assigned to this job. Nothing in this job description restricts management s right to assign duties and responsibilities to this job at any time. Who we are: Mold-Masters is a global leader in the plastics industry. We design, manufacture, distribute, sell and service highly engineered and customized plastic processing equipment and systems. Our hot runners, temperature controllers, auxiliary injection and co-injection systems are utilized by customers of all sizes in every industry, from small local manufacturers to large worldwide OEM manufacturers of the most widely recognized brands. Over the course of our 50+ year history, weve built our reputation on delivering the best performance through our broad range of innovative technologies that optimize production to enhance molded part quality, increase productivity and lower part cost. Unlock your operations full potential with Mold-Masters. Mold-Masters is an Operating Company of Hillenbrand. EEO: The policy of Hillenbrand Inc. is to extend opportunities to qualified applicants and employees on an equal basis regardless of an individuals age, race, color, sex, religion, national origin, disability, sexual orientation, gender identity/expression or veteran status. Additionally, Hillenbrand Inc. and our operating companies are committed to being an Equal Employment Opportunity (EEO) Employer and offers opportunities to all job seekers including individuals with disabilities. If you need a reasonable accommodation to assist with your job search or application for for which you are applying. At Hillenbrand, everyone is welcome to apply and "Shape What Matters for Tomorrow".
Posted 3 weeks ago
3.0 - 5.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Key Responsibilities: Implement workflows for data discovery, classification, lineage, and auditing. Evaluate and classify both structured and unstructured data assets. Develop and support information governance policies and processes. Support data classification for regulatory audits and efficiency improvements. Collaborate with Enterprise Architect and leverage data discovery, cataloging, and governance tools. Ensure proper management and governance of information assets. Support the data classification process involving attestation and verification of data assets in collaboration with data owners (transactional state of data) and product owners (transformed state data for analytics). Support the data governance process in data product lifecycle management for enterprise analytics data platforms (data onboarding, data classification, attestation, deployment, metadata management). Promote best practices for data protection using tagging, security policies, and data security (object-level, column-level, row-level) in data platforms. Promote and apply best practices for data catalogs and metadata management for systems within your scope. Requirements: 3-5 years of experience in data governance, data quality, data preparation, and data classification. Proficiency in tools like Informatica, Snowflake, Tableau, and ServiceNow. Strong understanding of data privacy laws and regulations. Experience with Machine Learning and Advanced analytics. Ability to work in a fast-paced environment and manage multiple priorities. Good understanding of working with companies having regulated systems and processes for data. Experience with metadata management for data platforms and reporting systems like Snowflake, SAP HANA, Tableau, Cognos Analytics, TM1 Planning Analytics. Experience with metadata management tools like Informatica On-premises (EIC), Informatica Cloud (IDMC), Collibra, Alation.
Posted 3 weeks ago
6.0 - 10.0 years
7 - 11 Lacs
Pune
Work from Office
Eviden, part of the Atos Group, with an annual revenue of circa 5 billion is a global leader in data-driven, trusted and sustainable digital transformation. As a next generation digital business with worldwide leading positions in digital, cloud, data, advanced computing and security, it brings deep expertise for all industries in more than 47 countries. By uniting unique high-end technologies across the full digital continuum with 53,000 world-class talents, Eviden expands the possibilities of data and technology, now and for generations to come. Role: Hyperion Developer Total Experience: 6-10 years Job Location: Chennai Mode of Hire: Permanent Educational Qualification: Any (Full Time Graduate) J ob Roles and Responsibilities: Hyperion Developer with experience in developing Hyperion (HFM HP) technical business requests. The ideal candidate should have strong experience in financial Management domains. The detailed responsibilities are mentioned below. - Develop design solution for Hyperion suite of applications HFM, HP applications based on business requirements. - Develop business rules, calculations, and scripts to help business and support financial processes. - Maintain security, defines task flows, business rules, scripts, member lists, journal module, objects (Webforms, Grids, Task lists), consolidation and data clearing procedures, metadata updates etc. - Perform coding and configuration to enhance and maintain Oracle EPM tools or Hyperion applications, including Planning, Essbase (BSO and ASO cubes), HFM, FDMEE. - Sound scripting knowledge including MaxL Batch Scripting. - Build and optimize financial reports dashboards and data forms. - Identify opportunities for system enhancements process optimization. - Contributing Responsibilities Technical Behavioral Competencies Job Requirements: - Develop design solution for Hyperion suite of applications - HFM, HP applications based on business requirements. - Develop define security level, task flows, business rules, scripts, member lists, journal module, objects (Webforms, Grids, Task lists), consolidation and data clearing procedures, metadata updates etc. - Perform coding and configuration to enhance and maintain Oracle EPM tools or Hyperion applications, including Planning, Essbase (BSO and ASO cubes), HFM, FDMEE. SmartView. - Sound knowledge of FDMEE (Life Cycle Management, user Management, Security Management etc) - Sound scripting knowledge including MaxL Batch Scripting. - Administer Hyperion applications including performance tuning and patch management. - Hyperion BI+ Reporting Tools (i.e. Web Analysis, Financial Reports, Smart View, etc.) - Knowledge of rapid-application-development software development life cycles, batch scheduling tool, file transfer and related controls - Oracle database development: good knowledge practice of Oracle SQL and PL/SQL - Good analytical, problem solving, communication skills - Implement and maintain Hyperion calculations, business rules, and data forms. - Able to create new report and optimize existing reports using Hyperion Reporting tools. - Collaborate with business analysts to understand data modeling needs and implement its solutions. - Conduct performance tuning and optimization of Hyperion applications for efficiency. - Hands-on in Hyperion Planning / Essbase application development and support experience - Knowledge and experience with Hyperion Security and Task Lists - Engage in technical discussions on Hyperion and to help in improving this system Our Offering: Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment Wellbeing programs work-life balance - integration and passion sharing events Company Initiative Benefits Courses and conferences Attractive Salary Hybrid work culture #Eviden Let s grow together.
Posted 3 weeks ago
8.0 - 13.0 years
13 - 17 Lacs
Hyderabad
Work from Office
We are looking for an experienced and dynamic Manager - Data Engineering to lead the design, development, and optimization of scalable data pipelines and analytics platforms. This role demands a hands-on leader with deep expertise in AWS cloud services and Snowflake, who can drive end-to-end data engineering initiatives while mentoring a high-performing team. You will collaborate closely with cross-functional teams including Data Science, Product, and Engineering to deliver robust and secure data solutions that enable business insights and decision-making at scale. Responsibilities: Design and build scalable, efficient, and secure data pipelines and ETL/ELT processes using AWS services (Glue, S3, Lambda, Redshift, Athena, etc.) and Snowflake. Implement and optimize data warehousing solutions in Snowflake to support BI, analytics, and ML workloads. Ensure proper data modeling, metadata management, and version control for data assets. Design and build scalable, efficient, and secure data pipelines and ETL/ELT processes using AWS services (Glue, S3, Lambda, Redshift, Athena, etc.) and Snowflake. Implement and optimize data warehousing solutions in Snowflake to support BI, analytics, and ML workloads. Ensure proper data modeling, metadata management, and version control for data assets. Lead and mentor a team of data engineers; set goals, conduct performance reviews, and foster professional growth. Drive the data engineering roadmap, aligning technical initiatives with business priorities. Collaborate with stakeholders to define data strategy, architecture, and scalable best practices. 8+ years of experience in Data Engineering, with at least 2+ years in a leadership/managerial capacity. Proven expertise in AWS Data Stack (Glue, S3, Redshift, Lambda, EMR, etc.). Hands-on experience with Snowflake including data modeling, performance tuning, and role-based access. Strong programming skills in Python, SQL, and Spark. Familiarity with orchestration tools (Airflow, Step Functions) and data versioning tools (DVC, LakeFS). Experience with CI/CD practices and Infrastructure-as-Code. Excellent communication, stakeholder management, and cross-functional collaboration skills.
Posted 3 weeks ago
6.0 - 8.0 years
40 - 50 Lacs
Bengaluru
Work from Office
GAQ326R174 Our Databricks Business Systems Team is looking for a Salesforce Developer to manage our Salesforce platform and be a strategic partner in using this core system across the business. Reporting to the Senior Manager of Business Systems, you will own the process for design, test, build and implementation. Additionally you will help guide the business in process development and enhancement and work on future implementation needs. We want someone passionate about leveraging Salesforce to achieve success and value across the business. We want someone who is excited about being a crucial part of a high-growth company with an incredible product. You will be the lead resource on a team, leveraging your skills and expertise to enhance and evolve the value of our Salesforce instance across the business. You will partner with other teams to bring sustainable maturity to our use of the Salesforce platform. With the support of your team, you will be in the position to guide the business to be more efficient with Salesforce and products that connect to and rely on Salesforce. The right person for this role will have analytical and problem-solving skills with a proven background in Salesforce development, configuration, and governance standardisation. They should be able to help translate GTM and CS business needs to system configuration. The impact you will have: Design/Strategy: Build core platform functionality on Salesforce, including how Salesforce interacts with and is leveraged by other key business systems. Collaboration: You are the lead for helping our business leverage and use Salesforce. You will work with other teams and will have a seat on the IT Portfolio Working Group. Analytics: You will play an analytical role in quickly and thoroughly analysing business requirements for reporting and subsequently translating the emanating results into good technical data designs. In this capacity, the System Analyst configure, develops technical specification documentation for all processes. What we look for: 8+ or more years of experience with Salesforce in a high-growth environment. Experience in Salesforce administration and other applications like financial force, learning management systems Knowledge of security and governance (profiles, permission sets, data visibility, sharing settings ) Experience with the Life Cycle of Development, including Salesforce Deployment/Packaging using Metadata API, ChangeSet and code coverage Experience implementing Salesforce Portal (Community portal), Service Console, Lightning components is required Hands-on experience in Salesforce development skills with good command on Triggers, Configuration, SOQL, REST APIs etc Experience with application development like ReactJS, JQuery is plus
Posted 3 weeks ago
8.0 - 13.0 years
13 - 17 Lacs
Hyderabad
Work from Office
Job Description We are looking for an experienced and dynamic Manager - Data Engineering to lead the design, development, and optimization of scalable data pipelines and analytics platforms. This role demands a hands-on leader with deep expertise in AWS cloud services and Snowflake, who can drive end-to-end data engineering initiatives while mentoring a high-performing team. You will collaborate closely with cross-functional teams including Data Science, Product, and Engineering to deliver robust and secure data solutions that enable business insights and decision-making at scale. Responsibilities: Design and build scalable, efficient, and secure data pipelines and ETL/ELT processes using AWS services (Glue, S3, Lambda, Redshift, Athena, etc.) and Snowflake. Implement and optimize data warehousing solutions in Snowflake to support BI, analytics, and ML workloads. Ensure proper data modeling, metadata management, and version control for data assets. Design and build scalable, efficient, and secure data pipelines and ETL/ELT processes using AWS services (Glue, S3, Lambda, Redshift, Athena, etc.) and Snowflake. Implement and optimize data warehousing solutions in Snowflake to support BI, analytics, and ML workloads. Ensure proper data modeling, metadata management, and version control for data assets. Lead and mentor a team of data engineers; set goals, conduct performance reviews, and foster professional growth. Drive the data engineering roadmap, aligning technical initiatives with business priorities. Collaborate with stakeholders to define data strategy, architecture, and scalable best practices. Qualifications 8+ years of experience in Data Engineering, with at least 2+ years in a leadership/managerial capacity. Proven expertise in AWS Data Stack (Glue, S3, Redshift, Lambda, EMR, etc.). Hands-on experience with Snowflake including data modeling, performance tuning, and role-based access. Strong programming skills in Python, SQL, and Spark. Familiarity with orchestration tools (Airflow, Step Functions) and data versioning tools (DVC, LakeFS). Experience with CI/CD practices and Infrastructure-as-Code. Excellent communication, stakeholder management, and cross-functional collaboration skills.
Posted 3 weeks ago
5.0 - 7.0 years
6 - 9 Lacs
Bengaluru
Work from Office
Expertise in data modelling tools (ER/Studio, Erwin, IDM/ARDM models, CPG / Manufacturing / Sales / Finance / Supplier / Customer domains ) Experience with at least one MPP database technology such as Databricks Lakehouse, Redshift, Synapse, Teradata, or Snowflake. Experience with version control systems like GitHub and deployment & CI tools. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Working knowledge of SAP
Posted 3 weeks ago
6.0 - 10.0 years
8 - 12 Lacs
Pune
Work from Office
Role: Hyperion Developer Total Experience: 6-10 years Job Location: Chennai Mode of Hire: Permanent Educational Qualification: Any (Full Time Graduate) J ob Roles and Responsibilities: Hyperion Developer with experience in developing Hyperion (HFM HP) technical business requests. The ideal candidate should have strong experience in financial Management domains. The detailed responsibilities are mentioned below. - Develop design solution for Hyperion suite of applications HFM, HP applications based on business requirements. - Develop business rules, calculations, and scripts to help business and support financial processes. - Maintain security, defines task flows, business rules, scripts, member lists, journal module, objects (Webforms, Grids, Task lists), consolidation and data clearing procedures, metadata updates etc. - Perform coding and configuration to enhance and maintain Oracle EPM tools or Hyperion applications, including Planning, Essbase (BSO and ASO cubes), HFM, FDMEE. - Sound scripting knowledge including MaxL Batch Scripting. - Build and optimize financial reports dashboards and data forms. - Identify opportunities for system enhancements process optimization. - Contributing Responsibilities Technical Behavioral Competencies Job Requirements: - Develop design solution for Hyperion suite of applications - HFM, HP applications based on business requirements. - Develop define security level, task flows, business rules, scripts, member lists, journal module, objects (Webforms, Grids, Task lists), consolidation and data clearing procedures, metadata updates etc. - Perform coding and configuration to enhance and maintain Oracle EPM tools or Hyperion applications, including Planning, Essbase (BSO and ASO cubes), HFM, FDMEE. SmartView. - Sound knowledge of FDMEE (Life Cycle Management, user Management, Security Management etc) - Sound scripting knowledge including MaxL Batch Scripting. - Administer Hyperion applications including performance tuning and patch management. - Hyperion BI+ Reporting Tools (i.e. Web Analysis, Financial Reports, Smart View, etc.) - Knowledge of rapid-application-development software development life cycles, batch scheduling tool, file transfer and related controls - Oracle database development: good knowledge practice of Oracle SQL and PL/SQL - Good analytical, problem solving, communication skills - Implement and maintain Hyperion calculations, business rules, and data forms. - Able to create new report and optimize existing reports using Hyperion Reporting tools. - Collaborate with business analysts to understand data modeling needs and implement its solutions. - Conduct performance tuning and optimization of Hyperion applications for efficiency. - Hands-on in Hyperion Planning / Essbase application development and support experience - Knowledge and experience with Hyperion Security and Task Lists - Engage in technical discussions on Hyperion and to help in improving this system Our Offering: Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment Wellbeing programs work-life balance - integration and passion sharing events Company Initiative Benefits Courses and conferences Attractive Salary Hybrid work culture Let s grow together.
Posted 3 weeks ago
4.0 - 9.0 years
10 - 11 Lacs
Hyderabad, Bengaluru
Work from Office
No. of. Positions 4 Experience 4 to 9 Years Hyderabad /Bangalore Role requirements: Solid understanding of Salesforce.com Architecture, Design, Development, Administration and Operational Support Analyze requirements and designing solutions that are achievable, acceptable and consistent with customer expectations and good architectural principles. Provide development and administration support on Salesforce platform including Development and support on standard Salesforce functionality including page layouts, field additions, and permissions Advanced development and administration which would include triggers, S-controls, workflow, validation rules, Lightning components, etc. Code development using APEX and Force.com. Development using existing and upcoming Salesforce frameworks like Lightning as well as custom frameworks Assist the Support team in troubleshooting and resolving technical issues. Provide timely status updates and identify as well as communicate risks in a timely manner Work closely with business analysts and/or key business community users regarding enhancements and bug fixes. Ensure consistent delivery of IT solutions by following Informatica s development standards and the architecture framework. Work effectively in a distributed team Demonstrated deep technical knowledge with a minimum of 3 years experience working with Force.com developer toolkit Apex, Visualforce, Lightning, Force.com IDE, Force.com Migration Tool, Web Services/SOA Metadata APIs SQL, RDBMS experience Strong communication skills both verbal and written. Demonstrated strong prototyping, coding and debugging skill Required Candidate login to applying this job. Click here to And try again Login to your account Email Address: Password: Salesforce Developer Drop your resume or click to upload File types supported: .doc, .docx and .pdf | Max file size: 3 MB or First Name: * Last Name: * Email: * Phone: * Current Job Title: Online resume/portfolio link Address Country State Postal code How soon can you join* How did you hear about us* By clicking checkbox, you agree to our and
Posted 3 weeks ago
3.0 - 5.0 years
13 - 17 Lacs
Bengaluru
Work from Office
What if the work you did every day could impact the lives of people you knowOr all of humanity At Illumina, we are expanding access to genomic technology to realize health equity for billions of people around the world. Our efforts enable life-changing discoveries that are transforming human health through the early detection and diagnosis of diseases and new treatment options for patients. Working at Illumina means being part of something bigger than yourself. Every person, in every role, has the opportunity to make a difference. Surrounded by extraordinary people, inspiring leaders, and world changing projects, you will do more and become more than you ever thought possible. Key Responsibilities: Implement workflows for data discovery, classification, lineage, and auditing. Evaluate and classify both structured and unstructured data assets. Develop and support information governance policies and processes. Support data classification for regulatory audits and efficiency improvements. Collaborate with Enterprise Architect and leverage data discovery, cataloging, and governance tools. Ensure proper management and governance of information assets. Support the data classification process involving attestation and verification of data assets in collaboration with data owners (transactional state of data) and product owners (transformed state data for analytics). Support the data governance process in data product lifecycle management for enterprise analytics data platforms (data onboarding, data classification, attestation, deployment, metadata management). Promote best practices for data protection using tagging, security policies, and data security (object-level, column-level, row-level) in data platforms. Promote and apply best practices for data catalogs and metadata management for systems within your scope. Requirements: 3-5 years of experience in data governance, data quality, data preparation, and data classification. Proficiency in tools like Informatica, Snowflake, Tableau, and ServiceNow. Strong understanding of data privacy laws and regulations. Experience with Machine Learning and Advanced analytics. Ability to work in a fast-paced environment and manage multiple priorities. Good understanding of working with companies having regulated systems and processes for data. Experience with metadata management for data platforms and reporting systems like Snowflake, SAP HANA, Tableau, Cognos Analytics, TM1 Planning Analytics. Experience with metadata management tools like Informatica On-premises (EIC), Informatica Cloud (IDMC), Collibra, Alation.
Posted 3 weeks ago
2.0 - 7.0 years
50 - 100 Lacs
Mumbai
Work from Office
Position at GroupM Nexus Overview We are looking for a results-driven Manager SEO to lead our organic growth strategies in the Indian market. You will be responsible for driving search visibility, increasing organic traffic, and improving conversions across multiple digital platforms. This role requires a blend of technical SEO expertise, strategic thinking, and leadership skills to manage a team and execute high-impact SEO campaigns. Key Responsibilities: SEO Strategy Execution: Develop and implement data-driven SEO strategies to improve search rankings and organic traffic for the Indian market. On-Page Technical SEO: Optimize website structure, content, metadata, and internal linking to enhance search visibility. Conduct regular site audits and implement fixes. Off-Page SEO Link Building: Develop and execute effective link-building strategies to improve domain authority. Content Keyword Optimization: Collaborate with content teams to ensure SEO-friendly content, targeting high-intent keywords. Local E-commerce SEO: Optimize for Google My Business, local search, and e-commerce platforms (if applicable). Marketplace SEO: Optimize product listings, conduct keyword research, improve rankings, analyze performance, enhance visibility, manage content, track trends, and drive organic traffic product sales. Analytics Reporting: Monitor key SEO KPIs (traffic, rankings, conversions) using tools like Google Analytics, Search Console, and SEMrush/Ahrefs. Generate reports with actionable insights. Team Leadership: Manage and mentor a team of SEO specialists, setting goals and guiding performance. Collaboration: Work closely with developers, content creators, and digital marketing teams to ensure alignment with overall business objectives. SEO Trends Algorithm Updates: Stay updated on Google algorithm changes and emerging SEO trends to implement best practices. Requirements: 3 - 4 years of SEO experience, with 2+ years in a managerial role . In-depth knowledge of Google s algorithms, ranking factors, and search engine best practices . Hands-on experience with SEO tools like Google Analytics, Search Console, SEMrush, Ahrefs, Screaming Frog, etc. Strong understanding of technical SEO , including schema markup, site speed, and mobile optimization. Experience with content marketing keyword research tailored for the Indian audience. Knowledge of local SEO strategies and optimization for multilingual content (preferred). Proficiency in Google My Business, and e-commerce SEO . Team management stakeholder communication experience. Basic knowledge of HTML, CSS, and JavaScript is a plus. About GroupM Nexus GroupM Nexus is the industry s largest community of performance marketing experts designed to drive performance and innovation at scale for GroupM s agencies and clients. With the most platform accreditations in the industry combined with proprietary technology, media, and solutions, culture of continuous innovation and scaled partnerships, GroupM Nexus consistently sets new benchmarks for effectiveness and efficiency across all forms of media to drive growth for the world s leading advertisers. About India At GroupM India, there s never a dull moment between juggling client requests, managing vendor partners and having fun with your team. We believe in tackling challenges head-on and getting things done.
Posted 3 weeks ago
2.0 - 7.0 years
50 - 100 Lacs
Mumbai
Work from Office
Description Position at GroupM Nexus Overview We are looking for a results-driven Manager SEO to lead our organic growth strategies in the Indian market. You will be responsible for driving search visibility, increasing organic traffic, and improving conversions across multiple digital platforms. This role requires a blend of technical SEO expertise, strategic thinking, and leadership skills to manage a team and execute high-impact SEO campaigns. Key Responsibilities: SEO Strategy Execution: Develop and implement data-driven SEO strategies to improve search rankings and organic traffic for the Indian market. On-Page Technical SEO: Optimize website structure, content, metadata, and internal linking to enhance search visibility. Conduct regular site audits and implement fixes. Off-Page SEO Link Building: Develop and execute effective link-building strategies to improve domain authority. Content Keyword Optimization: Collaborate with content teams to ensure SEO-friendly content, targeting high-intent keywords. Local E-commerce SEO: Optimize for Google My Business, local search, and e-commerce platforms (if applicable). Marketplace SEO: Optimize product listings, conduct keyword research, improve rankings, analyze performance, enhance visibility, manage content, track trends, and drive organic traffic product sales. Analytics Reporting: Monitor key SEO KPIs (traffic, rankings, conversions) using tools like Google Analytics, Search Console, and SEMrush/Ahrefs. Generate reports with actionable insights. Team Leadership: Manage and mentor a team of SEO specialists, setting goals and guiding performance. Collaboration: Work closely with developers, content creators, and digital marketing teams to ensure alignment with overall business objectives. SEO Trends Algorithm Updates: Stay updated on Google algorithm changes and emerging SEO trends to implement best practices. Requirements: 3 - 4 years of SEO experience, with 2+ years in a managerial role . In-depth knowledge of Google s algorithms, ranking factors, and search engine best practices . Hands-on experience with SEO tools like Google Analytics, Search Console, SEMrush, Ahrefs, Screaming Frog, etc. Strong understanding of technical SEO , including schema markup, site speed, and mobile optimization. Experience with content marketing keyword research tailored for the Indian audience. Knowledge of local SEO strategies and optimization for multilingual content (preferred). Proficiency in Google My Business, and e-commerce SEO . Team management stakeholder communication experience. Basic knowledge of HTML, CSS, and JavaScript is a plus. About GroupM Nexus About India
Posted 3 weeks ago
2.0 - 7.0 years
14 - 17 Lacs
Bengaluru
Work from Office
About Onehouse Onehouse is a mission-driven company dedicated to freeing data from data platform lock-in. We deliver the industry s most interoperable data lakehouse through a cloud-native managed service built on Apache Hudi. Onehouse enables organizations to ingest data at scale with minute-level freshness, centrally store it, and make available to any downstream query engine and use case (from traditional analytics to real-time AI / ML). We are a team of self-driven, inspired, and seasoned builders that have created large-scale data systems and globally distributed platforms that sit at the heart of some of the largest enterprises out there including Uber, Snowflake, AWS, Linkedin, Confluent and many more. Riding off a fresh $35M Series B backed by Craft, Greylock and Addition Ventures, were now at $68M total funding and looking for rising talent to grow with us and become future leaders of the team. Come help us build the worlds best fully managed and self-optimizing data lake platform! The Community You Will Join When you join Onehouse, youre joining a team of passionate professionals tackling the deeply technical challenges of building a 2-sided engineering product. Our engineering team serves as the bridge between the worlds of open source and enterprise: contributing directly to and growing Apache Hudi (already used at scale by global enterprises like Uber, Amazon, ByteDance etc) and concurrently defining a new industry category - the transactional data lake. The Data Infrastructure team is the grounding heartbeat to all of this. We live and breathe databases, building cornerstone infrastructure by working under Hudis hood to solving incredibly complex optimization and systems problems. The Impact You Will Drive: As a foundational member of the Data Infrastructure team, you will productionize the next generation of our data tech stack by building the software and data features that actually process all of the data we ingest. Accelerate our open source enterprise flywheel by working on the guts of Apache Hudis transactional engine and optimizing it for diverse Onehouse customer workloads. Act as a SME to deepen our teams expertise on database internals, query engines, storage and/or stream processing. A Typical Day: Design new concurrency control and transactional capabilities that maximize throughput for competing writers. Design and implement new indexing schemes, specifically optimized for incremental data processing and analytical query performance. Design systems that help scale and streamline metadata and data access from different query/compute engines. Solve hard optimization problems to improve the efficiency (increase performance and lower cost) of distributed data processing algorithms over a Kubernetes cluster. Leverage data from existing systems to find inefficiencies, and quickly build and validate prototypes. Collaborate with other engineers to implement and deploy, safely rollout the optimized solutions in production. What You Bring to the Table: Strong, object-oriented design and coding skills (Java and/or C/C++ preferably on a UNIX or Linux platform). Experience with inner workings of distributed (multi-tiered) systems, algorithms, and relational databases. You embrace ambiguous/undefined problems with an ability to think abstractly and articulate technical challenges and solutions. An ability to prioritize across feature development and tech debt with urgency and speed. An ability to solve complex programming/optimization problems. An ability to quickly prototype optimization solutions and analyze large/complex data. Experience running production services, at scale. Robust and clear communication skills. Nice to haves (but not required): Experience working with database systems, Query Engines or Spark codebases. Experience in optimization mathematics (linear programming, nonlinear optimization). Existing publications of optimizing large-scale data systems in top-tier distributed system conferences. PhD degree with 2+ years industry experience in solving and delivering high-impact optimization projects. How Well Take Care of You -Equity Compensation; our success is your success with eligible participation in our company equity plan - Health Well-being; well invest in your physical and mental well-being by reimbursing up to 20,000 INR for your monthly insurance premium - Financial Future; well invest in your financial well-being by making this role eligible for the provident fund of which Onehouse will contribute up to 1800 INR/month - Location; we are a remote-friendly company (internationally distributed across N. America + India), though some roles will be subject to in-person requirements in alignment with the needs of the business - Generous Time Off; unlimited PTO (mandatory 1 week/year minimum), uncapped sick days and 17 paid company holidays - Food Meal Allowance; weekly lunch stipend, in-office snacks/drinks - Equipment; well provide you with the equipment you need to be successful and a one-time $500 (USD) stipend for your initial office/desk setup - Child Bonding!; 26 weeks off for birthing and 12 weeks for surrogate and adoptive parents - fully paid so you can focus your energy on your newest addition House Values One Team Optimize for the company, your team, self - in that order. We may fight long and hard in the trenches, take care of your co-workers with empathy. We give more than we take to build the one house, that everyone dreams of being part of. Tough Persevering We are building our company in a very large, fast-growing but highly competitive space. Life will get tough sometimes. We take hardships in the stride, be positive, focus all energy on the path forward and develop a champions mindset to overcome odds. Always day one! Keep Making It Better Always Rome was not built in a day; If we can get 1% better each day for one year, well end up thirty-seven times better. This means being organized, communicating promptly, taking even small tasks seriously, tracking all small ideas, and paying it forward. Think Big, Act Fast We have tremendous scope for innovation, but we will still be judged by impact over time. Big, bold ideas still need to be strategized against priorities, broken down, set in rapid motion, measure, refine, repeat. Great execution is what separates promising companies from proven unicorns. Be Customer Obsessed Everyone has the responsibility to drive towards the best experience for the customer, be an OSS user or a paid customer. If something is broken, own it, say something, do something; never ignore. Be the change that you want to see in the company.
Posted 3 weeks ago
5.0 - 10.0 years
10 - 15 Lacs
Bengaluru
Work from Office
The purpose of this role is to implement and execute the performance marketing campaigns and strategies across various platforms in alignment with the business objective and marketing goals. The candidate will have a strong understanding of digital performance marketing and deep understanding and knowledge of Meta, Google, DV360, Native, Portal, affiliate, etc. Ads platforms. Keep a close eye on the current trends, best practices, and be updated with the latest updates of the digital marketing. JD Having a deep understanding of performance marketing and manage all digital campaigns, from ideation to media planning to optimization and performance analysis focused on delivering leads for business. Understand the campaign performance and corelate with the business objective and curate the campaigns and the overall strategy accordingly so that business objectives are met. Drive marketing programs which are highly targeted basis content affinity of relevant segmented audience cohorts. Manage all efforts and goals on organic and digital paid media: SEM, Display, Paid Social, Affiliate, Direct buys etc. Serve as the lead for programmatic display, especially the Double Click suite. Establish processes for monitoring, measurement and optimization through institutionalizing performance marketing metrics that correlate to campaigns as well as business impact. Drive digital marketing efficiencies and reach through continuous communication testing (a/b), innovation using different platforms and new products/formats, active co-creation and planning with Google/FB/others as partner. Research, monitor and track what's trending on social media and coordinate with departments create content around trending topics. Responsibilities will consist of: Categorize and analyse high performance campaigns and document success stories for feedback and learning. Keep a keen eye on copy and visual styles that have worked on the campaigns across channels and replicate/abridge these approaches as and when required. Organize weekly brainstorming meetings for upcoming projects and be an active participant in suggesting fresh ideas that shall be replicated on the campaigns and social brand pages. Effective coordination with other teams to create and execute campaigns that will initiate sales and increase footfalls on the pages, website and project sites. Analyse the data and take out the insights of the campaigns and take the corrective actions accordingly to improve the performance of campaigns. Prepare a strategy to achieve the business objective and meet the targets. Come up with innovative ideas which can help in improving the efficiency and performance in the system.
Posted 3 weeks ago
2.0 - 5.0 years
4 - 7 Lacs
Kochi
Work from Office
At Iron Mountain we know that work, when done well, makes a positive impact for our customers, our employees, and our planet Thats why we need smart, committed people to join us Whether youre looking to start your career or make a change, talk to us and see how you can elevate the power of your work at Iron Mountain, We provide expert, sustainable solutions in records and information management, digital transformation services, data centers, asset lifecycle management, and fine art storage, handling, and logistics We proudly partner every day with our 225,000 customers around the world to preserve their invaluable artifacts, extract more from their inventory, and protect their data privacy in innovative and socially responsible ways, Are you curious about being part of our growth stor y while evolving your skills in a culture that will welcome your unique contributionsIf so, let's start the conversation, Location: Cochin, Kerala Department: Business Process Operations About The Role As a Supervisor, Business Process Operations (M1) at Iron Mountain, you will be responsible for managing large-scale digitization projects across customer sites and IMI facilities This role requires strong project supervision, cross-functional coordination, and the ability to lead high-performing teams while ensuring adherence to standard operating procedures (SOPs) and delivery commitments, You will serve as a critical link between Key Account Managers and on-ground delivery teams to ensure timely, high-quality outcomes Additionally, you will support vertical leads in achieving monthly, quarterly, and annual operational goals and budgets, An ideal candidate brings a deep understanding of digitization, workflow automation, and productivity optimization with a passion for leveraging technology to streamline operations, Key Responsibilities Manage large-scale digitization operations, both at customer sites and IMI facilities, Supervise teams involved in scanning, digitization, metadata management, and document handling, Plan and execute projects in line with SOPs, quality standards, and timelines, Conduct Proof of Concept (POC) exercises and process enhancements as needed, Coordinate between Key Account Managers and delivery teams for seamless execution, Drive productivity improvements through automation and time & motion studies (TMS), Monitor team KPIs and ensure alignment with business goals, Support in budgeting, cost optimization, and AOP planning, Prepare and maintain MIS reports and presentations for internal and external stakeholders, Qualifications & Experience Graduate (mandatory); MBA in Operations preferred, 57 years of relevant experience managing digitization/large-scale judiciary projects, Proven ability to lead teams of 50100 members, Strong understanding of document management systems (DMS), metadata creation, and workflow management, Prior experience in handling judiciary-related digitization projects is a must, Proficiency in Malayalam is mandatory, Familiarity with production scanners and related market trends, Experience in server management will be an added advantage, Strong command of Google Suite (Sheets, Docs, Slides); knowledge of Google Data Studio preferred, Experience with RFP evaluation, project costing, and gross profit optimization is desirable, Customer-focused mindset with the ability to balance SOPs with industry best practices, What Were Looking For A self-motivated and target-driven individual with strong leadership and communication skills, A detail-oriented professional capable of identifying process improvements and driving operational efficiency, A team player with a solution-oriented approach and the ability to manage multiple stakeholders, Interested candidates can apply through this post or share the updated resume with runa singha@ironmountain , Thanks and regards, TA Team Category: Operations Group Iron Mountain is a global leader in storage and information management services trusted by more than 225,000 organizations in 60 countries We safeguard billions of our customersassets, including critical business information, highly sensitive data, and invaluable cultural and historic artifacts Take a look at our history here, Iron Mountain helps lower cost and risk, comply with regulations, recover from disaster, and enable digital and sustainable solutions, whether in information management, digital transformation, secure storage and destruction, data center operations, cloud services, or art storage and logistics
Posted 3 weeks ago
4.0 - 7.0 years
6 - 9 Lacs
Bengaluru
Work from Office
Job Title: Laserfiche Consultant (Data Migration Support) Location: Remote/On-site (depending on the client's preference) Duration: 1 to 2 months (Part-time) Job Type: Contract/Freelance ______________ Job Overview We are seeking an experienced Laserfiche Consultant to assist in the data migration process from Laserfiche Document Management System (DMS) This project will involve supporting the migration of data from Laserfiche to a new DMS or cloud-based solution The consultant will be responsible for ensuring smooth, error-free data extraction, transformation, and loading (ETL) during the migration process This role requires extensive knowledge and hands-on experience with Laserfiche data structures, content repository, and integration with third-party systems, ______________ Key Responsibilities Data Migration: Lead and support the migration of documents and associated metadata from Laserfiche to the target system, Data Extraction & Transformation: Extract data from Laserfiche, clean and prepare it for migration, ensuring data integrity, accuracy, and consistency, Mapping and Metadata Management: Work with the team to map Laserfiche metadata to the new system, ensuring all critical attributes are correctly transferred, Quality Assurance: Conduct testing to ensure data migration is completed successfully without data loss, corruption, or errors, Troubleshooting and Issue Resolution: Address any issues or concerns during the migration process, providing solutions for data or system-related problems, Documentation: Maintain proper documentation of the migration process, mapping documents, and any issues encountered, Training & Support: Provide guidance and support to internal teams regarding Laserfiche best practices, data migration strategies, and troubleshooting, ______________ Required Skills & Qualifications Experience: Minimum 3 years of hands-on experience with Laserfiche DMS, specifically in data migration projects, Technical Knowledge: oStrong knowledge of Laserfiche architecture, including repositories, workflows, and metadata structures, oExpertise in Laserfiche Data Migration Tools (e-g , Laserfiche Import Agent, Laserfiche Workflow), oUnderstanding of ETL (Extract, Transform, Load) processes, data mapping, and metadata handling, oFamiliarity with Laserfiche SDK/API for advanced integrations, Problem-Solving: Excellent troubleshooting skills for identifying and resolving data migration issues, Communication: Strong verbal and written communication skills to document processes and communicate effectively with stakeholders, Flexibility: Ability to work independently, manage time effectively, and work remotely (part-time), Project Management: Ability to manage tasks efficiently and meet deadlines for short-term projects, ______________ Preferred Qualifications Certifications: Laserfiche Certified Professional (LCP) or Laserfiche Certified Document Imaging Architect (CDIA) is a plus, Experience with Other DMS: Experience migrating data from or to other DMS solutions such as SharePoint, OpenText, or DocuSign, Cloud Platforms: Familiarity with cloud storage solutions like AWS, Azure, or Google Cloud is an advantage, ______________ Work Schedule Hours: Part-time (approx 20-30 hours/week) Flexible working hours based on availability, but must meet project deadlines, ______________ Application Process If you are a skilled Laserfiche consultant with hands-on data migration experience and are interested in contributing to a challenging project, please submit your resume along with a brief cover letter detailing your experience with Laserfiche data migration,
Posted 3 weeks ago
5.0 - 10.0 years
9 - 13 Lacs
Hyderabad
Work from Office
Provide timely professional ongoing Mgmt of Data Operations on Use Cases/Demand deliverables and of clinical data warehouse maintenance with respect to cost, quality and timelines within Clinical Pipeline team Help developing content and redefining training modules into engaging interactive applications for Clinical Data Mapper onboarding Leverage AI-based Clinical Pipeline technology to ensure process simplification and training delivery Follows data privacy, data-handling procedures and guidelines Participates in the discussion with Data Scientist/ Data Services or other stakeholders requesting for legacy data mapping effort and translate their needs into data operations Drives participation and input within Data Operations (DO) in the delivery of quality data and tools, processes and documentation Manage data Load, Transfer from Novartis Clinical Data Lake and conform of Clinical trial data to SDTM/ADaM compliant standards within the Clinical Data Warehouse The position is a key contributor with Clinical Pipeline team in ensuring that use case/demands are executed efficiently with timely and high quality deliverables Major accountabilities: Provides data mapping leadership across assigned use cases/demands and acts as the Clinical Data Mapper Lead where needed -Demonstrates a business understanding of the use cases/demands profile to identify and assist in successful application of data operations processes. Manage task allocation of the Clinical Data Mapper in accordance with priorities defined by Clinical Data Operations Lead. Recognize and resolve conflicts in data flows or data standards decisions and be responsible to define new quality checks/ validation process to ensure data compliance. Participation for all aspects of the Process and Training to ensure full compliance to all applicable global regulatory requirements including data privacy and business objectives are achieved. Responsible and accountable to ensure consistency of assigned tasks related to data mappings, maintenance of relevant clinical data and metadata catalogs. Build or contribute to relevant data dictionaries, ontologies and vocabularies. Perform hands on activities to conduct data quality assessments. Supports and assists Clinical Data Mapper staff for assigned use cases/demands -Provides effective input into Data Operations initiatives and innovations for quality, efficiency and continuous improvement in scientific and operational excellence Serves as primary Data Mapper ensuring timely and quality deliverables by establishing and maintaining strong working relationships with data mapper teams, and functional lines. Acts as a CDISC SDTM/ADaM expert as required Discuss with Data Scientists to provide overview on the data mapping process and data related complexities issues to be resolved. Lead independently or participate in improvement initiatives related to the development of the Clinical Pipeline technology Collaborate with Data Engineering team to submit advanced data mapping requirements for complex transformation within the Clinical Pipeline. Be familiar with all clinical study documents from protocol to CSR including Data Management and Biostatistic documents. Key performance indicators: Achieve high level of quality, timeliness, cost efficiency and customer satisfaction across Data Operations activities and deliverables. No critical audit findings due to Data Mgmt -Adherence to Novartis policy and guidelines -Customer / partner/ project feedback and satisfaction Minimum Requirements: Work Experience: 5+ years of experience working with Clinical data Strong CDISC knowledge (SDTM/ADaM) Good knowledge of Clinical Data Lifecycle Being able to work with different standards Data Privacy knowledges Cross Cultural Experience. Functional Breadth. Experience in Agile way of working would be a plus. Project Management. Skills: Clinical Data Management. Data Governance. Data Integrity. Data Operations. Data Quality. Data Privacy Databases. Project Management. SAS/SQL knowledge Python would be a plus Artificial Intelligence (Optional) Languages : English.
Posted 3 weeks ago
10.0 - 15.0 years
14 - 18 Lacs
Hyderabad, Bengaluru
Work from Office
The Data Excellence Data Architect is a demonstrated expert in technical and/or functional aspects of customer and partner engagements that lead to the successful delivery of data management projects. The Data Architect plays a critical role for setting customers up for success by prescriptively helping to shape and then execute in the Salesforce data space. This role also provides subject matter expertise related to the data management solutions and ensures successful project delivery. This includes helping identify and proactively manage risk areas, and ensuring issues are seen through to complete resolution as it relates to implementations. Will have the ability to configure and drive solutions to meet the customer s business and technical requirements. Additionally, this role will include helping align on the development of client-specific implementation proposals, SOWs, and staffing plans, engaging with SMEs across the organization to gain consensus on an acceptable proposal, developing best practices within the data excellence community, developing of shared assets. Responsibilities Serve as the Subject Matter Expert for Salesforce data excellence practice Recognized as a valuable and trusted advisor by our customers and other members of Salesforce community and continue to build a reputation for excellence in professional services Lead development of multi-year Data platform capabilities roadmaps for internal business units like Marketing, Sales, Services, and Finance. Facilitate enterprise information data strategy development, opportunity identification, business cases, technology adoption opportunities, operating model development, and innovation opportunities. Maximize value derived from data analytics by leveraging data assets through data exploitation, envisioning data-enabled strategies as well as enabling business outcomes through analytics, data analytics governance, and enterprise information policy. Translating business requirements into technical specifications, including data streams, integrations, transformations, databases, and data warehouses Defining the data architecture framework, standards and principles, including modeling, metadata, security, reference data such as product codes and client categories, and master data such as clients, vendors, materials, and employees Defining data flows, i.e., which parts of the organization generate data, which require data to function, how data flows are managed, and how data changes in transition Design and implement effective data solutions and models to store and retrieve data from different data sources Prepare accurate dataset, architecture, and identity mapping design for execution and management purposes. Examine and identify data structural necessities by evaluating client operations, applications, and programming. Research and properly evaluate new sources of information and new technologies to determine possible solutions and limitations in reliability or usability Assess data implementation procedures to ensure they comply with internal and external regulations. Lead or participate in the architecture governance, compliance, and security activities (architectural reviews, technology sourcing) to ensure technology solutions are consistent with the target state architecture. Partner with stakeholders early in the project lifecycle to identify business, information, technical, and security architecture issues and act as a strategic consultant throughout the technology lifecycle. Oversee the migration of data from legacy systems to new solutions. Preferred Qualifications and Skills: BA/BS degree or foreign equivalent Overall 10+ years of experience in Marketing data Data management space. Minimum 1 year of hands-on full lifecycle CDP implementation experience on platforms like Salesforce CDP(formerly 360 Audiences), Tealium AudienceStream, Adobe AEP, Segment, Arm Treasure Data, BlueShift, SessionM, RedPoint, etc. 5+ years of experience with data management, data transformation, ETL, preferably using cloud-based tools/infrastructure Experience with Data architecture (ideally with marketing data) using batch and/or real-time ingestion Relevant Salesforce experience in Sales Service Cloud as well as Marketing Cloud, related certifications is a plus (Marketing Cloud Consultant, Administrator, Advanced Administrator, Service Cloud Consultant, Sales Cloud Consultant, etc.) Experience with Technologies and Processes for Marketing, Personalization, and Data Orchestration. Experience with master data management (MDM), data governance, data security, data quality and related tools desired. Demonstrate deep data integration and/or migration experience with Salesforce.com and other cloud-enabled tools Demonstrate expertise in complex SQL statements and RDBMS systems such as Oracle, Microsoft SQL Server, PostGres Demonstrate experience with complex coding through ETL tools such as Informatica, SSIS, Pentaho, and Talend Knowledge of Data Governance and Data Privacy concepts and regulations a plus Required Skills Ability to work independently and be a self-starter Comfort and ability to learn new technologies quickly thoroughly Specializes in gathering and analyzing information related to data integration, subscriber management, and identify resolution Excellent analytical problem-solving skills Demonstrated ability to influence a group audience, facilitate solutions and lead discussions such as implementation methodology, Road-mapping, Enterprise Transformation strategy, and executive-level requirement gathering sessions Travel to client site (up to 50%)
Posted 3 weeks ago
7.0 - 15.0 years
12 - 17 Lacs
Mumbai
Work from Office
Prudential s purpose is to be partners for every life and protectors for every future. Our purpose encourages everything we do by creating a culture in which diversity is celebrated and inclusion assured, for our people, customers, and partners. We provide a platform for our people to do their best work and make an impact to the business, and we support our people s career ambitions. We pledge to make Prudential a place where you can Connect, Grow, and Succeed. Prudential (UK) in partnership with HCL group plans to set-up a standalone Indian health insurance company to address the growing healthcare needs of the Indian consumer. This joint venture will combine Prudentials global expertise in insurance and financial services with HCL Group s experience in technology and healthcare solutions. Prudential, with its longstanding presence in India, already operates two leading businesses in life insurance and asset management with the ICICI Group. Prudential was also the proud sponsor of the 1983 Cricket World Cup, India s first World Cup Victory! Prudential Health India is a Zero to One team undertaking a no-legacy, greenfield health insurance deployment in India, building journeys that truly empathize with the customer and offer a differentiated experience. To partner with us in this mission, we are looking for a talented to join our Experience team Name in Mumbai. Lead / Data Modeler Note: The title will depend on (1) Experience (2) Expertise and (3) Performance. So the title could be: (Senior Manager) Lead Data Modeler (Manager) Data Modeler Deep technology role Experience: 7 - 15 years. Location: Mumbai and/or Bangalore Work from office only Job Profile Summary: PHI intends to build a cloud-native, microservices-oriented, loosely coupled open technology platform, which is tightly aligned to health-insurance domain, and built expecting to be reused while anticipating change . The PHI platform will be made up of multiple applications supporting different business functions, which are well-integrated and well-orchestrated. The applications could be COTS (Common-Off-The-Shelf) vendor software, or Prudential Group software capabilities, or software built by in-house PHI engineering team. All applications need to adopt common services, platforms, architectural principles, and design patterns. The right candidate will be accountable to deliver technology artefacts to business stakeholders, in the fastest time possible , with least gaps , best quality , and clarity on how the technology and business requirements can be delivered and tested at pace, with minimal gaps, and best quality. Requirement gaps, change requests, non-integrated journeys, bugs in UAT, NFR failures - all these would signal poor quality deliverables by this candidate. Job Description: Deeply understand the long-term architectural direction, with emphasis on reusable components, and the interactions between the various applications. Work closely with data engineers, software engineers, data architects, solution designers, RD, analysts, product managers, and other teams and stakeholders to achieve desired outcomes for the company considering functionality, interoperability, performance, scalability, reliability availability other applicable criteria Design, develop, test, and implement data models supporting mobile apps, SDKs, micro-frontends, WhatsApp, and ecosystem partners. Establish standards for platform development. Data model and open APIs capabilities needs to be aligned to enable seamless integration across ecosystem partners. Use data-driven insights to guide the development of programs and apps that meet user needs. Follow and contribute to defining and ensuring adherence to architecture and coding standards. Keep up with innovativeness to build world-class propositions using bleeding-edge technologies, frameworks, and best practices to bring differentiation through technology. Establish and enforce data modeling standards and best practices to ensure consistency, quality, and integrity of data models. Continuously evolve and refine data models to adapt to new business requirements, technological advancements, and data sources. Create and maintain comprehensive documentation for data models, including data dictionaries, entity-relationship diagrams, and metadata. Optimize data models for performance, scalability, and reliability to ensure efficient data processing and retrieval. Ensure seamless integration of data models with existing data systems, databases, and data warehouses. Conduct thorough testing and validation of data models to ensure accuracy and consistency of data. Provide training and support to data users and stakeholders on data modeling concepts, tools, and best practices. Who we are looking for: Technical Skills work experience: MH: Proven hands-on experience in data modeling, with a strong understanding of data modeling principles, methodologies, and tools. MH: Demonstrated ability to understand technology and architectural strategy processes its successful translation into engineering solutions MH: Deep expertise in data modeling tools such as ER/Studio, ERwin, or similar. Strong Data and SQL skills and experience with database management systems (e.g., SQL and No-SQL DBs).. MH: Should have worked for large scale data engineering/transformation with successful implementation of data warehouses, lakes or lake houses leveraging cloud technologies like GCP. Personal Traits: First and foremost, be an exceptional engineer Highest standards of Collaboration Teamwork are critical to this role Strong communication skills ability to engagement senior management on strategic plans, leading project steering committees and status updates etc. Excellent problem analysis skills. Innovative and creative in developing solutions Ability and willingness to be hands-on; Strong attention to detail Ability to work independently and handle multiple concurrent initiatives Excellent organizational, vendor management, negotiation, and prioritization skills Education Bachelor s in computer science, Computer Engineering or equivalent; Suitable certifications for key skills Language Fluent written and spoken English Prudential is an equal opportunity employer. We provide equality of opportunity of benefits for all who apply and who perform work for our organisation irrespective of sex, race, age, ethnic origin, educational, social and cultural background, marital status, pregnancy and maternity, religion or belief, disability or part-time / fixed-term work, or any other status protected by applicable law. We encourage the same standards from our recruitment and third-party suppliers taking into account the context of grade, job and location. We also allow for reasonable adjustments to support people with individual physical or mental health requirements.
Posted 3 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Chandigarh, Dadra & Nagar Haveli
Hybrid
We are looking for a detail-oriented and proactive Salesforce Administrator with hands-on DevOps experience and proficiency in AutoRABIT (Autora). This role will focus on administering the Salesforce platform, managing release processes, and optimizing DevOps automation to ensure secure, efficient, and high-quality deployments. Responsibilities: Perform day-to-day Salesforce administration tasks, including user management, security controls, reports, dashboards, and configuration using Flows and Process Builder. Manage sandbox environments, data migrations, and metadata deployments using AutoRABIT or similar tools (Gearset, Copado). Monitor and maintain DevOps pipelines for Salesforce, ensuring smooth CI/CD processes and version control using Git. Work closely with developers, QA, and release managers to coordinate releases and manage deployment schedules. Create and maintain documentation for administrative processes, release runbooks, and DevOps workflows. Ensure compliance with security policies and governance frameworks across all Salesforce environments. Assist in auditing, troubleshooting, and resolving issues with deployments and configuration changes. Keep abreast of new Salesforce features and functionality andprovidingrecommendations for process improvements Required Qualifications: 35 years of experience as a Salesforce Administrator with strong understanding of Salesforce best practices. Hands-on experience with DevOps tools for Salesforce, especially AutoRABIT (Autora). Proficiency in managing deployment processes, metadata migration, and change tracking. Experience working with Git repositories and version control in a Salesforce context. Strong knowledge of Salesforce platform capabilities, including Flows, permission sets, roles, profiles, and data models. Salesforce Administrator Certification required (Advanced Admin is a plus). Familiarity with agile methodologies and tools like Jira, Confluence, and Slack. Preferred Skills: Knowledge of Apex, LWC, or SOQL (basic development understanding is a plus). Experience with other CI/CD tools like Copado, Gearset, Jenkins, or Azure DevOps. Understanding of Salesforce deployment risk mitigation strategies (backups, static code analysis, impact analysis). Strong communication and documentation skills. Location - Chandigarh,Dadra & Nagar Haveli,Daman,Diu,Goa,Haveli,Jammu,Lakshadweep,Nagar,New Delhi,Puducherry,Sikkim
Posted 3 weeks ago
0.0 - 4.0 years
2 - 6 Lacs
Chennai
Work from Office
The primary responsibility of this role is to perform various tasks related to content for the video catalog quality, under general supervision. This could involve tasks such as checking and/or fixing metadata, image, subtitles, audio and video assets to provide a seamless viewing experience to PV customers. The day to day job requires the individual to make judgment based decisions by following a standard operating procedure and perform Quality checks on various devices. The associate should have working knowledge of MS office to capture data on daily basis. This job requires you to be in the office 5-days per week for in-person work with your teammates. The day to day job requires the individual to make judgment-based decisions by following a standard operating procedure. This will involve tasks such as: -Understand and adhere to standard operating procedure. -Analyze, and identify the issues in the Video content. -Understand the issue and make best use of the available resources/tools to resolve/fix it. -Proactively raises issues /alarms to manager or stakeholders that may have an impact on core deliverables or operations -Communicate with internal and external stakeholders. -Adhere to the Service level agreement, and average handle time set for the processes. -Meet predetermined and assigned productivity targets and quality standards. About the team Prime Video Digi-Flex s (DF) vision is to be the most customer centric, agile and efficient operations powering Prime Video (PV) growth worldwide. Our mission is to be the center of operational excellence for PV through agile and efficient operations at scale. We influence technology-based scaling through tooling and automation. DF is a variable operations workforce that offers quick to market scalable solutions through manual execution for customer facing and business critical strategic initiatives. DF creates repeatable and standardized processes to ingest, process, cleanse, enrich, classify, match & merge partner assets and resolve customer facing issues, and enhance customer experience. - Bachelors degree - Speak, write, and read fluently in English - Experience with Microsoft Office products and applications - Knowledge of Excel at an advanced level
Posted 3 weeks ago
3.0 - 5.0 years
5 - 7 Lacs
Hyderabad
Work from Office
We are looking for a Sr. Business Analyst to join our Go-To-Market Reporting and Analytics team. In this role, you will help build data models and reports to support the broader sales organization. The ideal candidate will have experience working with sales data, SQL scripting, developing production-quality dashboards, and managing large data sets. They should possess a keen attention to detail, a creative problem-solving mindset, and strong communication skills. This individual will collaborate across functions to empower the sales operations community and leadership in making data-driven, strategic decisions. What you get to do in this role: Build complex data models to perform analysis of sales data in support of various reporting initiatives. Research required data and provide insights for ad-hoc questions from leadership. Use BI tools to design and implement industry standard best practices for scalable data management and processing architecture. Work with local and remote team members to design and build data models and perform data validation, integration testing, and support models. Develop functional subject matter expertise within various areas of the enterprise and maintain documentation for all areas of involvement, including metadata objects for end users. Manage and nurture relationships with key stakeholders, ensuring clear communication and alignment across all levels of the organization. To be successful in this role you have: Experience in leveraging or critically thinking about how to integrate AI into work processes, decision-making, or problem-solving. This may include using AI-powered tools, automating workflows, analyzing AI-driven insights, or exploring AI s potential impact on the function or industry. Minimum of 3-5 years experience in analytics or related field. Excellent knowledge of SQL scripting and writing stored procedures is a must. Good understanding of dimensional data modeling concepts. Hands-on experience with Visualization tools, particularly Power BI. Proficiency in utilizing Power Query and DAX. Excellent communication skills and ability to work individually and in a broader, geographically dispersed team. Positive can do attitude and highly analytical; enjoys challenges.
Posted 3 weeks ago
8.0 - 10.0 years
25 - 30 Lacs
Hyderabad
Work from Office
Job Title: Group Lead - Content Operations Hub Location: Hyderabad About the job Strategic context: Sanofi has currently the best and most robust pipeline of R&D and consequent new launches of our history. As new phase of Play-To-Win strategy, funding this pipeline and new launches is key to materialize the miracles of the science to improve people lives. Thus, as we enter the next phase, modernization of Sanofi is required as per the recent announcements on DRIVE, and in this respect, we are in the beginning stages of organizing the Go-to-Market Capabilities (GTMC) team at global level. The GTMC organization will help us to drive best-in-class capabilities across the board and bring value and excellence in our commercial operations. This move is a key part of the aimed modernization of Sanofi and will allow us to focus on our priorities across our products, market and pipeline through the reallocation of resources and realizing the efficiencies of removing silos that exist between our business units, avoiding the duplication and overlapping of resources, standardizing our processes and tools, operating with a One Sanofi approach to accelerate our key capabilities development, and fostering the entrepreneurial spirit by speeding up the decision making. As part of GTMC, vision of the Omnichannel pillar is the definition of Sanofi-wide best-in-class Omnichannel engagement strategy, including development of standards & best practices across markets and brand teams, as well as executional planning and support of local Omnichannel approaches (including change management). GTMC will also collaborate closely with Digital to provide consistent tools. Our Hubs are a crucial part of how we innovate, improving performance across every Sanofi department and providing a springboard for the amazing work we do. Build a career and you can be part of transforming our business while helping to change millions of lives. Ready? As Content Operations Hub Lead, within our Hyderabad Hub, youll be responsible for leading the Content Operations Team, ensuring seamless business continuity and driving strategies aligned with global priorities in Content Operations, GenAI, and digital presence optimization. You will manage resources, budget allocation, and vendor relationships, while overseeing content tagging, metadata management, and utilizing data-driven insights to optimize performance. You will also be responsible for driving synergies between other teams within Omnichannel/GTMC. You will lead the Content Operations Hub for planning and executing market-driven campaigns, making data-driven business recommendations, and creating insightful presentations. Main responsibilities: The overall purpose and main responsibilities are listed below: To create synergies and provide functional and operational direction to Content Operations Hub of Omnichannel pillar. Ensure seamless business continuity amidst capability and resource changes within content operations Drive Hub strategy aligned with global business priorities, focusing on content operations, GenAI and content optimization Lead Hub resources to improve individuals skills and enhance Hub services such as content creation, modular content and technical production Manage budget allocation and vendor relationships crucial for content production and digital marketing tools Report on content performance metrics and derive actionable insights for senior leadership, ensuring strategic alignment and performance optimization Stay up to date with industry trends and best practices in commercial operations, and standardize all tools/processes used in Omnichannel Content Operations activities deployed in hub and ensure their continuous improvement through continuous iteration and external benchmarking approach Support the content transformation program supporting the Glocal co-creation teams Be a strategic advisor for Omnichannel Content Operations capabilities execution Have a robust plan and implement concrete moves towards best-in-class capabilities Mentor the team, ensure knowledge sharing across team and company, provide global and local Content Operations teams with best practice and feedback loop on processes People : (1) Lead team of writers in content creation/corresponding support team content enhancement/graphic design/operations team; (2) Coach and develop team on content, process, agile methodologies, thoughtful risk taking, automation & innovation (including GenAI); (3) Maintain effective relationship with the stakeholders within the allocated GTMC pillar and cross-pillars - with an end objective to develop content as per requirement; (4) Interact effectively with health care professionals as relevant; (5) Partner with team to strengthen capabilities and support individual development plans (6) Collaborate with cross-functional teams in GTMC to build digital transformation/to bring innovative digital solutions (7) Provide proactive recommendations on improving scientific content of the deliverables and play an active role to follow the best practices in relation to processes, communications, project management, documentation and technical requirements Performance : (1) Provide strategic support across GTMC pillars; (2) Lead and support development of tools, technology, and processes to constantly improve quality and productivity (3) Ensure Content Operations team provides content as per agreed timelines and quality; (4) Coach team to become subject matter, process, and technological experts; and (5) Recommend, lead, and implement tactical process improvements within the department and division-wide Process : (1) Support delivery of projects in terms of resourcing, tools, technology, quality, timeliness, efficiency, and high technical standards for deliveries made by Content Operations Hub; (2) Contribute to overall quality enhancement by ensuring high scientific standards for the output produced by the Hub; (3) Secure adherence to compliance procedures and internal/operational risk controls in accordance with any and all applicable regulatory standards; (4) Facilitate development of complex scientific content (branded/unbranded); (5) Help build talent pool/capabilities/Omnichannel content experts across GBUs/therapeutic area(s); (6) Conduct comprehensive content-need analysis; (7) Implement the content plan and associated activities for the year identified for the pillar; (8) Work with selected vendors within the region to deliver the required deliverables as per defined process; (9) Leverage advanced training delivery tools and techniques thereby enhancing the effectiveness of training delivery; and (10) Design an overall plan of action based on end-user feedback and improve course content and delivery Stakeholder : (1) Work closely with GTMC/Omnichannel pillars (Global, Local, and Hub) to identify content need and assist in developing assigned deliverables and (2) Liaise with Omnichannel/ GBTs/AoR/LexMex to provide relevant and customized deliverables About you Experience : 8-10 years of experience in content creation/optimization / Leadership experience (building up teams is preferred, GCC experience)(including up to 2 years of experience in leading a multi-layered diverse team of 10 members) in medico-marketing / medical / commercial / Omnichannel domain for the pharmaceutical/healthcare industry/digital platforms; and ability to influence and negotiate,/ Familiarity with Veeva CRM tools + Veeva PromoMats (Veeva DAM, knowledge of content approval process, promotional & non-promotional materials)/ GenAI experience or interest (desirable not mandatory) Soft & Technical skill : Stakeholder management; proficient in written & oral communication; people management/ability to lead diverse teams; strong organizational and time management skills; and ability to work independently and within a team environment/ As applicable (including but not limited to therapeutic area/domain knowledge exposure - Proficient in multiple TAs/domains/GBUs; scientific communications/writing; and/or project management) Education : Advanced degree in life sciences/pharmacy/similar discipline or medical degree Languages : Excellent knowledge of English language (spoken and written) Why choose us? Bring the miracles of science to life alongside a supportive, future-focused team Discover endless opportunities to grow your talent and drive your career, whether it s through a promotion or lateral move, at home or internationally Enjoy a thoughtful, well-crafted rewards package that recognizes your contribution and amplifies your impact Take good care of yourself and your family, with a wide range of health and wellbeing benefits including high-quality healthcare, prevention and wellness programs and at least 14 weeks gender-neutral parental leave Play a key role in shaping and optimizing our content strategy, driving business growth and achieving impactful results At Sanofi, we provide equal opportunities to all regardless of race, colour, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, or gender identity. Watch our ALL IN video and check out our Diversity Equity and Inclusion actions at sanofi.com !
Posted 3 weeks ago
2.0 - 5.0 years
7 - 11 Lacs
Hyderabad
Work from Office
Data Quality : Define and Measure Data Quality Metrics: Establish metrics for accuracy, completeness, validity, consistency, timeliness, and reliability. Continuous Monitoring and Remediation: Regularly monitor data quality, conduct audits, perform root cause analysis for recurring data issues, and implement preventive measures and remediation plans. Data Profiling: Develop and maintain comprehensive data profiles to understand data characteristics. Data Validation: Create and implement validation rules to ensure that incoming data conforms to expected formats and values. Data Cleansing: Design and execute data cleansing processes to correct errors and inconsistencies, enhancing overall data quality and reliability. Data Governance : Establish Governance Framework: Implement and enforce data governance practices to ensure compliance with regulatory requirements and corporate policies, ensuring data is managed according to best practices. Metadata Management: Develop and maintain a comprehensive metadata repository to document data definitions, lineage, and usage, ensuring it is kept up to date and accessible to end users. Understand User Needs: Collaborate with business users to identify data needs, pain points, and requirements, ensuring the data is fit for its intended use. Identify Improvement Areas: Continuously seek opportunities for process improvement in data governance and quality management. User Roles and Access Requirements: Understand user roles and access requirements for systems, so that similar protection can be implemented into the analytical solutions. Row-Level Security: Work with the data & analytics team to establish row-level security for analytical solutions, ensuring data is accessible only to authorised users. Continuous Improvement: Establish Naming Conventions: Define business friendly table names and column names, along with synonyms, to ensure data easily accessible using AI. Create Synonyms: Implement synonyms to simplify data access and enhance data readability. Establish KPIs for data governance and quality efforts and create regular reports for stakeholders to track progress and demonstrate the value of data governance initiatives. Continuous Improvement: Establish a feedback loop where users can report data quality issues and suggest improvements.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Metadata roles are in high demand in India, with many companies looking for professionals who can manage and analyze data effectively. In this article, we will explore the metadata job market in India, including top hiring locations, salary ranges, career progression, related skills, and common interview questions.
These cities are known for their thriving tech sectors and offer numerous opportunities for metadata professionals.
The average salary range for metadata professionals in India varies based on experience level: - Entry-level: ₹3-6 lakhs per annum - Mid-level: ₹6-12 lakhs per annum - Experienced: ₹12-20 lakhs per annum
Salaries may vary based on the company, location, and specific job responsibilities.
In the metadata field, a career typically progresses as follows: - Metadata Analyst - Metadata Specialist - Metadata Manager - Metadata Architect
As professionals gain experience and expertise, they can move into more senior roles with increased responsibilities.
In addition to metadata management, professionals in this field are often expected to have skills in: - Data analysis - Database management - Data modeling - Information governance
Having a combination of these skills can make job seekers more attractive to potential employers.
As you explore metadata jobs in India, remember to showcase your skills and experience confidently during interviews. By preparing thoroughly and demonstrating your expertise in metadata management, you can increase your chances of securing a rewarding career in this field. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2