Jobs
Interviews

946 Metadata Jobs - Page 7

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

15 - 20 Lacs

Pune

Work from Office

Job Description: This role is for a motivated and curious Data Architect / Data Engineer to join the Group Architecture team. This is a hands-on role focused on the development of tools, prototypes, and reference solutions that support enterprise data architecture standards. The successful candidate will work with senior architects and engineers to enable the adoption of best practices across data platforms, pipelines, and domains, helping to ensure scalable, secure, and consistent data delivery across the organization. Group Architecture is responsible for setting the strategic direction for technology architecture across the enterprise. The team partners with all business divisions to define architecture principles and standards, evaluate emerging technologies, and guide implementation through hands-on support, tooling, and governance. Responsibilities Design and develop lightweight tools, scripts, and utilities that support the implementation and adoption of data architecture standards (e.g., metadata enrichment, model validation, lineage capture, standard compliance checks). Contribute to the development of reference implementations and prototypes demonstrating approved data architecture patterns. Support the creation and enhancement of data pipelines, APIs, and other data integration components across various platforms. Assist in the evaluation and testing of new tools, frameworks, or services for potential use in the data architecture landscape. Collaborate with senior architects, engineers, and business stakeholders to gather requirements and deliver technical solutions that meet enterprise standards. Prepare and maintain documentation, dashboards, and visual materials to communicate technical concepts and track adoption of architecture standards. Participate in architecture review forums and support data governance processes as needed. Skills Foundational experience in data engineering or software development, with the ability to write clean, maintainable code in Python, SQL, or other languages. Exposure to cloud platforms (such as GCP, AWS, or Azure) and experience with relevant data services and APIs. Interest in or experience developing internal tools or automation scripts to improve engineering workflows. Familiarity with concepts such as data lineage, metadata, data quality, or governance is a plus. Understanding of basic architecture principles and willingness to apply them in practical solution design. Ability to work collaboratively in a cross-functional team, take initiative, and communicate effectively with technical and non-technical stakeholders. Exposure to business intelligence tools like Looker, Tableau, or similar. Understanding of data modeling, even at a high level, is beneficial but not a core focus. Experience with Git, CI/CD, or cloud-native development practices. Well-being & Benefits Emotionally and mentally balanced: we support you in dealing with life crises, maintaining stability through illness, and maintaining good mental health Empowering managers who value your ideas and decisions. Show your positive attitude, determination, and open-mindedness. A professional, passionate, and fun workplace with flexible Work from Home options. A modern office with fun and relaxing areas to boost creativity. Continuous learning culture with coaching and support from team experts. Physically thriving we support you managing your physical health by taking appropriate preventive measures and providing a workplace that helps you thrive Private healthcare and life insurance with premium benefits for you and discounts for your loved ones. Socially connected: we strongly believe in collaboration, inclusion and feeling connected to open up new perspectives and strengthen our self-confidence and wellbeing. Kids@TheOffice - support for unexpected events requiring you to care for your kids during work hours. Enjoy retailer discounts, cultural and CSR activities, employee sport clubs, workshops, and more. Financially secure: : we support you to meet personal financial goals during your active career and for the future Competitive income, performance-based promotions, and a sense of purpose. 24 days holiday, loyalty days, and bank holidays (including weekdays for weekend bank holidays).

Posted 1 week ago

Apply

1.0 - 5.0 years

10 - 11 Lacs

Jaipur

Work from Office

Data Engineer + AIJob Summary:We are looking for a skilled and versatile Data Engineer with expertise in PySpark, Apache Spark, and Databricks, along with experience in analytics, data modeling, and Generative AI/Agentic AI solutions This role is ideal for someone who thrives at the intersection of data engineering, AI systems, and business insights contributing to high-impact programs with clients Required Skills & Experience: Advanced proficiency in PySpark, Apache Spark, and Databricks for batch and streaming data pipelines Strong experience with SQL for data analysis, transformation, and modeling Expertise in data visualization and dashboarding tools (Power BI, Tableau, Looker) Solid understanding of data warehouse design, relational databases (PostgreSQL, Snowflake, SQL Server), and data lakehouse architectures Exposure to Generative AI, RAG, embedding models, and vector databases (e g , FAISS, Pinecone, ChromaDB) Experience with Agentic AI frameworks: LangChain, Haystack, CrewAI, or similar Familiarity with cloud services for data and AI (Azure, AWS, or GCP) Excellent problem-solving and collaboration skills with an ability to bridge engineering and business needs Preferred Skills: Experience with MLflow, Delta Live Tables, or other Databricks-native AI tools Understanding of prompt engineering, LLM deployment, and multi-agent orchestration Knowledge of CI/CD, Git, Docker, and DevOps pipelines Awareness of Responsible AI, data privacy regulations, and enterprise data compliance Background in consulting, enterprise analytics, or AI/ML product development Key Responsibilities: Design, build, and optimize distributed data pipelines using PySpark, Apache Spark, and Databricks to support both analytics and AI workloads Support RAG pipelines, embedding generation, and data pre-processing for LLM applications Create and maintain interactive dashboards and BI reports using Power BI, Tableau, or Looker for business stakeholders and consultants Conduct adhoc data analysis to drive data-driven decision making and enable rapid insight generation Develop and maintain robust data warehouse schemas, star/snowflake models, and support data lake architecture Integrate with and support LLM agent frameworks such as LangChain, LlamaIndex, Haystack, or CrewAIfor intelligent workflow automation Ensure data pipeline monitoring, cost optimization, and scalability in cloud environments (Azure/AWS/GCP) Collaborate with cross-functional teams including AI scientists, analysts, and business teams to drive use-case delivery Maintain strong data governance, lineage, and metadata management practices using tools like Azure Purview or DataHub

Posted 1 week ago

Apply

8.0 - 13.0 years

10 - 15 Lacs

Bengaluru

Work from Office

As a Senior Software Engineer on the Cloud-Lake team, you will play a critical role in driving Uber s batch data infrastructure to the cloud. You ll be responsible for building scalable, reliable systems that automate dataset replication, orchestrate workload migrations, and ensure data integrity and performance across hybrid environments. You will collaborate with infra, platform, and product teams to migrate hundreds of PBs of data and thousands of pipelines, minimizing customer impact and ensuring strong observability and resilience during the transition. This role is central to delivering on Uber s long-term cost, performance, and scalability goals. What the Candidate Will Need / Bonus Points What the Candidate Will Do ---- Lead design and development of critical migration components like dataset replication, workload redirection, and metadata reconciliation. Own key modules such as state tracking, observability tooling, rollback workflows, or migration planners. Collaborate with infra, data platform, and product teams to define migration strategies, create scalable solutions, and align on delivery timelines. Proactively identify gaps in current migration tooling, propose improvements, and drive execution. Work closely with stakeholders to ensure seamless migration of workloads, accurate lineage mapping, and minimal customer disruption. Take ownership of production reliability, implement alerting for silent failures, and drive initiatives for automatic anomaly detection. Represent the team in architecture reviews, technical deep-dives, and operational postmortems. Basic Qualifications ---- 8+ years of software engineering experience, including backend development in Java, Go, or Python . Strong understanding of distributed systems , data processing frameworks (e.g., Spark, Hive, Presto) , and cloud-native services (e.g., GCS, S3, BigQuery) . Proven experience designing and operating fault-tolerant , scalable systems in production. Proficiency with batch job orchestration tools (e.g., Airflow, Piper) and monitoring/observability best practices. Experience working with large-scale data systems , including large scale upgrades, storage optimisations and handling consistency / availability challenges Strong debugging skills, ownership mindset, and the ability to work across team boundaries. Preferred Qualifications ---- Bachelors (or Masters) in Compute Science Experience leading projects that span multiple teams and domains. Prior exposure to cloud migration initiatives or hybrid cloud/on-prem transitions. Knowledge of metadata management, data lineage, and data governance systems. Experience in building internal platforms or tooling to improve engineering productivity and reduce operational burden. Strong communication skills and a history of mentoring or guiding junior engineers. *Accommodations may be available based on religious and/or medical conditions, or as required by applicable law.

Posted 1 week ago

Apply

5.0 - 10.0 years

50 - 55 Lacs

Bengaluru

Work from Office

You have the opportunity to unleash your full potential at a world-renowned company and take the lead in shaping the future of technology- As a Senior Manager of Data Engineering at JPMorgan Chase within Asset and Wealth Management, you serve in a leadership role by providing technical coaching and advisory for multiple technical teams, as well as anticipate the needs and potential dependencies of other data users within the firm- As an expert in your field, your insights influence budget and technical considerations to advance operational efficiencies and functionalities Job responsibilities Architects the design of complex data solutions that meet diverse business needs and customer requirements and guides the evolution of logical and physical data models to support emerging business use cases and technological advancements- Builds and manages end-to-end cloud-native data pipelines in AWS, leveraging hands-on expertise with AWS components, analytical systems from the ground up, providing architectural direction, translating business issues into specific requirements, and identifying appropriate data to support solutions- Works across the Service Delivery Lifecycle on engineering major/minor enhancements, ongoing maintenance of existing applications and conducts feasibility studies, capacity planning, and process redesign/re-engineering of complex integration solutions- Helps others build code to extract raw data, coach the team on techniques to validate its quality, and apply deep data knowledge to ensure the correct data is ingested across the pipeline- Guides the development of data tools used to transform, manage, and access data, and advise the team on writing and validating code to test the storage and availability of data platforms for resilience- Oversees the implementation of performance monitoring protocols across data pipelines, data accessibility within assigned pipelines, coaches the team on building visualizations and aggregations to monitor pipeline health, implements solutions and self-healing processes that minimize points of failure across multiple product features- Prepares team members for meetings with appropriate stakeholders across teams, addresses concerns around data requirements by providing guidance on feature estimation and leverages expertise to mentor and enhance team capabilities- Collects, refines, and transforms data accurately from diverse sources using advanced SQL queries and Alteryx expertise- Designs, develops and manages dynamic data visualization solutions like Tableau and ThoughtSpot providing actionable insights for informed decision-making- Publishes and manages dashboards, reports with optimized scheduling, addressing data discrepancies and performance issues proactively- Defines critical data scope within products, documenting, classifying, and enriching data with comprehensive metadata for effective use- Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience- In addition, 2+ years of experience leading technologists to manage and solve complex technical items within your domain of expertise- Extensive experience in managing the full lifecycle of data, from collection and storage to analysis and reporting- Experienced in SQL, with the ability to design and optimize complex queries and database structures- Deep understanding of NoSQL databases and their strategic applications within the industry- Proven track record in statistical data analysis and the ability to derive actionable insights from complex data sets- Experience in leading large-scale data engineering projects and implementing custom solutions to meet business objectives- Experience in analytics/business intelligence to deliver data-driven insights and strategic solutions, with mandatory hands-on expertise in Alteryx, SQL, Tableau for advanced analytics, complex data manipulations, and crafting advanced data visualizations- Demonstrated ability to build and manage cloud-native data pipelines in AWS, with hands-on knowledge of AWS components- Preferred qualifications, capabilities, and skills Proficient in Python to effectively meet future and evolving data needs, while adeptly tackling complex data logic challenges and designing sophisticated workflows for problem-solving- Drive projects efficiently using extensive experience with tools like JIRA and Confluence, demonstrating agility and adaptability to transition swiftly between projects and meet evolving demands- Exhibit exceptional written and verbal communication skills to articulate complex ideas clearly and persuasively to diverse audiences with assertive communication to set and manage stakeholder expectations under tight deadlines- Extensive experience with major cloud platforms such as AWS, Azure, and Snowflake- Proficiency in ETL tools, including PySpark, Snowflake, and other data processing frameworks- Strong understanding of Data Mesh, data modeling, and domain-driven design principles- Experience with version control systems and tools, including Git, GitHub, GitLab, and Bitbucket

Posted 1 week ago

Apply

10.0 - 15.0 years

50 - 55 Lacs

Hyderabad

Work from Office

As a Senior Manager , you ll lead a team of talented engineers in designing and building trusted, scalable systems that capture, process, and surface rich product signals for use across analytics, AI/ML, and customer-facing features- You ll guide architectural decisions, drive cross-functional alignment, and shape strategy around semantic layers, knowledge graphs, and metrics frameworks that help teams publish and consume meaningful insights with ease- We re looking for a strategic, systems-minded leader who thrives in ambiguity, excels at cross-org collaboration, and has a strong technical foundation to drive business and product impact- What You ll Do Lead and grow a high-performing engineering team focused on batch and streaming data pipelines using technologies like Spark, Trino, Flink, and DBT Define and drive the vision for intuitive, scalable metrics frameworks and a robust semantic signal layer Partner closely with product, analytics, and engineering stakeholders to align schemas, models, and data usage patterns across the org Set engineering direction and best practices for building reliable, observable, and testable data systems Mentor and guide engineers in both technical execution and career development Contribute to long-term strategy around data governance, AI-readiness, and intelligent system design Serve as a thought leader and connector across domains to ensure data products deliver clear, trusted value What We re Looking For 10+ years of experience in data engineering or backend systems, with at least 2+ years in technical leadership or management roles Strong hands-on technical background, with deep experience in big data frameworks (e-g-, Spark, Trino/Presto, DBT) Familiarity with streaming technologies such as Flink or Kafka Solid understanding of semantic layers, data modeling, and metrics systems Proven success leading teams that build data products or platforms at scale Experience with cloud infrastructure (especially AWS S3, EMR, ECS, IAM) Exposure to modern metadata platforms, Snowflake, or knowledge graphs is a plus Excellent communication and stakeholder management skills A strategic, pragmatic thinker who is comfortable making high-impact decisions amid complexity

Posted 1 week ago

Apply

3.0 - 5.0 years

6 - 10 Lacs

Bengaluru

Work from Office

We are currently seeking a BID Administrator to join our growing team- This is an opportunity to be involved in large global projects that have huge positive impacts on the way the world produces energy for the future- The bid teams are involved from the concept stage of many of these projects and are part of the team that delivers them all over the world- In this role, you ll be expected to: Manage messages through our team inbox including portal notifications- Run a weekly scan of new opportunities- Managing our portal registrations- Comply with company systems, practices and guidelines- Provide reports based on data in our CRM system- Enter and update opportunity information in our CRM system- Respond to Expressions of Interest and Pre-qualification Questionnaires using standard company information- Raise requests including due diligence and credit checks- Support the preparation and production of bids, including: o Maintaining CRM data o Identifying and coordinating various internal approvals required o Preparing and formatting the proposal templates o Coordinating inputs from the nominated parties o Modifying and formatting CVs and case studies o Submitting bids through portals- Maintenance and updating of knowledge repositories, databases, and catalogues- Applying appropriate metadata, tags, and categories to information- Performing quality assurance and validation- Identifying and addressing knowledge gaps- Collaborating with internal stakeholders to acquire and share knowledge- Arrange internal and external meetings- Preparing minutes of meetings- Support with general ad hoc administrative requests- Candidate requirements: Ability to manage multiple tasks concurrently- Fluent written and spoken English- Self-directed, take initiative and ownership- Deadline oriented and good attention to detail- Ability to use computer-based technology and internal software- Ability to handle large volumes of information and communications- Ability to work under pressure and be flexible- Good communication skills with all levels of employees- Interested in the variety that working in a large consultancy provides- Preferred: Experience of working with CRM- Experience of working on proposals, preferably with global technical consultants will be an advantage We can offer (subject to Company s policy): - Agile and safe working environment - Competitive annual leave and sick leaves - Group incentive scheme - Group term life insurance, Workmen s compensation and Group medical insurance coverage - Short and Long-term Global employment opportunities - Global collaboration and knowledge sharing - Digital Innovation and Transformation

Posted 1 week ago

Apply

2.0 - 4.0 years

4 - 8 Lacs

Bengaluru

Work from Office

You will build a knowledge library aligned with the structure of the Energy WIN Knowledge Hub, adding metadata as selected by the wider team- You will be responsible for collecting, storing, maintaining, reviewing, and archiving information and data assets- You will ensure that information is relevant, up-to-date, and accessible for users- Main Duties of the Bid Assistant Performing various tasks related to the knowledge management cycle, including: Creating and updating knowledge repositories, databases, and catalogues- Applying appropriate metadata, tags, and categories to information- Performing quality assurance and validation- Providing support and guidance on how to access and use the Win Site- Identifying and addressing knowledge gaps- Collaborating with internal stakeholders to acquire and share knowledge- Bid Assistant Skillsets A bid assistant should have the following skills: Experience working with knowledge management systems like Sharepoint, tools, and platforms- Proficiency in using various data formats, standards, and protocols- Strong research, analytical, and problem-solving skills- Excellent communication, collaboration, and presentation skills- Attention to detail, accuracy, and consistency- Creativity, curiosity, and a passion for learning and sharing knowledge- We can offer (subject to Company s policy): - Agile and safe working environment - Competitive annual leave and sick leaves - Group incentive scheme - Group term life insurance, Workmen s compensation and Group medical insurance coverage - Short and Long-term Global employment opportunities - Global collaboration and knowledge sharing - Digital Innovation and Transformation

Posted 1 week ago

Apply

3.0 - 7.0 years

6 - 9 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

" Job Title : Azure Data Engineer Job Location - India Job Description: Must Skills: Microservices Architecture & Async Messaging Understanding of event-driven design Experience designing loosely coupled microservices. Familiarity with event sourcing and message-driven workflows. Microsoft Purview APIs & Data Governance Experience integrating with Microsoft Purview REST APIs. Understanding of metadata management and data cataloguing Power BI Reporting Experience with developing Power BI report using Power BI Desktop Experience with managing data source connects for Power BI Vnet Data Gateway Experience with managing access to Power BI Workspace and Reports Experience: An awareness of Azure Logic Apps would also be very beneficial. Containerization & Azure Container Apps Proficiency with Docker Experience deploying and maintaining Azure Container Apps. Familiarity with container orchestration and scaling strategies. Azure Cosmos DB & NoSQL Databases Experience with Cosmos DB SQL API and query optimizations. Azure Storage & Scalable Data Storage Experience designing efficient Table Storage schemas. Experience with managing blob files Understanding of PartitionKey\/RowKey indexing for fast lookups. Monitoring & Observability with Azure Monitor Experience setting up Azure Monitor, Application Insights, and Log Analytics. Ability to analyse logs, track performance metrics, and set up alerts. Understanding of distributed tracing for microservices DevOps & CI\/CD Pipelines Experience with Azure DevOps for CI\/CD. Knowledge of container registry management (ACR).

Posted 1 week ago

Apply

3.0 - 9.0 years

12 - 16 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

66degrees is a leading Google Cloud Premier Partner. We believe that great engineering takes heart. Focusing exclusively on Google Cloud, we guide our clients on their digital transformation journey to deliver the most innovative and disruptive projects in the industry. 66degrees is dedicated to providing our employees with a challenging and exciting work environment without forgetting to have some fun along the way. Success is predictable Role Description Own end-to-end design of modern data platforms on Microsoft Azure. Provide architectural leadership with hands-on engineering skills, guide data engineering team to build a secure, scalable data platform; consisting of data lake, data lakehouse or data warehouse. Deliver raw data into analytics-ready assets. Liaison between business and technology stakeholders (Cloud Infrastructure, App Development, Security and Compliance) to define data strategy, standards and governance while optimising cost, performance and compliance across the Azure ecosystem. Responsibilities Design and document data architectures (data lake, warehouse, lakehouse, MDM, streaming) on Azure Synapse Analytics, Data Lake Storage Gen2, Microsoft Fabric,CosmosDB. Lead migration of on-prem workloads to Azure with appropriate IaaS, PaaS or SaaS solutions and right-sizing for cost and performance. Guide development of data pipelines using Azure Data Factory, Synapse Pipelines, dbt, ensuring orchestration, monitoring and CI/CD via Azure DevOps. Model conceptual, logical and physical data structures; enforce naming standards, data lineage and master-data management practices. Implement robust security (RBAC, managed identities, Key Vault), data privacy and regulatory controls such as GDPR or HIPAA. Define data governance policies, metadata management and catalogue strategies using Microsoft Purview or equivalent tools. Provide technical leadership to data engineers, analysts and BI developers; lead code/design review meetings and mentor on Azure best practices. Collaborate with enterprise architects, product owners and business SMEs to translate analytical use cases into scalable cloud data design and feature roadmap. Establish patterns to monitor platform health, automate cost optimisation and capacity planning via Azure features.

Posted 1 week ago

Apply

2.0 - 8.0 years

4 - 10 Lacs

Hyderabad

Work from Office

Curate and manage datasets to ensure they meet quality and compliance standards. Develop and implement data governance policies and procedures. Conduct data profiling and quality assessments to identify discrepancies and improve data quality. Collaborate with data producers and consumers to understand data needs and requirements. Create and maintain metadata documentation to enhance data discoverability and usability. Monitor and evaluate data curation processes and recommend improvements. Support data integration and data migration projects as needed. Stay updated on best practices in data curation, management, and governance. Skillsets: Experience in using, testing, modifying APIs Experience on JIRA and Salesforce Proficiency in data manipulation tools and programming languages (e.g., PLSQL, Python) Strong communication skills and the ability to work collaboratively with cross-functional teams Experience with data visualization using Power BI

Posted 1 week ago

Apply

0.0 - 4.0 years

2 - 5 Lacs

Ghaziabad, New Delhi, Pune

Work from Office

On-Page & Off-Page SEO: Optimize content, metadata, and internal linking; build quality backlinks. Keyword & Competitor Research : Conduct keyword research and competitor analysis. Performance Monitoring: Use tools like Google Analytics to track website rankings and traffic. Technical SEO: Perform website audits and resolve SEO issues.

Posted 1 week ago

Apply

10.0 - 15.0 years

14 - 19 Lacs

Hyderabad

Work from Office

To get the best candidate experience, please consider applying for a maximum of 3 roles within 12 months to ensure you are not duplicating efforts. Job Category Software Engineering Job Details About Salesforce We re Salesforce, the Customer Company, inspiring the future of business with AI+ Data +CRM. Leading with our core values, we help companies across every industry blaze new trails and connect with customers in a whole new way. And, we empower you to be a Trailblazer, too driving your performance and career growth, charting new paths, and improving the state of the world. If you believe in business as the greatest platform for change and in companies doing well and doing good you ve come to the right place. We re building a product data platform to bring Salesforce s product signals into the agentic era powering smarter, adaptive, and self-optimizing product experiences. As a Senior Manager , you ll lead a team of talented engineers in designing and building trusted, scalable systems that capture, process, and surface rich product signals for use across analytics, AI/ML, and customer-facing features. You ll guide architectural decisions, drive cross-functional alignment, and shape strategy around semantic layers, knowledge graphs, and metrics frameworks that help teams publish and consume meaningful insights with ease. We re looking for a strategic, systems-minded leader who thrives in ambiguity, excels at cross-org collaboration, and has a strong technical foundation to drive business and product impact. What You ll Do Lead and grow a high-performing engineering team focused on batch and streaming data pipelines using technologies like Spark, Trino, Flink, and DBT Define and drive the vision for intuitive, scalable metrics frameworks and a robust semantic signal layer Partner closely with product, analytics, and engineering stakeholders to align schemas, models, and data usage patterns across the org Set engineering direction and best practices for building reliable, observable, and testable data systems Mentor and guide engineers in both technical execution and career development Contribute to long-term strategy around data governance, AI-readiness, and intelligent system design Serve as a thought leader and connector across domains to ensure data products deliver clear, trusted value What We re Looking For 10+ years of experience in data engineering or backend systems, with at least 2+ years in technical leadership or management roles Strong hands-on technical background, with deep experience in big data frameworks (e.g., Spark, Trino/Presto, DBT) Familiarity with streaming technologies such as Flink or Kafka Solid understanding of semantic layers, data modeling, and metrics systems Proven success leading teams that build data products or platforms at scale Experience with cloud infrastructure (especially AWS S3, EMR, ECS, IAM) Exposure to modern metadata platforms, Snowflake, or knowledge graphs is a plus Excellent communication and stakeholder management skills A strategic, pragmatic thinker who is comfortable making high-impact decisions amid complexity Why Join Us This is your opportunity to shape how Salesforce understands and uses its product data. You ll be at the forefront of transforming raw product signals into intelligent, actionable insights powering everything from internal decision-making to next-generation AI agents. If youre excited by the challenge of leading high-impact teams and building trusted systems at scale, wed love to talk to you. Salesforce is an equal opportunity employer and maintains a policy of non-discrimination with all employees and applicants for employment. What does that mean exactlyIt means that at Salesforce, we believe in equality for all. And we believe we can lead the path to equality in part by creating a workplace that s inclusive, and free from discrimination. Know your rights: workplace discrimination is illegal. Any employee or potential employee will be assessed on the basis of merit, competence and qualifications without regard to race, religion, color, national origin, sex, sexual orientation, gender expression or identity, transgender status, age, disability, veteran or marital status, political viewpoint, or other classifications protected by law. This policy applies to current and prospective employees, no matter where they are in their Salesforce employment journey.

Posted 1 week ago

Apply

2.0 - 4.0 years

5 - 9 Lacs

Thane

Work from Office

Job Title: Content Writer Health & Wellness (Nutraceuticals + Personal Care) Job Summary: Are you passionate about health, wellness, and the science behind what makes people feel and look their bestWe re looking for a sharp, creative, and driven Content Writer to craft powerful product listings that educate, engage, and convert. In this role, you won t just write you ll shape the voice of some of the most innovative nutraceutical and personal care products in the market. You ll transform complex ingredients and health benefits into stories that connect with our audience, build trust, and drive action. If you re someone who thrives at the intersection of creativity, science, and SEO, we want to hear from you. What You ll Do: Craft compelling and conversion-driven product listings for our wellness and skincare products. Translate technical product information into engaging copy that s easy to understand and full of impact. Own the content strategy for product listings: from titles and bullet points to rich descriptions and meta tags. Ensure all content meets regulatory standards (FDA, FSSAI, etc.) without losing its charm or clarity. Collaborate closely with marketing, product, and design teams to ensure consistency in brand voice and storytelling. Stay ahead of industry trends, consumer insights, and competitor strategies to inform and evolve our content approach. Deliver high-volume, high-impact content under tight deadlines with accuracy and flair. What We re Looking For: 2+ years of experience in content writing, ideally in the health, wellness, nutraceutical, or beauty space. Solid understanding of nutraceuticals and cosmetic products, including knowledge of ingredients, claims, and compliance. Proven ability to blend science-backed information with emotionally resonant storytelling. Strong grasp of SEO, including keyword strategy and metadata optimization Excellent writing, editing, and proofreading skills with high attention to detail. Self-starter with strong project management skills and the ability to juggle multiple priorities. Collaborative mindset you love working across teams and building something great together. Key Skills: Content Writing SEO & Keyword Research Product Storytelling Regulatory Awareness (FDA/FSSAI) Research & Trend Analysis Cross-Functional Collaboration Time Management & Deadline Ownership Why Join Us This is your chance to make your mark in a fast-growing wellness brand . Youll be part of a team that values creativity, initiative, and purpose-driven content. We re not just writing descriptions, we re building a brand that empowers healthier lifestyles.

Posted 1 week ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Surat

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 week ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Varanasi

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 week ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Visakhapatnam

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications.

Posted 1 week ago

Apply

1.0 - 9.0 years

1 - 9 Lacs

Mumbai, Maharashtra, India

On-site

Job responsibilities: Evaluate conceptual soundness of model specifications, reasonableness of assumptions, reliability of inputs, completeness of testing, correctness of implementation, and suitability and comprehensiveness of performance metrics and risk measures Perform independent testing of models by replicating or building benchmark models Design and implement experiments to measure the potential impact of model limitations, parameter estimation errors, and deviations from model assumptions; compare model outputs with empirical evidence or outputs from model benchmarks Evaluate the risks posed by non-transparent model parameters and/or non-linear relationships, and suggest ways to mitigate such risks Document the model review findings and communicate them to stakeholders Serve as the first point of contact for model governance related inquiries for the coverage area, and help identify and escalate issues to ensure that their resolutions are sound and timely Provide guidance on the appropriate usage of models to model developers, users, and other stakeholders in the firm Stay abreast of the ongoing performance testing outcomes for models used in the coverage area, and communicate those outcomes to stakeholders Maintain the model inventory and model metadata for the coverage area Maintain the pace with the latest developments in coverage area in terms of products, markets, models, risk management practices, and industry standards Required qualifications, capabilities, and skills : Master s degree in a quantitative discipline such as Math, Physics, Engineering, Computer Science, Economics or Finance - with minimum 3 years of relevant working experience or a PhD. Excellence in probability theory, stochastic processes, statistical/economic modeling, partial differential equations, and numerical analysis. Understanding of options and derivative pricing theory and risks Proficient in Python, R, Matlab, C++, or other programming languages Risk and control mindset: ability to ask incisive questions, assess materiality of model issues, and escalate issues appropriately Strong communication skills with the ability to interface with front office traders, and other functional areas in the firm on model-related issues; and produce documents for internal and external (regulatory) consumption Strong analytical and problem-solving abilities Preferred qualifications, capabilities, and skills: Knowledge of machine learning is not required but a plus.

Posted 1 week ago

Apply

8.0 - 12.0 years

8 - 12 Lacs

Bengaluru, Karnataka, India

On-site

The Business Intelligence Solutions team works to providetailored solutions for Wholesale lending services data needs. As a Business Intelligence Associate within our Data Team, you will collaborate closely with product, operations, and data teams to understand business problems, identify underlying challenges, and deliver actionable data insights. Hands on expertise across Tableau, Alteryx, SQL, dashboard performance optimization is critical and a must have to succeed in this role. Job Responsibilities Lead effective requirements gathering sessions to align with WLS data and product priorities. Accurately collect, refine, and transform data from diverse sources using advanced SQL queries and Alteryx expertise. Design, develop, and manage dynamic data visualization solutions in Tableau, providing actionable insights for informed decision-making. Conduct thorough control testing of solution components, ensuring precise insights and validation with stakeholders for accuracy. Publish and manage dashboards, reports with optimized scheduling, addressing data discrepancies and performance issues proactively. Define critical data scope within products, documenting, classifying, and enriching data with comprehensive metadata for effective use. Actively coordinate and collaborate with the team, leveraging expertise to mentor and enhance team capabilities. Required Qualifications, Capabilities, and Skills Minimum 7 years of experience in analytics/business intelligence to deliver data-driven insights and strategic solutions, With 5 years mandatory hands-on expertise in Alteryx, SQL, Tableau for advanced analytics, complex data manipulations, and crafting advanced data visualizations. Utilize proficiency in Qlik and Python to effectively meet future and evolving data needs, while adeptly tackling complex data logic challenges and designing sophisticated workflows for problem-solving. Drive projects efficiently using extensive experience with tools like JIRA and Confluence, demonstrating agility and adaptability to transition swiftly between projects and meet evolving demands. Exhibit exceptional written and verbal communication skills to articulate complex ideas clearly and persuasively to diverse audiences with assertive communication to set and manage stakeholder expectations under tight deadlines. Preferred Qualifications, Capabilities, and Skills Proficient knowledge of the product development life cycle. Role: Business Intelligence & Analytics - Other Industry Type: Financial Services Department: Data Science & Analytics Employment Type: Full Time, Permanent Role Category: Business Intelligence & Analytics Education UG: Any Graduate PG: Any Postgraduate

Posted 1 week ago

Apply

5.0 - 12.0 years

0 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Familiarity with Data Management Standards Ability to work with high volumes of detailed technical & business metadata. Experience with documenting Data Element metadata (Business Elements vs. Technical Data Elements) Experience with understanding how data transformations materialize and determine appropriate controls required to ensure high-level of data quality. Ability to understand and document application and/or data element level flows (i.e., lineage). Ability to analyze both Process and Datasets to identity meaningful actionable outcomes. Understand and implement changes to business processes. Develop and influence business processes necessary to support data governance related outcomes. Manage and influence across vertical organizations to achieve common objectives. Intermediate to Expert level knowledge of MS products such as Excel, PowerPoint, Word, Skype, & Outlook Working knowledge of Metadata tools such as Collibra or equivalent. Familiarity with Data Analytics / BI tools such as Tableau, MicroStrategy etc. Communication Skills: Create both visually and verbally engaging informative materials for departmental leadership, business partners, executives, and stakeholders. Ability to tailor communication of topics to various levels of the organization (e.g., technical audiences vs. business stakeholders). Desired Skills (nice-to-have): General knowledge of Banking industry.

Posted 1 week ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Lucknow

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 week ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Ludhiana

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

hyderabad, telangana

On-site

The Shri Ram Academy is a renowned educational institution located in the heart of Financial District, Hyderabad. As part of The Shri Legacy, the academy is committed to providing exceptional educational experiences through its International Continuum Curriculum. With a sprawling 9-acre campus that boasts green pastures and serene lake views, the academy offers a vibrant learning environment for its students. We are currently seeking a full-time MYP Librarian Teacher to join our team in Hyderabad. In this role, you will be responsible for managing library services, teaching library science, handling metadata, overseeing library operations, and delivering library instruction. Your daily tasks will include cataloging library materials, assisting students and faculty with research, maintaining an organized library system, and conducting instructional sessions to support academic programs. The ideal candidate for this position will possess skills in Library Services and Library Management, along with experience in Library Science and Metadata. Knowledge of Library Instruction techniques, strong organizational and communication skills, and the ability to collaborate effectively with students and staff are essential for success in this role. While a degree in Library Science or a related field is preferred, relevant experience and expertise will also be considered. If you are passionate about promoting a love for reading and learning, and if you thrive in an environment that values innovation and excellence in education, we invite you to join our team at The Shri Ram Academy.,

Posted 1 week ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Vadodara

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 week ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Agra

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 week ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Nagpur

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies