Jobs
Interviews

946 Metadata Jobs - Page 10

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 10.0 years

9 - 12 Lacs

Bengaluru

Work from Office

Career Area: Finance : Your Work Shapes the World at Caterpillar Inc. When you join Caterpillar, you'rejoining a global team who cares not just about the work we do but also about each other. We are the makers, problem solvers, and future world builders who are creating stronger, more sustainable communities. We don'tjust talk about progress and innovation here we make it happen, with our customers, where we work and live. Together, we are building a better world, so we can all enjoy living in it. Your Impact Shapes the World at Caterpillar Inc When you join Caterpillar, you're joining a global team who cares not just about the work we do but also about each other. We are the makers, problem solvers and future world builders who are creating stronger, more sustainable communities. We don't just talk about progress and innovation here we make it happen, with our customers, where we work and live. Together, we are building a better world, so we can all enjoy living in it. Job Summary We are seeking a skilled Accountant for our FP&A systems- business support (OneStream) role, to join our Manufacturing Tower - Analytics COE Team -Global Financial Services Division who will have the opportunity to champion our Global Finance Transformation by supporting our Management Reporting, Business Partnering and Continuous Improvement efforts. The preference for this role is to be based out of Bangalore, Whitefield Office What you will do Business support for OneStream handling multiple stakeholders in the business Performing the tasks related to OneStream business support in line with COE scope of work, timelines, business requirements and defined metrics/KPIs Participating in knowledge transfer sessions and creating user documentation, including user guides and training materials. Partnering with technical teams to provide support on dashboard and cube view report builds. Delivering ad-hoc end-user training, as required Providing additional guidance to resolve issues arising from a gap in user knowledge / understanding Providing support for data import related queries - e.g. establishing and explaining the data lineage of data points within the OneStream application Assisting users to submit change requests, including the process and the details required Embedding change management best practices What you will have. Hands on experience in month end close/consolidation and management reporting using OneStream Preferable to have experience in OneStream metadata, business rules, forms, cube views, workflow, user experience management, reports and dashboards Preferable to have experience with finance transformation projects closely involved during design, development, testing and enhancement phases Background/Experience/Skills & Capabilities Proven experience in management reporting, FP&A and related month end processes Excellent customer service skills working in a global environment with multiple stakeholders to drive outcomes Self-starter, works well independently and in a team, with excellent communication skills Exhibits initiative and intellectual curiosity Experience in finance transformation projects CA, CMA, ICWA, MBA Finance with 7 - 10 years of progressive experience Level 1 OneStream Certified Associate is preferred but not required Additional Information Work timings 1 p.m. to 10 p.m. IST Work from office 5 days a week IC (individual contributor) role Skills desired: Accuracy and Attention to Detail Understanding the necessity and value of accuracy; ability to complete tasks with high levels of precision.Level Extensive ExperienceEvaluates and makes contributions to best practices. Processes large quantities of detailed information with high levels of accuracy. Productively balances speed and accuracy. Employs techniques for motivating personnel to meet or exceed accuracy goals. Implements a variety of cross-checking approaches and mechanisms. Demonstrates expertise in quality assurance tools, techniques, and standards. Analytical Thinking Knowledge of techniques and tools that promote effective analysis; ability to determine the root cause of organizational problems and create alternative solutions that resolve these problems.Level Working KnowledgeApproaches a situation or problem by defining the problem or issue and determining its significance. Makes a systematic comparison of two or more alternative solutions. Uses flow charts, Pareto charts, fish diagrams, etc. to disclose meaningful data patterns. Identifies the major forces, events and people impacting and impacted by the situation at hand. Uses logic and intuition to make inferences about the meaning of the data and arrive at conclusions. Effective Communications Understanding of effective communication concepts, tools and techniques; ability to effectively transmit, receive, and accurately interpret ideas, information, and needs through the application of appropriate communication behaviors.Level Working KnowledgeDelivers helpful feedback that focuses on behaviors without offending the recipient. Listens to feedback without defensiveness and uses it for own communication effectiveness. Makes oral presentations and writes reports needed for own work. Avoids technical jargon when inappropriate. Looks for and considers non-verbal cues from individuals and groups. Managing Multiple Priorities Knowledge of effective self-management practices; ability to manage multiple concurrent objectives, projects, groups, or activities, making effective judgments as to prioritizing and time allocation.Level Extensive ExperienceClarifies and handles multiple concurrent and diverse activities. Shifts focus among several efforts as required by changing priorities. Addresses potential conflicts that impact current delivery commitments. Works with or leads others to re-prioritize work and reschedule commitments as necessary. Responds to shifting priorities while maintaining progress of regularly scheduled work. Demonstrates an expectation that there will be ongoing shifts in demands and priorities .Problem Solving Knowledge of approaches, tools, techniques for recognizing, anticipating, and resolving organizational, operational or process problems; ability to apply knowledge of problem solving appropriately to diverse situations.Level Working KnowledgeIdentifies and documents specific problems and resolution alternatives. Examines a specific problem and understands the perspective of each involved stakeholder. Develops alternative techniques for assessing accuracy and relevance of information. Helps to analyze risks and benefits of alternative approaches and obtain decision on resolution. Uses fact-finding techniques and diagnostic tools to identify problems. Accounting Knowledge of accounting methods, processes, and tools; ability to maintain and prepare financial statements and reports using accounting methods and processes.Level Working KnowledgeUtilizes cost monitoring practices, techniques and considerations. Works with financial transactions and related documentation within the organization. Participates in accounting practices of classifying and recording financial data. Maintains existing charts of accounts. Follows regulations for entering and reporting the financial content in major accounting systems. Financial Analysis Knowledge of tools and approaches of financial analysis; ability to read, interpret and draw accurate conclusions from financial and numerical material.Level Working KnowledgeApplies principles used to evaluate the economics of investment decisions. Interprets major types of financial statements issued by the organization. Utilizes basic qualitative and quantitative tools and techniques with proficiency. Works with a specific financial analysis tool set. Implements valid financial analysis aligned with key criteria. Financial Reporting Knowledge of processes, methods, and tools of financial reporting; ability to create and maintain accurate and thorough financial reports.Level Working KnowledgeFollows organizational practices and guidelines for product profitability reporting. Analyzes errors or inaccuracies in financial reports. Uses basic tools to create simple financial reports. Monitors compliance with organizational standards for financial report writing. Implements organizational methods and procedures for financial report writing. What you will get: Work Life Harmony Earned and medical leave. Relocation assistance Holistic Development Personal and professional development through Caterpillar s employee resource groups across the globe Career developments opportunities with global prospects Health and Wellness Medical coverage -Medical, life and personal accident coverage Employee mental wellness assistance program Financial Wellness Employee investment plan Pay for performance -Annual incentive Bonus plan. Additional Information: Caterpillar is not currently hiring individuals for this position who now or in the future require sponsorship for employment visa status; however, as a global company, Caterpillar offers many job opportunities outside of the U.S. which can be found through our employment website at www.caterpillar.com/careersCaterpillar is an Equal Opportunity Employer (EEO)EEO/AA Employer. All qualified individuals, including minorities, females, veterans and individuals with disabilities - are encouraged to apply. Posting Dates: Caterpillar is an Equal Opportunity Employer. Not ready to applyJoin our Talent Community.

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

The Lead Data Modeler role involves developing high performance, scalable enterprise data models on a cloud platform. You must possess strong SQL skills along with excellent data modeling expertise, and be well-versed in the Kimball methodology. Your responsibilities include participating in various activities throughout the systems development lifecycle, supporting activities, engaging in POCs, and presenting outcomes effectively. Additionally, you will be responsible for analyzing, architecting, designing, programming, debugging both existing and new products, as well as mentoring team members. It is crucial to take ownership and demonstrate high professional and technical ethics with a consistent focus on emerging technologies beneficial for the organization. You should have over 10 years of work experience in data modeling or engineering. Your duties will involve defining, designing, and implementing enterprise data models, building Kimball-compliant data models in the Analytic layer of the data warehouse, and constructing 3rd normal form-compliant data models in the hub layer of the data warehouse. You must translate tactical/strategic requirements into effective solutions that align with business needs. The role also requires participation in complex initiatives, seeking help when necessary, reviewing specifications, coaching team members, and researching coding standards improvements. Technical skills include hands-on experience in SQL, query optimization, RDBMS, Data Warehouse (ER and Dimensional modeling), modeling data into star schemas using the Kimball methodology, Agile methodology, CICD frameworks, DevOps practices, and working in an onsite-offshore model. Soft skills such as leadership, analytical thinking, problem-solving, communication, and presentation skills are essential. You should be able to work with a diverse team, make decisions, guide team members through complex problems, and effectively communicate with leadership and business teams. A Bachelor's degree in Computer Science, Information Systems, or a related technical area is required, preferably B.E in Computer Science/Information Tech. Nice-to-have skills include experience with Apache Spark Python, graph databases, data identification, ingestion, transformation, and consumption, data visualization, SAP Enterprise S/4 HANA familiarity, programming language skills (Python, NodeJs, Unix Scripting), and experience in GCP Cloud Ecosystem. Experience in software engineering across all deliverables, including defining, architecting, building, testing, and deploying, is preferred. The Lead Data Modeler role does not offer relocation assistance and does not specify a particular work shift.,

Posted 2 weeks ago

Apply

2.0 - 7.0 years

4 - 8 Lacs

Nagercoil

Work from Office

Experience: 4-5 years Job Description: Develop, test, and deploy Salesforce components like Apex class/triggers, Aura, VF, LWC, Flow. Understand and develop optimized SOQL queries Knowledge about handling CRUD/FLS handling on DML operations on Apex code Knowledge in VS Code with Salesforce CLI, Scratch ORG based development, and Namespace/DevHub ORG based Metadata handling. Working knowledge of Salesforce Platform APIs both for integration and metadata handling Knowledge of browser-based debugging processes like: Debug logs, Console logs understanding Optimize code for performance and maintainability in accordance with Salesforce Governor Limits and platform best practices Experience with code versioning tools such as Git, GitHub along with knowledge in branch/tags handling and conflict resolving processes Troubleshoot and resolve issues across development and production environments. Knowledge in handling Managed Packages with 1GP process, like: Existing/New components handling Beta & Main package creation Ability to understand business requirements and translate them into technical requirements Exposure to Agile based development cycle (Scrum) Good To Have: Working knowledge of NodeJS/ReactJS With third-party integration like Salesforce Salesforce Certifications such as: Platform Developer I/II JavaScript Developer I

Posted 2 weeks ago

Apply

7.0 - 10.0 years

7 - 10 Lacs

Hyderabad

Work from Office

Required Skills: Oracle EPM & Hyperion Planning and Essbase implementation, developing financial reports and data forms, Advanced knowledge of FDMEE and ODI for automating data and metadata integration, banking or financial services clients preferred. Key Responsibilities: Lead or support end-to-end implementation of Oracle EPM & Hyperion Planning and Essbase solutions (On-Prem). Design and develop financial reports and data forms based on business requirements. Develop and manage workflow processes within the Hyperion suite. Write and maintain business rules to support budgeting, forecasting, and reporting needs. Build and optimize data and metadata load automation using FDMEE and Oracle Data Integrator (ODI) . Collaborate with finance and business stakeholders to translate functional requirements into technical solutions. Conduct system testing, UAT support, and user training sessions. Troubleshoot issues, monitor system performance, and provide ongoing support and enhancements. Qualifications Preferred Qualifications: Bachelor s or Master s degree in Computer Science, Finance, or related field. Oracle certifications in Hyperion or related

Posted 2 weeks ago

Apply

6.0 - 11.0 years

6 - 10 Lacs

Hyderabad

Work from Office

About the role The Senior Salesforce Developer is responsible for analysis, design, development and unit testing of custom solutions, as well as production support, on the CRM B2B Salesforce.com platform. The Salesforce Developer implements solutions based on the overall technical strategy for the platform, following specified documentation, coding and design patterns and best practices. The role requires solving unique or complex problems, individually or in group-think fashion, specific to the Salesforce.com platform. What youll do Design, develop, test and maintain usable, scalable, extensible solutions on the Salesforce.com platform Translate business requirements into logical, component-based technical implementations using the Salesforce.com toolkit Apex, Lightning components, JavaScript, workflows, approval processes, triggers, SOQL, Produce technical specification and design documents as required Collaborate with scrum team members to ensure thorough understanding of business requirements and processes Produce all required artifacts, including analysis and design documentation, as well as technical implementation specifications Support implementation activities and troubleshooting of system environmental issues sandbox and production Provide ongoing code-level maintenance and support Drive resolution of all issues that occur during development Attend and provide input to the scrum ceremonies, and provide level of effort estimates for sprint planning Perform Administrative configuration, as needed What youll bring 6+ years of rigorous design and development experience 4+ years of solid hands-on experience developing on the Salesforce platform including Apex, Visualforce, APIs(REST/SOAP/Tooling/Metadata/Bulk/etc),approval processes, flows, triggers, Lightning Web Components (LWC), web services, SOQL, SOSL, Analytics Experience integrating Salesforce.com with enterprise-level applications though all available integration channels as well asSalesforce-to-Salesforceintegrations Experience working locally with Git and interacting with remote repositories Strong relational database design/development skills Strong analytical, problem solving/decision making skills Experience working in an Agile Scrum delivery framework Excellent oral and written communication skills Ability to meet or exceed standards of quality, security and operability required in a DevSecOps culture Salesforce certifications such as Salesforce Certified Platform Developer, Salesforce Certified CPQ Specialist, or Salesforce Certified Technical Architect are highly desirable Experience with ETL tools is a plus Exposure to any of the following tools and applications a plusGitlab, AutoRabit, Rally Financial Services experience a plus Stay up to date on everything Blackbaud, follow us on Linkedin, X, Instagram, Facebook and YouTube Blackbaud is proud to be an equal opportunity employer and is committed to maintaining an inclusive work environment. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, physical or mental disability, age, or veteran status or any other basis protected by federal, state, or local law.

Posted 2 weeks ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

Bengaluru

Work from Office

Job Description: Ataccama Support Engineer Job Location: Hyderabad / Bangalore / Chennai / Kolkata / Noida/ Gurgaon / Pune / Indore / Mumbai Resource should have 2+ years experience in Ataccama Support software application Experience in data quality, governance, and metadata management. Extensive knowledge of Ataccama, ADF, SQL Open for 24*7 support swift rotation. Experience in business processing mapping of data and analytics solutions Monitor and support Ataccama Data Quality rules execution and profiling jobs. Troubleshoot data validation, anomaly detection, and scorecard generation issues. Perform patching, software upgrades, and ensure compliance with latest platform updates. Work with business teams to resolve data integrity and governance-related incidents. Maintain SLA commitments for resolving incidents and ensuring data accuracy. Experience with Ataccama ONE platform and knowledge of SQL for data validation. At DXC Technology, we believe strong connections and community are key to our success. Our work model prioritizes in-person collaboration while offering flexibility to support wellbeing, productivity, individual work styles, and life circumstances. We re committed to fostering an inclusive environment where everyone can thrive. Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services, such as false websites, or through unsolicited emails claiming to be from the company. These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process. DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process, nor ask a job seeker to purchase IT or other equipment on our behalf. More information on employment scams is available here .

Posted 2 weeks ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Bengaluru

Work from Office

Job Description: Key Responsibilities Multi-Cloud Log Ingestion & Analysis Write Python code to fetch and normalize logs via AWS/Azure/GCP APIs (e.g., S3, Storage Queues, Pub/Sub) Parse, cleanse, and aggregate diverse log types (DNS, proxy, Orca, Uptycs, NSG flow logs, etc.) Identify data quality issues, annotate metadata, and document remediation steps Interactive Visualization Build reusable Plotly Dash components (heatmaps, time-series, geospatial maps) that allow security teams to drill into anomalies Annotate key events and embed insights for non-technical stakeholders Infrastructure as Code Develop Terraform modules to provision logging infrastructure in AWS (S3, Kinesis), Azure (Storage Accounts, Log Analytics), and GCP (Cloud Storage, Pub/Sub) Configure remote state backends with locking and integrate secrets in secure stores (Key Vault, Secrets Manager) Configuration Automation (Nice to have) Create Ansible roles/playbooks to provision and configure Ubuntu (or container) environments, install dependencies, deploy code, and run analyses Securely manage service principal or IAM credentials via Ansible Vault or environment variables Security Best Practices Apply least-privilege principles when assigning IAM/RBAC roles Understand threat models for log data streams (e.g., log injection, tampering, retention) and recommend hardening measures Collaborate with Ops to tune alert thresholds and response workflows Required Qualifications 4+ years professional Python development experience Demonstrated ability to work with AWS, Azure, and GCP SDKs/APIs for storage, messaging, and compute Strong Plotly or similar interactive visualization skills Proven Terraform expertise across at least two cloud providers, with remote state and secret management Experience parsing and making sense of security logs (e.g., DNS queries, proxy logs, NSG flows, Orca/Uptycs outputs) Familiarity with security concepts and best practices (RBAC, least privilege, log integrity, etc.) Comfortable with Git-based workflows and CI/CD pipelines Preferred Qualifications Prior experience building security or SIEM dashboards Containerization (Docker) and orchestration (Kubernetes/EKS, etc) skills Hands-on with monitoring/alerting tools (Prometheus, Grafana, etc) Familiarity with mocking and testing frameworks (pytest, moto, etc) Bachelor s degree in Computer Science, Engineering, or related field (or equivalent experience) At DXC Technology, we believe strong connections and community are key to our success. Our work model prioritizes in-person collaboration while offering flexibility to support wellbeing, productivity, individual work styles, and life circumstances. We re committed to fostering an inclusive environment where everyone can thrive. Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services, such as false websites, or through unsolicited emails claiming to be from the company. These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process. DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process, nor ask a job seeker to purchase IT or other equipment on our behalf. More information on employment scams is available here .

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Gurugram

Work from Office

Job Responsibilities: Design and Develop Data Pipelines: Develop and optimise scalable data pipelines using Microsoft Fabric , including Fabric Notebooks , Dataflows Gen2 , Data Pipelines , and Lakehouse architecture . Work on both batch and real-time ingestion and transformation. Integrate with Azure Data Factory or Fabric-native orchestration for smooth data flow. Fabric Data Platform Implementation: Collaborate with data architects and engineers to implement governed Lakehouse models in Microsoft Fabric (OneLake) . Ensure data solutions are performant, reusable, and aligned with business needs and compliance standards. Data Pipeline Optimisation: Monitor and improve performance of data pipelines and notebooks in Microsoft Fabric. Apply tuning strategies to reduce costs, improve scalability, and ensure reliable data delivery across domains. Collaboration with Cross-functional Teams: Work closely with BI developers, analysts, and data scientists to gather requirements and build high-quality datasets. Support self-service BI initiatives by developing well-structured datasets and semantic models in Fabric. Documentation and Reusability: Document pipeline logic, lakehouse architecture, and semantic layers clearly. Follow development standards and contribute to internal best practices for Microsoft Fabric-based solutions. Microsoft Fabric Platform Execution: Use your experience with Lakehouses , Notebooks , Data Pipelines , and Direct Lake in Microsoft Fabric to deliver reliable, secure, and efficient data solutions that integrate with Power BI , Azure Synapse , and other Microsoft services. Required Skills and Qualifications: 5+ years of experience in data engineering within the Azure ecosystem , with relevant hands-on experience in Microsoft Fabric , including Lakehouse , Dataflows Gen2 , and Data Pipelines . Proficiency in building and orchestrating pipelines with Azure Data Factory and/or Microsoft Fabric Dataflows Gen2 . Solid experience with data ingestion , ELT/ETL development , and data transformation across structured and semi-structured sources. Strong understanding of OneLake architecture and modern data lakehouse patterns . Strong command of SQL,Pyspark, Python applied to both data integration and analytical workloads. Ability to collaborate with cross-functional teams and translate data requirements into scalable engineering solutions. Experience in optimising pipelines and managing compute resources for cost-effective data processing in Azure/Fabric. Preferred Skills: Experience working in the Microsoft Fabric ecosystem , including Direct Lake , BI integration , and Fabric-native orchestration features. Familiarity with OneLake , Delta Lake , and Lakehouse principles in the context of Microsoft s modern data platform. expert knowledge of PySpark , strong SQL , and Python scripting within Microsoft Fabric or Databricks notebooks. Understanding of Microsoft Purview or Unity Catalog , or Fabric-native tools for metadata , lineage , and access control . Exposure to DevOps practices for Fabric and Power BI, including Git integration , deployment pipelines, and workspace governance. Knowledge of Azure Databricks for Spark-based transformations and Delta Lake pipelines is a plus.

Posted 2 weeks ago

Apply

10.0 - 15.0 years

30 - 35 Lacs

Pune

Work from Office

Manager, Solution Engineering (Java, Talend) - Pune, MH, India Are you ready to join a global organization that helps diverse teams stay at the forefront of technology and innovation? How about offering up your skills in a global business that is committed to moving money for better? Join Western Union as Manager, Solution Engineering Western Union powers your pursuit. We are looking for a seasoned Manager , Solution Engineering to lead a team of developers in building secure, scalable, and cloud-native payment solutions. This role requires a strong technical foundation, hands-on experience with JAVA, Bath processing framework, File processing, Talend, and the ability to align business goals with innovative technical solutions. Key Responsibilities Lead and mentor a cross-functional team of developers and testers. Design and develop metadata-driven batch processing framework to handle inbound data from partners and distribute outbound files to multiple partner destinations. Develop complex Talend ETL processes for integration between systems Design, develop, and maintain Talend solutions across platforms, optimizing ETL workflows and ensuring data quality and integration per enterprise standards. Collaborate with stakeholders to gather business requirements and translate them into technical solutions. Liaise with the Product team to define and prioritize features aligned with business goals. Provide architectural guidance and ensure timely delivery of platform-specific solutions. Design and manage secure, scalable, and maintainable cloud-based software systems. Develop and enhance software with a focus on performance, maintainability, and code optimization. Troubleshoot application issues and coordinate resolution across functional and technical teams. Recommend improvements to existing software programs and development practices. Prepare and present technical proposals to stakeholders. Define and maintain delivery and support guidelines. Stay current with emerging technologies and enable the team to adopt innovative solutions. Work independently on simple to moderately complex projects. Collaborate with geographically distributed teams and service providers. Role Requirements Bachelors degree in Computer Science, Information Systems, IT, or similar preferred. 10+ years of professional experience in software architecture design and implementation. 6+ years of hands-on experience with JAVA, Spring-Batch, Talend, AWS Solid hands-on experience in Java/J2EE, architectural and design patterns, Spring boot, Web technologies, SQL/NoSQL, and Kafka. 2+ years development experience with Talend Data Integration module Strong middleware integration experience with external 3rd party platform APIs. Exceptional understanding of computer science fundamentals (like SOLID), data structures, algorithms, OOP, and best practices for microservices, domain driven design, data persistence/ data engineering. Ability to communicate and evangelize architectural concepts to internal stakeholders and external partners. Experience with agile development methodologies with emphasis on Test Driven Development (TDD) and CI (Continuous Integration)/CD (Continuous Delivery). Comfortable with an Agile Scrum project management style. Self-starter with ability to multi-task, prioritize, manage a team. Experience in Financial Services We make financial services accessible to humans everywhere. Join us for what s next. Western Union is positioned to become the world s most accessible financial services company transforming lives and communities. To support this, we have launched a Digital Banking Service and Wallet across several European markets to enhance our customers experiences by offering a state-of-the-art digital Ecosystem. More than moving money, we design easy-to-use products and services for our digital and physical financial ecosystem that help our customers move forward. Just as we help our global customers prosper, we support our employees in achieving their professional aspirations. You ll have plenty of opportunities to learn new skills and build a career, as well as receive a great compensation package. If you re ready to help drive the future of financial services, it s time for the Western Union. Learn more about our purpose and people at https: / / careers.westernunion.com. Benefits You will also have access to short-term incentives, multiple health insurance options, accident and life insurance, and access to best-in-class development platforms, to name a few ( https: / / careers.westernunion.com / global-benefits / ). Please see the location-specific benefits below and note that your Recruiter may share additional role-specific benefits during your interview process or in an offer of employment. Your India specific benefits include: Employees Provident Fund [EPF] Gratuity Payment Public holidays Annual Leave, Sick leave, Compensatory leave, and Maternity / Paternity leave Annual Health Check up Hospitalization Insurance Coverage (Mediclaim) Group Life Insurance, Group Personal Accident Insurance Coverage, Business Travel Insurance Cab Facility Relocation Benefit Western Union values in-person collaboration, learning, and ideation whenever possible. We believe this creates value through common ways of working and supports the execution of enterprise objectives which will ultimately help us achieve our strategic goals. By connecting face-to-face, we are better able to learn from our peers, solve problems together, and innovate. Our Hybrid Work Model categorizes each role into one of three categories. Western Union has determined the category of this role to be Hybrid. This is defined as a flexible working arrangement that enables employees to divide their time between working from home and working from an office location. The expectation is to work from the office a minimum of three days a week. We are passionate about diversity. Our commitment is to provide an inclusive culture that celebrates the unique backgrounds and perspectives of our global teams while reflecting the communities we serve. We do not discriminate based on race, color, national origin, religion, political affiliation, sex (including pregnancy), sexual orientation, gender identity, age, disability, marital status, or veteran status. The company will provide accommodation to applicants, including those with disabilities, during the recruitment process, following applicable laws. #LI-HR1 #LI-Hybrid Estimated Job Posting End Date: 07-20-2025 This application window is a good-faith estimate of the time that this posting will remain open. This posting will be promptly updated if the deadline is extended or the role is filled.

Posted 2 weeks ago

Apply

1.0 - 6.0 years

3 - 8 Lacs

Mumbai

Work from Office

This role is responsible for ensuring that content processed through The Orchard meets DSP metadata, infringement, and quality guidelines. This role also involves assisting with day-to-day requests, catalog clean-ups, and longer-term departmental improvement projects. What youll do: Review and QC audio & video content in accordance with DSP asset and metadata guidelines (e.g., Apple, Spotify, YouTube). Maintain off-hours coverage by working on Saturdays and Sundays (Tuesday or Wednesday is the preferred days to take off during the week) Provide feedback and educate internal teams on handling products that may violate The Orchard and DSP content guidelines. Contribute to internal process capture and documentation. Communicate issues and roadblocks pertaining to department projects and processes with team members, management, and other departments. Partner with other departments (Label Management, Legal) to identify, report on, and resolve issues, providing exceptional support for clients. Work closely with the Product and Tech departments to provide feedback and implement new strategies for optimal operational efficiency. Stay updated on changes to DSP guidelines and industry best practices, recommending process enhancements to improve content quality and compliance. Who you are: 1+ years experience in an operations role or supply chain environment in the entertainment industry or equivalent education/experience. Knowledge of music metadata in a digital distribution or digital streaming/download context. Outstanding written and verbal communication skills; impeccable follow-up and follow-through capabilities. Well-organized and attentive to detail. Discerning eye and ear for audio and visual content; bonus points for specialized knowledge of independent music or niche genres. Basic knowledge of copyright and current popular music landscape. Comfortable with high volume tasks. Bonus Points: Fluency in a second language Experience working with a record label, distributor, and/or digital service content management systems (e.g., iTunes Connect, YouTube CMS, VEVO Backstage, Spotify Scatman). iTunes and Spotify style guide experience What we give you: You join a vibrant global community with the opportunity to channel your passion every day A modern office environment designed for you, empowering you to bring your best Investment in your professional growth and development enabling you to thrive in our vibrant community The space to accelerate progress, positively disrupt and create what happens next We give you the platform to champion positive change, with the opportunity to contribute to our social impact, diversity, equity and inclusion initiatives. Equal Opportunities As an active part of a culturally and socially diverse society, Sony Music s aim is that our workforce is diverse and inclusive. Sony Music is an equal opportunity employer and supports workforce diversity. We employ, retain, promote and otherwise treat all employees and job applicants according to their merit, qualifications, competence and talent. We apply this policy without regard to any individual s sex, race, religion, origin, age, sexual orientation, marital status, medical condition or disability. Privacy Policy Please click here to read our privacy policy before beginning the application process as you will need to agree to the terms of the policy before submitting your information.

Posted 2 weeks ago

Apply

8.0 - 11.0 years

25 - 30 Lacs

Hyderabad

Work from Office

In this role you will develop code to feed data to machine learning and statistical modeling platforms.This role will be responsible for the Extraction, Transformation, and Loading of data points from source to destination models. The data science advisor will collaborate with other data scientists and reporting teams across Evernorth to provide a streamlined way to productionize and locally test modeling. The role will also have numerous opportunities to automate and innovate the ways in which modeling is done and consult with business partners on best practices. Responsibilities Locate, extract, manipulate, and organize data from operational source systems in support of analytic tool development (SQL, Microsoft SSIS, Python) Create and manage Postgres and SQL Server entities for use in Data Science modeling and reporting Proficient usage of SQL data sources and database management like Hadoop, Oracle, and SQL servers Partner with varying levels of operations and resource management leadership to understand challenges, goals, and pain points, designing analytic solutions to address them Build processes supporting data transformation, data structures, metadata, dependency and workload management Help develop and maintain code standards and repositories Qualifications 8 - 11 years building and optimizing \u2018big data data pipelines, architectures and data sets Strong SQL expertise Experience with the use of Python for deployment Experience with Git (or equivalent) Experience with Python, Postgres and SSIS highly desired Solution design and troubleshooting skills Ability to extrapolate data into information to drive process improvements Ability to quickly learn how to use new software applications Comfortable working in environments with varying levels of ambiguity, complexity, uncertainty, and change Location & Hours of Work Full-time position, working 40 hours per week. Expected overlap with US hours as appropriate Primarily based in the Innovation Hub in Hyderabad, India in a hybrid working model (3 days WFO and 2 days WAH)

Posted 2 weeks ago

Apply

10.0 - 15.0 years

30 - 35 Lacs

Gurugram

Work from Office

Total experience 10+ years. Expertise in Python, LLM integration, and GenAI development workflows Strong knowledge of prompt engineering, vector databases, retrieval-augmented generation (RAG), and toolchains Strong working experience in Agentic AI and Open AI. Hands-on experience with agent frameworks like LangChain Agents, Semantic Kernel, Haystack, or custom-built agents Proven experience in leading LLM/GenAI projects from Proof-of-Concept to Production Deep understanding of modern data workflows - ingestion, transformation, cataloguing, validation, and analytics Ability to work collaboratively across engineering, product, and governance teams Experience with enterprise data products, MLOps, or DataOps platforms Familiarity with governance workflows and metadata-driven automation Exposure to cognitive architectures or agent simulation environments Excellent communication and collaboration skills. RESPONSIBILITIES: Understanding the client s business use cases and technical requirements and be able to convert them into technical design which elegantly meets the requirements. Mapping decisions with requirements and be able to translate the same to developers. Identifying different solutions and being able to narrow down the best option that meets the client s requirements. Defining guidelines and benchmarks for NFR considerations during project implementation Writing and reviewing design document explaining overall architecture, framework, and high-level design of the application for the developers Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed. Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it Understanding and relating technology integration scenarios and applying these learnings in projects Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken. Carrying out POCs to make sure that suggested design/technologies meet the requirements. Bachelor s or master s degree in computer science, Information Technology, or a related field.

Posted 2 weeks ago

Apply

10.0 - 15.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Director, Clinical Operational Data Governance At GSK, innovation is at the heart of everything we do as we strive to deliver transformative medicines that improve lives around the globe. The Director, Clinical Operational Data Governance role is at the forefront of this mission, driving the governance and management of clinical operational data that fuels groundbreaking research and development. This role is central to ensuring the integrity and accessibility of operational data that powers our clinical studies, enabling us to bring innovative treatments to patients faster and more effectively. If you re passionate about curating our clinical trial operational data to unlock new possibilities in medicine and want to be a key player in shaping the future of healthcare, this is your opportunity to make a meaningful impact. Job Purpose The Director, Clinical Operational Data Governance is accountable for availability of high-quality operational data from our clinical studies (i.e. data in our Clinical Trial Management System (CTMS)). This role supports GSKs mission by driving data integrity, governance, and accessibility to enable informed decision-making for our current and in preparation of our future clinical pipeline and successful clinical operations execution . Process and Technology Development and Maintenance: Serve as the Global Process Owner/Lead for all processes around intake (from internal end-users as well as third parties), management and downstream provisioning of clinical operational data. Serve as the Clinical Technology Lead to ensure the maintenance of an appropriate technology landscape that optimally supports the execution of business processes aimed at capturing, maintaining, governing and downstream provisioning of clinical operational data. Data Governance and Quality Assurance: Serve as the Domain Owner for the study-level Clinical Operational Data Domain. Own and lead data governance activities for clinical operational data and partner closely with relevant sponsors, data stewards, and other stakeholders across GCO and beyond. Maintain and refine data governance strategies and their execution to ensure data integrity, reliability and compliance with external regulations and internal standards. Monitor the quality of clinical operational data and ensure appropriate mitigations are taken to resolve data quality issues (e.g. validity, completeness, consistency). Ensure effective Data Governance practices are defined and embedded across all functions and with upstream and downstream data domain / system owners addressing key risks to availability, quality, ingestion and consumption of clinical operational data. Data Management and Integration: Oversee the collection, storage, and maintenance of clinical operational data, ensuring data is organized and accessible for analysis and reporting. Define and enforce Master Data Management policies, standards, and procedures to ensure high-quality master data. Own key data artefacts including data dictionaries, metadata repositories, data domain maps, and data lineage documentation to support data traceability and usability. Monitor and maintain the quality and consistency of master data through regular reviews and data cleansing activities. Stakeholder Management and Communication: Serve as the primary point of contact for clinical operational data-related inquiries and issues, providing expert guidance and support to stakeholders. Collaborate with cross-functional teams to understand their data needs and ensure clinical operational data aligns with business objectives. Develop and deliver training programs in relation to clinical operational data and data collection, management and governance processes as needed. Manage key stakeholders to promote a culture of data awareness and quality. Matrix Management and Continuous Improvement: Manage relationships with stakeholders and SMEs fostering a culture of accountability, collaboration and continuous improvement while staying current with industry trends, emerging technologies, and best practices in data management and stewardship. Identify opportunities to leverage advanced data analytics, machine learning, and automation to enhance CTMS data management processes. Lead cross-functional projects, monitoring and remediation programs as required. Why You? Basic Qualifications: Bachelors Degree or equivalent in information Systems, Life Sciences, Data Science, or a related field. 10+ years of experience in the pharmaceutical or life sciences industry within the field of data management and data governance. Proven track record in defining and establishing an organization-wide data governance strategy including stakeholder management. Experience in translating complex data concepts and data challenges to non-technical stakeholders. Proven experience in leading cross-functional teams, managing multiple projects, and driving stakeholder engagement in a data-driven environment. Strong industry experience and understanding of clinical trial processes, regulatory requirements (e.g., FDA, EMA), and data governance principles. Preferred Qualifications: Masters or Doctorate. Experience working with CTMS (Clinical Trial Management System) platforms and metadata management tools GSK is an Equal Opportunity Employer. This ensures that all qualified applicants will receive equal consideration for employment without regard to race, color, religion, sex (including pregnancy, gender identity, and sexual orientation), parental status, national origin, age, disability, genetic information (including family medical history), military service or any basis prohibited under federal, state or local law. Important notice to Employment businesses/ Agencies Please note that if you are a US Licensed Healthcare Professional or Healthcare Professional as defined by the laws of the state issuing your license, GSK may be required to capture and report expenses GSK incurs, on your behalf, in the event you are afforded an interview for employment. This capture of applicable transfers of value is necessary to ensure GSK s compliance to all federal and state US Transparency requirements. For more information, please visit the Centers for Medicare and Medicaid Services (CMS) website at https: / / openpaymentsdata.cms.gov /

Posted 2 weeks ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

Pune

Work from Office

Evolent partners with health plans and providers to achieve better outcomes for people with most complex and costly health conditions. Working across specialties and primary care, we seek to connect the pieces of fragmented health care system and ensure people get the same level of care and compassion we would want for our loved ones. Evolent employees enjoy work/life balance, the flexibility to suit their work to their lives, and autonomy they need to get things done. We believe that people do their best work when theyre supported to live their best lives, and when they feel welcome to bring their whole selves to work. Thats one reason why diversity and inclusion are core to our business. Join Evolent for the mission. Stay for the culture. What You ll Be Doing: MLOps Engineer We are seeking a highly capable MLOps Engineer to join our growing AI/ML Team. You will bridge the gap between data science and operations, ensuring that machine learning models are efficiently tested, deployed, monitored, and maintained in production environments. You will work closely with data scientists, software engineers, infrastructure, and development teams to build scalable and reliable ML infrastructure. You will be instrumental in supporting clinical decision-making, operational efficiency, quality outcomes, and patient care. What You Will Be Doing : Model Deployment and Infrastructure Design, build, and maintain scalable, secure ML pipelines for model training, validation, deployment, and monitoring Automate deployment workflows using CI/CD pipelines and infrastructure-as-code tools Partner with Infrastructure Teams to manage (Azure) cloud-based ML infrastructure, ensuring compliance with InfoSec and AI policies Ensure applications run at peak efficiency Model Testing, Monitoring, and Validation Develop rigorous testing frameworks for ML models, including clinical validation, traditional model performance measures, population segmentation, and edge-case analysis Build monitoring systems to detect model drift, overfitting, data anomalies, and performance degradation in real-time Continuously analyze model performance metrics and operational logs to identify improvement opportunities Translate monitoring insights into actionable recommendations for data scientists to improve model precision, recall, fairness, and efficiency Model Transparency & Governance Maintain detailed audit trails, logs, and metadata for all model versions, training datasets, and configurations to ensure full traceability and support internal audits Ensure models meet transparency and explainability standards using tools like SHAP, LIME, or integrated explainability APIs. Collaborate with data scientists and clinical teams to ensure models are interpretable, actionable, and aligned with practical applications Support corporate Compliance and AI Governance policies Advocate for best practices in ML engineering, including reproducibility, version control, and ethical AI Develop product guides, model documentation, and model cards for internal and external stakeholders Required Qualifications : Bachelor s Degree in Computer Science, Machine Learning, Data Science, or a related field 2+ years of experience in MLOps, DevOps, or ML engineering Proficiency in Python and ML frameworks such as Keras, PyTorch, Scikit-Learn, TensorFlow, and XGBoost Experience with containerization (Docker), orchestration (Kubernetes), and CI/CD tools Familiarity with healthcare datasets and privacy regulations Strong analytical skills to interpret model performance data and identify optimization opportunities Proven ability to optimize application performance, including improving code efficiency, right-sizing infrastructure usage, and reducing system latency Experience implementing rollback strategies, including version control, rollback triggers, and safe deployment practices across lower and upper environments 2+ years of experience developing in a cloud environment (AWS, GCS, Azure) 2+ years of experience with Github, Github Actions, CI/CD, and source control 2+ years working within an Agile environment Preferred Qualifications : Experience with MLOps platforms like MLflow, TFX, or Kubeflow Healthcare experience, particularly using administrative and prior authorization data Proven experience with developing and deploying ML systems into production environments Experience working with Product, Engineering, Infrastructure, and Architecture teams Proficiency using Azure cloud-based services and infrastructure such as Azure MLOps Experience with feature flagging tools and strategies To comply with HIPAA security standards (45 C.F.R. sec. 164.308 (a) (3)), identity verification may be required as part of the application process. This is collected for compliance and security purposes and only reviewed if an applicant advances to the final interview state. Reasonable accommodations are available upon request. Technical Requirements: We require that all employees have the following technical capability at their home: High speed internet over 10 Mbps and, specifically for all call center employees, the ability to plug in directly to the home internet router. These at-home technical requirements are subject to change with any scheduled re-opening of our office locations. Evolent is an equal opportunity employer and considers all qualified applicants equally without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, or disability status. If you need reasonable accommodation to access the information provided on this website, please contact recruiting@evolent.com for further assistance. The expected base salary/wage range for this position is $. This position is also eligible for a bonus component that would be dependent on pre-defined performance factors. As part of our total compensation package, Evolent is proud to offer comprehensive benefits (including health insurance benefits) to qualifying employees. All compensation determinations are based on the skills and experience required for the position and commensurate with experience of selected individuals, which may vary above and below the stated amounts.

Posted 2 weeks ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

Bengaluru

Work from Office

Ataccama Support Engineer Job Location: Hyderabad / Bangalore / Chennai / Kolkata / Noida/ Gurgaon / Pune / Indore / Mumbai Resource should have 2+ years experience in Ataccama Support software application Experience in data quality, governance, and metadata management. Extensive knowledge of Ataccama, ADF, SQL Open for 24*7 support swift rotation. Experience in business processing mapping of data and analytics solutions Monitor and support Ataccama Data Quality rules execution and profiling jobs. Troubleshoot data validation, anomaly detection, and scorecard generation issues. Perform patching, software upgrades, and ensure compliance with latest platform updates. Work with business teams to resolve data integrity and governance-related incidents. Maintain SLA commitments for resolving incidents and ensuring data accuracy. Experience with Ataccama ONE platform and knowledge of SQL for data validation. At DXC Technology, we believe strong connections and community are key to our success. Our work model prioritizes in-person collaboration while offering flexibility to support wellbeing, productivity, individual work styles, and life circumstances. We re committed to fostering an inclusive environment where everyone can thrive.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Mumbai, Hyderabad, Pune

Work from Office

Your Role You would be working on End to end implementation in Hyperion Planning and Essbase Essbase Calc Scripts, MDX & MAXL Budgeting, forecasting and financial analysis processes Application creation Developing metadata Developing Forms Creation of Business rules (Calc Scripts) Creation of data load rules Write and maintain financial reports in HFR and Hyperion BI Configuration of security and process management FDMEE Integration Write & maintain data load rules, calc scripts and business rules in Hyperion Essbase and Hyperion Planning Your Profile Has extensively worked in ASO cubes and Report scripts Should have experience in writing test schedule and test scenarios for Integration testing and User Acceptance testing Should be able to Interface with clients and senior executives to understand requirements on building and optimizing their Financial Planning, Budgeting and Forecasting processes / applications. Good written and communication skills Certified Hyperion Planning & Essbase Developer What you"ll love about Capgemini You can shape your career with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have theopportunity to learn on one of the industry"s largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini Location - Hyderabad,Mumbai,Pune,Bengaluru

Posted 2 weeks ago

Apply

1.0 - 4.0 years

5 - 9 Lacs

Noida

Work from Office

About the Role Shiksha is looking for a Community Specialist/Manager to scale our creator-led video content initiatives. You will be responsible for identifying, onboarding, and nurturing college-based video creators to produce compelling content that resonates with student audiences. Your role will also include direct outreach to students , helping them understand the campaign, and guiding them on how they can earn by contributing content . You’ll play a key part in growing a strong, self-sustaining creator community, ensuring quality output, and supporting platform discovery through smart use of YouTube SEO and content trends . Key Responsibilities Identify and onboard college students and micro-creators to create authentic, edited short-form videos (Reels/Shorts) for the Shiksha brand. Personally connect with student creators via calls to explain campaign objectives, content expectations, and monetization opportunities. Build a scalable pipeline of active content contributors across colleges in India. Monitor and guide creator output to ensure brand alignment, consistency, and creative quality. Coordinate with the content and social media teams for publishing and promotion. Stay updated with creator trends, student subcultures, and platform algorithms to optimize content success. If applicable, use YouTube SEO and keyword optimization to improve reach and discoverability of creator-led content. Requirements Must-Haves: 1–3 years of experience managing communities or sourcing video creators (preferably from campus/student networks). Strong interpersonal and verbal communication skills—comfortable with high volumes of calling and creator engagement. Ability to assess and give feedback on short-form content, particularly Reels and YouTube Shorts. Proven experience in creator onboarding, content workflow management, or campus-based video projects. Bachelor's degree from a recognized university. Good-to-Have: Hands-on exposure to YouTube SEO , metadata optimization, or creator analytics tools. Experience using video editing platforms or tools. Prior work with student ambassadors, UGC campaigns, or influencer networks. Why Shiksha? Be part of a fast-growing initiative that’s changing how Indian students prepare for global education. At Shiksha Communities, your work will help build a network of creators who share real stories and real impact—while you grow your career in a creative, mission-driven environment.

Posted 2 weeks ago

Apply

2.0 - 6.0 years

3 - 7 Lacs

Pune, Bengaluru

Work from Office

Your Role 35 years of hands-on experience with BigID or similar data discovery/classification tools (e.g., Varonis, Informatica, and MIP). Strong understanding of data governance, data privacy, and compliance regulations (GDPR, CCPA, SOX, SEBI etc.). Experience working with structured data in RDBMS (Oracle, MS SQL Server, and PostgreSQL) and unstructured data sources (file servers, SharePoint, cloud repositories). Proficiency in configuring BigID policies, classifiers, data flows, and discovery and classification operations modules. Experience integrating BigID with security tools like Microsoft Information Protection, DLP solutions, or SIEM platforms. Familiarity with metadata management, data catalogs, and data lineage concepts Your Profile Design, implement, and manage data discovery and classification workflows using the BigID platform for both structured (e.g., databases, data warehouses) and unstructured data (e.g., file shares, SharePoint, email). Configure and maintain BigID connectors to integrate with enterprise data sources including databases (Oracle, SQL Server, MySQL), cloud storage (AWS S3, Azure Blob), collaboration platforms (O365, Google Drive), and more. Define and customize classification policies, sensitivity labels, and data tagging rules to align with organizational data governance and compliance frameworks (e.g., GDPR, CCPA, SEBI, DORA etc.). Collaborate with data owners, security teams, and compliance stakeholders to identify sensitive data types (PII, PCI, etc.) and apply appropriate classification and protection strategies. Integrate BigID with Microsoft Information Protection (MIP) and other DLP platforms or IRM tools to automate labeling and enforcement policies. Monitor discovery scans and classification jobs, troubleshoot issues, and optimize performance. Generate and present reports and dashboards to stakeholders, highlighting data classification coverage, risk areas, and remediation plans. What You"ll love about working hereShort Description We recognize the significance of flexible work arragemnets to provide support.Be it remote work, or flexible work hours. You will get an enviorment to maintain healthy work life balance. At the heart of our misssion is your career growth. our Array of career growth programs and diverse professions arecrafted to spport you in exploring a world of opportuneties Euip Yourself with valulable certification in the latest technlogies such as unix,Sql.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Bengaluru

Work from Office

The Fusion Data Intelligence Platform is an evolution of the Oracle Fusion Analytics Warehouse product, which will deliver business data-as-a-service with automated data pipelines, 360-degree data models for key business entities, rich interactive analytics, AI/ML models, and intelligent applications. These out-of-box capabilities will run on top of the Oracle Cloud Infrastructure (OCI) Data Lakehouse services, including Oracle Autonomous Database and Oracle Analytics Cloud, which enables full extensibility at the data, analytics, AI/ML, and application layers. Given these, there is an urgent need to augment the team to develop valuable new features for the FDIP platform. These projects require strong skills on Full stack, Backend, Big data technology skills. For this position, we are looking for Sr. Data Engineer/Modeler with experience in data warehousing projects. You will work with various large scale data systems like Product Usage and Application Performance to deliver data model powering insights platforms. You will work as part of the team designing all aspects of data management practices for the cloud. Responsibilities: Translate business requirements into logical/physical data models that are easy to understand. Design declarative data warehouse transformation rules/guidelines, for extraction/sourcing, loading and transformation Use SQL queries/custom scripts for rapid prototyping data model/transforms Design, develop unit/integration test data/plan and scripts for data model deliverables. Create supporting documentation, such as metadata and diagrams of entity relationships, business processes, and process flow Required Skills BS or higher degree in Computer Science, Computer Engineering or equivalent with 5+ years of applied experience 5+ years of experience working in ETL, ODI, data architecture/modeling, data integrations. Expertise in data modeling principles/methods including conceptual, logical & physical Data Models Experience in developing BI Semantic Models Experience in working with dimensionally modeled data Experience in Business Intelligence tools (Tableau, OAC, Domo) to represent insights Knowledge of statistical inference, forecasting, multivariate analysis, cluster analysis, and optimization Experience in developing/supporting data solutions over multiple releases Experience with databases, performance tuning SQL and understanding ETL pipelines Exposure to Python or Java, and big data systems (Spark, Hive, Hadoop) Experience with data modeling software As a member of the software engineering division, you will assist in defining and developing software for tasks associated with the developing, debugging or designing of software applications or operating systems. Provide technical leadership to other software developers. Specify, design and implement modest changes to existing software architecture to meet changing needs.

Posted 2 weeks ago

Apply

1.0 - 6.0 years

1 - 2 Lacs

Mumbai

Work from Office

This role is responsible for ensuring that content processed through The Orchard meets DSP metadata, infringement, and quality guidelines. This role also involves assisting with day-to-day requests, catalog clean-ups, and longer-term departmental improvement projects. What youll do: Review and QC audio & video content in accordance with DSP asset and metadata guidelines (e.g., Apple, Spotify, YouTube). Maintain off-hours coverage by working on Saturdays and Sundays (Tuesday or Wednesday is the preferred days to take off during the week) Provide feedback and educate internal teams on handling products that may violate The Orchard and DSP content guidelines. Contribute to internal process capture and documentation. Communicate issues and roadblocks pertaining to department projects and processes with team members, management, and other departments. Partner with other departments (Label Management, Legal) to identify, report on, and resolve issues, providing exceptional support for clients. Work closely with the Product and Tech departments to provide feedback and implement new strategies for optimal operational efficiency. Stay updated on changes to DSP guidelines and industry best practices, recommending process enhancements to improve content quality and compliance. Who you are: 1+ years experience in an operations role or supply chain environment in the entertainment industry or equivalent education/experience. Knowledge of music metadata in a digital distribution or digital streaming/download context. Outstanding written and verbal communication skills; impeccable follow-up and follow-through capabilities. Well-organized and attentive to detail. Discerning eye and ear for audio and visual content; bonus points for specialized knowledge of independent music or niche genres. Basic knowledge of copyright and current popular music landscape. Comfortable with high volume tasks. Bonus Points: Fluency in a second language Experience working with a record label, distributor, and/or digital service content management systems (e.g., iTunes Connect, YouTube CMS, VEVO Backstage, Spotify Scatman). iTunes and Spotify style guide experience What we give you: You join a vibrant global community with the opportunity to channel your passion every day A modern office environment designed for you, empowering you to bring your best Investment in your professional growth and development enabling you to thrive in our vibrant community The space to accelerate progress, positively disrupt and create what happens next We give you the platform to champion positive change, with the opportunity to contribute to our social impact, diversity, equity and inclusion initiatives. Equal Opportunities As an active part of a culturally and socially diverse society, Sony Music s aim is that our workforce is diverse and inclusive. Sony Music is an equal opportunity employer and supports workforce diversity. We employ, retain, promote and otherwise treat all employees and job applicants according to their merit, qualifications, competence and talent. We apply this policy without regard to any individual s sex, race, religion, origin, age, sexual orientation, marital status, medical condition or disability. Privacy Policy Please click here to read our privacy policy before beginning the application process as you will need to agree to the terms of the policy before submitting your information.

Posted 2 weeks ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Bengaluru

Work from Office

Salesforce Developer will work on an Agile scrum team and will be responsible for managing assigned work requests throughout the SDLC, building customized solutions using Salesforce declarative tools, custom code, and AppExchange products. The candidate will require to gain understanding of business requirements and work to develop them accordingly. Analysing requirements to ensure information is complete and determine complexity. Applying your knowledge of out of the Youre the type of person that is always looking to learn and embrace existing and emerging integration capabilities/approaches. Youre passionate about all things technology and keep up to date with the latest industry trends and products. Youre the type of person that loves the challenge of solving complex technical problems by thinking outside the box. A normal day looks like: Development Analysing requirements to ensure information is complete and determine complexity. Applying your knowledge of out of the boxfunctionality to determine what different design options exist and highlight any compromises. Developing and documenting customised solutions within the Salesforce platform including enhancing and creating flows, functions and configurations and maintaining balance between point and click and coding solutions Develop system integrations involving Salesforce.com web services (JSON, SOAP) and Metadata API Ensuring developed solutions are written to be maintainable, scalable, testable and deployable. Ensuring that solutions adhere to Salesforce best practices and leverage standard CRM functionality where possible. Support for bug fixing Maintaining system security and integrity Deployment/Release of the change through sandboxes to production Leadership and Collaboration Participating in peer review sessions with other Salesforce developers and admins within your team or Cignas Salesforce community Collaborating with other technology teams to facilitate interfaces between Salesforce and other systems Collaborate with Cignas Architecture team in regards to the project and related Salesforce effort estimates Participating in Agile team activities, such as estimation, collaboration in requirements definition, code reviews, and feedback contribution during retrospectives Ideally you will have experience with or have: Bachelordegree in Computer Science or a related discipline 4+ years in Salesforce administration and development 2+ years as a Salesforce Developer with Apex, triggers, trigger frameworks and Visualforce 1+ years experience working with Lightning Web Components Hands on Salesforce experience either on Service Cloud or Sales Cloud Experience with code refactoring Hands on Salesforce experience with Salesforce Experience Cloud advantageous Knowledge of Salesforce REST, SOAP and Metadata API and integration experience with calling external REST/SOAP APIs and web services from Salesforce Knowledge of agile processes, software development lifecycle and supporting tools General understanding of common IT applications Exceptional English verbal, written and interpersonal skills Salesforce Platform Developer I certified It would be great if you have: 1+ years as a Salesforce Administrator with declarative development Validations, Flow, Object Model, Security Certified Salesforce Administrator, Platform App Builder, Platform Developer II Certified Copado Fundamentals I and II Experience with Salesforce DevOps tools such as Copado, Git, SFDX

Posted 2 weeks ago

Apply

15.0 - 20.0 years

50 - 55 Lacs

Pune

Work from Office

Role : SAP Data Architect Work Mode : Remote Contract : 6months Position Summary : We are seeking an experienced SAP Data Architect with a strong background in enterprise data management, SAP S/4HANA architecture, and data integration. This role demands deep expertise in designing and governing data structures, ensuring data consistency, and leading data transformation initiatives across large-scale SAP implementations. Key Responsibilities : - Lead the data architecture design across multiple SAP modules and legacy systems. - Define data governance strategies, master data management (MDM), and metadata standards. - Architect data migration strategies for SAP S/4HANA projects, including ETL, data validation, and reconciliation. - Develop data models (conceptual, logical, and physical) aligned with business and technical requirements. - Collaborate with functional and technical teams to ensure integration across SAP and nonSAP platforms. - Establish data quality frameworks and monitoring practices. - Conduct impact assessments and ensure scalability of data architecture. - Support reporting and analytics requirements through structured data delivery frameworks (e.g., SAP BW, SAP HANA). Required Qualifications : - 15+ years of experience in enterprise data architecture, with 8+ years in SAP landscapes. - Proven experience in SAP S/4HANA data models, SAP Datasphere, SAP HANA Cloud & SAC and integrating this with AWS Data Lake (S3) - Strong knowledge of data migration tools (SAP Data Services, LTMC / LSMW, BODS, etc.). - Expertise in data governance, master data strategy, and data lifecycle management. - Experience with cloud data platforms (Azure, AWS, or GCP) is a plus. - Strong analytical and communication skills to work across business and IT stakeholders. Preferred Certifications : - SAP Certified Technology Associate SAP S/4HANA / Datasphere - TOGAF or other Enterprise Architecture certifications - ITIL Foundation (for process alignment)

Posted 2 weeks ago

Apply

15.0 - 20.0 years

18 - 22 Lacs

Ahmedabad

Remote

Role : SAP Data Architect Work Mode : Remote Contract : 6months Position Summary : We are seeking an experienced SAP Data Architect with a strong background in enterprise data management, SAP S/4HANA architecture, and data integration. This role demands deep expertise in designing and governing data structures, ensuring data consistency, and leading data transformation initiatives across large-scale SAP implementations. Key Responsibilities : - Lead the data architecture design across multiple SAP modules and legacy systems. - Define data governance strategies, master data management (MDM), and metadata standards. - Architect data migration strategies for SAP S/4HANA projects, including ETL, data validation, and reconciliation. - Develop data models (conceptual, logical, and physical) aligned with business and technical requirements. - Collaborate with functional and technical teams to ensure integration across SAP and nonSAP platforms. - Establish data quality frameworks and monitoring practices. - Conduct impact assessments and ensure scalability of data architecture. - Support reporting and analytics requirements through structured data delivery frameworks (e.g., SAP BW, SAP HANA). Required Qualifications : - 15+ years of experience in enterprise data architecture, with 8+ years in SAP landscapes. - Proven experience in SAP S/4HANA data models, SAP Datasphere, SAP HANA Cloud & SAC and integrating this with AWS Data Lake (S3) - Strong knowledge of data migration tools (SAP Data Services, LTMC / LSMW, BODS, etc.). - Expertise in data governance, master data strategy, and data lifecycle management. - Experience with cloud data platforms (Azure, AWS, or GCP) is a plus. - Strong analytical and communication skills to work across business and IT stakeholders. Preferred Certifications : - SAP Certified Technology Associate SAP S/4HANA / Datasphere - TOGAF or other Enterprise Architecture certifications - ITIL Foundation (for process alignment)

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

Pune

Work from Office

We are seeking a highly capable MLOps Engineer to join our growing AI/ML Team. You will bridge the gap between data science and operations, ensuring that machine learning models are efficiently tested, deployed, monitored, and maintained in production environments. You will work closely with data scientists, software engineers, infrastructure, and development teams to build scalable and reliable ML infrastructure. You will be instrumental in supporting clinical decision-making, operational efficiency, quality outcomes, and patient care. What You Will Be Doing : Model Deployment and Infrastructure Design, build, and maintain scalable, secure ML pipelines for model training, validation, deployment, and monitoring Automate deployment workflows using CI/CD pipelines and infrastructure-as-code tools Partner with Infrastructure Teams to manage (Azure) cloud-based ML infrastructure, ensuring compliance with InfoSec and AI policies Ensure applications run at peak efficiency Model Testing, Monitoring, and Validation Develop rigorous testing frameworks for ML models, including clinical validation, traditional model performance measures, population segmentation, and edge-case analysis Build monitoring systems to detect model drift, overfitting, data anomalies, and performance degradation in real-time Continuously analyze model performance metrics and operational logs to identify improvement opportunities Translate monitoring insights into actionable recommendations for data scientists to improve model precision, recall, fairness, and efficiency Model Transparency & Governance Maintain detailed audit trails, logs, and metadata for all model versions, training datasets, and configurations to ensure full traceability and support internal audits Ensure models meet transparency and explainability standards using tools like SHAP, LIME, or integrated explainability APIs. Collaborate with data scientists and clinical teams to ensure models are interpretable, actionable, and aligned with practical applications Support corporate Compliance and AI Governance policies Advocate for best practices in ML engineering, including reproducibility, version control, and ethical AI Develop product guides, model documentation, and model cards for internal and external stakeholders Required Qualifications bachelors Degree in Computer Science, Machine Learning, Data Science, or a related field 2+ years of experience in MLOps, DevOps, or ML engineering Proficiency in Python and ML frameworks such as Keras, PyTorch, Scikit-Learn, TensorFlow, and XGBoost Experience with containerization (Docker), orchestration (Kubernetes), and CI/CD tools Familiarity with healthcare datasets and privacy regulations Strong analytical skills to interpret model performance data and identify optimization opportunities Proven ability to optimize application performance, including improving code efficiency, right-sizing infrastructure usage, and reducing system latency Experience implementing rollback strategies, including version control, rollback triggers, and safe deployment practices across lower and upper environments 2+ years of experience developing in a cloud environment (AWS, GCS, Azure) 2+ years of experience with Github, Github Actions, CI/CD, and source control 2+ years working within an Agile environment Preferred Qualifications : Experience with MLOps platforms like MLflow, TFX, or Kubeflow Healthcare experience, particularly using administrative and prior authorization data Proven experience with developing and deploying ML systems into production environments Experience working with Product, Engineering, Infrastructure, and Architecture teams Proficiency using Azure cloud-based services and infrastructure such as Azure MLOps Experience with feature flagging tools and strategies

Posted 2 weeks ago

Apply

10.0 - 15.0 years

25 - 30 Lacs

Pune

Work from Office

we're seeking a future team member for the role of Vice President, Data Management Engineer I to join our Data Solution Platform team. This role is located in Pune, Maharashtra - HYBRID In this role, you'll make an impact in the following ways: The Data Governance Analyst collaborates with the Platform s Data Leader / DDO to drive data management within the Platform. This role oversees the implementation and enforcement of data governance policies and procedures, and works closely with stakeholders to define data standards and ensure regulatory compliance. Work under the direction of the data leader or manager to implement and enforce data governance policies and procedures. Identify, manage, and measure data risks. Participate in the Platforms data maturity assessment. Manage the identification and maintenance of authoritative data for the Platform including metadata. Adhere to the requirements of the data management policies. To be successful in this role, we're seeking the following: Bachelors degree or applicable work experience Eager to learn and lead data initiatives

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies