Home
Jobs
Companies
Resume

9 Data Sharing Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

9.0 - 13.0 years

10 - 14 Lacs

Hyderabad

Hybrid

Naukri logo

You will play a key role in the implementation and adoption of the data governance framework which will modernize Amgen's data ecosystem, positioning Amgen as a leader in biopharma innovation. This role leverages state-of-the-art technologies, including Generative AI, Machine Learning, and integrated data. You will leverage domain, technical and business process expertise to provide exceptional support of Amgens data governance framework. This role involves working closely with business stakeholders and data analysts to ensure implementation and adoption of the data governance framework. You will collaborate with the Product Owner and other Business Analysts to ensure operational support and excellence from the team. Roles & Responsibilities: Responsible for the data governance and data management framework implementation for the Commercialization domain of the biopharma lifecycle. Responsible for the operationalization of the Enterprise data governance framework and aligning broader stakeholder community with their data governance needs, including data quality, data access controls, compliance with privacy and security regulations, foundational master data management, data sharing, communication and change management. Works with Enterprise MDM and Reference Data to enforce standards and data reusability. Drives cross functional alignment in his/her domain(s) of expertise to ensure adherence to Data Governance principles. Maintain documentation on data definitions, data standards, data flows, legacy data structures / hierarchies, common data models, data harmonization etc. for assigned domains. Ensure compliance requirements with data privacy, security, and regulatory policies for the assigned domains Jointly with Technology teams, business functions, and enterprise teams (e.g., MDM, Enterprise Data Fabric, etc.) define the specifications shaping the development and implementation of data foundations. Build strong relationships with key business leads and partners to ensure their needs are being met Functional Skills: Must-Have Functional Skills: Technical skills with knowledge of Pharma processes with specialization in the Commercialization domain of the biopharma lifecycle. In depth knowledge of data management, common data models, metadata management, data quality, master data management, data stewardship, data protection, etc. In depth experience with data products development life cycle, including the enablement of data dictionaries, business glossary to increase data products reusability and data literacy. Customer-focused with excellent written and verbal communication skills who can confidently work with internal Amgen business stakeholders and external service partners on business process and technology topics In depth experience of working with or supporting systems used to data governance framework. E.g. Collibra, Alation Excellent problem-solving skills and committed attention to detail in finding solutions Good-to-Have Functional Skills: Experience of working with data governance councils or forums Experience with Agile software development methodologies (Scrum) Proficiency in data analysis and quality tools (e.g., SQL, Excel, Python, or SAS) Soft Skills: Highly organized and able to work under minimal supervision Excellent analytical and assessment skills Ability to work effectively with global, virtual teams Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Ambitious to further develop their skills and career Ability to build business relationships and understand end-to-end data use and needs. Excellent interpersonal skills (team player). People management skills either in matrix or direct line function. Strong verbal and written communication skills High degree of initiative and self-motivation. Good presentation and public speaking skills. Strong attention to detail, quality, time management and customer focus. Basic Qualifications: Any Degree and 9-13 years of experience

Posted 3 days ago

Apply

5.0 - 9.0 years

27 - 42 Lacs

Chennai

Work from Office

Naukri logo

Job Summary: "Duties and Responsibilities of Mainframe DB2 Systems DBA" (i) Create and Manage IBM DB2 databases. (ii) Experience in DB2 Upgrades, Installation, and RSU maintenance using SMP/E of DB2 Z/OS products. (iii) Experience in Performance Tuning and troubleshooting. (iv) Good knowledge of Disaster recovery practices planning and execution. (v) Thorough knowledge of buffer pool tuning, backup and recovery procedures, data migration, and archival. (vi) Knowledge of Function level upgrades of DB2. (vii) Experience in upgrades and maintenance of BMC DB2 products, QMF, HPU, etc. (viii) Thorough understanding of replication and its related upgrades. (ix) Must be able to provide design and administration support to application developers. (x) Good team player with better behavioral and administration skills (xi) Provide training to junior members of the team. (xii) Self-motivated ability to work un-supervised as well as being able to work as part of a team in a pressured environment (xiii) Excellent communication and presentation skills. (xiv) Excellent critical thinking and problem-solving skills; patient and professional demeanor, with a can-do attitude. Roles and Responsibilities: (i) Installing DB2 Z/OS upgrades, applying RSU maintenance and PTFs. (ii) Strong knowledge and experience with SMP/E, ISPF dialog management, JCL and REXX. (iii) Strong understanding of the DB2 Z/OS System administration concepts. (iv) Should have worked with WLM. (v) Experience with BMC DB2, QMF, HPU, CA-Platinum, and upgradation of the same. (vi) Strong knowledge of Function level upgrades to DB2, Mainview, and Apptune. (vii) Sound understanding of performance tuning, buffer pool tuning, and troubleshooting for performance-related issues and problem diagnosis. (viii) Sound knowledge of disaster recovery procedures, backup, restore, and archival practices of DB2. (ix) Thorough understanding of DB2 data sharing environments, migration of data, and security of DB2. (x) Strong experience with stored procedures (xi) Thorough knowledge of DB2 utilities such as (UNLOAD, LOAD, REORG, IMAGECOPY, DSN1COPY), etc #LI-LK1

Posted 1 week ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Senior CPQ Developer. Job Type :Full-time. Location :India Remote. Work Timings :4 pm to 11 pm IST. Notice Period :( Immediate 15 Days ). About The Role :. Responsibilities :. Configure and customize Salesforce CPQ to meet business requirements. Configure Salesforce CPQ to automate pricing, discounting, and quoting processes. Troubleshoot and resolve CPQ-related issues and provide ongoing support. Integrate Salesforce CPQ with other business systems and third-party applications. Work closely with stakeholders to gather requirements and translate them into technical specifications. The position should have the ability to work on individual pieces of work and solve problems including the design of the program flow of individual pieces of code, effective coding, and unit testing. Taking ownership of CPQ Implementations. Qualifications. 4+ years of Salesforce development experience,. 2+ years in Salesforce CPQ administration and development. Technical Skills. Proficiency in Salesforce CPQ configuration and customization. Strong understanding of pricing strategies and discount structures. Experience with product catalogue setup, including bundling and configuration. Previous experience working with a global, enterprise Salesforce implementation. Familiarity with Salesforce integration tools and techniques (REST/SOAP APIs, etc. Participation in refactoring existing development to improve underlying foundation. Ability to map business and functional requirements to Salesforce out of box features and functionality. High Experience in Salesforce Administration, Security Management and data migration. Strong Hands-on Experience with Salesforce Lightning flows. Knowledge of Salesforce development including triggers, Apex, SOQL, Lightning with Aura and Lightning with web components. Strong knowledge of data sharing rules and functionality. Familiarity with Clear knowledge in CPQ deployment process. Strong knowledge in industry best practices in development. Good oral and written communications skills. Certifications. Salesforce Certified CPQ Specialist. Salesforce Certified Platform Developer I or II (preferred).

Posted 2 weeks ago

Apply

10.0 - 15.0 years

22 - 37 Lacs

Bengaluru

Work from Office

Naukri logo

Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role As a Mainframe DB2 System Programmer SME at Kyndryl are project-based subject matter experts in all things infrastructure – good at providing analysis, documenting and diagraming work for hand-off, offering timely solutions, and generally “figuring it out.” This is a hands-on role where your feel for the interaction between a system and its environment will be invaluable to every one of your clients. There are two halves to this role: First, contributing to current projects where you analyze problems and tech issues, offer solutions, and test, modify, automate, and integrate systems. And second, long-range strategic planning of IT infrastructure and operational execution. This role isn’t specific to any one platform, so you’ll need a good feel for all of them. And because of this, you’ll experience variety and growth at Kyndryl that you won’t find anywhere else. You’ll be involved early to offer solutions, help decide whether something can be done, and identify the technical and timeline risks up front. This means dealing with both client expectations and internal challenges – in other words, there are plenty of opportunities to make a difference, and a lot of people will witness your contributions. In fact, a frequent sign of success for our Infrastructure Specialists is when clients come back to us and ask for the same person by name. That’s the kind of impact you can have! This is a project-based role where you’ll enjoy deep involvement throughout the lifespan of a project, as well as the chance to work closely with Architects, Technicians, and PMs. Whatever your current level of tech savvy or where you want your career to lead, you’ll find the right opportunities and a buddy to support your growth. Boredom? Trust us, that won’t be an issue. Your future at Kyndryl There are lots of opportunities to gain certification and qualifications on the job, and you’ll continuously grow as a Cloud Hyperscaler. Many of our Infrastructure Specialists are on a path toward becoming either an Architect or Distinguished Engineer, and there are opportunities at every skill level to grow in either of these directions. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Technical and Professional Experience 9 to 14 Years of Experience in Mainframe DB2 System Programming. Experience and Expertise in Monoplex, SYSPLEX architecture and data sharing architecture. Experience in DB2 subsystem setup - both Stand-alone and data sharing. Experience in DB2 backup and recovery scenarios - traditional image copy, FLASH, mirroring etc., methodology. Experience in REXX/CLIST and ISPF programming skills. Good knowledge on transactional environment like CICS, IMS architecture and interface with DB2. DB2 Data sharing skills - Installation, customization, DB2 structures, CFRM policy, DB2 CF recovery, Data sharing locking mechanism etc. Experience and good exposure to SMP/E process. Experience and Expertise on software migration and maintenance procedures - Install, backout, re-migration process - DB2 V9, V10, and V11. Good Knowledge on DB2 Health check, DB2 security and Patch management. Good knowledge and experience on DB2 traces, SMF, RMF and WLM. Good Knowledge on z/OS and DB2 system performance tuning methods. Preferred Skills and Experience •Bachelor’s degree in computer science or a related field. •Experience in DB2 Connect and IBM Data Studio •DB2 Cloning process - IBM DB2 Cloning Tool or any customized in-house Tools. • Good knowledge on SYSPLEX, data sharing and distributed problem-solving skills. • Experience in problem debugging and Vendor specific PMR ticketing procedure. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.

Posted 3 weeks ago

Apply

1 - 3 years

8 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

? Azure Platform Engineer (Databricks) Platform Design Define best practices for end-to-end databricks platform. Work with databricks and internal teams on evaluation of new features of databricks (private preview/ public preview) Ongoing discussions with databricks product teams on product features Platform Infra Create new databricks workspaces (premium, standard, serverless) and clusters including right sizing Drop unused workspaces Delta SharingWork with enterprise teams on connected data (data sharing) User Management About The Role Create new security groups and add/delete users Assign Unity Catalog permissions to respective groups/teams Manage Quantum Collaboration platform – sandbox for enterprise teams for ideation and innovation. Troubleshooting Issues in Databricks Investigate and diagnose performance issues or errors within Databricks. Review and analyze Databricks logs and error messages. Identify and address problems related to cluster configuration or job failures. Optimize Databricks notebooks and jobs for performance. Coordinate with Databricks support for unresolved or complex issues. Document common troubleshooting steps and solutions. Develop and test Databricks clusters to ensure stability and scalability. Governance Create dashboards to monitor job performance, cluster utilization, and cost. Design dashboards to cater to various user roles (e.g., data scientists, admins). Use Databricks APIs or integration with monitoring tools for up-to-date metrics. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 month ago

Apply

3 - 8 years

15 - 25 Lacs

Bhubaneshwar, Bengaluru, Hyderabad

Hybrid

Naukri logo

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location :Bangalore/Bhubaneswar/Hyderabad Required Skills, Snowflake developer Snowpipe SQL Interested candidates can send resumes to nandhini.spstaffing@gmail.com or ping me in 8148043843(Whatsapp)

Posted 1 month ago

Apply

3 - 8 years

15 - 25 Lacs

Bengaluru, Hyderabad, Noida

Hybrid

Naukri logo

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location :Noida/Gurgaon/Pune/Bangalore/Bhubaneswar/kochi Required Skills, Snowflake developer snowpipe Interested candidates can send resumes to nandhini.spstaffing@gmail.com or ping me in 8148043843(Whatsapp)

Posted 1 month ago

Apply

6 - 10 years

15 - 30 Lacs

Chennai, Hyderabad, Kolkata

Work from Office

Naukri logo

About Client Hiring for One of Our Multinational Corporations! Job Description Job Title: Snowflake Developer Qualification : Graduate Relevant Experience: 6 to 8 years Must-Have Skills: Snowflake Python SQL Roles and Responsibilities: Design, develop, and optimize Snowflake-based data solutions Write and maintain Python scripts for data processing and automation Work with cross-functional teams to implement scalable data pipelines Ensure data security and performance tuning in Snowflake Debug and troubleshoot database and data processing issues Location: Kolkata,Hyderabad, Chennai, Mumbai Notice Period: Upto 60 days Mode of Work: On-site -- Thanks & Regards Nushiba Taniya M Black and White Business Solutions Pvt.Ltd. Bangalore,Karnataka,India. Direct Number:08067432408 |Nushiba@blackwhite.in |www.blackwhite.in

Posted 2 months ago

Apply

3 - 8 years

5 - 15 Lacs

Chennai

Work from Office

Naukri logo

Snowflake Architect is responsible for designing, implementing, and optimizing data solutions using Snowflake Cloud Data Platform. They ensure scalability, security, and high performance in data warehousing, analytics, and cloud data solutions. Role & responsibilities 1. Architecture & Design Design end-to-end data solutions using Snowflake Cloud Data Platform. Define data architecture strategy, ensuring scalability and security. Establish best practices for Snowflake implementation, including data modeling, schema design, and query optimization. Design data lakes, data marts, and enterprise data warehouses (EDW) in Snowflake. 2. Data Engineering & Development Oversee ETL/ELT pipelines using Snowflake Snowpipe, Streams, Tasks, and Stored Procedures. Ensure efficient data ingestion, transformation, and storage using SQL, Python, or Scala. Implement data partitioning, clustering, and performance tuning for optimized query execution. 3. Security & Compliance Implement role-based access control (RBAC) and data governance policies. Ensure encryption, auditing, and data masking for security compliance. Define multi-cloud strategies (AWS, Azure, GCP) for Snowflake deployments. 4. Performance Optimization Optimize query performance and warehouse compute resources to reduce costs. Implement Materialized Views, Query Acceleration, and Caching to improve performance. Monitor Snowflake usage, cost management, and auto-scaling capabilities. 5. Integration & Automation Integrate Snowflake with BI tools (Tableau, Power BI, Looker), data lakes (S3, Azure Blob, GCS). Automate data workflows and pipeline orchestration using Airflow, dbt, or Snowflake Tasks. Implement CI/CD pipelines for data model deployments and schema changes. 6. Stakeholder Collaboration & Leadership Work closely with business analysts, data scientists, and IT teams to define requirements. Act as a technical advisor for Snowflake-related decisions and best practices. Provide mentorship and training to data engineers and analysts on Snowflake architecture. Key Skills Required: Snowflake Data Warehouse (Warehouses, Secure Data Sharing, Multi-Cluster Architecture) SQL, Python, Scala (for data processing and scripting) ETL/ELT & Data Pipelines (Informatica, Talend, dbt, Airflow) Cloud Services (AWS, Azure, GCP integration) Performance Tuning (Query Optimization, Snowflake Caching) Security & Governance (RBAC, PII Data Masking, Compliance) BI Tools Integration (Tableau, Power BI, Looker)

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies