Jobs
Interviews

51 Data Sharing Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 5.0 years

1 - 2 Lacs

thane

Work from Office

About The Role Short Description for Internal Candidates Description for Internal Candidates Job Role : Periodic extraction and publishing of MIS/Reports/Dashboard/Info graphs and Dumps Ensure Data"™s are provided to the top management on time and make certain the accuracy of data Maintain Timely submission and circulation of Data to the stake holders of below mentioned data Ensure data are provided for compliance and Audits and RCSA and related compliances Sales Management - Has to provide to ensure data and eligible base provided to sales management unit to drive sales across the contact center. Timely and accurate execution of adhoc requests Ensure and comply to all control and compliance guidelines on data sharing About The Role : Graduate Minimum of 1 year of MIS experience. Good Communication Skill Good hand on Info graphics, MS Excel,VBA (Macro ), MS access Proficient in advance excel, h lookup, v lookup, power pivots Basic analytics skills Expertise on Power Point, Prezi and other slide making platforms. Comfortable with flexible shift and work timings Same Posting Description for Internal and External Candidates

Posted 4 days ago

Apply

1.0 - 5.0 years

1 - 2 Lacs

thane

Work from Office

About The Role Short Description for Internal Candidates Description for Internal Candidates Job Role : Periodic extraction and publishing of MIS/Reports/Dashboard/Info graphs and Dumps Ensure Data"™s are provided to the top management on time and make certain the accuracy of data Maintain Timely submission and circulation of Data to the stake holders of below mentioned data Ensure data are provided for compliance and Audits and RCSA and related compliances Sales Management - Has to provide to ensure data and eligible base provided to sales management unit to drive sales across the contact center. Timely and accurate execution of adhoc requests Ensure and comply to all control and compliance guidelines on data sharing About The Role : Graduate Minimum of 1 year of MIS experience. Good Communication Skill Good hand on Info graphics, MS Excel,VBA (Macro ), MS access Proficient in advance excel, h lookup, v lookup, power pivots Basic analytics skills Expertise on Power Point, Prezi and other slide making platforms. Comfortable with flexible shift and work timings Same Posting Description for Internal and External Candidates

Posted 6 days ago

Apply

6.0 - 10.0 years

8 - 13 Lacs

hyderabad

Work from Office

Job description Position Title: Data Engineer (Snowflake Lead) Experience: 7+ Years Shift Schedule: Rotational Shifts Location: Hyderabad Role Overview We are seeking an experienced Data Engineer with strong expertise in Snowflake to join the Snowflake Managed Services team. This role involves data platform development, enhancements, and production support across multiple clients. You will be responsible for ensuring stability, performance, and continuous improvement of Snowflake environments. Key Responsibilities Design, build, and optimize Snowflake data pipelines, data models, and transformations. Provide L2/L3 production support for Snowflake jobs, queries, and integrations. Troubleshoot job failures, resolve incidents, and perform root cause analysis (RCA). Monitor warehouses, tune queries, and optimize Snowflake performance and costs. Manage service requests such as user provisioning, access control, and role management. Create and maintain documentation, runbooks, and standard operating procedures. Required Skills & Experience 5+ years of hands-on experience in Snowflake development and support. Strong expertise in SQL, data modeling, and query performance tuning. Experience with ETL/ELT development and orchestration tools (e.g., Azure Data Factory). Familiarity with CI/CD pipelines and scripting (Python or PySpark). Strong troubleshooting and incident resolution skills. Preferred Skills SnowPro Core Certification. Experience with ticketing systems (ServiceNow, Jira). Hands-on experience with Azure cloud services. Knowledge of ITIL processes.

Posted 1 week ago

Apply

1.0 - 4.0 years

4 - 8 Lacs

bengaluru

Work from Office

About The Role Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Salesforce Technical Architecture Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education "Key Responsibilities:Design, develop, and maintain Salesforce applications using Apex, LWC, and Flows.Implement and configure Salesforce CRM and Experience Cloud components.Collaborate on the technical design and architecture of scalable, secure Salesforce solutions.Manage Salesforce security settings, roles, profiles, permission sets, and data sharing models.Work with Salesforce Data Cloud basics to support data integration, unification, and analytics.Develop and maintain integrations between Salesforce and external systems using REST/SOAP APIs.Write efficient, reusable JavaScript code to enhance Lightning Web Components and user experience.Build and manage CI/CD pipelines using Copado to automate deployments and maintain code quality.Participate in Agile development cycles and collaborate with product owners, admins, QA, and other stakeholders.Ensure compliance with Salesforce best practices and maintain technical documentation.Required Skills & Experience:Experience in Salesforce development and administration.Strong proficiency in Apex, Lightning Web Components (LWC), and Salesforce Flows.Experience with Experience Cloud implementation and customization.Working knowledge of Salesforce Data Cloud basics (data unification, Customer 360).Solid understanding of Salesforce CRM functionality and customization.Expertise in Salesforce security models, data sharing, and access controls.Hands-on experience with Salesforce integrations using REST and SOAP APIs.Proficient in JavaScript (ES6+).Experience with CI/CD tools and processes, particularly Copado.Strong analytical, communication, and collaboration skills.Preferred Qualifications:Salesforce certifications (Platform Developer I & II, Experience Cloud Consultant, Admin).Familiarity with middleware platforms like MuleSoft or Dell Boomi.Experience with Salesforce DX and scratch orgs.Knowledge of Agile development methodologies." Qualification 15 years full time education

Posted 1 week ago

Apply

1.0 - 5.0 years

1 - 2 Lacs

thane

Work from Office

About The Role Short Description for Internal Candidates Description for Internal Candidates Job Role : Periodic extraction and publishing of MIS/Reports/Dashboard/Info graphs and Dumps Ensure Data"™s are provided to the top management on time and make certain the accuracy of data Maintain Timely submission and circulation of Data to the stake holders of below mentioned data Ensure data are provided for compliance and Audits and RCSA and related compliances Sales Management - Has to provide to ensure data and eligible base provided to sales management unit to drive sales across the contact center. Timely and accurate execution of adhoc requests Ensure and comply to all control and compliance guidelines on data sharing About The Role : Graduate Minimum of 1 year of MIS experience. Good Communication Skill Good hand on Info graphics, MS Excel,VBA (Macro ), MS access Proficient in advance excel, h lookup, v lookup, power pivots Basic analytics skills Expertise on Power Point, Prezi and other slide making platforms. Comfortable with flexible shift and work timings Same Posting Description for Internal and External Candidates

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

Transport is at the core of modern society. Imagine using your expertise to shape sustainable transport and infrastructure solutions for the future If you seek to make a difference on a global scale, working with next-gen technologies and the sharpest collaborative teams, then we could be a perfect match. Is Data Privacy your passion Do you want to be a part of driving the Volvo Group Trucks Technology Data Privacy journey Then read onthis is the assignment for you! We believe you have experience in data privacy, data sharing, vehicle data, and IT systems in a complex global product development organization. You will be comfortable working with large degrees of uncertainty, and you will help to identify and visualize pragmatic but robust solutions to problems. You should be service-minded and supportive to take care of internal and external stakeholders and customers. The Data Governance and Compliance team is looking for an experienced member in the area of GDPR/Privacy assessor and Analyst with at least 3 to 5 years of working in the automotive domain. We are part of the Vehicle Data Management and Analytics team within Vehicle Technology at Volvo Group Trucks Technology. This person will work within the Data Governance and Compliance team, which is in both Gothenburg (Sweden) and Bangalore (India). This position is in Bangalore. You have a proven record of driving successful change initiatives and are skilled in stakeholder management across organizational levels. Practical experience in Data Privacy tools is considered a merit. Fluency in English is essential. Your role will develop over time; some of your first tasks will be to: - Work together as part of the GTT Data Privacy team setting and achieving team goals. - Drive investigations and initiatives as needed to facilitate, promote, and protect Volvo Group data privacy and data sharing. - Further strengthen the data privacy framework and capability within GTT by driving activities in the GTT Data Privacy roadmap. Our focus on Inclusion, Diversity, and Equity allows each of us the opportunity to bring our full authentic self to work and thrive by providing a safe and supportive environment, free of harassment and discrimination. We are committed to removing the barriers to entry, which is why we ask that even if you feel you may not meet every qualification on the job description, please apply and let us decide. Applying to this job offers you the opportunity to join Volvo Group. Every day, across the globe, our trucks, buses, engines, construction equipment, financial services, and solutions make modern life possible. We are almost 100,000 people empowered to shape the future landscape of efficient, safe, and sustainable transport solutions. Fulfilling our mission creates countless career opportunities for talents with sharp minds and passion across the group's leading brands and entities. Group Trucks Technology is seeking talents to help design sustainable transportation solutions for the future. As part of our team, youll help us by engineering exciting next-gen technologies and contribute to projects that determine new, sustainable solutions. Bring your love of developing systems, working collaboratively, and your advanced skills to a place where you can make an impact. Join our design shift that leaves society in good shape for the next generation.,

Posted 1 week ago

Apply

10.0 - 15.0 years

18 - 20 Lacs

noida, gurugram

Work from Office

Lead design & implementation of scalable data pipelines using Snowflake & Databricks. Drive data architecture, governance. Build ETL/ELT, optimize models, mentor team, ensure security, compliance.strong Snowflake, Databricks, SQL, Python Required Candidate profile Experienced Data Analytics Lead skilled in Snowflake, Databricks, SQL, Python. Proven leader in designing scalable pipelines, data governance, ETL/ELT, and team mentoring.

Posted 1 week ago

Apply

1.0 - 4.0 years

1 - 2 Lacs

hyderabad

Work from Office

About The Role Job Role Description Fresher with coding skills in Java and SQL Strong control on Java/JS/Python /Front end Skills Expertise on web service. Understanding of systems architecture and ability to design scalable performance-driven solutions Understanding of key design patterns and large data volume limitations and best practices Understanding of data sharing and visibility considerations and how these play into platform architecture Strong understanding of environment management, release management, code versioning best practices, and deployment methodologies

Posted 1 week ago

Apply

1.0 - 5.0 years

1 - 2 Lacs

thane

Work from Office

About The Role Short Description for Internal Candidates Description for Internal Candidates Job Role : Periodic extraction and publishing of MIS/Reports/Dashboard/Info graphs and Dumps Ensure Data"™s are provided to the top management on time and make certain the accuracy of data Maintain Timely submission and circulation of Data to the stake holders of below mentioned data Ensure data are provided for compliance and Audits and RCSA and related compliances Sales Management - Has to provide to ensure data and eligible base provided to sales management unit to drive sales across the contact center. Timely and accurate execution of adhoc requests Ensure and comply to all control and compliance guidelines on data sharing About The Role : Graduate Minimum of 1 year of MIS experience. Good Communication Skill Good hand on Info graphics, MS Excel,VBA (Macro ), MS access Proficient in advance excel, h lookup, v lookup, power pivots Basic analytics skills Expertise on Power Point, Prezi and other slide making platforms. Comfortable with flexible shift and work timings Same Posting Description for Internal and External Candidates

Posted 1 week ago

Apply

3.0 - 7.0 years

5 - 15 Lacs

hyderabad

Work from Office

Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using Snowflake as the primary database engine. Collaborate with cross-functional teams to gather requirements and design solutions that meet business needs. Develop complex SQL queries to extract insights from large datasets stored in Snowflake tables. Troubleshoot issues related to data quality, performance tuning, and security compliance. Participate in code reviews to ensure adherence to coding standards and best practices. Desired Candidate Profile 3-7 years of experience working with Snowflake as a Data Engineer or similar role. Strong understanding of SQL programming language with ability to write efficient queries for large datasets. Proficiency in Python scripting language with experience working with popular libraries such as Pandas, NumPy, etc.

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a DBT professional, you will be responsible for designing, developing, and defining technical architecture for data pipelines and performance scaling in a big data environment. Your expertise in PL/SQL, including queries, procedures, and JOINs, will be crucial for the integration of Talend data and ensuring data quality. You will also be proficient in Snowflake SQL, writing SQL queries against Snowflake, and developing scripts in Unix, Python, etc., to facilitate Extract, Load, and Transform operations. It would be advantageous to have hands-on experience and knowledge of Talend. Candidates with previous experience in PROD support will be given preference. Your role will involve working with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. You will be responsible for data analysis, troubleshooting data issues, and providing technical support to end-users. In this position, you will develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Your problem-solving skills will be put to the test, and you will be expected to have a continuous improvement approach. Possessing Talend/Snowflake Certification would be considered desirable. Excellent SQL coding skills, effective communication, and documentation skills are essential. Knowledge of the Agile delivery process is preferred. You must be analytical, creative, and self-motivated to excel in this role. Collaboration within a global team environment is key, necessitating excellent communication skills. Your contribution to Virtusa will be valued, as teamwork, quality of life, and professional development are the core values the company upholds. By joining a global team of 27,000 professionals, you will have access to exciting projects and opportunities to work with cutting-edge technologies throughout your career. Virtusa provides an environment that nurtures new ideas, fosters excellence, and encourages personal and professional growth.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

bhopal, madhya pradesh

On-site

As a Lab Admin, your primary responsibilities will include: - Managing store inventory by controlling stock, entering data, and issuing materials as required. - Overseeing repair and maintenance tasks, as well as ensuring proper housekeeping within the lab premises. - Ensuring compliance with all lab regulations and standards. - Monitoring electricity, water usage, and waste management in the laboratory. - Tracking the amount of Bio Medical Waste (BMW) generated in kilograms. - Handling sales administration tasks related to local compliance. - Safeguarding laboratory keys and controlling access to the facility. - Sharing important data such as lab opening and closing times, first and last accession times, and last sample processing details. - Monitoring local earthing and voltage levels on a daily basis. - Uploading biometric attendance records into the designated machine. - Regularly checking water Total Dissolved Solids (TDS) levels. - Managing and reporting accidents, spills, machine faults, and downtime incidents. - Coordinating logistics for the transportation of samples. This is a full-time, permanent position with benefits including Provident Fund. The work schedule is during day shifts at the designated work location.,

Posted 2 weeks ago

Apply

3.0 - 8.0 years

4 - 9 Lacs

kolkata, ahmedabad, bengaluru

Work from Office

Educational Requirements MCA,MSc,MTech,Bachelor of Engineering,BCA,BSc Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: • Primary skills:Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Snowflake SQL Developer, you will be responsible for writing SQL queries against Snowflake and developing scripts in Unix, Python, and other languages to facilitate the Extract, Load, and Transform (ELT) process for data. Your role will involve hands-on experience with various Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Your primary objective will be to design and implement scalable and performant data pipelines that ingest, process, and transform data from diverse sources into Snowflake. You should have proven experience in configuring and managing Fivetran connectors for data integration, and familiarity with DBT knowledge is considered a plus. To excel in this role, you must possess excellent SQL coding skills, along with strong communication and documentation abilities. Your complex problem-solving capabilities, coupled with an ever-improving approach, will be crucial in delivering high-quality solutions. Analytical thinking, creativity, and self-motivation are key attributes that will drive your success in this position. Collaboration is essential in our global team environment, and your ability to work effectively with colleagues worldwide will be valued. While a Snowflake Certification is preferred, familiarity with Agile delivery processes and outstanding communication skills are also highly desirable traits for this role.,

Posted 2 weeks ago

Apply

6.0 - 8.0 years

2 - 4 Lacs

hyderabad, chennai, bengaluru

Work from Office

Job Title: System Administrator Work Location: Chennai, TN / Hyderabad, TS / Pune, MH/ Bangalore, KA/ Mumbai, MH Skill Required: IT IS_AMS_Mainframe_DB2 Administration Experience Range: 6-8 Years. Job Description: Maintain Documentation of issue causes, actions and resolution in the designated problem tracking system. Upgrade DB2 Data sharing and non-data sharing subsystem environments Proactively monitoring and maintaining production databases Essential Skills: Maintain Documentation of issue causes, actions and resolution in the designated problem tracking system . Assist the assigned project manager in the migration of newly acquired customers such that all technical hurdles are addressed in a timely manner and do not impede the expected progress of the transition nor the targeted migration date. In joint cooperation with the Disaster Recovery Services and other technical teams, successfully execute tasks associated with any given customers disaster recovery plans. Adhere to a schedule that efficiently manages resources and allows timely implementations. Follow ITSM change control procedures

Posted 2 weeks ago

Apply

11.0 - 20.0 years

30 - 45 Lacs

pune, chennai, bengaluru

Work from Office

Data Architecture experience in Data Warehouse, Snowflake+ DBT Snowflake + snowflake advanced certification Oversees and designs the information architecture for the data warehouse, including all information structures i.e. staging area, data warehouse, data marts, operational data stores, oversees standardization of data definition, Oversees development of Physical and Logical modelling. Deep understanding in Data Warehousing, Enterprise Architectures, Dimensional Modelling, Star & Snow-flake schema design, Reference DW Architectures, ETL Architect, ETL (Extract, Transform, Load), Data Analysis, Data Conversion, Transformation, Database Design, Data Warehouse Optimization, Data Mart Development, and Enterprise Data Warehouse Maintenance and Support etc. Significant experience in working as a Data Architect with depth in data integration and data architecture for Enterprise Data Warehouse implementations (conceptual, logical, physical & dimensional models). Maintain in-depth and current knowledge of the Cloud architecture, Data lake, Data warehouse, BI platform, analytics models platforms and ETL tools Essential job tasks Data Architecture experience in Data Warehouse, Snowflake. Oversees and designs the information architecture for the data warehouse, including all information structures i.e. staging area, data warehouse, data marts, operational data stores, oversees standardization of data definition, Oversees development of Physical and Logical modelling. Deep understanding in Data Warehousing, Enterprise Architectures, Dimensional Modelling, Star & Snow-flake schema design, Reference DW Architectures, ETL Architect, ETL (Extract, Transform, Load), Data Analysis, Data Conversion, Transformation, Database Design, Data Warehouse Optimization, Data Mart Development, and Enterprise Data Warehouse Maintenance and Support etc. Significant experience in working as a Data Architect with depth in data integration and data architecture for Enterprise Data Warehouse implementations (conceptual, logical, physical & dimensional models). Maintain in-depth and current knowledge of the Cloud architecture, Data lake, Data warehouse, BI platform, analytics models platforms and ETL tools Pls share your resume at parul@mounttalent.com

Posted 3 weeks ago

Apply

10.0 - 15.0 years

22 - 37 Lacs

bengaluru

Work from Office

Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role As a Mainframe DB2 System Programmer SME at Kyndryl are project-based subject matter experts in all things infrastructure – good at providing analysis, documenting and diagraming work for hand-off, offering timely solutions, and generally “figuring it out.” This is a hands-on role where your feel for the interaction between a system and its environment will be invaluable to every one of your clients. There are two halves to this role: First, contributing to current projects where you analyze problems and tech issues, offer solutions, and test, modify, automate, and integrate systems. And second, long-range strategic planning of IT infrastructure and operational execution. This role isn’t specific to any one platform, so you’ll need a good feel for all of them. And because of this, you’ll experience variety and growth at Kyndryl that you won’t find anywhere else. You’ll be involved early to offer solutions, help decide whether something can be done, and identify the technical and timeline risks up front. This means dealing with both client expectations and internal challenges – in other words, there are plenty of opportunities to make a difference, and a lot of people will witness your contributions. In fact, a frequent sign of success for our Infrastructure Specialists is when clients come back to us and ask for the same person by name. That’s the kind of impact you can have! This is a project-based role where you’ll enjoy deep involvement throughout the lifespan of a project, as well as the chance to work closely with Architects, Technicians, and PMs. Whatever your current level of tech savvy or where you want your career to lead, you’ll find the right opportunities and a buddy to support your growth. Boredom? Trust us, that won’t be an issue. Your future at Kyndryl There are lots of opportunities to gain certification and qualifications on the job, and you’ll continuously grow as a Cloud Hyperscaler. Many of our Infrastructure Specialists are on a path toward becoming either an Architect or Distinguished Engineer, and there are opportunities at every skill level to grow in either of these directions. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Technical and Professional Experience 9 to 14 Years of Experience in Mainframe DB2 System Programming. Experience and Expertise in Monoplex, SYSPLEX architecture and data sharing architecture. Experience in DB2 subsystem setup - both Stand-alone and data sharing. Experience in DB2 backup and recovery scenarios - traditional image copy, FLASH, mirroring etc., methodology. Experience in REXX/CLIST and ISPF programming skills. Good knowledge on transactional environment like CICS, IMS architecture and interface with DB2. DB2 Data sharing skills - Installation, customization, DB2 structures, CFRM policy, DB2 CF recovery, Data sharing locking mechanism etc. Experience and good exposure to SMP/E process. Experience and Expertise on software migration and maintenance procedures - Install, backout, re-migration process - DB2 V9, V10, and V11. Good Knowledge on DB2 Health check, DB2 security and Patch management. Good knowledge and experience on DB2 traces, SMF, RMF and WLM. Good Knowledge on z/OS and DB2 system performance tuning methods. Preferred Skills and Experience Bachelor’s degree in computer science or a related field. Experience in DB2 Connect and IBM Data Studio DB2 Cloning process - IBM DB2 Cloning Tool or any customized in-house Tools. Good knowledge on SYSPLEX, data sharing and distributed problem-solving skills. Experience in problem debugging and Vendor specific PMR ticketing procedure. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.

Posted 3 weeks ago

Apply

9.0 - 12.0 years

5 - 5 Lacs

thiruvananthapuram

Work from Office

Tech Lead - Azure/Snowflake & AWS Migration Key Responsibilities - Design and develop scalable data pipelines using Snowflake as the primary data platform, integrating with tools like Azure Data Factory, Synapse Analytics, and AWS services. - Build robust, efficient SQL and Python-based data transformations for cleansing, enrichment, and integration of large-scale datasets. - Lead migration initiatives from AWS-based data platforms to a Snowflake-centered architecture, including: o Rebuilding AWS Glue pipelines in Azure Data Factory or using Snowflake-native ELT approaches. o Migrating EMR Spark jobs to Snowflake SQL or Python-based pipelines. o Migrating Redshift workloads to Snowflake with schema conversion and performance optimization. o Transitioning S3-based data lakes (Hudi, Hive) to Snowflake external tables via ADLS Gen2 or Azure Blob Storage. o Redirecting Kinesis/MSK streaming data to Azure Event Hubs, followed by ingestion into Snowflake using Streams & Tasks or Snowpipe. - Support database migrations from AWS RDS (Aurora PostgreSQL, MySQL, Oracle) to Snowflake, focusing on schema translation, compatibility handling, and data movement at scale. - Design modern Snowflake lakehouse-style architectures that incorporate raw, staging, and curated zones, with support for time travel, cloning, zero-copy restore, and data sharing. - Integrate Azure Functions or Logic Apps with Snowflake for orchestration and trigger-based automation. - Implement security best practices, including Azure Key Vault integration and Snowflake role-based access control, data masking, and network policies. - Optimize Snowflake performance and costs using clustering, multi-cluster warehouses, materialized views, and result caching. - Support CI/CD processes for Snowflake pipelines using Git, Azure DevOps or GitHub Actions, and SQL code versioning. - Maintain well-documented data engineering workflows, architecture diagrams, and technical documentation to support collaboration and long-term platform maintainability. Required Qualifications - 9+ years of data engineering experience, with 3+ years on Microsoft Azure stack and hands-on Snowflake expertise. - Proficiency in: o Python for scripting and ETL orchestration o SQL for complex data transformation and performance tuning in Snowflake o Azure Data Factory and Synapse Analytics (SQL Pools) - Experience in migrating workloads from AWS to Azure/Snowflake, including services such as Glue, EMR, Redshift, Lambda, Kinesis, S3, and MSK. - Strong understanding of cloud architecture and hybrid data environments across AWS and Azure. - Hands-on experience with database migration, schema conversion, and tuning in PostgreSQL, MySQL, and Oracle RDS. - Familiarity with Azure Event Hubs, Logic Apps, and Key Vault. - Working knowledge of CI/CD, version control (Git), and DevOps principles applied to data engineering workloads. Preferred Qualifications - Extensive experience with Snowflake Streams, Tasks, Snowpipe, external tables, and data sharing. - Exposure to MSK-to-Event Hubs migration and streaming data integration into Snowflake. - Familiarity with Terraform or ARM templates for Infrastructure-as-Code (IaC) in Azure environments. - Certification such as SnowPro Core, Azure Data Engineer Associate, or equivalent. Required Skills Azure,AWS REDSHIFT,Athena,Azure Data Lake

Posted 4 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

hyderabad, telangana

On-site

As a Senior Architect AI Products supporting Novartis" Commercial function, you will play a crucial role in driving the architectural strategy that facilitates the seamless integration of data and AI products across various key areas such as omnichannel engagement, customer analytics, field operations, and real-world insights. Your responsibilities will involve collaborating with commercial business domains, data platforms, and AI product teams to design scalable, interoperable, and compliant solutions that enhance the impact of data and advanced analytics on healthcare professional and patient engagement. You will be tasked with defining and implementing the reference architecture for commercial data and AI products, ensuring alignment with enterprise standards and business priorities. Additionally, you will architect the integration of data products with AI products and downstream tools, promoting modular, scalable design to encourage reuse and interoperability across different markets and data domains within the commercial landscape. Stakeholder alignment will be a key aspect of your role, as you will partner with various teams to guide solution design, delivery, and lifecycle evolution. Your role will also involve supporting the full lifecycle of data and AI, including ingestion, transformation, model training, inference, and monitoring within secure and compliant environments. It will be essential to ensure that the architecture complies with governance, data privacy, and commercial requirements while constantly seeking opportunities for architectural improvements, modern technologies, and integration patterns to enhance personalization, omnichannel engagement, segmentation, targeting, and performance analytics. To excel in this position, you are expected to demonstrate proven leadership in cross-functional architecture efforts, possess a good understanding of security, compliance, and privacy regulations in the commercial pharma sector, and have experience with pharmaceutical commercial ecosystems and data. A strong background in data platforms, pipelines, and governance, as well as knowledge of AI/ML architectures supporting commercial use cases, will be advantageous. Additionally, a bachelor's or master's degree in computer science, engineering, data science, or a related field, along with at least 10 years of experience in enterprise or solution architecture, are desirable qualifications for this role. Novartis is committed to diversity, equal opportunity, and inclusion, striving to build diverse teams that represent the patients and communities served. By joining Novartis, you will become part of a community of passionate individuals collaborating to achieve breakthroughs that positively impact patients" lives. If you are ready to contribute to creating a brighter future through innovation and collaboration, we invite you to explore career opportunities at Novartis.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

haryana

On-site

As a part of this role, you will be responsible for understanding the commercials associated with each partner and the costs related to the end-to-end operations of each partnership. This includes working on reconciliation and accounting tasks and effectively communicating these requirements to the product and technology teams. Your primary focus will be on developing partner level profit and loss statements with the aim of achieving efficiency. You will also be expected to propose and implement improved processes to enhance operational efficiencies within the partnerships. Collaboration with relevant teams to address any unreconciled transactions is a key aspect of this role. You will need to identify and eliminate manual processes in daily operations by closely collaborating with the Product and Technology teams. Monitoring and tracking various metrics such as projected volumes versus actual volumes, internal rate of return (IRR), number of loans, underlying portfolio class target segment, and consumer profiles will be part of your responsibilities. Additionally, you will be required to write and execute business requirement documents and test cases to support operational needs. Finally, you will play a crucial role in sharing data with the finance team to facilitate invoicing based on different commercial agreements. Your attention to detail and ability to work cross-functionally will be essential in ensuring smooth operations and financial management within the partnerships.,

Posted 1 month ago

Apply

8.0 - 10.0 years

5 - 9 Lacs

Gurugram

Work from Office

1. Research and Knowledge Generation Contribute to the design of the research on how data sharing models can enable access to finance for smallholders. Conduct extensive secondary research on data-sharing in agriculture, including analysis of literature reviews and that of existing data-sharing initiatives. Conduct and document expert interviews on key building blocks of data-sharing initiatives and the business models of data sharing Analyse findings from research and pilot projects (mainly interview reports) that can be used to develop knowledge products (e.g. thought pieces, case studies, webinars, report), which this person will co-create. 2. Stakeholder Engagement and Convening Engage with financial institutions, agribusinesses, and other stakeholders to influence them to promote collaboration and to arrive at data-sharing initiatives. Organize and facilitate workshops for stakeholders, convenings, and webinars to validate findings and share insights. Support selected data-sharing pilots; generating learnings. Provide capacity-building support, convening activities, and advisory services to stakeholders involved in the pilots. 3. Knowledge Product Development and Dissemination Develop knowledge products (e.g. case studies, webinars, how-to-data sharing toolkit) to support stakeholders in sector-wide learning and adopting data-sharing models Contribute to a final Program Report summarizing key insights, best practices, and recommendations regarding data sharing. Key Challenges: Connect with various stakeholders involved in data sharing initiatives, including financial institutions and agribusinesses Translate research findings into actionable insights for diverse stakeholders Manage multiple research and stakeholder engagement activities simultaneously Produce high-quality, impactful knowledge products that drive sector-wide learning while being practical enough to result in adoption Navigate complex issues around business models, data governance, ownership, and security in agricultural data-sharing initiatives Work in a global, remote team with cross-cultural collaboration Job requirements Working and thinking at master's degrees level; 8-10 years of relevant working experience in international development, agri-trade, agricultural finance, or similar preferably with content expertise in smallholder value chains Familiarity with financial inclusion and data sharing models is a plus Experience conducting research, stakeholder mapping, and producing high-quality knowledge products. Excellent stakeholder engagement skills, with the ability to facilitate collaboration between public and private sector actors. Strong project management skills, including the ability to manage project and research activities simultaneously Excellent communication and writing skills in English Willingness to travel

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

hyderabad, telangana

On-site

About DRF: Dr. Reddys Foundation (DRF) is a not-for-profit organization dedicated to enhancing the dignity and well-being of socially and economically vulnerable individuals. The organization focuses on empowering communities through improved education, health, livelihood, and climate action outcomes. As a member of our team, your responsibilities will include coordinating with program centers nationwide and preparing a validated and consolidated attendance sheet by the 24th of every month. Additionally, you will be tasked with consolidating and sharing data regarding monthly resignations and other necessary information as per specific requirements.,

Posted 1 month ago

Apply

6.0 - 14.0 years

0 Lacs

hyderabad, telangana

On-site

As a Salesforce Technical Architect, you will collaborate with client stakeholders to define requirements, deliverables, and manage expectations effectively. Your responsibilities include translating business requirements into well-architected solutions that maximize the Salesforce platform's potential. Leading technical design sessions, you will architect technical solutions aligned with client objectives, identifying gaps between their current and desired states. You will provide oversight and governance for Salesforce projects, ensuring adherence to coding standards and conducting code reviews to maintain quality and design integrity. Managing the technical delivery of custom development, integrations, and data migration elements in Salesforce implementations is a key aspect of your role. Maintaining a target billable utilization aligned to your responsibilities, you will demonstrate the ability to understand projects and troubleshoot issues effectively. In addition, you may be involved in pre-sales activities such as discovery sessions and Proof-Of-Concept (POC) development with prospects. Collaborating with Salesforce product teams to support client implementations and traveling to client sites for projects (approximately 50-75% of the time) are also part of your responsibilities. To qualify for this role, you must hold a degree or equivalent proven experience, with a strong background in CRM, particularly with a minimum of 6 years on the Salesforce platform. Your expertise should include a deep understanding of the Salesforce product suite, covering B2B commerce, Sales, Service, Community, Marketing, and Community Clouds. Moreover, you should possess knowledge of B2B sales processes, customer journeys, and e-commerce best practices. Your proficiency in data integration tools and experience in integrating Salesforce with various business systems, along with an understanding of B2B Commerce Einstein Features & Capabilities, B2B Commerce Extensions framework, and B2B Commerce APIs, are essential. Additionally, you should comprehend cross-cloud use cases, Salesforce OMS, system architecture, and scalable performance-driven solutions. Demonstrating familiarity with key design patterns, data sharing considerations, platform authentication patterns, platform security capabilities, environment management, and release management best practices is crucial. Your experience in defining system architecture landscapes, identifying gaps, and delivering comprehensive solutions to achieve desired business outcomes will be highly valued. Having active Salesforce certifications or the ability to obtain relevant certifications upon hire is advantageous for this position. Your proven track record in designing and developing large web-based systems or complete software product lifecycle exposure will be beneficial in fulfilling your role as a Salesforce Technical Architect.,

Posted 1 month ago

Apply

1.0 - 7.0 years

0 - 0 Lacs

haryana

On-site

You will be responsible for managing and monitoring all installed systems and infrastructure to ensure the highest level of availability. Providing L1 Level support in LAN and Remote support will be a key aspect of your role. You will also be required to offer Software, Hardware, Network Support, and Operating System support. Configuring Microsoft Outlook for users and troubleshooting mail problems, along with tasks such as basic networking, peer-to-peer connections, mapping drives, and data sharing will be part of your daily responsibilities. It is essential to have proven work experience as a Desktop Support Engineer, Technical Support Engineer, or in a similar role. You will be addressing user tickets regarding hardware, software, and networking, as well as installing applications and computer peripherals. Hands-on experience with Windows environments is necessary, along with knowledge of network security practices and anti-virus programs. Excellent problem-solving and multitasking skills, a customer-oriented attitude, and the ability to handle printers and Wi-Fi devices in customer locations are also required. Qualifications: - Experience: 1 - 7 Years - Salary: 2 Lac To 2 Lac 50 Thousand P.A. - Industry: IT Hardware / Technical Support / Telecom Engineering - Qualification: Other Bachelor Degree Key Skills: - Hardware Engineer - Network Engineer - Hardware Faculty - LAN Maintenance - Walk in,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

kolkata, west bengal

On-site

Genpact is a global professional services and solutions firm focused on delivering outcomes that shape the future. With over 125,000 employees across 30+ countries, our team is driven by curiosity, agility, and the desire to create lasting value for our clients. Our purpose, the relentless pursuit of a world that works better for people, underpins our work as we serve and transform leading enterprises worldwide, including Fortune Global 500 companies. We leverage deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI to drive innovation and success. We are currently seeking applications for the role of Vice President, Enterprise Architecture Consulting- GCP Delivery Lead at Genpact. In this critical leadership position, you will be responsible for managing the delivery of complex Google Cloud Platform (GCP) projects, ensuring client satisfaction, team efficiency, and innovation. The ideal candidate will bring deep industry expertise, technical excellence, and strong business acumen to shape our organization's data and cloud transformation roadmap. As the Delivery Lead, your key responsibilities include overseeing the successful delivery of multimillion-dollar engagements involving GCP, managing client relationships, leading global project teams, ensuring adherence to delivery governance standards, and driving innovation within the scope of GCP initiatives. You will play a vital role in shaping the data and cloud transformation journey of our organization. Key Responsibilities: - Own and drive end-to-end delivery of GCP & Data Engineering programs across multiple geographies & industry verticals. - Establish a best-in-class data delivery framework ensuring scalability, security, and efficiency in GCP-based transformations. - Act as a trusted advisor to C-level executives, driving customer success, innovation, and business value. - Lead executive-level stakeholder engagement, aligning with business strategy and IT transformation roadmaps. - Drive account growth, supporting pre-sales, solutioning, and go-to-market (GTM) strategies for GCP and data-driven initiatives. - Ensure customer satisfaction and build long-term strategic partnerships with enterprise clients. - Shape the organization's Data & AI strategy, promoting the adoption of GCP, AI/ML, real-time analytics, and automation in enterprise data solutions. - Establish data accelerators, reusable frameworks, and cost optimization strategies to enhance efficiency and profitability. - Build and mentor a high-performing global team of cloud data professionals, including data engineers, architects, and analytics experts. - Foster a culture of continuous learning and innovation, driving upskilling and certifications in GCP, AI/ML, and Cloud Data technologies. - Stay informed about emerging trends in GCP, cloud data engineering, and analytics to drive innovation. Qualifications: Minimum Qualifications: - Experience in IT Services, with a focus on data engineering, GCP, and cloud transformation leadership. - Bachelor's degree in Computer Science, Engineering, or a related field (Masters or MBA preferred). Preferred Qualifications/ Skills: - Proven track record in delivering large-scale, multimillion dollar GCP & data engineering programs. - Deep understanding of the GCP ecosystem, including Data Sharing, Streams, Tasks, Performance Tuning, and Cost Optimization. - Strong expertise in cloud platforms (Azure, AWS) and data engineering pipelines. - Proficiency in modern data architectures, AI/ML, IoT, and edge analytics. - Experience in managing global, multi-disciplinary teams across multiple geographies. - Exceptional leadership and executive presence, with the ability to influence C-suite executives and key decision-makers. Preferred Certifications: - Certified Google Professional Cloud Architect or equivalent. - Cloud Certifications (Azure Data Engineer, AWS Solutions Architect, or equivalent). - PMP, ITIL, or SAFe Agile certifications for delivery governance. If you are a dynamic leader with a passion for driving innovation and transformation in the cloud and data space, we encourage you to apply for the Vice President, Enterprise Architecture Consulting- GCP Delivery Lead role at Genpact. Join us in shaping the future and delivering value to clients worldwide.,

Posted 1 month ago

Apply
Page 1 of 3
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies