Home
Jobs

13 Data Virtualization Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 5.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Qlik Attunity QlikView Attunity Qlik sense Attunity Qlik Replicate Preferred Skills: Technology->Business Intelligence - Visualization->Qlikview Technology->Business Intelligence - Data Virtualization->Qliksense Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit * Location of posting is subject to business requirements

Posted 1 week ago

Apply

6.0 - 8.0 years

5 - 7 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

Skills, Knowledge & Experience Strong Working Knowledges in IBM Db2 LUW Replication technologies (Db2 SQL replication and Q Replication, a Queue -based Replication) as well as Using Third party tools for Replications is mandatory Experience in Installation and Configurations of Huge IBM DB2 LUW systems on Production and Nonproduction Systems on UNIX (Linux/AIX) environments Experience with Performance Tuning and Optimization (PTO), using native monitoring and troubleshooting tools. Experience with backups, restores and recovery models and implementation of Backup strategies mandatory including RTO and RPO. Strong knowledge of Clustering, High Availability (HA/HADR) and Disaster Recovery (DR) options for DB2 LUW . Strong knowledge of Data Encryption (at rest and in transit) for DB2 LUW . Strong Proven working knowledge on Db2 tools. Explain plan, Db2 reorg, Db2 run stats. Strong knowledge of DB2 SQL and Sourced Stored Procedures . Knowledge of Toad for DB2 and IBM Client tools . Strong knowledge of Linux and DB2 User Access Security, Groups and Roles. Experience with Data Virtualization. Strong knowledge of Linux and DB2 User Access Security, Groups and Roles. Experience with Database Design . Experience with Autosys Workload Automation. Experience with MS PowerShell, Bash, and VB Script etc. Working Experience in Cloud environment especially in GCP, IBM Cloud, Azure Cloud is Big Plus. Knowledge on IBM Maximo Application suite installation end to end process.(Big Plus)

Posted 1 week ago

Apply

3 - 8 years

5 - 10 Lacs

Mumbai

Work from Office

Naukri logo

We are looking for a VMware Certified Instructor (VCI) to join our training delivery team. The ideal candidate should be passionate about teaching, technically sound in VMware technologies, and possess strong communication skills to deliver instructor-led training to our enterprise and individual clients. Must be a VMware Certified Instructor (VCI) with valid credentials. Experience in delivering VMware trainings in both physical and virtual environments. Strong communication, presentation, and mentoring skills. Ability to manage classroom dynamics and ensure effective knowledge transfer. Prior corporate or academic training experience is a plus. VMware, vSphere, NSX, vSAN, Certified Instructor, Technical Training, Virtualization, Online & Classroom Delivery

Posted 1 month ago

Apply

12 - 22 years

35 - 65 Lacs

Chennai

Hybrid

Naukri logo

Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 - 24 Yrs Location- Pan India Job Description : - Candidates should have minimum 2 Years hands on experience as Azure databricks Architect If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in With Regards, Sankar G Sr. Executive - IT Recruitment

Posted 1 month ago

Apply

12 - 22 years

35 - 60 Lacs

Chennai

Hybrid

Naukri logo

Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 - 24 Yrs Location- Pan India Job Description : - The Data Modeler will be responsible for the design, development, and maintenance of data models and standards for Enterprise data platforms. Build dimensional data models applying best practices and providing business insights. Build data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Identify business needs and translate business requirements to Conceptual, Logical, Physical and semantic, multi-dimensional (star, snowflake), normalized/denormalized, Data Vault2.0 model for the project. Knowledge of snowflake and dbt is added advantage. Create and maintain the Source to Target Data Mapping document for this project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Develop best practices for standard naming conventions and coding practices to ensure consistency of data models. Gather and Publish Data Dictionaries: Maintain data models and capture data models from existing databases and record descriptive information. Work with the Development team to implement data strategies, build data flows and develop conceptual data models. Create logical and physical data models using best practices to ensure high data quality and reduced redundancy. Optimize and update logical and physical data models to support new and existing projects. Data profiling, business domain modeling, logical data modeling, physical dimensional data modeling and design. Data design and performance optimization for large Data Warehouse solutions. understanding data - profile and analysis (metadata (formats, definitions, valid values, boundaries), relationship/usage) Create relational and dimensional structures for large (multi-terabyte) operational, analytical, warehouse and BI systems. Should be good in verbal and written communication. If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in With Regards, Sankar G Sr. Executive - IT Recruitment

Posted 1 month ago

Apply

12 - 22 years

35 - 60 Lacs

Kolkata

Hybrid

Naukri logo

Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 - 24 Yrs Location- Pan India Job Description : - The Data Modeler will be responsible for the design, development, and maintenance of data models and standards for Enterprise data platforms. Build dimensional data models applying best practices and providing business insights. Build data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Identify business needs and translate business requirements to Conceptual, Logical, Physical and semantic, multi-dimensional (star, snowflake), normalized/denormalized, Data Vault2.0 model for the project. Knowledge of snowflake and dbt is added advantage. Create and maintain the Source to Target Data Mapping document for this project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Develop best practices for standard naming conventions and coding practices to ensure consistency of data models. Gather and Publish Data Dictionaries: Maintain data models and capture data models from existing databases and record descriptive information. Work with the Development team to implement data strategies, build data flows and develop conceptual data models. Create logical and physical data models using best practices to ensure high data quality and reduced redundancy. Optimize and update logical and physical data models to support new and existing projects. Data profiling, business domain modeling, logical data modeling, physical dimensional data modeling and design. Data design and performance optimization for large Data Warehouse solutions. understanding data - profile and analysis (metadata (formats, definitions, valid values, boundaries), relationship/usage) Create relational and dimensional structures for large (multi-terabyte) operational, analytical, warehouse and BI systems. Should be good in verbal and written communication. If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in With Regards, Sankar G Sr. Executive - IT Recruitment

Posted 1 month ago

Apply

12 - 22 years

35 - 60 Lacs

Noida

Hybrid

Naukri logo

Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 - 24 Yrs Location- Pan India Job Description : - The Data Modeler will be responsible for the design, development, and maintenance of data models and standards for Enterprise data platforms. Build dimensional data models applying best practices and providing business insights. Build data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Identify business needs and translate business requirements to Conceptual, Logical, Physical and semantic, multi-dimensional (star, snowflake), normalized/denormalized, Data Vault2.0 model for the project. Knowledge of snowflake and dbt is added advantage. Create and maintain the Source to Target Data Mapping document for this project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Develop best practices for standard naming conventions and coding practices to ensure consistency of data models. Gather and Publish Data Dictionaries: Maintain data models and capture data models from existing databases and record descriptive information. Work with the Development team to implement data strategies, build data flows and develop conceptual data models. Create logical and physical data models using best practices to ensure high data quality and reduced redundancy. Optimize and update logical and physical data models to support new and existing projects. Data profiling, business domain modeling, logical data modeling, physical dimensional data modeling and design. Data design and performance optimization for large Data Warehouse solutions. understanding data - profile and analysis (metadata (formats, definitions, valid values, boundaries), relationship/usage) Create relational and dimensional structures for large (multi-terabyte) operational, analytical, warehouse and BI systems. Should be good in verbal and written communication. If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in With Regards, Sankar G Sr. Executive - IT Recruitment

Posted 1 month ago

Apply

8 - 10 years

14 - 19 Lacs

Bengaluru

Work from Office

Naukri logo

Job Summary As a member of Solutions Integration Engineering you work cross-functionally to define and create engineered solutions /products which would accelerate the field adoption and work closely with ISVs and with the startup ecosystem in the Virtualization, Cloud, AI/ML and Gen AI domains to build solutions that matter for the customers. You will work closely with product owner and product lead on the company's current and future strategies related to said domains. Job Requirements Lead to deliver features, including participating in the full software development lifecycle. Deliver reliable, innovative solutions and products. Participate in product design, development, verification, troubleshooting, and delivery of a system or major subsystems, including authoring project specifications. Write unit and automated integration?tests and project documentation. Technical Skills: Understanding of Software development lifecycle Strong proficiency in full stack development ~ MERN Stack, Python, Container Ecosystem, Cloud and Modern ML frameworks. Knowledge of Data storage, virtualization, knowledge on hypervisors such as VMware ESX, Linux KVM and Artificial intelligence concepts including server/storage architecture, batch/stream processing, data warehousing, data lakes, distributed filesystems, OLTP/OLAP databases and data pipelining tools, model training, inferencing as well as RAG workflows. Knowledge of Unix based operating system kernels and development environments, e.g. Linux or FreeBSD. A strong understanding of basic to complex concepts related to computer architecture, data structures, and new programming paradigms Education A minimum of 8+ years of software development experience. A Bachelor of Science Degree in Electrical Engineering or Computer Science, a Master degree, or a PhD; or equivalent experience is required.

Posted 2 months ago

Apply

8 - 12 years

27 - 32 Lacs

Bengaluru

Work from Office

Naukri logo

Job Summary As a member of Solutions Integration Engineering you work cross-functionally to define and create engineered solutions /products which would accelerate the field adoption. We work closely with ISVs and with the startup ecosystem in the AI/ML and Gen AI, Cloud, and Virtualization domains to build solutions that matter for the customers. As a Technical Marketing Engineer, you work cross-functionally to evangelize the solution(includes creating technical content for NetApp products and solutions that are targeted to customers, Field tech teams and Partners) You will work closely with Product Management as well as Product and Solutions Engineering on the company's current and future strategies related to their specific area of technical expertise. You may also work closely with third-party vendor organizations as required to achieve corporate goals in regard to product integration and best practices. Job Requirements SME in AI & ML and virtualization domain(s), can drive the solution definition, ISV integration & qualification projects. Knowledge of artificial intelligence concepts including server/storage architecture, batch/stream processing, data warehousing, data lakes, distributed filesystems, OLTP/OLAP databases and data pipelining tools, model training, inferencing as well as RAG workflows and Data storage, virtualization, knowledge on hypervisors such as VMware ESX, Linux KVM Knowledge of Unix based operating system kernels and development environments, e.g. Linux or FreeBSD Experience with the Cloud Hyperscalers and their services (Amazon Web Services, Microsoft Azure, Google Cloud Platform) Prior experience with NetApp ONTAP is most preferred Ability to lead and work collaboratively within a business unit team and have strong influencing skills. Ability to work on complex issues where analysis of situations or data requires an in-depth evaluation and may require collaboration across multiple technical teams. Ability to communicate in a clear and concise professional manner, tailored to the appropriate audience, including both verbal and written communications Authoring technical white papers, technical reports and any other marketing collaterals and representing NetApp in different public forums Education B.E/B. Tech or M.S in Computer Science or related technical field. 8+ years of experience in Technical Marketing or related field is required.

Posted 2 months ago

Apply

5 - 10 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Your Job The Data Engineer will be a part of an global team that designs, develops, and delivers BI and Analytics solutions leveraging the latest BI and Analytics technologies for Koch Industries. Koch Industries is a privately held global organization with over 120,000 employees around the world, with subsidiaries involved in manufacturing, trading, and investments. Koch Technology Center (KTC) is being developed in India to extend its IT operations, as well as act as a hub for innovation in the IT function. As KTC rapidly scales up its operations in India, its employees will get opportunities to carve out a career path for themselves within the organization. This role will have the opportunity to join on the ground floor and will play a critical part in helping build out the Koch Technology Center (KTC) over the next several years. Working closely with global colleagues would provide significant global exposure to the employees. This role is a part of the Georgia Pacific team within the KTC. Our Team The Data Engineer will report to the Data Engineering & BI Lead of the KGS and will be responsible to develop and implement a future-state data analytics platform for both the back-end data processing and the front-end data visualization component for the Finance Data Delivery teams. This individual will be a hands on role to build ingestion pipelines and the data warehouse. What You Will Do Be part of the data team to design and build a BI and analytics solution Implement batch and near real time data movement design patterns and define best practices in data engineering. Design and develop optimal cloud data solutions (lakes, warehouses, marts, analytics) by collaborating with diverse IT teams including business analysts, project managers, architects, and developers. Work closely with a team of data engineers and data analysts to procure, blend and analyze data for quality and distribution; ensuring key elements are harmonized and modeled for effective analytics, while operating in a fluid, rapidly changing data environment Build data pipelines from a wide variety of sources Demonstrate strong conceptual, analytical, and problem-solving skills and ability to articulate ideas and technical solutions effectively to external IT partners as well as internal data team members Work with cross-functional teams, on-shore/off-shore, development/QA teams/Vendors in a matrixed environment for data delivery Backtracking, troubleshooting failures and provide fix as needed Update and maintain key data cloud solution deliverables and diagrams Ensure conformance and compliance using Georgia-Pacific data architecture guidelines and enterprise data strategic vision Who You Are (Basic Qualifications) Bachelors degree in computer science, Engineering, or related IT area with at least 5-8 years of experience in software development. Primary Skill set: SQL, PYTHON, Columnar DB( Snowflake) Secondary Skill set: Docker, Kubernetes, CI/CD At least 5+ of hands-on experience in designing, implementing, managing large-scale and ETL solutions. At least 3 years of hands-on experience in business intelligence, data modelling, data engineering, ETL, multi-dimensional data warehouses, cubes, with expertise in relevant languages and frameworks like SQL, Python etc. Hands on experience with designing and fine-tuning queries in Columnar DB(Redshift or Snowflake) Strong knowledge of Data Engineering, Data Warehousing, OLAP and database concepts Understanding of common DevSecOps/DataOps and CICD processes, methodologies, and technologies like GitLab, Terraform etc Be able to analyze large complex data sets to resolve data quality issues What Puts You Ahead: AWS certifications like Solution Architect (SAA/SAP) or Data Engineering Associate (DEA) Hands-on experience with AWS data technologies and at least one full life cycle project experience in building a data solution in AWS. Exposure to visualization tools, such as Tableau or PowerBI. Experience with OLAP technologies and Data Virtualization (using Denodo) Knowledge on manufacturing and Finance domain

Posted 2 months ago

Apply

5 - 10 years

8 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Role & responsibilities : Become a domain expert in SMB space and conduct rigorous data analysis to drive meaningful financial benefit for PayPal without jeopardizing customer experience Challenge the status quo, and drive data backed decision making Partner closely with product leaders to understand new product offerings being built and recommend the right metrics to measure the performance of those features Identify key metrics Conduct rigorous explorative data analysis Create executive-friendly info-insight packets and build business cases that drive decision making and prioritization Analyze business performance and health, triage issues, and provide recommendation on the best course solution and optimization Synthesize large volumes of data with attention to granular details and present findings and recommendations to senior-level stakeholders Collaborate with engineering and data engineering to enable feature tracking, resolve complex data and tracking issues, and build necessary data pipelines Define and cultivate best practices in analytics instrumentation and experimentation Support multiple projects at the same time in a fast-paced, results-oriented environment

Posted 2 months ago

Apply

5 - 8 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Key Responsibilities: Develop and implement custom Power BI reports and dashboards. Provide complex query and reporting support. Write complex SQL queries to extract and analyze data. Working Knowledge of Data virtualization tools(Denodo). Good Knowledge and working experience in Power bi Administration and Scheduling of Power bi Reports. Work with stakeholders to gather requirements and translate them into technical specifications. Interface with stakeholders for requirements gathering. Translate business needs into technical specifications. Write business requirements documents based on stakeholder input. Lead the development and maintenance of advanced custom Power BI reports. Support the configuration and maintenance of the reporting environment. Transform and model raw data for analysis. Plan and execute user acceptance testing. Lead system testing to ensure functionality and compliance. Build and Deployment of Power BI reports and dashboards. Handling ServiceNow requests and incidents related to Power BI reports and dashboards. Ensure systems validation and produce evidence for audits. Diagnose and resolve user requests and issues. Ensure compliance through system testing and validation. Respond to audit and inspection requests and implement corrective actions. Identify and resolve Power BI performance issues and provide architectural recommendations.

Posted 3 months ago

Apply

10 - 20 years

20 - 35 Lacs

Bengaluru

Remote

Naukri logo

Denodo SME Job Location: Hyderabad / Bangalore / Chennai / Kolkata / Noida/ Gurgaon / Pune / Indore / Mumbai 10+ years of experience in Data Virtualization, Administration and development Administration experience to include installation, configuration, security, cache management in a highly available enterprise environment. 5+ years of experience in requirement analysis, design and data service layer creation in Denodo. Good analytical and problem-solving skills for design, creation and testing of programs. Good communication skills to interact with client, team members, support personnel, and provide technical guidance and expertise to customers and management. Ability to work in a self-directed work environment. Responsibilities Provide efficient service management by resolving incidents, fulfilling service requests and change requests within SLA. Identify and work on Kaizens (continuous optimization) to improve the platform with automation and silent operations. The complete lifecycle of resources in the platform must be managed via Terraform based infra as code. All configurations must be done in accordance with the defined standards, practices (ex: CICD, approved patterns) and security requirements. Collaborate within the team and outside the team as necessary, plan the leaves as per leave policy, attend the required meetings without fail. Provide required coverage across shifts or on-call, be flexible to work across shifts. Follow agile practices, and all deliverables must be delivered within the committed sprints. Learn new technologies or tools added to the platform, support their integration and maintenance.

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies