Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a dynamic global technology company, Schaeffler's success stems from its entrepreneurial spirit and long history of private ownership. Partnering with major automobile manufacturers, as well as key players in the aerospace and industrial sectors, we offer numerous development opportunities globally. Your key responsibilities include developing data pipelines and utilizing methods and tools to collect, store, process, and analyze complex data sets for assigned operations or functions. You will design, govern, build, and operate solutions for large-scale data architectures and applications across businesses and functions. Additionally, you will manage and work hands-on with big data tools and frameworks, implement ETL tools and processes, data virtualization, and federation services. Engineering data integration pipelines and reusable data services using cross-functional data models, semantic technologies, and data integration solutions is also part of your role. You will define, implement, and apply data governance policies for all data flows of data architectures, focusing on the digital platform and data lake. Furthermore, you will define and implement policies for data ingestion, retention, lineage, access, data service API management, and usage in collaboration with data management and IT functions. To qualify for this position, you should hold a Graduate Degree in Computer Science, Applied Computer Science, or Software Engineering with 3 to 5 years of relevant experience. Emphasizing respect and valuing diverse ideas and perspectives among our global workforce is essential to us. By fostering creativity through appreciating differences, we drive innovation and contribute to sustainable value creation for our stakeholders and society as a whole. Together, we are shaping the future with innovation, offering exciting assignments and outstanding development opportunities. We eagerly anticipate your application. For technical inquiries, please contact the following email address: technical-recruiting-support-AP@schaeffler.com. For more information and to apply, visit www.schaeffler.com/careers.,
Posted 2 days ago
5.0 - 10.0 years
25 - 40 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
We are hiring Various Skills for our Client Ascendion Bangalore. Shortlisted candidates will receive interview invites from HR team. (Note: If you already received any call from different vendors or from Ascendion kindly do not apply) If you're ready to take the next step in your career, share your profile with us at hiring@radonglobaltech.com Job Title: Test Data Management Engineer (Delphix Specialist) Company: Ascendion Location: Hyderabad / Bangalore Job Type: Full-time / Contract Experience: 5-10Years Availability: Immediate to Quick Joiners Preferred Job Summary: Ascendion is seeking a skilled Test Data Management Engineer with strong hands-on experience in Delphix and test data provisioning for healthcare systems. The ideal candidate will have expertise in data masking, synthetic data generation, and a deep understanding of TDM tools. Key Responsibilities: Work extensively with Delphix TDM tools for data masking and de-identification. Handle test data provisioning activities, especially in healthcare environments. Implement synthetic data generation strategies. Align test data provisioning with stakeholder requirements and delivery roadmaps. Collaborate across teams for fast and efficient data delivery. Required Skills: 5+ years of experience with Test Data Management (TDM) tools, specifically Delphix. Proven experience in data masking, de-identification, and test data provisioning. At least 2 years of experience with synthetic data generation. Nice to have: Working knowledge of Python and .NET. Exposure to cloud platforms, CI/CD pipelines, and data integration workflows is an added advantage.
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have hands-on knowledge of Guidewire ClaimCenter. You will play a crucial role in driving quality to help build and ship better products. As an experienced individual contributor with specialized knowledge within the assigned discipline, you may manage smaller projects/initiatives. Your responsibilities will include developing and executing test plans for a single application, independently generating test data. You will participate in project-level reviews, walkthroughs, and inspections, and conduct test reviews, including test plans, requirements, cases, and automation. It will be your responsibility to raise automation opportunities to more senior employees and implement simple automation efforts. You will also be responsible for automation and Continuous Integration (CI). Working closely with the development team, you will identify defects early in the cycle through requirements analysis and code reviews. Your experience in multiple types of coding, software development, and/or using automation frameworks will be valuable. You should have experience in advanced code development, quality review software development, and automation frameworks. You should have successfully developed high-quality test strategies and test execution, recognizing test environment preparation needs. Experience in agile and waterfall testing methodologies and tools, unit and integration testing, and data virtualization tools is required. You should have experience overseeing the coding, testing, and review process for unit and integration testing. You should have successfully ensured the quality of one or more application codebases and alignment with development standards. Experience in building automation frameworks, acceptance and integration test automation scripts, and integrating with other tools is necessary. Experience testing across a variety of platforms, such as APIs (SOAP), and with tools like Selenium WebDriver and Soap UI is also expected.,
Posted 2 weeks ago
0.0 - 1.0 years
3 - 4 Lacs
Pune
Work from Office
Working with data cleaning, processing, and analyzing it to identify patterns, trends, and insights Data collection, data quality assessment data modeling developing reports and dashboards and communicating findings to stake holders A data analyst
Posted 4 weeks ago
7.0 - 9.0 years
7 - 9 Lacs
Hyderabad, Telangana, India
On-site
Roles & Responsibilities: Design and buildscalable enterprise analytics solutions using Databricks, Power BI, and other modern data tools. Leverage data virtualization, ETL, and semantic layers to balance need for unification, performance, and data transformation with goal to reduce data proliferation Break downfeatures into work that aligns with the architectural directionrunway Participate hands-on in pilots and proofs-of-concept for new patterns Create robust documentationfrom data analysis and profiling, and proposed designs and data logic Develop advancedsqlqueries to profile, and unify data Develop data processing code insql, along with semantic views to prepare data for reporting DevelopPowerBIModels and reporting packages Design robust data models, and processing layers,that support both analytical processing and operational reporting needs. Design and develop solutions based onbest practices for data governance, security, and compliance within Databricks and Power BI environments. Ensure the integration of data systems with other enterprise applications, creating seamless data flows across platforms. Develop andmaintainPower BI solutions, ensuring data models and reports areoptimizedfor performance and scalability. Collaborate with stakeholders to define data requirements,functionalspecifications, and project goals. Continuously evaluate and adoptnew technologiesand methodologies to enhance the architecture and performance of data solutions. Basic Qualifications and Experience: Master s degree with1 to 3years of experience inDataEngineeringOR Bachelor s degree with4 to 5years of experience inData Engineering Diploma and 7 to 9 years of experience in Data Engineering. Functional Skills: Must-Have Skills: Minimum of3years of hands-on experience withBI solutions (Preferrable Power BI or Business Objects)including report development, dashboard creation, and optimization. Minimum of3years ofhands-onexperiencebuilding Change-data-capture (CDC) ETL pipelines, data warehouse design and build,and enterprise-level data management. Hands-on experience with Databricks, including data engineering, optimization, and analytics workloads. Deep understanding of Power BI, includingmodel design, DAX, and Power Query. Proven experience designing and implementing data mastering solutions and data governance frameworks. Expertisein cloud platforms (AWS), data lakes, and data warehouses. Strong knowledge of ETL processes, data pipelines, and integration technologies. Strong communicationand collaboration skills to work with cross-functional teams and senior leadership. Ability to assess business needs and design solutions that align with organizational goals. Exceptional hands-oncapabilitieswith data profiling, data transformation, data mastering Success in mentoringand training team members Good-to-Have Skills: Experience in developing differentiated and deliverable solutions Experiencewith human data, ideally human healthcare data Familiarity with laboratory testing, patient data from clinical care, HL7, FHIR, and/or clinical trial data management Professional Certifications(please mention if the certification is preferred or mandatory for the role): ITIL Foundation or other relevant certifications(preferred) SAFeAgile Practitioner (6.0) Microsoft Certified: Data Analyst Associate (Power BI) or related certification. Databricks Certified Professional or similar certification. Soft Skills: Excellent analytical and troubleshooting skills Deep intellectual curiosity Highestdegree of initiative and self-motivation Strong verbal and written communication skills, includingpresentationtovaried audiencesofcomplextechnical/business topics Confidencetechnical leader Ability to work effectively with global, virtual teams, specifically including leveraging of tools and artifacts to assure clear and efficient collaboration acrosstime zones Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong problem solving, analytical skills; Ability to learn quickly andretainand synthesize complex information from diverse sources
Posted 1 month ago
3.0 - 5.0 years
5 - 8 Lacs
Bengaluru
Work from Office
Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Qlik Attunity QlikView Attunity Qlik sense Attunity Qlik Replicate Preferred Skills: Technology->Business Intelligence - Visualization->Qlikview Technology->Business Intelligence - Data Virtualization->Qliksense Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit * Location of posting is subject to business requirements
Posted 1 month ago
6.0 - 8.0 years
5 - 7 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Skills, Knowledge & Experience Strong Working Knowledges in IBM Db2 LUW Replication technologies (Db2 SQL replication and Q Replication, a Queue -based Replication) as well as Using Third party tools for Replications is mandatory Experience in Installation and Configurations of Huge IBM DB2 LUW systems on Production and Nonproduction Systems on UNIX (Linux/AIX) environments Experience with Performance Tuning and Optimization (PTO), using native monitoring and troubleshooting tools. Experience with backups, restores and recovery models and implementation of Backup strategies mandatory including RTO and RPO. Strong knowledge of Clustering, High Availability (HA/HADR) and Disaster Recovery (DR) options for DB2 LUW . Strong knowledge of Data Encryption (at rest and in transit) for DB2 LUW . Strong Proven working knowledge on Db2 tools. Explain plan, Db2 reorg, Db2 run stats. Strong knowledge of DB2 SQL and Sourced Stored Procedures . Knowledge of Toad for DB2 and IBM Client tools . Strong knowledge of Linux and DB2 User Access Security, Groups and Roles. Experience with Data Virtualization. Strong knowledge of Linux and DB2 User Access Security, Groups and Roles. Experience with Database Design . Experience with Autosys Workload Automation. Experience with MS PowerShell, Bash, and VB Script etc. Working Experience in Cloud environment especially in GCP, IBM Cloud, Azure Cloud is Big Plus. Knowledge on IBM Maximo Application suite installation end to end process.(Big Plus)
Posted 1 month ago
3 - 8 years
5 - 10 Lacs
Mumbai
Work from Office
We are looking for a VMware Certified Instructor (VCI) to join our training delivery team. The ideal candidate should be passionate about teaching, technically sound in VMware technologies, and possess strong communication skills to deliver instructor-led training to our enterprise and individual clients. Must be a VMware Certified Instructor (VCI) with valid credentials. Experience in delivering VMware trainings in both physical and virtual environments. Strong communication, presentation, and mentoring skills. Ability to manage classroom dynamics and ensure effective knowledge transfer. Prior corporate or academic training experience is a plus. VMware, vSphere, NSX, vSAN, Certified Instructor, Technical Training, Virtualization, Online & Classroom Delivery
Posted 2 months ago
12 - 22 years
35 - 65 Lacs
Chennai
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 - 24 Yrs Location- Pan India Job Description : - Candidates should have minimum 2 Years hands on experience as Azure databricks Architect If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in With Regards, Sankar G Sr. Executive - IT Recruitment
Posted 2 months ago
12 - 22 years
35 - 60 Lacs
Chennai
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 - 24 Yrs Location- Pan India Job Description : - The Data Modeler will be responsible for the design, development, and maintenance of data models and standards for Enterprise data platforms. Build dimensional data models applying best practices and providing business insights. Build data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Identify business needs and translate business requirements to Conceptual, Logical, Physical and semantic, multi-dimensional (star, snowflake), normalized/denormalized, Data Vault2.0 model for the project. Knowledge of snowflake and dbt is added advantage. Create and maintain the Source to Target Data Mapping document for this project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Develop best practices for standard naming conventions and coding practices to ensure consistency of data models. Gather and Publish Data Dictionaries: Maintain data models and capture data models from existing databases and record descriptive information. Work with the Development team to implement data strategies, build data flows and develop conceptual data models. Create logical and physical data models using best practices to ensure high data quality and reduced redundancy. Optimize and update logical and physical data models to support new and existing projects. Data profiling, business domain modeling, logical data modeling, physical dimensional data modeling and design. Data design and performance optimization for large Data Warehouse solutions. understanding data - profile and analysis (metadata (formats, definitions, valid values, boundaries), relationship/usage) Create relational and dimensional structures for large (multi-terabyte) operational, analytical, warehouse and BI systems. Should be good in verbal and written communication. If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in With Regards, Sankar G Sr. Executive - IT Recruitment
Posted 2 months ago
12 - 22 years
35 - 60 Lacs
Kolkata
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 - 24 Yrs Location- Pan India Job Description : - The Data Modeler will be responsible for the design, development, and maintenance of data models and standards for Enterprise data platforms. Build dimensional data models applying best practices and providing business insights. Build data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Identify business needs and translate business requirements to Conceptual, Logical, Physical and semantic, multi-dimensional (star, snowflake), normalized/denormalized, Data Vault2.0 model for the project. Knowledge of snowflake and dbt is added advantage. Create and maintain the Source to Target Data Mapping document for this project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Develop best practices for standard naming conventions and coding practices to ensure consistency of data models. Gather and Publish Data Dictionaries: Maintain data models and capture data models from existing databases and record descriptive information. Work with the Development team to implement data strategies, build data flows and develop conceptual data models. Create logical and physical data models using best practices to ensure high data quality and reduced redundancy. Optimize and update logical and physical data models to support new and existing projects. Data profiling, business domain modeling, logical data modeling, physical dimensional data modeling and design. Data design and performance optimization for large Data Warehouse solutions. understanding data - profile and analysis (metadata (formats, definitions, valid values, boundaries), relationship/usage) Create relational and dimensional structures for large (multi-terabyte) operational, analytical, warehouse and BI systems. Should be good in verbal and written communication. If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in With Regards, Sankar G Sr. Executive - IT Recruitment
Posted 2 months ago
12 - 22 years
35 - 60 Lacs
Noida
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 - 24 Yrs Location- Pan India Job Description : - The Data Modeler will be responsible for the design, development, and maintenance of data models and standards for Enterprise data platforms. Build dimensional data models applying best practices and providing business insights. Build data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Identify business needs and translate business requirements to Conceptual, Logical, Physical and semantic, multi-dimensional (star, snowflake), normalized/denormalized, Data Vault2.0 model for the project. Knowledge of snowflake and dbt is added advantage. Create and maintain the Source to Target Data Mapping document for this project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Develop best practices for standard naming conventions and coding practices to ensure consistency of data models. Gather and Publish Data Dictionaries: Maintain data models and capture data models from existing databases and record descriptive information. Work with the Development team to implement data strategies, build data flows and develop conceptual data models. Create logical and physical data models using best practices to ensure high data quality and reduced redundancy. Optimize and update logical and physical data models to support new and existing projects. Data profiling, business domain modeling, logical data modeling, physical dimensional data modeling and design. Data design and performance optimization for large Data Warehouse solutions. understanding data - profile and analysis (metadata (formats, definitions, valid values, boundaries), relationship/usage) Create relational and dimensional structures for large (multi-terabyte) operational, analytical, warehouse and BI systems. Should be good in verbal and written communication. If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in With Regards, Sankar G Sr. Executive - IT Recruitment
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough