Jobs
Interviews

315 Data Profiling Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 7.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : SAP BusinessObjects Data Services Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Lead, you will develop and configure software systems, either end-to-end or for specific stages of the product lifecycle. Your typical day will involve collaborating with various teams to ensure the successful implementation of software solutions, applying your knowledge of technologies and methodologies to support projects and clients effectively. You will engage in problem-solving activities, guiding your team through challenges while ensuring that project goals are met efficiently and effectively. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with strategic objectives. Professional & Technical Skills: - Must To Have Skills: Candidate should have minimum 2 S/4 HANA Implementation project experience.- Proficiency in SAP BusinessObjects Data Services.- Strong understanding of data integration and ETL processes.- Experience with data quality management and data profiling.- Familiarity with database management systems and SQL.- Ability to troubleshoot and resolve technical issues efficiently. Additional Information:- The candidate should have minimum 5 years of experience in SAP BusinessObjects Data Services.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

2.0 - 5.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : SAP Data Services Development Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. A typical day involves collaborating with various stakeholders to gather and analyze requirements, creating design specifications, and ensuring that the applications align with business objectives. You will also engage in discussions with team members to refine designs and troubleshoot any issues that arise during the development process, ensuring a smooth workflow and timely delivery of projects. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with business goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Data Services Development.- Strong understanding of data integration techniques and ETL processes.- Experience with data quality management and data profiling.- Familiarity with database management systems and SQL.- Ability to design and implement data workflows and transformations. Additional Information:- The candidate should have minimum 5 years of experience in SAP Data Services Development.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

12.0 - 17.0 years

12 - 17 Lacs

Pune

Work from Office

Role Overview: The Technical Architect specializing in Data Governance and Master Data Management (MDM) designs, implements, and optimizes enterprise data solutions. The jobholder has expertise in tools like Collibra, Informatica, InfoSphere, Reltio, and other MDM platforms, ensuring data quality, compliance, and governance across the organization. Responsibilities: Architect and optimize strategies for data quality, metadata management, and data stewardship. Design and implement data governance frameworks and MDM solutions using tools like Collibra, Informatica, InfoSphere, and Reltio. Develop strategies for data quality, metadata management, and data stewardship. Collaborate with cross-functional teams to integrate MDM solutions with existing systems. Establish best practices for data governance, security, and compliance. Monitor and troubleshoot MDM environments for performance and reliability. Provide technical leadership and guidance to data teams. Stay updated on advancements in data governance and MDM technologies. Key Technical Skills & Responsibilities Overall 12+Yrs of Experience 10+ years of experience working on DG/MDM projects Strong on Data Governance concepts Hands-on different DG tools/services Hands-on Reference Data, Taxonomy Strong understanding of Data Governance, Data Quality, Data Profiling, Data Standards, Regulations, Security Match and Merge strategy Design and implement the MDM Architecture and Data Models Usage of Spark capabilities Statistics to deduce meanings from vast enterprise level data Different data visualization means of analyzing huge data sets Good to have knowledge of Python/R/Scala languages Experience on DG on-premise and on-cloud Understanding of MDM, Customer, Product, Vendor Domains and related artifacts Experience of working on proposals, customer workshops, assessments etc is preferred Must have good communication and presentation skills Technology Stack - Collibra, IBM MDM, Reltio, Infosphere Eligibility Criteria: Bachelors degree in Computer Science, Data Management, or a related field. Proven experience as a Technical Architect in Data Governance and MDM. Certifications in relevant MDM tools (e.g., Collibra Data Governance, Informatica / InfoSphere / Reltio MDM, ). Experience with cloud platforms like AWS, Azure, or GCP. Proficiency in tools like Collibra, Informatica, InfoSphere, Reltio, and similar platforms. Strong understanding of data modeling, ETL/ELT processes, and cloud integration. Excellent problem-solving and communication skills.

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Data and Solution Architect at our company, you will play a crucial role in participating in requirements definition, analysis, and designing logical and physical data models for various data models such as Dimensional Data Model, NoSQL, or Graph Data Model. You will lead data discovery discussions with the Business in Joint Application Design (JAD) sessions and translate business requirements into logical and physical data modeling solutions. It will be your responsibility to conduct data model reviews with project team members and capture technical metadata using data modeling tools. Your expertise will be essential in ensuring that the database designs efficiently support Business Intelligence (BI) and end-user requirements. You will collaborate closely with ETL/Data Engineering teams to create data process pipelines for data ingestion and transformation. Additionally, you will work with Data Architects for data model management, documentation, and version control. Staying updated with industry trends and standards will be crucial in driving continual improvement and enhancement of existing systems. To excel in this role, you must possess strong data analysis and data profiling skills. Your experience in conceptual, logical, and physical data modeling for Very Large Database (VLDB) Data Warehouse and Graph DB will be highly valuable. Hands-on experience with modeling tools like ERWIN or other industry-standard tools is required. Proficiency in both normalized and dimensional model disciplines and techniques is essential. A minimum of 3 years" experience in Oracle Database along with hands-on experience in Oracle SQL, PL/SQL, or Cypher is expected. Exposure to tools such as Databricks Spark, Delta Technologies, Informatica ETL, and other industry-leading tools will be beneficial. Good knowledge or experience with AWS Redshift and Graph DB design and management is desired. Working knowledge of AWS Cloud technologies, particularly on services like VPC, EC2, S3, DMS, and Glue, will be advantageous. You should hold a Bachelor's degree in Software Engineering, Computer Science, or Information Systems (or equivalent experience). Excellent verbal and written communication skills are necessary, including the ability to describe complex technical concepts in relatable terms. Your ability to manage and prioritize multiple workstreams confidently and make decisions about prioritization will be crucial. A data-driven mentality, self-motivation, responsibility, conscientiousness, and detail-oriented approach are highly valued. In terms of education and experience, a Bachelor's degree in Computer Science, Engineering, or relevant fields along with 3+ years of experience as a Data and Solution Architect supporting Enterprise Data and Integration Applications or a similar role for large-scale enterprise solutions is required. You should have at least 3 years of experience in Big Data Infrastructure and tuning experience in Lakehouse Data Ecosystem, including Data Lake, Data Warehouses, and Graph DB. Possessing AWS Solutions Architect Professional Level certifications will be advantageous. Extensive experience in data analysis on critical enterprise systems like SAP, E1, Mainframe ERP, SFDC, Adobe Platform, and eCommerce systems is preferred. If you are someone who thrives in a dynamic environment and enjoys collaborating with enthusiastic individuals, this role is perfect for you. Join our team and be a part of our exciting journey towards innovation and excellence!,

Posted 6 days ago

Apply

12.0 - 16.0 years

0 Lacs

pune, maharashtra

On-site

The Data Engineering Technology Lead position is a senior role responsible for establishing and implementing new or revised data platform ecosystems and programs in coordination with the Technology team. Your primary objective is to lead the data engineering team in implementing business requirements. Your responsibilities will include designing, building, and maintaining batch or real-time data pipelines in the data platform, as well as optimizing the data infrastructure for accurate extraction, transformation, and loading of data from various sources. You will develop ETL processes to extract and manipulate data, automate data workflows, and prepare data in Data Warehouses for stakeholders. Collaboration with data scientists and functional leaders to deploy machine learning models and building data products for analytics teams will be essential. Ensuring data accuracy, integrity, privacy, security, and compliance through quality control procedures, monitoring data system performance, and implementing optimization strategies are also part of your role. You will partner with management teams to integrate functions, identify necessary system enhancements, and resolve high impact problems. In addition, you will provide expertise in applications programming, ensure application design aligns with architecture, and develop knowledge of integrating business areas to accomplish goals. Qualifications for this position include 12+ years of experience in a data engineering role, problem-solving skills, leadership abilities, service orientation, and the ability to work in a fast-paced environment. Proficiency in technical tools, interpersonal skills, and a Bachelor's/University degree (Master's preferred) are also required. The position encompasses two key responsibilities: 1. Data Engineering: - Building Data Pipelines: Creating systems for collecting, storing, and transforming data from various sources. - Data Collection and Management: Gathering data, ensuring quality, and making it accessible for analysis. - Data Transformation: Converting raw data into usable formats using ETL processes for analysis and reporting. 2. Data Governance and Compliance: - Documentation Data Lineage: Documenting data requirements for data governance within Citi Information Security Office. - Data Models and Flow Diagrams: Implementing data flow diagrams to understand data movement and conducting gap analysis for remediation. - Data Models Understanding: Translating business needs into logical and physical data models to ensure data integrity and consistency. - Data Analysis and Profiling: Analyzing data sources, identifying data issues, and ensuring data quality. This role falls under the Technology job family group and Applications Support job family, and it is a full-time position at Citi. If you need accommodations due to disability, refer to the Accessibility at Citi policy. Review Citis EEO Policy Statement and the Know Your Rights poster for further information.,

Posted 6 days ago

Apply

0.0 - 1.0 years

1 - 2 Lacs

Hassan

Work from Office

What we are looking for. Work from home who is fluent in English and working experience with Microsoft word.

Posted 6 days ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Change Management Specialist at Standard Chartered Bank, you will play a crucial role in ensuring that all required changes from stakeholders are effectively delivered by following the Software Development Life Cycle (SDLC) and Governance processes. Your primary responsibility will be to own the change delivery for respective portfolios and liaise with geographically dispersed stakeholders from Business and Technology teams to ensure smooth delivery of strategic changes for SCB. Your key responsibilities will include driving end-to-end Data Modelling based on new product functionality, delivering Changes/Projects related to Capital Management and Regulatory Reporting areas, conducting business analysis and impact analysis, as well as driving Data Sourcing, Data Profiling, and Business Transformation Logics activities. You will need to have a strong understanding of Capital reporting related business domains and banking products, prepare various documentation including Business Requirement Document and Test Strategy, and perform User Acceptance Testing. Additionally, you will be responsible for managing stakeholders across business functions and domains, coordinating with all business and technological stakeholders, developing domain content in banking products, ensuring compliance with rules and regulations, and reviewing key controls to ensure operational risk policy framework compliance. It will be essential for you to uphold the Values of the Group and Company, comply with applicable laws and regulations, and embed the highest standards of ethics across Standard Chartered Bank. In terms of qualifications, you are required to have an MBA (Finance), ICWA, CA, MBA (Banking) from a reputable institute, and be FRM certified. Proficiency in Confluence/PM tools and MS suite of applications is also necessary for this role. Standard Chartered Bank is an international bank dedicated to making a positive difference for clients, communities, and employees. If you are looking for a purposeful career and want to work for a bank that values diversity and inclusion, we encourage you to apply. At Standard Chartered, we celebrate unique talents and advocate for inclusion, striving to drive commerce and prosperity through our diverse workforce. We offer core bank funding for retirement savings, medical and life insurance, flexible working options, proactive wellbeing support, continuous learning opportunities, and a values-driven culture that embraces diversity and inclusion. Join us to be part of a team that challenges the status quo, seeks new opportunities for growth, and works together to make a difference. For more information on career opportunities at Standard Chartered Bank, please visit www.sc.com/careers.,

Posted 1 week ago

Apply

1.0 - 6.0 years

14 - 16 Lacs

Bengaluru

Work from Office

KPMG India is looking for Associate consultant- Data Governance Associate consultant- Data Governance to join our dynamic team and embark on a rewarding career journey Undertake short-term or long-term projects to address a variety of issues and needs Meet with management or appropriate staff to understand their requirements Use interviews, surveys etc. to collect necessary data Conduct situational and data analysis to identify and understand a problem or issue Present and explain findings to appropriate executives Provide advice or suggestions for improvement according to objectives Formulate plans to implement recommendations and overcome objections Arrange for or provide training to people affected by change Evaluate the situation periodically and make adjustments when needed Replenish knowledge of industry, products and field

Posted 1 week ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

ECMS Req # 531483 Number of Openings 1 Duration of contract 6 No of years experience 4 to 8 Yrs Detailed job description - Skill Set: Strong SQL and data transformation skills Strong experience in Test data management and SQL. Understanding of ETL/ELT process fundamentals Experience in TDM data masking using Delphix. Experience in masking VSAM files Experience in performing various TDM related activities which includes Data profiling/data discovery, writing custom data masking algorithms and perform data masking Mandatory Skills Data masking using Delphix and VSAM file masking Vendor Billing range (share in local currency) 10500 INR/day Work Location HYD/BANG/PUNE/CHENNAI/MYS Notice Period 2 weeks Client Interview / F2F Applicable Yes Background check process to be followed: Before/After Onboarding based on client Before onboarding / After onboarding: BGV Agency:

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

BlitzenX is looking for an experienced ClaimCenter Data Migration Developer to take charge of data transformation and ingestion projects for large-scale legacy-to-cloud ClaimCenter migrations. The ideal candidate should have substantial experience in Guidewire Cloud migrations and should possess a current certification on the Las Lenas release. As a ClaimCenter Data Migration Developer at BlitzenX, your responsibilities will include architecting and implementing end-to-end data migration pipelines, translating complex legacy schemas into ClaimCenter entity models, optimizing performance and integrity through custom scripting, and developing validation tools for data reconciliation. You will also be expected to execute mock migrations, work closely with QA and business teams for migration testing, and ensure compliance with data governance standards like GDPR and SOC 2. To be successful in this role, you must have at least 6 years of Guidewire development experience, with a minimum of 3 years in Guidewire ClaimCenter data migration. You should also have proven expertise in ClaimCenter data structures, Guidewire Cloud integrations, and the ETL lifecycle. Strong programming skills in Gosu, Java, and SQL are essential, along with a deep understanding of bulk load strategies within cloud platforms. A mandatory requirement for this position is the Guidewire Certified Associate - ClaimCenter on Las Lenas certification. Soft skills such as extreme attention to detail, the ability to work under pressure, strong communication skills, self-motivation, and a collaborative mindset are also highly valued. This role at BlitzenX offers a significant career growth opportunity, with the possibility of advancing to Lead Developer and Migration Architect roles supporting global ClaimCenter cloud initiatives as part of the Vision 2030 Insurance Technology growth track. If you are ready to take on the challenge and contribute to cutting-edge data migration projects, we encourage you to apply and be a part of our high-performance Agile team.,

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Educational Requirements Bachelor of Engineering Service Line Infosys Quality Engineering Responsibilities As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems . If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Job Opening is for multiple locationsBANGALORE, BHUBANESWAR, MYSORE, HYD, CHENNAI, PUNE, COIMBATORE, THIRUVANANTHAPURAMPlease apply only if you have skills mentioned under technical requirement Technical and Professional Requirements: Skills Required : Strong understanding of MDM concepts, data governance, and data stewardship. Proficiency in SQL and data validation techniques. Experience with one or more MDM platforms (Informatica, Reltio, SAP MDG, etc.). Familiarity with data modeling, data profiling, and data quality tools. Knowledge of integration technologies (ETL, APIs, messaging systems). Preferred Skills: Technology-Data Services Testing-MDM Testing

Posted 1 week ago

Apply

2.0 - 7.0 years

13 - 18 Lacs

Bengaluru

Work from Office

Educational Requirements Master Of Business Adm.,Master Of Commerce,Master Of Engineering,Master Of Technology,Master of Technology (Integrated),Bachelor Of Business Adm.,Bachelor Of Commerce,Bachelor of Engineering,Bachelor Of Technology,Bachelor Of Technology (Integrated) Service Line Enterprise Package Application Services Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to work with clients to identify business challenges and contribute to client deliverables by refining, analyzing, and structuring relevant data Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Technical and Professional Requirements: At least 2 years of configuration & development experience with respect to implementation of OFSAA Solutions (such as ERM, EPM, etc.) Expertise in implementing OFSAA Technical areas covering OFSAAI and frameworks Data Integrator, Metadata Management, Data Modelling. Perform and understand data mapping from Source systems to OFSAA staging; Execution of OFSAA batches and analyses result area tables and derived entities. Perform data analysis using OFSAA Metadata (i.e Technical Metadata, Rule Metadata, Business Metadata) and identify if any data mapping gap and report them to stakeholders. Participate in requirements workshops, help in implementation of designed solution, testing (UT, SIT), coordinate user acceptance testing etc. Knowledge and experience with full SDLC lifecycle Experience with Lean / Agile development methodologies Preferred Skills: Technology-Oracle Industry Solutions-Oracle Financial Services Analytical Applications (OFSAA)

Posted 1 week ago

Apply

4.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

As a senior SAP Consultant, you will serve as a client-facing practitioner working collaboratively with clients to deliver high-quality solutions and be a trusted business advisor with deep understanding of SAP Accelerate delivery methodology or equivalent and associated work products. You will work on projects that assist clients in integrating strategy, process, technology, and information to enhance effectiveness, reduce costs, and improve profit and shareholder value. There are opportunities for you to acquire new skills, work across different disciplines, take on new challenges, and develop a comprehensive understanding of various industries. Your primary responsibilities include: Strategic SAP Solution FocusWorking across technical design, development, and implementation of SAP solutions for simplicity, amplification, and maintainability that meet client needs. Comprehensive Solution DeliveryInvolvement in strategy development and solution implementation, leveraging your knowledge of SAP and working with the latest technologies Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 12 years of experience required, technical Knowledge and Experience in working with SAP Data Management Tools like Data Services, Cockpit, MDG, HANA EIM. Hands on MDG configuration experience, configuration related to customer and core STE processes Experience in IM Architecture, Data Migrations, Data Profiling and Data quality Implementation experience of MDG in key domains such as Finance, Customer, Supplier, Material & Business Partners Experience in implementation, development or configuration on one or more of the following solutions from Data Management Suite-SAP Data Services / SAP MDG / Migration Cockpit / HANA EIM SDI. Experience to write scripts and complex SQL statements Preferred technical and professional experience Experience in Data Migration methodologies, specifically around legacy to SAP migration using solutions like Data Services will be nice to have. Implementation experience and knowledge in at least two of these areas would be an added advantage Master Data Management

Posted 1 week ago

Apply

2.0 - 6.0 years

6 - 11 Lacs

Coimbatore

Work from Office

As a managing consultant, you will serve as a client-facing practitioner working collaboratively with clients to deliver high-quality solutions and be a trusted business advisor with deep understanding of SAP Accelerate delivery methodology or equivalent and associated work products. You will lead design workshops, support business development activities and mentor and coach team members to develop their skills and knowledge. There are opportunities for you to acquire new skills, work across different disciplines, take on new challenges, and develop a comprehensive understanding of various industries. There are opportunities for you to acquire new skills, work across different disciplines, take on new challenges, and develop a comprehensive understanding of various industries. Your primary responsibilities include: Strategic SAP Solution LeadershipLeading the technical design, development, and implementation of SAP solutions for simplicity, amplification, and maintainability that meet client needs. Team Delivery leadershipLead and manage high performing team of SAP consultants to deliver work products on time, budget, and quality. Comprehensive Solution DeliveryInvolvement in strategy development and solution implementation, leveraging your functional expertise of SAP with clients and team members and working with the latest technologies Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall, 5 - 12 years of relevant experience in SAP BODS/BOIS/SDI/SDQ and 3+ Years of SAP functional experience specializing in design and configuration of SAP BODS/HANA SDI modules. Experience in gathering business requirements and should be able to create requirement specifications based on Architecture/Design/Detailing of Processes. Should be able to prepare mapping sheet combining his/her Functional and technical expertise. All BODS Consultant should primarily have Data migration experience from Different Legacy Systems to SAP or Non-SAP systems. Data Migration experience from SAP ECC to S/4HANA using Migration Cockpit or any other methods. In addition to Data Migration experience, Consultant should have experience or Strong knowledge on BOIS (BO Information Steward) for data Profiling or Data Governance Preferred technical and professional experience Having BODS Admin experience/Knowledge. Having working or strong Knowledge of SAP DATA HUB. Experience/Strong knowledge of HANA SDI (Smart data Integration) to use this as an ETL and should be able to develop flow graphs to Validate/Transform data. Consultant should Develop Workflows, Data flows based on the specifications using various stages in BODS

Posted 1 week ago

Apply

5.0 - 7.0 years

3 - 7 Lacs

Gurugram

Work from Office

About Role : A NASDAQ-listed company that has effectively maintained its position as the front-runner technology sector, is looking to onboard a skilled Python developer keen is helping them expand the power of AI in the mobile coding ecosystem with the ultimate goal of creating new assisting AI-powered tools for the development domain. The company is developing the next generation of coding assisting agents, which will have a wide range of uses. For those who are keen to learn in a fast-paced setting, this is an exciting opportunity. Responsibilities : - Navigate and modify complex Rust codebases using CLI tools like grep and ripgrep. - Implement new features with a focus on memory safety, ownership rules, and type correctness. - Write and execute tests using cargo test, including property-based testing (proptest or quickcheck). - Refactor existing Rust code while maintaining functionality and performance. - Debug and fix memory safety, ownership, and concurrency-related issues. - Set up and manage Rust development environments using cargo, including handling dependencies and feature flags. - Ensure best practices in Rust development, including proper error handling, concurrency safety, and efficient memory usage. Requirements : - Strong experience with Rust programming language concepts, including ownership, borrowing, and lifetimes. - Familiarity with Rust frameworks like Tokio, Actix, Rocket and libraries such as Serde and Rayon. - Experience with Rust's testing ecosystem, including unit, integration, and property-based testing. - Knowledge of multi-threading and asynchronous programming in Rust. - Ability to work with complex architectural patterns and refactor code without introducing regressions. - Strong debugging skills, including fixing memory and concurrency issues. - Experience with performance profiling and benchmarking in Rust (cargo bench). - 4+ years of work experience This role provides an opportunity to work on challenging Rust engineering problems while improving AI-assisted programming workflows. If you're passionate about Rust and eager to push the boundaries of AI-driven software development, we'd love to hear from you! Nice to Have : - Experience contributing to open-source Rust projects. - Familiarity with writing Rust documentation and designing APIs with doc-tests. - Search Guidance Mandatory Skills : Rust - 3 yrs, Rust Frameworks(Tokio, Actix, Rocket) - 3 yrs, Libraries(Serde and Rayon) - 3 years Nice To Have : Rust documentation and designing APIs with doc-tests. Total Years Of exp : 5+ years experience Overlap Required : 4 hours Pst

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

punjab

On-site

The Data Quality Specialist role in Melbourne requires the following qualifications and experience: - A tertiary qualification in I.T. or related discipline is a must. - Minimum of 5 years experience in a similar data quality management role is required. - Working knowledge and practical experience in SQL and associated tools are necessary. Experience with relational databases and big data technologies is preferred. - Proficiency in methods and tools used for data profiling and data analysis is essential. - Ability to interpret business processes and relate them to application workflow and logical and physical data structures is required. - Understanding of business rules and their enforcement in determining data quality outcomes is crucial. - Knowledge and experience in industry standard data governance and data management guidelines and practices are expected. - Working knowledge of data visualization tools to create dashboards and scorecards is beneficial. - Excellent communication skills, self-motivation, and the ability to work in a team as well as take individual responsibility are important qualities for this role. - Experience in the Health Insurance and Health Services industries is an advantage.,

Posted 1 week ago

Apply

1.0 - 5.0 years

0 Lacs

karnataka

On-site

4CRisk is an AI start-up uniquely positioned to identify and solve the annual $300 billion Risk and Compliance problem for banks, non-bank financial institutions, and FinTech companies. The company's mission is to help customers protect brand value and strengthen revenues by reducing risk and the cost of compliance. At 4CRisk, technology, data, UI, and products have all been conceived with a customer-centric approach, believing that Culture trumps aptitude. Our Engineering center (4CRisk.ai Software Private Ltd.) in Bangalore, India is seeking bright and passionate candidates who share our vision and wish to be part of a team of talented professionals. We are looking for a Data Quality Analyst to utilize regulatory data to drive product decisions. Collaborating with cross-functional teams comprising product managers, designers, and engineers, you will apply your expertise to deliver customer insights and help shape the products we offer. Leveraging rich user data through cutting-edge technology, you will witness your insights being transformed into real products. Key Responsibilities: - Performing statistical tests on large datasets to determine data quality and integrity. - Evaluating system performance and design and its impact on data quality. - Collaborating with AI and Data Engineers to enhance data collection and storage processes. - Running data queries to identify quality issues, data exceptions, and cleaning data. - Gathering data from primary or secondary sources to identify and interpret trends. - Reporting data analysis findings to management for informed business decisions and prioritizing information system needs. - Documenting processes and maintaining data records. - Adhering to best practices in data analysis and collection. - Staying updated on developments and trends in data quality analysis. Required Experience/Skills: - Data Quality analysis experience is a must, including root-cause analysis and data slicing. - Designing, building, and executing data quality plans for complex data management solutions on modern data processing frameworks. - Understanding data lineage and preparing validation cases to verify data at each stage of the data processing journey. - Planning, designing, and conducting validations of data-related implementations to achieve acceptable results. - Developing dataset creation scripts for data verification during extraction, transformation, and loading phases by validating data mapping and transformation rules. - Supporting AI and Product Management teams by contributing to the development of a data validation strategy focusing on building the regression suite. - Documenting issues and collaborating with data engineers to resolve issues and ensure quality standards. - Efficiently capturing business requirements and translating them into functional, non-functional, and semantic specifications. - Data Profiling, Data Modeling, and Data Validation Testing experience is a plus. - 1 to 3+ years of proven experience. - Excellent presentation, communication (oral and written) in English, and relationship-building skills across all management levels and customer interactions. - Ability to collaborate with team members globally and across departments. Location: Bangalore, India.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You will be responsible for designing, developing, and optimizing data models within the Celonis Execution Management System (EMS). Your duties will include extracting, transforming, and loading (ETL) data from flat files and UDP into Celonis. It is essential to work closely with business stakeholders and data analysts to understand data requirements and ensure an accurate representation of business processes. Additionally, you will be required to develop and optimize PQL (Process Query Language) queries for process mining. Collaboration with group data engineers, architects, and analysts is crucial to ensure high-quality data pipelines and scalable solutions. Data validation, cleansing, and transformation will also be part of your responsibilities to enhance data quality. Monitoring and troubleshooting data integration pipelines to ensure performance and reliability are key tasks. You will also provide guidance and best practices for data modeling in Celonis. To qualify for this role, you should have a minimum of 5 years of experience in data engineering, data modeling, or related roles. Proficiency in SQL, ETL processes, and database management (e.g., PostgreSQL, Snowflake, BigQuery, or similar) is required. Experience working with large-scale datasets and optimizing data models for performance is essential. Your data management experience must cover the data lifecycle and critical functions such as data profiling, data modeling, data engineering, and data consumption products and services. Strong problem-solving skills are necessary, along with the ability to work in an agile, fast-paced environment. Excellent communication skills and demonstrated hands-on experience in communicating technical topics with non-technical audiences are expected. You should be able to effectively collaborate and manage the timely completion of assigned activities while working in a highly virtual team environment. Excellent collaboration skills to work with cross-functional teams will also be essential for this role.,

Posted 1 week ago

Apply

1.0 - 3.0 years

3 - 6 Lacs

Bengaluru

Work from Office

[{"Salary":"4500000" , "Remote_Job":false , "Posting_Title":"Data Story Teller" , "Is_Locked":false , "City":"Bangalore" , "Industry":"Technology" , "Job_Description":" Were looking for a **Data Storyteller/UX Analyst**who can bridge the gap between data and decision-making. Youll work withanalysts, business leaders, and product teams to craft compelling stories fromcomplex data, enabling smarter decisions across the organization by providingactionable insights. Eligibility Criteria: Years of Experience : Minimum 14 years Job Experience: o Experience with Data Analysis/Data Profiling, Visualization tools ( Power BI) o Experience in Database and Data warehousetech (Azure Synapse/ SQL Server/SAPHANA/MS fabric) o Experience in Stakeholdermanagement / requirement gathering/delivery cycle. Educational : o BachelorDegree: Math/Statistics/Operations Research/ComputerScience o MasterDegree : BusinessAnalytics (with a background in Computer Science) Primary Responsibilities: -Translate complex data analyses into clear, engaging narratives tailored todiverse audiences. - Develop impactful data visualizations and dashboards using tools like PowerBI or Tableau. - Educateand Mentor team to develop the insightful dashboards by using multiple DataStory telling methodologies. - Collaborate with Data Analysts, Data Scientists, Business Analysts and Businessstakeholders to uncover insights. - Understand business goals and align analytics storytelling to drive strategicactions. - Create presentations, reports, and visual content to communicate insightseffectively. - Maintain consistency in data communication and ensure data-drivenstorytelling best practices. Mandatory Skills required to perform the job: Data Analysisskills, experience in extracting information from databases, Office 365 Professional and Proven Data Storytellerthrough BI Experience in Agile/SCRUMprocess and development using any tools. Knowledge of SAPsystems (SAP ECC T-Codes & Navigation) Proven abilityto tell stories with data, combining analytical rigor with creativity. Strong skills indata visualization tools (eg, Tableau, Power BI) and presentation tools(eg, PowerPoint, Google Slides). Proficiency inSQL and basic understanding of statistical methods or Python/R is a plus. Excellentcommunication and collaboration skills. Ability todistill complex information into easy-to-understand formats. Desirable Skills: Background injournalism, design, UX, or marketing alongside analytics. Experienceworking in fast-paced, cross-functional teams. Familiarity withdata storytelling frameworks or narrative design. ","Job_Type":"Full time","Job_Opening_Name":"Data Story Teller" , "State":"Karnataka" , "Country":"India" , "Zip_Code":"560048" , "id":"153957000004621904" , "Publish":true , "Date_Opened":"2025-07-22" , "Keep_on_Career_Site":false}]

Posted 1 week ago

Apply

5.0 - 10.0 years

16 - 20 Lacs

Bengaluru

Work from Office

Project description We are seeking a highly skilled Data Modelling Expert with deep experience in the Avaloq Core Banking platform to join our technology team. The ideal candidate will be responsible for designing, maintaining, and optimizing complex data models that support our banking products, client data, and regulatory reporting needs. This role requires a blend of domain expertise in banking and wealth management, strong data architecture capabilities, and hands-on experience working with the Avaloq platform. Responsibilities Design, implement, and maintain conceptual, logical, and physical data models within the Avaloq Core Banking system. Collaborate with business analysts, product owners, and Avaloq parameterisation teams to translate business requirements into robust data models. Ensure alignment of data models with Avaloq's object model and industry best practices. Perform data profiling, quality checks, and lineage tracing to support regulatory and compliance requirements (e.g., Basel III, MiFID II, ESG). Support integration of Avaloq data with downstream systems (e.g., CRM, data warehouses, reporting platforms). Provide expert input on data governance, metadata management, and model documentation. Contribute to change requests, upgrades, and data migration projects involving Avaloq. Collaborate with cross-functional teams to drive data consistency, reusability, and scalability. Review and validate existing data models, identify gaps or optimisation opportunities. Ensure data models meet performance, security, and privacy requirements. Skills Must have Proven experience (5+ years) in data modelling or data architecture, preferably within financial services. 3+ years of hands-on experience with Avaloq Core Banking Platform, especially its data structures and object model. Strong understanding of relational databases and data modelling tools (e.g., ER/Studio, ERwin, or similar). Proficient in SQL and data manipulation in Avaloq environments. Knowledge of banking products, client lifecycle data, and regulatory data requirements. Familiarity with data governance, data quality, and master data management concepts. Experience working in Agile or hybrid project delivery environments. Nice to have Exposure to Avaloq Scripting or parameterisation is a strong plus. Experience integrating Avaloq with data lakes, BI/reporting tools, or regulatory platforms. Understanding of data privacy regulations (GDPR, FINMA, etc.). Certification in Avaloq or relevant financial data management domains is advantageous.

Posted 1 week ago

Apply

7.0 - 10.0 years

0 - 1 Lacs

Bengaluru

Remote

Job Title: Senior Data Engineer Contractual (Remote | 12 Months Project) Company: Covalensedigital Job Type: Contract (Short-term: 1 to 2 months) Location: Remote Experience: 7+ Years (3+ years in Databricks/Azure Data Engineering) Job Description: We are looking for an experienced Senior Data Engineer for a short-term remote project (12 months) to join Covalensedigital on a contractual basis . Key Responsibilities: Design and implement robust data pipelines using Azure Data Factory (ADF) and Databricks Work on data ingestion , transformation, cleansing, and aggregation from multiple sources Use Python, Spark, SQL for developing scalable data workflows Integrate pipelines with external APIs and systems Ensure data quality, accuracy, and adherence to standards Collaborate with data scientists, analysts, and engineering teams Monitor and troubleshoot data pipelines for smooth operation Must-Have Skills: Python / Spark / SQL / ADLS / Databricks / ADF / ETL 3+ years of hands-on experience in Azure Databricks Deep understanding of large-scale data architecture , data lakes , warehousing , and cloud/on-premise hybrid solutions Strong experience in data cleansing , Azure Data Explorer workflows Ability to work independently and deliver high-quality output within timelines Excellent communication skills Bonus: Experience in Insurance Domain projects Familiarity with data quality frameworks , data cataloging , and data profiling tools Contract Details: Duration: 1 to 2 months Type: Contractual (Remote) Start: Immediate How to Apply: Interested candidates, please send your resume to: kalaivanan.balasubamaniam@covalensedigital.com Thanks kalai 8015302990

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Job Description: Senior/Azure Data Engineer Job Location : Hyderabad / Bangalore / Chennai / Kolkata / Noida/ Gurgaon / Pune / Indore / Mumbai At least 5+ years of relevant hands on development experience as Azure Data Engineering role Proficient in Azure technologies like ADB, ADF, SQL(capability of writing complex SQL queries), ADB, PySpark, Python, Synapse, Delta Tables, Unity Catalog Hands on in Python, PySpark or Spark SQL Hands on in Azure Analytics and DevOps Taking part in Proof of Concepts (POCs) and pilot solutions preparation Ability to conduct data profiling, cataloguing, and mapping for technical design and construction of technical data flows Experience in business processing mapping of data and analytics solutions

Posted 1 week ago

Apply

1.0 - 6.0 years

6 - 7 Lacs

Bengaluru

Work from Office

Job Description: Senior/Azure Data Engineer Job Location : Hyderabad / Bangalore / Chennai / Kolkata / Noida/ Gurgaon / Pune / Indore / Mumbai At least 5+ years of relevant hands on development experience as Azure Data Engineering role Proficient in Azure technologies like ADB, ADF, SQL(capability of writing complex SQL queries), ADB, PySpark, Python, Synapse, Delta Tables, Unity Catalog Hands on in Python, PySpark or Spark SQL Hands on in Azure Analytics and DevOps Taking part in Proof of Concepts (POCs) and pilot solutions preparation Ability to conduct data profiling, cataloguing, and mapping for technical design and construction of technical data flows Experience in business processing mapping of data and analytics solutions At DXC Technology, we believe strong connections and community are key to our success. Our work model prioritizes in-person collaboration while offering flexibility to support wellbeing, productivity, individual work styles, and life circumstances. We re committed to fostering an inclusive environment where everyone can thrive. Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services, such as false websites, or through unsolicited emails claiming to be from the company. These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process. DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process, nor ask a job seeker to purchase IT or other equipment on our behalf. More information on employment scams is available here .

Posted 1 week ago

Apply

5.0 - 8.0 years

27 - 42 Lacs

Pune

Work from Office

JD for the Senior Windchill Data Migration Engineer. • 4+ years of experience in large scale data migration projects, preferably in Healthcare Devices manufacturing domain. • 7+ Years of total experience in IT industry. • Experience in using Windchill Bulk Migrator (WBM) tool for the large-scale data migrations to PTC Windchill. • Good experience with data migration ETLV (Extract, Transform, Load and Validation) concepts and tools • Experience in loading huge volume data different source systems to PTC Windchill using WBM or load from file approach. • Develop and execute data extraction and transformation scripts based on migration scope and migration procedure • Strong understanding of Windchill PLM and Manufacturing data model • Good knowledge about the Windchill Database and table structure. • Able to write SQL (Oracle) queries as needed for the work. • Strong analytical thinking and experience in data profiling and analysis • Strong & clear communicator who can communicate effectively with project team • Track data migration defects, analyze root cause, determine solution and support for timely resolution of defects related to extraction and transformation • Support for data cleansing and data construction activities

Posted 1 week ago

Apply

5.0 - 12.0 years

0 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Familiarity with Data Management Standards Ability to work with high volumes of detailed technical & business metadata. Experience with documenting Data Element metadata (Business Elements vs. Technical Data Elements) Experience with understanding how data transformations materialize and determine appropriate controls required to ensure high-level of data quality. Ability to understand and document application and/or data element level flows (i.e., lineage). Ability to analyze both Process and Datasets to identity meaningful actionable outcomes. Understand and implement changes to business processes. Develop and influence business processes necessary to support data governance related outcomes. Manage and influence across vertical organizations to achieve common objectives. Intermediate to Expert level knowledge of MS products such as Excel, PowerPoint, Word, Skype, & Outlook Working knowledge of Metadata tools such as Collibra or equivalent. Familiarity with Data Analytics / BI tools such as Tableau, MicroStrategy etc. Communication Skills: Create both visually and verbally engaging informative materials for departmental leadership, business partners, executives, and stakeholders. Ability to tailor communication of topics to various levels of the organization (e.g., technical audiences vs. business stakeholders). Desired Skills (nice-to-have): General knowledge of Banking industry.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies