Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Quality Engineer, your primary responsibility will be to analyze business and technical requirements to design, develop, and execute comprehensive test plans for ETL pipelines and data transformations. You will perform data validation, reconciliation, and integrity checks across various data sources and target systems. Additionally, you will be expected to build and automate data quality checks using SQL and/or Python scripting. It will be your duty to identify, document, and track data quality issues, anomalies, and defects. Collaboration is key in this role, as you will work closely with data engineers, developers, QA, and business stakeholders to understand data requirements and ensure that data quality standards are met. You will define data quality KPIs and implement continuous monitoring frameworks. Participation in data model reviews and providing input on data quality considerations will also be part of your responsibilities. In case of data discrepancies, you will be expected to perform root cause analysis and work with teams to drive resolution. Ensuring alignment to data governance policies, standards, and best practices will also fall under your purview. To qualify for this position, you should hold a Bachelor's degree in Computer Science, Information Technology, or a related field. Additionally, you should have 4 to 7 years of experience as a Data Quality Engineer, ETL Tester, or a similar role. A strong understanding of ETL concepts, data warehousing principles, and relational database design is essential. Proficiency in SQL for complex querying, data profiling, and validation tasks is required. Familiarity with data quality tools, testing methodologies, and modern cloud data ecosystems (AWS, Snowflake, Apache Spark, Redshift) will be advantageous. Moreover, advanced knowledge of SQL, data pipeline tools like Airflow, DBT, or Informatica, as well as experience with integrating data validation processes into CI/CD pipelines using tools like GitHub Actions, Jenkins, or similar, are desired qualifications. An understanding of big data platforms, data lakes, non-relational databases, data lineage, master data management (MDM) concepts, and experience with Agile/Scrum development methodologies will be beneficial for excelling in this role. Your excellent analytical and problem-solving skills along with a strong attention to detail will be valuable assets in fulfilling the responsibilities of a Data Quality Engineer.,
Posted 1 day ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
As a Data Catalog Developer specializing in the Alation platform at Ciena, you will play a pivotal role in designing, developing, and implementing data catalog solutions. Collaborating with cross-functional teams, you will ensure that data catalog initiatives meet business needs and enhance data quality across the organization. Your expertise in data catalog development will be critical in improving data management capabilities and supporting strategic decision-making. Partnering With Business Teams You will collaborate with data owners, stewards, and business leaders to gather requirements and define data catalog strategies aligned with business objectives. Acting as a key technical resource between business units and IT teams, you will ensure seamless integration of data catalog solutions with existing systems and processes. Providing guidance and best practices on data modeling, data governance, and data catalog lifecycle management, you will drive user engagement, adoption, and continuous design and configuration across the Alation data catalog program. Project Execution Your responsibilities will include developing and implementing data catalog solutions on the Alation platform, adhering to best practices and industry standards. You will create technical specifications, design documents, and implementation plans for data catalog projects, ensuring timely delivery and high-quality outcomes. Effectively communicating technical concepts to both technical and non-technical audiences is essential for ensuring alignment and understanding across teams. You will also be responsible for system testing, resolving defects, facilitating discussions around business issues, and engaging relevant resources related to data and integration. Metrics For Success Collaborating with business and IT partners, you will define key performance indicators (KPIs) for data catalog initiatives to align with organizational goals. Establishing data quality metrics to measure the accuracy, consistency, and completeness of data within the catalog will be crucial. Tracking data catalog adoption metrics and their impact on business processes and decision-making, as well as gathering and analyzing stakeholder feedback to continuously enhance data catalog processes and solutions will be part of your success metrics. The Must Haves Education: - Bachelors or masters degree in Computer Science, Information Systems, Data Management, or a related field. Experience: - Minimum of 3-5 years of experience in data catalog development, specifically with the Alation platform, demonstrating a successful track record in delivering data catalog projects. Functional Skills - Good understanding of data governance frameworks and methodologies, including data lineage, metadata management, MDM, Reference Data Management, and compliance with data privacy regulations. - Strong understanding of data catalog and data dictionary principles, data management best practices, data quality management, and data governance practices within an Alation environment. - Experience in data querying, profiling, data cleansing, and data transformation processes. Alation Technical Skills - Subject Matter Expert for the Alation platform. - Expertise in configuring the Alation Data Model, data lineage, metadata management. - Proficient in working with Alation APIs. - Experience in managing Reference Data Management (RDM), User Management, UI Config, workflows, loading/exporting data, and optimizing processes. - Design, data modeling creation, and management of large datasets/data models. - Hands-on experience with on-boarding metadata from various sources. General Skills - Excellent verbal and written communication skills. - Strong analytical and problem-solving skills. - Experience with Agile project management methodologies and tools. Assets - Additional experience in MDM space working on Reltio platform. - Knowledge of Cloud storage solutions. - Experience with programming languages like Java, Python, JavaScript, API protocols, and data formats. - Experience with data warehouses and data visualization tools. Ciena is an Equal Opportunity Employer that values diversity and respects its employees. Accommodation measures are available upon request. Join our Talent Community for job alerts.,
Posted 1 day ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. We're looking for candidates with strong technology and data understanding in the data modeling space, with proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your key responsibilities include employing tools and techniques used to understand and analyze how to collect, update, store, and exchange data. You will define and employ data modeling and design standards, tools, best practices, and related development methodologies. Additionally, you will design, review, and maintain data models, perform data analysis activities to capture data requirements and represent them in data models visualization, manage the life cycle of the data model from requirements to design to implementation to maintenance, work closely with data engineers to create optimal physical data models of datasets, and identify areas where data can be used to improve business activities. Skills and attributes for success: - Experience: 3 - 7 years - Data modeling (relevant Knowledge): 3 years and above - Experience in data modeling data tools including but not limited to Erwin Data Modeler, ER studio, Toad, etc. - Strong knowledge in SQL - Basic ETL skills to ensure implementation meets the documented specifications for ETL processes including data translation/mapping and transformation - Good Datawarehouse knowledge - Optional Visualization skills - Knowledge in DQ and data profiling techniques and tools To qualify for the role, you must: - Be a computer science graduate or equivalent with 3 - 7 years of industry experience - Have working experience in an Agile-based delivery methodology (Preferable) - Have a flexible and proactive/self-motivated working style with strong personal ownership of problem resolution - Possess strong analytical skills and enjoy solving complex technical problems - Have proficiency in Software Development Best Practices - Excel in debugging and optimization skills - Have experience in Enterprise-grade solution implementations & in converting business problems/challenges to technical solutions considering security, performance, scalability, etc. - Be an excellent communicator (written and verbal formal and informal) - Participate in all aspects of the solution delivery life cycle including analysis, design, development, testing, production deployment, and support - Possess client management skills EY exists to build a better working world, helping to create long-term value for clients, people, and society, and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 1 day ago
5.0 - 9.0 years
0 Lacs
kolkata, west bengal
On-site
As a Data Modeler specializing in Hybrid Data Environments, you will play a crucial role in designing, developing, and optimizing data models that facilitate enterprise-level analytics, insights generation, and operational reporting. You will collaborate with business analysts and stakeholders to comprehend business processes and translate them into effective data modeling solutions. Your expertise in traditional data stores such as SQL Server and Oracle DB, along with proficiency in Azure/Databricks cloud environments, will be essential in migrating and optimizing existing data models. Your responsibilities will include designing logical and physical data models that capture the granularity of data required for analytical and reporting purposes. You will establish data modeling standards and best practices to maintain data architecture integrity and collaborate with data engineers and BI developers to ensure data models align with analytical and operational reporting needs. Conducting data profiling and analysis to understand data sources, relationships, and quality will inform your data modeling process. Your qualifications should include a Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field, along with a minimum of 5 years of experience in data modeling. Proficiency in SQL, familiarity with data modeling tools, and understanding of Azure cloud services, Databricks, and big data technologies are essential. Your ability to translate complex business requirements into effective data models, strong analytical skills, attention to detail, and excellent communication and collaboration abilities will be crucial in this role. In summary, as a Data Modeler for Hybrid Data Environments, you will drive the development and maintenance of data models that support analytical and reporting functions, contribute to the establishment of data governance policies and procedures, and continuously refine data models to meet evolving business needs and leverage new data modeling techniques and cloud capabilities.,
Posted 1 day ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As part of the Infosys delivery team, your primary role would involve interfacing with clients for quality assurance issue resolution and ensuring high customer satisfaction. You will be responsible for understanding requirements, creating and reviewing designs, validating architecture, and ensuring high levels of service offerings to clients in the technology domain. Participation in project estimation, providing inputs for solution delivery, conducting technical risk planning, performing code reviews, and unit test plan reviews are also key aspects of your responsibilities. In addition, you will lead and guide your teams towards developing optimized high-quality code deliverables, continual knowledge management, and adherence to organizational guidelines and processes. Your contribution will be instrumental in building efficient programs and systems. If you believe you possess the skills to assist clients in navigating their digital transformation journey, this role is tailored for you. Technical Requirements: - Healthcare Data analyst - PL SQL - SQL - Data mapping - STTM creation - Data profiling - Reports Preferred Skills: - Domain expertise in Healthcare, specifically Healthcare - ALL.,
Posted 1 day ago
8.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
The role of a Senior Data Governance Resource at Cittabase entails spearheading the design and implementation of a robust data governance framework aligned with industry best practices and regulatory requirements. Drawing on your extensive experience in data governance methodologies, you will collaborate with business stakeholders to address data needs and challenges effectively. Your responsibilities will include developing and maintaining data governance policies, overseeing data quality initiatives, and identifying key data governance projects. As a seasoned professional with 8-10 years of experience in data governance or related fields, you will play a crucial role in championing data governance across the organization and fostering a data-driven culture. Your expertise in data governance tools, such as Informatica Data Governance, will be instrumental in automating data governance processes and workflows. Additionally, you will lead training programs to educate stakeholders on data governance principles and practices while tracking and reporting on key data governance metrics and KPIs. The ideal candidate for this role should possess strong analytical and problem-solving skills, along with excellent communication, collaboration, and interpersonal abilities. Proficiency in data quality concepts, relevant data privacy regulations, and a demonstrated ability to mentor junior team members are essential qualifications for this position. By joining the dynamic team at Cittabase, you will have the opportunity to contribute to innovative data governance projects and stay abreast of emerging trends and technologies in the data governance sphere. If you are a data governance professional seeking a challenging and rewarding opportunity, we invite you to apply for this full-time, permanent position in Chennai, TN, India. Take the next step in your career and be part of our exciting journey at Cittabase. Apply now to make a meaningful impact with us.,
Posted 1 day ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
As a Global Data Steward at Axalta's facility in Gurugram, Haryana, you will play a crucial role in ensuring the smooth operation of business processes by managing master data objects such as creation, update, obsolescence, reactivation, and accurate data maintenance in the system. Your responsibilities will include collaborating with business teams to clarify requests, maintaining data quality, testing data creations/updates, and mentoring team members. You will be required to work on daily business requests within defined SLA timelines and engage in additional tasks/projects that may involve multiple team interactions. To excel in this role, you should have hands-on experience in master data creation and maintenance, particularly in areas such as Material, Vendor, Pricing, Customer, PIRs, Source List, and BOM data. Proficiency in SAP toolsets related to data management, data extraction programs, ETL processes, data quality maintenance, and cleansing is essential. Knowledge of Request Management tools like SNOW and Remedy, as well as understanding key database concepts and data models, will be beneficial. An ideal candidate for this position would possess professional experience of 5-6 years, with expertise in Data Management Processes, SAP modules (MM/PP or OTC), and IT tools. Strong communication skills, stakeholder alignment, and the ability to interact with international colleagues are crucial. Additionally, you should demonstrate a strong ownership focus, drive to excel, and the ability to resolve conflicts, collaborate, and work effectively as a team player. Flexibility to work in shifts is also required for this role. Axalta, a leading company in the coatings industry, operates in two segments - Performance Coatings and Mobility Coatings, serving various end markets across the globe. With a commitment to sustainability and carbon neutrality, Axalta aims to deliver innovative solutions that protect and enhance products while contributing to a more sustainable future. Join us in our mission to optimize businesses and achieve common goals across diverse geographies and industries.,
Posted 1 day ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Analytics Lead at Cummins Inc., you will be responsible for facilitating data, compliance, and environment governance processes for the assigned domain. Your role includes leading analytics projects to provide insights for the business, integrating data analysis findings into governance solutions, and ingesting key data into the data lake while ensuring the creation and maintenance of relevant metadata and data profiles. You will coach team members, business teams, and stakeholders to find necessary and relevant data, contribute to communities of practice promoting responsible analytics use, and develop the capability of peers and team members within the Analytics Ecosystem. Additionally, you will mentor and review the work of less experienced team members, integrate data from various source systems to build models for business use, and cleanse data to ensure accuracy and reduce redundancy. Your responsibilities will also involve leading the preparation of communications to leaders and stakeholders, designing and implementing data/statistical models, collaborating with stakeholders on analytics initiatives, and automating complex workflows and processes using tools like Power Automate and Power Apps. You will manage version control and collaboration using GITLAB, utilize SharePoint for project management and data collaboration, and provide regular updates on work progress via JIRA/Meets to stakeholders. Qualifications: - College, university, or equivalent degree in a relevant technical discipline, or relevant equivalent experience required. - This position may require licensing for compliance with export controls or sanctions regulations. Competencies: - Balancing stakeholders - Collaborating effectively - Communicating clearly and effectively - Customer focus - Managing ambiguity - Organizational savvy - Data Analytics - Data Mining - Data Modeling - Data Communication and Visualization - Data Literacy - Data Profiling - Data Quality - Project Management - Valuing differences Technical Skills: - Advanced Python - Databricks, Pyspark - Advanced SQL, ETL tools - Power Automate - Power Apps - SharePoint - GITLAB - Power BI - Jira - Mendix - Statistics Soft Skills: - Strong problem-solving and analytical abilities - Excellent communication and stakeholder management skills - Proven ability to lead a team - Strategic thinking - Advanced project management Experience: - Intermediate level of relevant work experience required - This is a Hybrid role Join Cummins Inc. and be part of a dynamic team where you can utilize your technical and soft skills to make a significant impact in the field of data analytics.,
Posted 1 day ago
2.0 - 6.0 years
0 Lacs
indore, madhya pradesh
On-site
As a Data Quality Analyst at Indore (On-site), your primary responsibility will be to analyze data for accuracy, completeness, and consistency. You will be tasked with identifying and resolving data quality issues while developing and implementing data quality rules and standards. Monitoring data pipelines for errors and anomalies will also be a key part of your role. Collaboration is essential in this position, as you will work closely with data engineers, data analysts, and business users to enhance data quality. Conducting product presentations and demonstrations to potential clients to showcase the unique value proposition of Orange DataTech's solutions will also be part of your duties. In addition, you will be required to develop and maintain data quality reports and dashboards, utilizing data profiling and data quality tools to assess data quality effectively. To excel in this role, you must have experience with SQL and data querying languages, along with knowledge of data profiling and data quality tools. Strong analytical and problem-solving skills are essential, along with excellent communication and interpersonal abilities. Attention to detail and accuracy will be crucial to ensure the quality of data analysis and reporting.,
Posted 1 day ago
3.0 - 7.0 years
0 Lacs
west bengal
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. We are counting on your unique voice and perspective to help EY become even better. Join us and build an exceptional experience for yourself, and a better working world for all. We are seeking a highly skilled and motivated Data Analyst with experience in ETL services to join our dynamic team. As a Data Analyst, you will be responsible for data requirement gathering, preparing data requirement artefacts, data integration strategies, data quality, data cleansing, optimizing data pipelines, and solutions that support business intelligence, analytics, and large-scale data processing. You will collaborate closely with data engineering teams to ensure seamless data flow across our systems. The role requires hands-on experience in the Financial Services domain with solid Data Management, Python, SQL & Advanced SQL development skills. You should have the ability to interact with data stakeholders and source teams to gather data requirements, understand, analyze, and interpret large datasets, prepare data dictionaries, source to target mapping, reporting requirements, and develop advanced programs for data extraction and analysis. Key Responsibilities: - Interact with data stakeholders and source teams to gather data requirements - Understand, analyze, and interpret large datasets - Prepare data dictionaries, source to target mapping, and reporting requirements - Develop advanced programs for data extraction and preparation - Discover, design, and develop analytical methods to support data processing - Perform data profiling manually or using profiling tools - Identify critical data elements and PII handling process/mandates - Collaborate with technology team to develop analytical models and validate results - Interface and communicate with onsite teams directly to understand requirements - Provide technical solutions as per business needs and best practices Required Skills and Qualifications: - BE/BTech/MTech/MCA with 3-7 years of industry experience in data analysis and management - Experience in finance data domains - Strong Python programming and data analysis skills - Strong advance SQL/PL SQL programming experience - In-depth experience in data management, data integration, ETL, data modeling, data mapping, data profiling, data quality, reporting, and testing Good To have: - Experience using Agile methodologies - Experience using cloud technologies such as AWS or Azure - Experience in Kafka, Apache Spark using SparkSQL and Spark Streaming or Apache Storm Other Key capabilities: - Client facing skills and proven ability in effective planning, executing, and problem-solving - Excellent communication, interpersonal, and teamworking skills - Multi-tasking attitude, flexible with ability to change priorities quickly - Methodical approach, logical thinking, and ability to plan work and meet deadlines - Accuracy and attention to details - Written and verbal communication skills - Willingness to travel to meet client needs - Ability to plan resource requirements from high-level specifications - Ability to quickly understand and learn new technology/features and inspire change within the team and client organization EY exists to build a better working world, helping to create long-term value for clients, people, and society, and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate across assurance, consulting, law, strategy, tax, and transactions. EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
As a Snowflake Data Engineer at our organization, you will play a vital role in designing, developing, and maintaining our data infrastructure. Your responsibilities will include ingesting, transforming, and distributing data using Snowflake and AWS technologies. You will collaborate with various stakeholders to ensure efficient data pipelines and secure data operations. Your key responsibilities will involve designing and implementing data pipelines using Snowflake and AWS technologies. You will leverage tools like SnowSQL, Snowpipe, NiFi, Matillion, and DBT to ingest, transform, and automate data integration processes. Implementing role-based access controls and managing AWS resources will be crucial for ensuring data security and supporting Snowflake operations. Additionally, you will be responsible for optimizing Snowflake queries and data models for performance and scalability. To excel in this role, you should have a strong proficiency in SQL and Python, along with hands-on experience with Snowflake and AWS services. Understanding ETL/ELT tools, data warehousing concepts, and data quality techniques will be essential. Your analytical skills, problem-solving abilities, and excellent communication skills will enable you to collaborate effectively with data analysts, data scientists, and other team members. Preferred skills include experience with data virtualization, machine learning, AI concepts, data governance, and data security best practices. Staying updated with the latest advancements in Snowflake and AWS technologies will be essential for this role. If you are a passionate and experienced Snowflake Data Engineer with 5 to 7 years of experience, we invite you to apply and be a part of our team. This is a full-time position based in Gurgaon, with a hybrid work mode accommodating India, UK, and US work shifts.,
Posted 2 days ago
3.0 - 20.0 years
0 Lacs
karnataka
On-site
As a Change Management and Transformation Consultant in Capital Markets at Accenture, you will have the opportunity to tackle our clients" most complex challenges by collaborating with exceptional individuals, utilizing cutting-edge technology, and partnering with leading companies across various industries. In this role, you will be a part of the Capital Markets practices within Accenture's Capability Network. Your primary responsibility will be to assist investment banks, asset and wealth managers, and exchanges in preparing for the digital future. By leveraging global strategies and data-driven insights, you will play a crucial role in enabling digital-enabled capital markets. Your key initiatives will include collaborating with clients to address intricate problems such as regulatory reforms, managing organizational changes related to processes, technology, and structure, overseeing transformation projects to transition from legacy systems to modern solutions, and recommending industry best practices to enhance operational efficiency. Additionally, you will support data governance and management efforts, optimize operations, drive business decision-making, refine methodologies, track industry trends, and develop proposals that align with Accenture's value proposition. Your role will involve incorporating best practices and methodologies into all stages of project management to ensure successful outcomes. To excel in this position, you should possess strong analytical and problem-solving skills, excellent communication and presentation abilities, and cross-cultural competence to thrive in a dynamic consulting environment. The ideal candidate for this role would have an MBA from a reputable business school with a blend of consulting and functional skills, industry-specific certifications such as FRM, CFA, or PRM, prior experience in consulting projects, and expertise in Investment Banking and Investment Management functions. Specific domains of knowledge should include Capital Markets, Asset & Wealth Management, Front Office Advisory, OMS systems, Back Office applications, Risk Management, Regulatory Change and Compliance, Data Governance, Robotics Process Automation, Agile Methodology, and more. By joining our team, you will have the opportunity to work on transformative projects with key clients, collaborate with industry experts to shape innovative solutions, receive personalized training to enhance your skills and industry knowledge, and contribute to a culture committed to equality and collaboration. Accenture is a leading global professional services company that offers a wide range of services in strategy, consulting, digital, technology, and operations. With a focus on delivering sustainable value to clients across industries, Accenture's team of over 569,000 professionals in more than 120 countries drives innovation to improve the way the world works and lives. Join us at Accenture to be a part of a team that values ideas, ingenuity, and a commitment to making a positive impact through transformative change.,
Posted 2 days ago
7.0 - 12.0 years
12 - 17 Lacs
Noida, Bengaluru
Work from Office
Position Summary This role acts subject matter expert in the Pharma Commercial Datasets-including US data sets and different Business reporting metrics KPI. An expert in Pharma Sales Commercial data who can guide and lead the team supporting pharma clients primarily Data Management and Reporting projects. Job Responsibilities Have had extensive experience in working on Pharma Sales Commercial level datasets (Customer,Sales, Claims, Digital engagements, CRM interactionsetc.), or Medical Affairs, or Managed Markets or Patient Support Services Have fair understanding of functional design architecture preparation along with logical/functional data model of solution Have had good client facing roles to cover business requirement discussions, business use case assessments, conduct requirement workshops/interviews Strong experience in SQL to analyse data sets and perform data profiling/assessment Able to Create Business requirements, define scope and objectives, functional specifications, develop business processes and recommendations related to proposed solution. Obtain sign-off from customers. Facilitate UAT execution phase and work with project manager to obtain user acceptance test signoff. Able to convert business requirements into functional design for development team Represent the practice as a Functional SME for data warehouse, reporting, and master data management projects across Pharma sales commercial data sets. Support Axtrias project and R&D teams with functional/domain knowledge Able to create case studies/success stories of projects with functional/domain view Assist in whitepapers, point-of-view, develop Go-to- market strategy for Sales Commercial domain offerings Able to coach team members in Pharma Sales Commercial domain business & functional use cases Education BE/B.Tech in IT or Computer Master of Computer Application Work Experience Minimum 7 years of Pharma Industry domain experience of performing Business Analyst role in Medium to large Data warehousing and Business Intelligence projects Minimum 5 years of experience of working in Pharma domain with experience of working in Pharma Data sets and client facing roles. Behavioural Competencies Teamwork & Leadership Motivation to Learn and Grow Ownership Cultural Fit Talent Management Client Management Axtria RIGHT ValuesThe leader lives & breathes Axtria RIGHT values- Responds with a sense of urgency, demonstrates integrity, takes initiative, is humble and collaborative. Responsibility/Ownership Technical Competencies Problem Solving Lifescience Knowledge Communication Capability Building / Thought Leadership Business Consulting Business Acumen SQL Subject Matter Expertise
Posted 2 days ago
1.0 - 5.0 years
2 - 5 Lacs
Pune
Work from Office
MDM Specialist The aspirant will have a good exposure on the end-to end Master Data Management Any graduate with 6 months relevant exp CTC up to 5 LPA + Inc 5 Days working-2off Immediate Joiners Location-Pune Call or WhatsApp:-Nikita :-7744984200 Required Candidate profile min 6 months Experience Excellent communication skills Perks and benefits 2 way Cab Performance bonus+ Incentives
Posted 2 days ago
14.0 - 19.0 years
30 - 45 Lacs
Bengaluru
Work from Office
Role & responsibilities Eligibility Criteria: Years of Experience: Minimum 14 years Experience with Data Analysis/Data Profiling, Visualization tools (Power BI) Experience in Database and Data warehouse tech (Azure Synapse/SQL Server/SAP HANA/MS fabric) Experience in Stakeholder management/requirement gathering/delivery cycle. Bachelors Degree: Math/Statistics/Operations Research/Computer Science Master’s Degree: Business Analytics (with a background in Computer Science) Primary Responsibilities: Translate complex data analyses into clear, engaging narratives tailored to diverse audiences. Develop impactful data visualizations and dashboards using tools like Power BI or Tableau. Educate and Mentor team to develop the insightful dashboards by using multiple Data Story telling methodologies . Collaborate with Data Analysts, Data Scientists, Business Analyst and Business stakeholders to uncover insights. Understand business goals and align analytics storytelling to drive strategic actions. Create presentations, reports, and visual content to communicate insights effectively. - Maintain consistency in data communication and ensure data-driven storytelling best practices. Mandatory Skills required to perform the job: Data Analysis skills, experience in extracting information from databases, Office 365 Professional and Proven Data Storyteller through BI Experience in Agile/SCRUM process and development using any tools. Knowledge of SAP systems (SAP ECC T-Codes & Navigation) Proven ability to tell stories with data, combining analytical rigor with creativity. Strong skills in data visualization tools (e.g., Tableau, Power BI) and presentation tools (e.g., PowerPoint, Google Slides). Proficiency in SQL and basic understanding of statistical methods or Python/R is a plus. Excellent communication and collaboration skills. Ability to distill complex information into easy-to-understand formats. Desirable Skills: Background in journalism, design, UX, or marketing alongside analytics. Experience working in fast-paced, cross-functional teams. Familiarity with data storytelling frameworks or narrative design. Expected Outcome 1. Provide on-the-job training for leads on actionable insights. 2. Educate business partners on data literacy and actionable insights. 3. Lead change management initiatives (related to Data Storytelling and Data Literacy) in the organization. 4. Implement processes based on data storytelling concepts and establish a governance model to ensure dashboards are released with the appropriate insights. 5. Standardize dashboards and reports to provide actionable insights. 6. Utilize the most suitable data representation techniques. Preferred candidate profile
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
delhi
On-site
You are a skilled and experienced Business Analyst with a strong knowledge and expertise in Property and Casualty (P&C) insurance. You possess a comprehensive understanding of P&C insurance principles, processes, and systems. Your main responsibilities will include collaborating with stakeholders to gather and analyze business requirements relevant to P&C insurance operations. You will conduct insightful research into P&C insurance products, markets, and competitors to identify enhancements and opportunities. Leveraging your extensive P&C insurance knowledge, you will design and develop impactful solutions for business challenges. You will partner with cross-functional teams to ensure the seamless implementation of developed business solutions. Additionally, you will provide in-depth expertise for the development and upgrading of P&C insurance systems and applications. Your role will also involve performing meticulous data analysis and validation to maintain system data accuracy, as well as creating and maintaining comprehensive documentation such as functional specifications, business process flows, and user manuals. To excel in this role, you are expected to have a deep understanding of P&C insurance principles, products, and methodologies. You should demonstrate proven expertise in PL/SQL with the ability to craft intricate queries for database data manipulation and analysis. Familiarity with dimensional data marts and their applications in data warehousing settings is essential. Proficiency in business intelligence tools for developing reports and dashboards specific to P&C insurance analytics is required. Your exceptional analytical skills will enable you to translate complex requirements into tangible functional specifications. Strong communication proficiency is necessary for efficient collaboration with both technical and non-technical parties. Your rigorous attention to detail emphasizes data integrity and precision. You should have the capacity to independently handle tasks within dynamic team settings, adhering to multiple priorities and deadlines. A bachelor's degree in Business Administration, Computer Science, or related disciplines is preferred, and notable experience within the P&C insurance industry will be advantageous. Your skills should include data validation, business intelligence tools, insurance knowledge, proficiency in the policy and claim lifecycle, data analysis, analytical skills, data modeling, PL/SQL expertise, BRD creation, communication skills, source to target mapping, data profiling, SQL knowledge, attention to detail, documentation abilities, analytics, property and casualty (P&C) insurance understanding, data vault knowledge, and reporting and dashboard development proficiency.,
Posted 3 days ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
You will be working independently as an SAP BODS developer, familiar with standard concepts, practices, and procedures. Your responsibilities will include experience in Data Extraction, Data Integration, Data Transformation, Data Quality, and Data Profiling. You should have experience in building integrations in SAP DS with Rest/SOAP APIs and cloud platforms like Google Cloud Platform. It is essential to possess a deep knowledge and understanding of Datawarehouse concepts and have strong PL-SQL skills for writing complex queries. You will be responsible for data extraction and integration using different data sources like SAP HANA and various warehouse applications. Experience in utilizing all kinds of SAP BODS transformations such as Data Integrator, Data Quality, and Basic Transforms is required. Additionally, tuning jobs for high volume and complex transformation scenarios and experience in DS components including job server, repositories, and service designers will be advantageous. In addition to the core responsibilities, you should be able to work with clients to identify business challenges and contribute to client deliverables by refining, analyzing, and structuring relevant data. It is crucial to stay aware of the latest technologies and trends in the industry. Logical thinking, problem-solving skills, and the ability to collaborate effectively are essential traits for this role. You must also be capable of assessing current processes, identifying improvement areas, and suggesting appropriate technology solutions. The mandatory skills for this position include expertise in SAP BODS.,
Posted 3 days ago
8.0 - 15.0 years
0 Lacs
karnataka
On-site
The position of Informatica Architect requires 8-15 years of experience in Data Profiling, Data Quality rules (preferably in insurance domain), and strong experience in Informatica MDM. The role involves migration experience from Informatica MDM to Informatica Cloud MDM SaaS, designing, developing, and implementing MDM solutions, configuring and customizing Informatica MDM Hub on Cloud, data mapping, transformation, and cleansing rules management, data integration with enterprise systems and databases, as well as data quality rules and validation processes development. Additional skills include knowledge of MDM concepts, insurance domain experience, and Informatica MDM certification is preferred. Soft skills such as excellent written and verbal communication, experience with cross-functional teams, and strong stakeholder management are required. On the other hand, the MDM Consultant / Data Analyst position requires 8-10 years of experience in Data Profiling, identifying Data Quality rules (preferably in insurance), and proficiency in SQL and data analysis. The role also demands an understanding of data warehouse concepts, strong analytical and problem-solving skills for trend analysis and quality checks. Additional skills include knowledge of MDM concepts and familiarity with the insurance domain. Soft skills such as excellent written and verbal communication, experience working with cross-functional teams, and a strong ability to work with client stakeholders are essential. Both positions offer a collaborative work environment that requires interaction with cross-functional teams and client stakeholders. Key responsibilities include data quality assessment, governance, troubleshooting, and ensuring data integrity across the organization. For any specific changes or additional details, please reach out to sushma@metamorfs.com or contact at +91-8971322318.,
Posted 3 days ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
You are looking for a Jr Data Engineer with 3 to 6 years of experience in a hybrid work environment with a salary of 18L. Your role will involve working on Azure Data Services like Azure Data Factory, Azure SQL Database, and Azure Data Lake, with 4-5 years of hands-on project experience. Additionally, you should have expertise in Power BI with the same level of project experience. You will be responsible for developing data pipelines using Azure Data Factory, writing and analyzing SQL queries, creating and optimizing Stored Procedures and Views with a self-rating of at least 8/10, and working on data profiling, sourcing, and cleansing routines. As an Azure Data Solutions Developer, you will design and deliver scalable data solutions using Azure Data services, develop and implement data ingestion and transformation pipelines, build data solutions using various tools and Azure services, and collaborate on proof-of-concept development and production implementation. Your focus will be on translating business requirements into technical solutions, ensuring data quality, and meeting project deadlines. Qualifications required for this role include 4-5 years of hands-on experience with Azure Data Factory, Azure SQL Database, Azure Synapse Analytics, Azure Data Lake, expertise in SQL optimization, strong BI platform experience, and working with ETL/ELT tools like SSIS, ODI, and Talend. You should have knowledge of data management best practices, advanced RDBMS experience, coaching abilities, problem-solving skills, and excellent communication skills. Desirable qualifications include programming experience in Python, working with offshore teams, consultancy experience, and familiarity with Agile/Scrum methodologies. This is a full-time position with the benefit of working from home. The ideal candidate should have a total of 7 years of work experience and will be required to work in person at the specified location.,
Posted 3 days ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
We are looking for a skilled and motivated Data Engineer with at least 4 years of experience in GCP, Teradata, and Data Warehousing. The ideal candidate should have hands-on expertise in developing robust data engineering solutions on Google Cloud Platform (GCP) and working experience with Teradata. You must be proficient in designing and automating scalable data pipelines and possess excellent leadership, communication, and collaboration skills. Your responsibilities will include analyzing source systems, profiling data, and resolving data quality issues. You will be required to gather and comprehend business requirements for data transformation, design, develop, test, and deploy ETL/data pipelines using GCP services and Airflow. Additionally, writing complex SQL queries for data extraction, formatting, and analysis, creating and maintaining Source to Target Mapping, and designing documentation will be part of your role. You will also need to build metadata-driven frameworks for scalable data pipelines, perform unit testing, and document results, utilize DevOps tools for version control and deployment, provide production support, enhancements, and bug fixes, troubleshoot issues, and support ad-hoc business requests. Collaboration with stakeholders to resolve EDW incidents, manage expectations, apply ITIL concepts for incident and problem management, perform data cleaning, transformation, and validation, and stay updated on GCP advancements and industry best practices are also key responsibilities. Requirements: - Minimum 4 years of experience in ETL and Data Warehousing - Hands-on experience with GCP services such as BigQuery, Dataflow, Cloud Storage, etc. - Experience in Apache Airflow for workflow orchestration - Experience in automating ETL solutions - Experience in executing at least 2 GCP Cloud Data Warehousing projects - Exposure to Agile/SAFe methodologies in at least 2 projects - Mid-level proficiency in PySpark and Teradata - Strong SQL skills and experience working with semi-structured data formats like JSON, Parquet, XML - Experience with DevOps tools like GitHub, Jenkins, or similar - Deep understanding of Data Warehousing concepts, data profiling, quality, and mapping Preferred Qualifications: - B.Tech/B.E. in Computer Science or a related field - Google Cloud Professional Data Engineer Certification - Strong leadership and communication skills ,
Posted 3 days ago
2.0 - 6.0 years
0 Lacs
chennai, tamil nadu
On-site
As a part of Comcast, you will play a vital role in maintaining uniform access and complete transparency for data assets, which are essential for making informed business decisions that drive profitable growth. Your responsibilities will include collecting, analyzing, and validating data aligned with business requirements before it is stored for final use by the organization and its clients. Furthermore, you will provide technical requirements to operationalize data assets for organizational and client use, ensuring data accuracy through quality assurance checks, daily monitoring, triage, and user communication. Your efforts will also involve enhancing transparency to upstream source owners regarding data availability and accuracy alerts to stabilize data for downstream use. While working with moderate guidance in your area of expertise, you will contribute to various aspects of the data lifecycle processing to maintain transparent communication and meet deadline expectations. Key Responsibilities: - Contribute to utilizing business unit data for reporting, analytics, and addressing user community inquiries. - Participate in data discovery, profiling, requirements gathering, testing, and documentation processes. - Collaborate with business units to document enterprise base and semantic layer requirements. - Assist in estimating effort and timelines with the management team. - Maintain detailed documentation throughout the data lifecycle processing. - Support the development of business rules and logic for data harmonization. - Assist in creating test plans and conducting User Acceptance Testing for data-dependent projects. - Contribute to standardization efforts for referential data used in data analysis and project output. - Provide regular status updates to the leadership team. You will be expected to adhere to Comcast's Operating Principles, prioritize customer experience, stay abreast of technology advancements, collaborate effectively within a team, and actively contribute to the Net Promoter System. Additionally, you will be required to demonstrate punctuality, exercise independent judgment, and fulfill assigned duties and responsibilities. Working nights and weekends based on project requirements may be necessary. Qualifications: - Education: Bachelor's Degree - Relevant Work Experience: 2-5 Years Please note that while holding a Bachelor's Degree is preferred, Comcast also considers applicants with a combination of coursework and experience or extensive related professional experience. This job description provides an overview of the general responsibilities and expectations for this role and is not an exhaustive list of duties, responsibilities, and qualifications.,
Posted 3 days ago
4.0 - 8.0 years
3 - 6 Lacs
Hyderabad, Pune, Chennai
Work from Office
Job Category: IT Job Type: Full Time Job Location: Bangalore Chennai Hyderabad Pune Location-HYD / BANG / PUNE / CHENNAI / MYS Experience- 4+ Strong SQL and data transformation skills Strong experience in Test data management and SQL. Understanding of ETL/ELT process fundamentals Experience in TDM data masking using Delphix. Experience in masking VSAM files Experience in performing various TDM related activities which includes Data profiling/data discovery, writing custom data masking algorithms and perform data masking Data masking using Delphix and VSAM file masking
Posted 3 days ago
10.0 - 14.0 years
12 - 16 Lacs
Bengaluru
Work from Office
About The Role Skill required: Data Management - Microsoft Fabric Designation: Data Eng, Mgmt & Governance Assoc Mgr Qualifications: BE/BTech Years of Experience: 10 to 14 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Data & AIEnd to end, unified analytics platform that brings together existing offerings like Data Factory, Synapse, and Power BI into a single unified product for all your data and analytics workloads. What are we looking for Microsoft Fabric Microsoft Azure PySparkStrong analytical skillsAbility to establish strong client relationshipAbility to manage multiple stakeholdersAbility to perform under pressure Roles and Responsibilities: In this role you are required to do analysis and solving of moderately complex problems Typically creates new solutions, leveraging and, where needed, adapting existing methods and procedures The person requires understanding of the strategic direction set by senior management as it relates to team goals Primary upward interaction is with direct supervisor or team leads Generally, interacts with peers and/or management levels at a client and/or within Accenture The person should require minimal guidance when determining methods and procedures on new assignments Decisions often impact the team in which they reside and occasionally impact other teams Individual would manage medium-small sized teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Qualification BE,BTech
Posted 3 days ago
10.0 - 15.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Job Purpose and Impact This role will be responsible to create and maintain Data Quality Framework, Solution Design/ Architecture and assisting the global data team and business users /data stewards in implementing a global or enterprise level data quality solution and help improving data quality across organization. Key Accountabilities Accountable for Data Quality solution design, development, Test, and Operationalization across Cargill. Closely work with DQ engineering team and provide technical support and mentorship wherever required Understand functional/ technical design, define best practice, Develop re-usable & scalable DQ solution Partner with Global & Enterprise data teams and consult on data quality capabilities to define, Implement and socialize data definitions, Standards/ Polices. Qualifications Minimum requirement of 10 years of relevant work experience. Typically reflects 12 years or more of relevant experience. IDMC - Cloud Data Quality (CDQ), Data Integration (CI), Data Profiling (CDP) Mandatory IDMC Data Governance & Catalogue (CDGC), Application Integration (CAI) Mandatory Informatica Address Doctor / AD6/ DAAS - Mandatory IDMC - Metadata Scanning, CLAIRE Preferred Informatica Data Quality (IDQ) Preferred IDMC -Administration, Operational Dashboard, Connections Preferred
Posted 3 days ago
4.0 - 7.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : SAP BusinessObjects Data Services Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Lead, you will develop and configure software systems, either end-to-end or for specific stages of the product lifecycle. Your typical day will involve collaborating with various teams to ensure the successful implementation of software solutions, applying your knowledge of technologies and methodologies to support projects and clients effectively. You will engage in problem-solving activities, guiding your team through challenges while ensuring that project goals are met efficiently and effectively. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with strategic objectives. Professional & Technical Skills: - Must To Have Skills: Candidate should have minimum 2 S/4 HANA Implementation project experience.- Proficiency in SAP BusinessObjects Data Services.- Strong understanding of data integration and ETL processes.- Experience with data quality management and data profiling.- Familiarity with database management systems and SQL.- Ability to troubleshoot and resolve technical issues efficiently. Additional Information:- The candidate should have minimum 5 years of experience in SAP BusinessObjects Data Services.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough