Jobs
Interviews

455 Metadata Management Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 10.0 years

10 - 20 Lacs

hyderabad, gurugram, bengaluru

Work from Office

>> About KPMG in India KPMG in India, a professional services firm, is the Indian member firm affiliated with KPMG International and was established in September 1993. Our professionals leverage the global network of firms, providing detailed knowledge of local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada. KPMG in India offers services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focussed and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. Our professionals provide the experience to help companies stay on track and deal with risks that could unhinge their business survival. Our services enable clients to effectively co-ordinate their key growth, quality and operational challenges and working in partnership with us, clients have the benefits of KPMG's experienced, objective, and industry-grounded viewpoints. KPMG Advisory professionals provide advice and assistance to enable companies, intermediaries and public sector bodies to mitigate risk, improve performance, and create value. KPMG firms provide a wide range of Risk Consulting, Management Consulting and Transactions & Restructuring services that can help clients respond to immediate needs as well as put in place the strategies for the longer term. With increasing regulatory requirements, the need for greater transparency in operations, and disclosure norms, stakeholders require assurance beyond the traditional critique of numbers. Hence assurance is being increasingly required on industry issues, business risks and key business processes. The Governance, Risk & Compliance Services practice assists companies and public sector bodies to mitigate risk, improve performance and create value. We assist our clients to effectively manage business and process risks by providing a full spectrum of corporate governance, risk management, and Compliance Services. These services are tailored to meet clients individual needs and provide effective support to management in meeting the challenges and opportunities presented by today's complex business environment. Our professionals provide the experience to help companies stay on track and deal with risks that could unhinge their business survival. Our services enable clients to effectively co-ordinate their key growth, quality and operational challenges and working in partnership with us, clients have the benefits of KPMG's experienced, objective, and industry-grounded viewpoints. Data Governance Consultant/ AM/Manager Role & Responsibilty Lead the design and implementation of the Data Governance framework. Lead maturity assessments for data management/data governance capabilities and identify gaps & recommendations and build implementation roadmaps. Support data management initiatives including setting up and monitoring of data governance programs and coordinating with different teams/business units. Strong understanding of DG Org structure, and roles & responsibilities / RACI (Data Steward, Data Custodian, Data Owner, Producers and Consumers, etc.) Lead the building of Stewardship and Interaction models and ensure clear accountability for data ownership and stewardship. Strong understanding of Data Landscape (data flows, data lineage, CDEs, Data domains etc.) Experience in managing data lineage tools & visualizations to track data movement and transformations across systems Lead the development of data quality rules, identifying data quality issues, and working with stakeholders for remediation planning. Strong skills in data quality management and hands-on experience with data quality tools and techniques. Strong understanding of Metadata Mgmt. and MDM/RDM concepts and processes (creation, curation, update, and change mgmt., etc.) Strong understanding of Data Catalog, Business Glossary and/or Data Dictionary (understands data definitions and associated best practices) Assist in facilitating training sessions and workshops to improve data governance awareness and practices across the organization. Monitor compliance with relevant data regulations (eg: NDMO, GDPR, CCPA) and internal data policies. Knowledge of industry-leading DG tools (Collibra, Alation, etc.) and DQ tools (SAP IS, Informatica/IDQ etc.). Preferably also awareness of MDM tools Having a good understanding of either of one regulatory data management laws the National Data Management Office (NDMO), Abu Dhabi Government Data Management Standard, IT DPDP Act 2023 etc. Strong understanding of Data Risk Taxonomy (Types, Sub-types) QUALIFICATION BE/BTech/MCA/MBA/Masters degree in information management or related field. Good Knowledge of industry standards & regulations around NDMO, SAMA, BCBS239, GDPR, CCPA etc Extensive experience on Informatica tool stack. Data Management certifications (CDMP, DAMA, DCAM etc) >> SELECTION PROCESS Candidates should expect 2 - 3 rounds of personal or telephonic interviews to assess fitment and communication skills >> Compensation Compensation is competitive with industry standards Details of the compensation breakup will be shared with short-listed candidates only >> WoRK Timing Monday to Friday >> People BENEFITS Continuous learning program Driving a culture of recognition through ENCORE our quarterly rewards and recognition program Comprehensive medical insurance coverage for staff and family Expansive general and accidental coverage for staff Executive Health check-up (Manager & above, and for staff above the age of 30) Les Concierge desks Internal & Global Mobility Various other people-friendly initiatives Strong commitment to our Values such as CSR initiatives The opportunity is now! If you are interested please share your resume in sonalidas4@kpmg.com.

Posted 5 days ago

Apply

3.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

As a Manager, Technology specializing in Data, Analytics & AI, you will be responsible for overseeing the health, performance, and growth of a diverse team of Data/BI/ML Engineers and Analysts. Your primary objective will be to create an environment conducive to sustainable high performance by providing clear direction, fostering collaboration, and supporting career development within the team. Your key responsibilities will include managing, coaching, and developing the team, in addition to collaborating with Tech Leadership, Product, and Business stakeholders to translate requirements into scalable data models, pipelines, semantic layers, and dashboards. You will also play a crucial role in shaping the technical vision for your team and driving initiatives related to data quality, governance, documentation, and metric definitions. Furthermore, you will be expected to enforce engineering and analytics best practices, address complex technical and data issues, monitor project status and risks, and manage project forecast and spending. Your success in this role will be measured based on various metrics such as team health, delivery predictability, talent growth, operational maturity, stakeholder trust, and adoption & impact. To excel in this position, you should possess at least 8 years of experience in data engineering/BI/analytics, including 3 years of direct team management. A background in ML & Data Science would be advantageous. A degree in Computer Science/Applications, Engineering, Information Technology/Systems, or equivalent field is required, with a preference for a Master's degree. You should also have demonstrated proficiency in SQL, ELT/ETL patterns, dimensional & semantic modeling, as well as experience with tools such as Snowflake, Databricks, Data Factory, and Oracle. Moreover, familiarity with programming languages like Python, Spark, Scala, and streaming technologies is desirable. Strong BI development skills, governance expertise, software engineering practices, stakeholder management abilities, and clear communication skills are essential for success in this role. Your track record should reflect talent development, collaborative team-building, and the ability to deliver impactful solutions that drive business decisions. Overall, as a Manager, Technology specializing in Data, Analytics & AI, you will play a pivotal role in shaping the direction and performance of your team, driving innovation, and delivering high-quality data solutions that positively impact the organization.,

Posted 5 days ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Ops Capability Deployment Analyst at Citigroup, you will be a seasoned professional contributing to the development of new solutions, frameworks, and techniques for the Enterprise Data function. Your role will involve performing data analytics and analysis across different asset classes, as well as building data science and tooling capabilities within the team. You will work closely with the Enterprise Data team to deliver business priorities. The B & I Data Capabilities team manages the Data quality/Metrics/Controls program and implements improved data governance and data management practices. The Data quality program focuses on enhancing Citigroup's approach to data risk and meeting regulatory commitments. Key Responsibilities: - Hands-on experience with data engineering and distributed data platforms - Understanding of data architecture and integration with enterprise applications - Research and evaluate new data technologies and self-service data platforms - Collaborate with the Enterprise Architecture Team on defining data strategy - Perform complex data analytics on large datasets - Build analytics dashboards and data science capabilities - Communicate findings and propose solutions to stakeholders - Convert business requirements into technical design documents - Work with cross-functional teams for implementation and support - Demonstrate a good understanding of the banking industry - Perform other assigned duties Skills & Qualifications: - 10+ years of development experience in Financial Services or Finance IT - Experience with Data Quality/Data Tracing/Metadata Management Tools - ETL experience using PySpark on distributed platforms - Proficiency in Python, SQL, and BI visualization tools - Strong knowledge of Hive, HDFS, Airflow, and job scheduling - Experience in Data Lake/Data Warehouse implementation - Exposure to analytical tools and AI/ML is desired Education: - Bachelor's/University degree, master's degree in information systems, Business Analysis, or Computer Science If you are a person with a disability and require accommodation to use search tools or apply for a career opportunity, review Accessibility at Citi.,

Posted 5 days ago

Apply

3.0 - 6.0 years

7 - 11 Lacs

noida

Work from Office

Data Modeling Azure , SAP "Expertise in data modelling tools (ER/Studio, Erwin, IDM/ARDM models, CPG / Manufacturing/Sales/Finance/Supplier/Customer domains ) Experience with at least one MPP database technology such as Databricks Lakehouse, Redshift, Synapse, Teradata, or Snowflake. Experience with version control systems like GitHub and deployment & CI tools. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Working knowledge of SAP data models, particularly in the context of HANA and S/4HANA, Retails Data like IRI, Nielsen Retail."

Posted 5 days ago

Apply

5.0 - 7.0 years

5 - 15 Lacs

mumbai, gurugram, bengaluru

Work from Office

Key Responsibilities: - Build and maintain DevOps pipelines for Salesforce (Salesforce Education Cloud, Salesforce Experience Cloud, Salesforce Marketing Cloud, Salesforce Data Cloud) development and deployments. - Automate CI/CD processes for managing code across environments. - Manage version control using GitHub. - Build, refresh and manage Salesforce environments (Salesforce Education Cloud, Salesforce Experience Cloud, Salesforce Marketing Cloud, Salesforce Data Cloud). - Troubleshoot deployment and environment issues to ensure smooth workflows. - Collaborate with developers and testers to review and deploy changes. Required Qualifications: - Strong experience with Salesforce DevOps tools (e.g., GitHub Actions). - Proficiency in CI/CD practises and version control systems. - Experience in Salesforce metadata management. - Familiarity with performance monitoring tools and troubleshooting deployment errors.

Posted 6 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You are a highly skilled and experienced Data Modeler who will be joining our data engineering team. Your expertise lies in designing scalable and efficient data models for cloud platforms, with a focus on Oracle Data Warehousing and Databricks Lakehouse architecture. Your role will be crucial in our strategic transition from an on-prem Oracle data warehouse to a modern cloud-based Databricks platform. Your responsibilities will include designing and implementing conceptual, logical, and physical data models to meet business requirements and analytics needs. You will lead the migration of data models from Oracle Data Warehouse to Databricks on AWS or Azure cloud, reverse-engineer complex Oracle schemas, and collaborate with data architects and engineers to define optimal data structures in Databricks. Furthermore, you will optimize data models for performance, scalability, and cost-efficiency in a cloud-native environment, develop and maintain dimensional models using star and snowflake schemas, and ensure data governance standards are met through metadata management, data lineage, and documentation practices. Your input will be valuable in data architecture reviews and best practices in modeling and data pipeline integration. To be successful in this role, you should have at least 5 years of hands-on experience in data modeling, including conceptual, logical, and physical design. You must have proven experience in migrating large-scale Oracle DWH environments to Databricks Lakehouse or similar platforms, expertise in Oracle database schemas, PL/SQL, and performance tuning, as well as proficiency in Databricks, Delta Lake, Spark SQL, and DataFrame APIs. Deep knowledge of dimensional modeling techniques, familiarity with metadata management tools, and strong analytical and communication skills are essential. You should also be able to work collaboratively in Agile teams and effectively communicate data model designs to technical and non-technical stakeholders.,

Posted 6 days ago

Apply

10.0 - 14.0 years

0 Lacs

maharashtra

On-site

As a Data Ops Capability Deployment Analyst at Citigroup, you will be a seasoned professional applying your in-depth disciplinary knowledge to contribute to the development of new solutions, frameworks, and techniques for the Enterprise Data function. Your role will involve integrating subject matter expertise and industry knowledge within a defined area, requiring a thorough understanding of how different areas collectively integrate within the sub-function to contribute to the overall business objectives. Your primary responsibilities will include performing data analytics and data analysis across various asset classes, as well as building data science and tooling capabilities within the team. You will collaborate closely with the wider Enterprise Data team, particularly the front to back leads, to deliver on business priorities. Working within the B & I Data Capabilities team in the Enterprise Data function, you will manage the Data quality/Metrics/Controls program and implement improved data governance and data management practices across the region. The Data quality program focuses on enhancing Citigroup's approach to data risk and meeting regulatory commitments in this area. Key Responsibilities: - Utilize a data engineering background to work hands-on with Distributed Data platforms and Cloud services. - Demonstrate a sound understanding of data architecture and data integration with enterprise applications. - Research and evaluate new data technologies, data mesh architecture, and self-service data platforms. - Collaborate with the Enterprise Architecture Team to define and refine the overall data strategy. - Address performance bottlenecks, design batch orchestrations, and deliver Reporting capabilities. - Perform complex data analytics on large datasets, including data cleansing, transformation, joins, and aggregation. - Build analytics dashboards and data science capabilities for Enterprise Data platforms. - Communicate findings and propose solutions to various stakeholders. - Translate business and functional requirements into technical design documents. - Work closely with cross-functional teams to prepare handover documents and manage testing and implementation processes. - Demonstrate an understanding of how the development function integrates within the overall business/technology landscape. Skills & Qualifications: - 10+ years of active development background in Financial Services or Finance IT. - Experience with Data Quality, Data Tracing, Data Lineage, and Metadata Management Tools. - Hands-on experience with ETL using PySpark on distributed platforms, data ingestion, Spark optimization, resource utilization, and batch orchestration. - Proficiency in programming languages such as Python, with experience in data manipulation and analysis libraries. - Strong SQL skills and experience with DevOps tools like Jenkins/Lightspeed, Git, CoPilot. - Knowledge of BI visualization tools like Tableau, PowerBI. - Experience in implementing Datalake/Datawarehouse for enterprise use cases. - Exposure to analytical tools and AI/ML is desired. Education: - Bachelor's/University degree, master's degree in information systems, Business Analysis, or Computer Science. In this role, you will play a crucial part in driving compliance with applicable laws, rules, and regulations while safeguarding Citigroup, its clients, and assets. Your ability to assess risks and make informed business decisions will be essential in maintaining the firm's reputation. Please refer to the full Job Description for more details on the skills, qualifications, and responsibilities associated with this position.,

Posted 6 days ago

Apply

2.0 - 10.0 years

0 Lacs

karnataka

On-site

As an experienced Reltio MDM professional, you will be responsible for designing, developing, and implementing Master Data Management solutions using the Reltio platform. This role requires a minimum of 2+ years of experience specifically in Reltio MDM, and a total of 7-10 years of relevant experience. Your key responsibilities will include gathering and analyzing business requirements for Master Data Management, designing and configuring Reltio MDM Cloud solutions, building data models, match & merge rules, survivorship rules, hierarchies, and workflows in Reltio, integrating Reltio with enterprise applications, implementing data quality, governance, and stewardship processes, managing user roles, security, and workflows, collaborating with stakeholders and data stewards, troubleshooting, providing production support, and preparing documentation and training materials. The required skills for this role include strong hands-on experience with Reltio MDM Cloud, expertise in data modeling, governance, quality, and metadata management, proficiency in REST APIs, JSON, XML, knowledge of ETL tools and integration frameworks, strong SQL skills, familiarity with cloud platforms, understanding of Agile methodology and DevOps processes, and excellent communication and stakeholder management skills. Preferred skills for the role include Reltio Certification, experience with other MDM tools, knowledge of Graph databases, Big Data platforms, or Data Lakes. If you are looking to further your career as a Reltio MDM Developer / Lead and possess the required skills and experience, we encourage you to apply for this exciting opportunity in our organization.,

Posted 6 days ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior BI Developer, you will be responsible for designing and delivering enterprise-level business intelligence solutions. Your main focus will be on building scalable BI models, reports, and dashboards that provide accurate, timely, and actionable insights for various regions and business units. Collaboration with stakeholders, data engineering teams, and data modelling teams is a key aspect of this role to ensure that BI solutions are aligned with business requirements, maintain consistency, and comply with governance standards. In addition, you will play a crucial role in mentoring junior BI developers and contributing to best practices across BI initiatives. With a minimum of 5 years of BI development experience, including at least 3 years in a senior/lead role, you should possess strong expertise in Power BI (mandatory) along with additional skills in tools like Tableau. Proficiency in advanced SQL and experience with cloud data platforms, preferably Snowflake, is essential. A solid understanding of data modelling principles such as dimensional, Data Vault, and semantic layers is required. Your responsibilities will include leading enterprise-scale BI projects that align with business objectives, designing, developing, and maintaining BI datasets, reports, and dashboards (Power BI required; Tableau preferred), managing BI data models for scalability and compliance, optimizing reports and dashboards for large-scale environments, integrating BI solutions with ETL/ELT pipelines, and applying DevOps practices for version control and deployments. As part of the role, you will engage with stakeholders, facilitate workshops, mentor junior developers, and ensure the quality and standards of BI development are met. A Bachelor's degree in Computer Science, Data Management, Analytics, or a related field is expected. In return, you will have the opportunity for growth in leadership, technical, and commercial skills, career progression in a global, high-growth environment, and a collaborative, innovative, and vibrant work culture. A competitive compensation and benefits package is also on offer.,

Posted 6 days ago

Apply

2.0 - 6.0 years

0 Lacs

kolkata, west bengal

On-site

As a Junior Business Analyst at SVF Entertainment, you will play a crucial role in analyzing data related to music labels and social media to provide valuable insights for campaigns and strategies. Your responsibilities will include analyzing music label performance, social media insights, and other relevant data to facilitate decision-making processes. You will also be responsible for maintaining and updating dashboards for social media and streaming platforms to ensure accurate reporting. In this role, you will create clear, data-driven presentations and effectively communicate your findings to stakeholders. Additionally, you will troubleshoot metadata issues and ensure the accuracy of music tracks across different platforms. Monitoring social media trends, engagement, and content performance will be essential for providing recommendations for optimization. Collaboration with cross-functional teams to align on data-driven objectives is a key aspect of this role. You will also have the opportunity to leverage AI tools to automate tasks and enhance data insights, contributing to the efficiency and effectiveness of the analysis process. The ideal candidate for this position should have a Bachelor's degree in business, Data Analytics, or a related field, along with at least 2 years of experience in business analysis, preferably in the music & entertainment industry or related sectors. Proficiency in Excel and AI tools is essential, as well as familiarity with metadata standards for music platforms and social media analytics tools such as Facebook Insights and YouTube Analytics. Strong organizational, communication, and problem-solving skills are also important for success in this role. If you are passionate about music and social media, and are looking for a challenging opportunity to apply your analytical skills in a dynamic and creative environment, this role based in Kolkata could be the perfect fit for you.,

Posted 6 days ago

Apply

1.0 - 5.0 years

0 Lacs

maharashtra

On-site

The Analyst-Data governance role within the Data & Analytics department in Mumbai involves implementing a data governance framework to enhance data quality, standards, metrics, and processes. The primary objective is to align data management practices with regulatory requirements and gain an understanding of data lineage within the Bank's business process and systems. Responsibilities include designing and executing data quality rules and monitoring mechanisms, analyzing data quality issues in collaboration with business stakeholders, building recovery models across the Enterprise, and utilizing DG technologies for data quality and metadata management such as Ovaledge, Talend, and Collibra. Additionally, the role involves supporting the development of Centralized Metadata repositories including business glossaries, technical metadata, capturing business/data quality rules, and designing data quality reports and dashboards. The ideal candidate should possess a minimum of 1 to 2 years of experience in Data governance. Key success metrics for this role include the successful implementation of the Data Quality framework across business lines.,

Posted 6 days ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

You are seeking a skilled System Analyst with expertise in Informatica for a Contract to hire position in Pune. With a minimum of 8 to 10 years of experience in data management or data engineering, you will be responsible for utilizing Informatica Data Management Cloud (IDMC) tools like Data Catalog, Data Quality, Data Integration, and Reference360 for effective data processing. Your role will involve a deep understanding of data modeling, data warehousing, ETL/ELT concepts, SQL, PL/SQL, and relational databases such as SQL Server and Oracle. Experience with data governance, metadata management principles, and excellent communication skills will be crucial for this position. Additionally, possessing Informatica Cloud Certification, specifically for IDMC, will be advantageous. This position requires immediate availability with a notice period of up to 15 days. The interview process will be conducted virtually. This is a contractual/temporary position lasting for 6 months with benefits including health insurance. You will be working in a day shift from Monday to Friday at the customer's office in Pune. It is important that you are comfortable with the work location being in person. To apply, kindly provide details on your overall experience, comfort with a 6-month contract position, and working from the customer's office in Pune. Additionally, share your experience in data management, Informatica IDMC tools, data modeling, SQL, data governance, and metadata management principles. Inform about your experience with CICD principles, Informatica Certification, current location, and current & expected CTC.,

Posted 6 days ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

NTT DATA is seeking a Salesforce Data Cloud Developer to join their team in Bangalore, Karnataka, India. As a Salesforce Data Cloud Developer, you will be responsible for designing, building, and supporting Salesforce applications for CRM-based product development. You will work within an agile (SAFe) team, participating in SAFe ceremonies and contributing daily to high-quality implementation. Your role will involve end-to-end Data Cloud design and delivery, including data ingestion, modeling, identity resolution, Calculated Insights, segmentation, activations, and governance. You will collaborate with a multidisciplinary team in a high-demand yet highly collaborative environment. Additionally, you will troubleshoot and resolve defects, ensure data quality and performance, document solutions, and support release and operational readiness. Key Responsibilities: - Configure core Data Cloud Components such as Data Streams, Data Lake Objects (DLOs), Data Model Objects (DMOs), Calculated Insights, and Identity Resolution rules. - Develop and maintain data ingestion pipelines and integrations with robust monitoring and error handling. - Perform Data Profiling & Metadata Management to assess data quality and completeness. - Maintain the data dictionary and related metadata documentation. - Support DevOps practices, including CI/CD and release readiness. - Contribute to testing and quality assurance. Requirements: - Education: Bachelors Degree in Computer Science or equivalent. - Experience: 4 to 6 years of experience, with a minimum of 2 years in Salesforce Data Cloud development. - Technical Skills: Bachelor's degree in computer science or software engineering, at least 5 years of Salesforce experience, and at least 2 years of relevant working experience with Salesforce Data Cloud development. - Soft Skills: Excellent verbal and written communication skills in English, experience in customer-facing projects, good team player with lead capabilities, good decision-making capabilities, and strong presentation and reporting skills. About NTT DATA: NTT DATA is a $30 billion global innovator of business and technology services, serving 75% of the Fortune Global 100. The company is committed to helping clients innovate, optimize, and transform for long-term success. NTT DATA offers diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Their services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation, and management of applications, infrastructure, and connectivity. NTT DATA is a leading provider of digital and AI infrastructure globally, and it is part of the NTT Group, investing over $3.6 billion annually in R&D to support organizations and society in confidently moving into the digital future. Visit NTT DATA at us.nttdata.com.,

Posted 6 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

Withum is a place where talent thrives - where who you are matters. It's a place of endless opportunities for growth. A place where entrepreneurial energy plus inclusive teamwork equals exponential results. Withum empowers clients and our professional staff with innovative tools and solutions to address their accounting, tax and overall business management and operational needs. As a US nationally ranked Top 25 firm, we recruit only the best and brightest people with a genuine passion for the business. We are seeking an experienced Lead Consultant Data Engineering with a strong background in consulting services and hands-on skills in building modern, scalable data platforms and pipelines. This is a client-facing, delivery-focused role. Please note that this position is centered around external client delivery and is not part of an internal IT or product engineering team. This is a foundational hire. You will be responsible for delivering hands-on client work, support for our proprietary data products, and building the team underneath you. Withum's brand is a reflection of our people, our culture, and our strength. Withum has become synonymous with teamwork and client service excellence. The cornerstone of our success can truly be accredited to the dedicated professionals who work here every day, easy to work with a sense of purpose and caring for their co-workers and whose mission is to help our clients grow and thrive. But our commitment goes beyond our clients as we continue to live the Withum Way, promoting personal and professional growth for all team members, clients, and surrounding communities. How You'll Spend Your Time: - Architect, implement, and optimize data transformation pipelines, data lakes, and cloud-native warehouses for mid- and upper mid-market clients. - Deliver hands-on engineering work across client environments - building fast, scalable, and well-documented pipelines that support both analytics and AI use cases. - Lead technical design and execution using tools such as Tableau, Microsoft Fabric, Synapse, Power BI, Snowflake, and Databricks. - Also have a good hands-on familiarity with SQL Databases. - Optimize for sub-50GB datasets and local or lightweight cloud execution where appropriate - minimizing unnecessary reliance on cluster-based compute. - Collaborate with subject-matter experts to understand business use cases prior to designing data model. - Operate as a client-facing consultant: conduct discovery, define solutions, and lead agile project delivery. - Switch context rapidly across 23 active clients or service streams in a single day. - Provide support for our proprietary data products as needed. - Provide advisory and strategic input to clients on data modernization, AI enablement, and FP&A transformation efforts. - Deliver workshops, demos, and consultative training to business and technical stakeholders. - Ability to implement coding modifications to pre-existing code/procedures in a manner that results in a validated case study. - Take full ownership of hiring, onboarding, and mentoring future data engineers and analysts within the India practice. - During bench time, contribute to building internal data products and tooling - powering our own consulting operations (e.g., utilization dashboards, delivery intelligence, practice forecasting). - Help define and scale delivery methodology, best practices, and reusable internal accelerators for future engagements. - Ability to communicate openly about conflicting deadlines to ensure prioritization aligns with client expectations, with ample time to reset client expectations as needed. - Ensure coding is properly commented to help explain logic or purpose behind more complex sections of code. Requirements: - 6+ years of hands-on experience in data engineering roles, at least 3+ years in a consulting or client delivery environment. - Proven ability to context-switch, self-prioritize, and communicate clearly under pressure. - Demonstrated experience owning full lifecycle delivery, from architecture through implementation and client handoff. - Strong experience designing and implementing ETL / ELT pipelines, preferably in SQL-first tools. - Experience with Microsoft SQL Server / SSIS for maintenance and development of ETL processes. - Real-world experience with SQL Databases, Databricks, Snowflake, and/or Synapse - and a healthy skepticism of when to use them. - Deep understanding of data warehousing, data lakes, data modeling, and incremental processing. - Proficient in Python for ETL scripting, automation, and integration work. - Experience with tools such as dbt core in production environments. - Strong practices around data testing, version control, documentation, and team-based dev workflows. - Working knowledge of Power BI, Tableau, Looker, or similar BI tools. - Experience building platforms for AI/ML workflows or supporting agentic architectures. - Familiarity with Microsoft Fabric's Lakehouse implementation, Delta Lake, Iceberg, and Parquet. - Background in DataOps, CI/CD for data pipelines, and metadata management. - Microsoft certifications (e.g., Azure Data Engineer, Fabric Analytics Engineer) are a plus Website: www.withum.com,

Posted 6 days ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

chennai

Work from Office

About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica MDM Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop solutions that align with business needs and requirements. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the team in implementing innovative solutions- Ensure adherence to project timelines and quality standards Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM- Strong understanding of data integration and data quality management- Experience in designing and implementing MDM solutions- Knowledge of data modeling and metadata management- Hands-on experience with Informatica PowerCenter- Good To Have Skills: Experience with Informatica Data Quality Additional Information:- The candidate should have a minimum of 5 years of experience in Informatica MDM- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education

Posted 6 days ago

Apply

12.0 - 15.0 years

4 - 8 Lacs

bengaluru

Work from Office

About The Role Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Informatica MDM Good to have skills : NA Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a variety of tasks that involve analyzing, designing, coding, and testing multiple components of application code across various clients. Your typical day will include collaborating with team members to ensure the successful execution of projects, performing maintenance and enhancements, and contributing to the development of new features. You will also be responsible for troubleshooting issues and implementing solutions that improve application performance and user experience, all while adhering to best practices in software development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with strategic goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM.- Strong understanding of data integration and data quality processes.- Experience with data modeling and metadata management.- Familiarity with ETL processes and data warehousing concepts.- Ability to troubleshoot and resolve data-related issues efficiently. Additional Information:- The candidate should have minimum 12 years of experience in Informatica MDM.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 6 days ago

Apply

4.0 - 9.0 years

15 - 27 Lacs

bengaluru

Work from Office

Job Posting TitleSR. DATA ENGINEER Band/Level5-2-C--C Education ExperienceBachelors Degree (High School +4 years) Employment Experience5-7 years At TE, you will unleash your potential working with people from diverse backgrounds and industries to create a safer, sustainable and more connected world. Job Overview Responsible for designing and establishing the solution architecture for analytical platforms and/or data-enabled solutions.Has broad understanding of the entire data and analytics eco-system of tools and technologies, with hands-on experience across all three core data domains of data science, data engineering, and data visualization. Responsible for fleshing out the details of the end-to-end component architecture being implemented by the team. Contributes to the team's development velocity, if necessary. Understands how an organization's data platform integrates within the overall technical architecture. Understands the business purpose the role fulfills, and the roadmap for the lifecycle of an organization's systems. Sets the direction and establishes the approach for integrating information applications and programs. Roles & Responsibilities Key Responsibilities Architect, design, and build scalable data architectures, pipelines, and data products (Databricks, AWS). Design and develop enterprise data models; manage the full data life cycle. Extract, transform, and integrate data from SAP ECC/S4HANA and other non-SAP environments. Build solutions for real-time, streaming, and batch data workloads. Define and implement data security, governance, and compliance standards. Adapt and implement data quality and observability practices. Perform performance tuning and cost optimization. Support Operations teams in day-to-day business needs. Collaborate with business and technical teams to understand requirements and deliver scalable data architecture solutions. Desired Candidate Experience: Minimum 5 + years in data architecture/data engineering roles. Proven success in large-scale data environments : Delta Lake & Medallion Architecture DLT Pipelines PySpark Workbooks Spark SQL & SQL Warehouse Unity Catalog (data governance, lineage) Genie (query performance, indexing) Security & Role-Based Access Control BonusMLflow knowledge ER & Dimensional modeling Metadata management Python, SQL, PySpark S3, Lambda, EMR, Redshift, Bedrock Databricks Lakehouse platform : Data Modeling: Programming AWS Cloud Services: Competencies

Posted 6 days ago

Apply

8.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Governance Consultant based in Kuwait with over 8 years of experience, you will lead the design and implementation of comprehensive data governance frameworks covering various aspects such as data ownership, stewardship, quality, privacy, and lifecycle management. Your responsibilities will include conducting detailed assessments of clients" current data governance capabilities, evaluating practices, quality, and maturity levels. You will facilitate workshops to gather requirements from stakeholders, ensuring alignment with client priorities. Your role will involve creating phased plans and roadmaps to progressively implement data governance initiatives, defining milestones, timelines, and resource allocations clearly. You will also oversee the implementation of Organization Data Standards related to metadata, Data Quality, Data Architecture, MDM, DCM, data storage, integration, and warehousing domains. Recommending strategies for enhancing data quality, lineage, ownership, and overall management maturity will be crucial. Furthermore, you will be responsible for developing and managing data governance policies, standards, and processes in accordance with client requirements and industry best practices. Deploying data governance tools like data catalog, metadata management, quality tools, and privacy platforms will fall under your purview. Ensuring regulatory compliance with data-related practices and industry standards within client organizations will be a key aspect of your role. Collaborating with cross-functional teams such as IT, Legal, Risk, Compliance, etc., to establish governance roles and responsibilities within client organizations is essential. Your active participation in business development activities will contribute to the growth of the data governance practice. Managing project timelines, deliverables, and resources to ensure successful project delivery will be part of your responsibilities. To excel in this role, you should possess in-depth knowledge of data governance frameworks (e.g., DAMA, DCAM) and regulatory standards (e.g., PDPL, NDMO). Strong expertise in data management tools like Oval Edge, Informatica, Alation, and Microsoft Purview is required. A proven track record in managing data governance projects across various industries is essential. Experience in developing data strategies, catalogs, dictionaries, and stewardship programs will be beneficial. Strong problem-solving skills with a focus on practical implementation and results are necessary. Holding a CDMP (Certified Data Management Professional) certification is highly desirable. Excellent analytical, problem-solving, and project management skills are crucial for this role. Strong communication and stakeholder management abilities will also be key to your success as a Data Governance Consultant.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

As an experienced MDM Developer, you will be responsible for developing and configuring Informatica MDM on-premises solutions to meet business requirements. Your role will involve implementing MDM workflows, match rules, business rules, and data validations using Informatica tools. You will also design and develop data integration workflows to load and synchronize data with the MDM Hub, integrating it with external systems such as ERP, CRM, and data warehouses. In addition to this, you will be required to implement data quality rules and processes using Informatica Data Quality (IDQ) to ensure clean and consistent master data. Supporting data governance initiatives will be a key part of your responsibilities, which includes maintaining metadata, data lineage, and audit trails in the MDM system. Your role will also involve testing and validation activities, including unit, integration, and system testing to validate MDM configurations and workflows. You will support user acceptance testing (UAT) and ensure timely resolution of identified issues. Monitoring and maintaining the MDM environment to ensure optimal performance and uptime will be essential, along with troubleshooting and resolving MDM-related issues. Documentation and collaboration are crucial aspects of the role, requiring you to create and maintain technical documentation, collaborate with business analysts, data stewards, and stakeholders to understand and address data management needs. To be successful in this role, you should have at least 5 years of experience in MDM development, with a minimum of 3 years working with Informatica MDM on-premise. Proficiency in Informatica MDM components such as Hub Console, IDD, Match & Merge, and Hierarchy Manager is essential. Strong knowledge of Informatica Data Quality (IDQ), data profiling, SQL, PL/SQL, relational databases (e.g., Oracle, Teradata), data modeling, data integration, and ETL processes is required. Hands-on experience implementing MDM solutions for domains like Customer, Product, or Employee, and familiarity with integrating MDM systems with enterprise applications will be beneficial. In addition to technical expertise, you should possess strong analytical and problem-solving skills, the ability to work independently and collaboratively in a team environment, and good communication and documentation skills.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

The Informatica Axon Analyst is a key player in ensuring effective management of data governance and data management initiatives within the organization. Acting as a liaison between IT and business stakeholders, you will align metadata requirements with overall business strategies. Your responsibilities will include administering and optimizing the Informatica Axon platform to classify, manage, and govern data appropriately. Your expertise in data governance principles and technical skills will be crucial in maintaining data quality and integrity and promoting a data-driven decision-making culture. Defining and implementing data stewardship processes to ensure compliance with regulations and policies will also be part of your role. By utilizing your analytical and problem-solving abilities, you will contribute to enhancing the organization's data system and operational excellence. You will be tasked with various responsibilities, including administering and maintaining the Informatica Axon platform, collaborating with cross-functional teams to establish data stewardship roles, gathering and analyzing business requirements, creating and managing data dictionaries, glossaries, and taxonomies, monitoring data quality metrics, designing and implementing data governance frameworks, conducting training sessions, facilitating workshops, developing continuous improvement processes, ensuring compliance with industry standards and regulations, assisting in data mapping and lineage initiatives, generating reports and dashboards, supporting data integration projects, identifying automation opportunities, actively participating in data governance council meetings, and serving as the primary contact for Axon-related inquiries and troubleshooting. To qualify for this role, you should have a Bachelor's degree in Computer Science, Information Technology, or a related field, along with a minimum of 3 years of experience in data governance or data management roles. Strong proficiency in working with the Informatica Axon platform, data governance frameworks, SQL, data modeling concepts, and data visualization tools is required. Excellent analytical, problem-solving, communication, and interpersonal skills are essential. Project management experience and certifications in data management or data governance are preferred. Proficiency in metadata management, data lineage concepts, handling multiple projects, knowledge of data privacy and compliance regulations, attention to detail, quality assurance mindset, and a willingness to stay updated with evolving data governance tools and techniques are also necessary for success in this role.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

punjab

On-site

As an EDM DQ Technical Lead at Bunge, located in Mohali, Punjab, India, you will be responsible for leading global Master Data quality programs and driving data cleansing projects across various data domains such as Material, Customer, Supplier, FI, and Supply chain related master data. Your role will involve improving data quality and governance by developing DQ dashboards, executing critical projects, and adhering to established Bunge data Governance policies at regional and global levels. You will act as a Data Quality Application/tools expert, particularly in SAP Information Steward, managing all data types within the company. Collaboration with stakeholders from Business, IT, and other areas will be essential to ensure the successful implementation of projects and governance initiatives related to master data. Key responsibilities include leading data quality programs, developing programs using business algorithms, monitoring data adherence, collaborating with Master Data Coordinator to resolve issues, leading continuous improvement initiatives, and analyzing business processes to address needs innovatively. Department-specific functions and requirements include hands-on knowledge of key technologies like Information Steward, SQL, MDG, expertise in SAP Information Steward and SQL, as well as experience with Data Dictionaries, taxonomy, and Metadata management. To qualify for this role, you should have a minimum of 5 years of professional data management experience leading SAP Information Steward, SQL systems independently. Additionally, a minimum of 3 years of experience providing and supporting business solutions around DQ, along with working experience in SAP Information Steward / SQL is required. Experience with FMCG / CPG organizations is preferable, and the ability to work effectively in a virtual team across different distances, cultures, and time zones is essential. Bunge, a world leader in sourcing, processing, and supplying oilseed and grain products, offers sustainable products and opportunities for farmers and consumers globally. With headquarters in St. Louis, Missouri, and a global workforce of 25,000 employees, Bunge operates across numerous port terminals, oilseed processing plants, grain facilities, and food and ingredient production facilities worldwide.,

Posted 1 week ago

Apply

15.0 - 19.0 years

0 Lacs

pune, maharashtra

On-site

If you are seeking further opportunities to advance your career, you can take the next step in realizing your potential by joining HSBC. HSBC is a global banking and financial services organization operating in 62 countries and territories. The organization's goal is to be present where growth occurs, supporting businesses to thrive, economies to prosper, and individuals to achieve their aspirations. Currently, HSBC is looking for an experienced professional to join the team in the position of Data Technology Lead, focusing on Data Privacy. The role is open in Pune or Hyderabad. As a Data Technology Lead, you will be responsible for driving the strategy, engineering, and governance of Data Privacy technology within the CTO Data Technology function. Your role is crucial in ensuring the bank's compliance with complex global data privacy regulations and commitments to customer trust through scalable, automated, and resilient technology solutions. Your main responsibilities will include designing and executing enterprise-wide capabilities for classifying, protecting, governing, and monitoring personal and sensitive data throughout its lifecycle, from data discovery to secure deletion. This will involve integrating solutions across data platforms, operational systems, and third-party systems. You will lead cross-functional teams and collaborate closely with Group Privacy, Legal, Risk, Cybersecurity, and business-aligned CTOs to implement privacy-by-design practices across platforms and establish robust data protection standards. This leadership role is highly impactful, requiring expertise in privacy technologies, platform engineering, control automation, and global compliance. Key Responsibilities: - Define and lead the enterprise strategy for Data Privacy Technology across different data environments. - Design and implement technology capabilities to support privacy compliance frameworks such as GDPR. - Lead the development and integration of solutions for data classification, consent management, access controls, data subject rights fulfillment, data retention, and disposal. - Govern and oversee the privacy tooling landscape, ensuring robust metadata management and control enforcement. - Collaborate with Group CPO, Legal, and Compliance to translate regulatory mandates into implementable technology solutions. - Embed privacy-by-design principles into data pipelines, software development lifecycles, and DevSecOps practices. - Drive enterprise-wide adoption and monitor privacy controls" effectiveness using metrics, dashboards, and automated audits. - Lead engineering teams to deliver scalable services with resilience, performance, and observability. - Participate in regulatory engagements, internal audits, and risk forums related to data privacy controls. - Cultivate a high-performing team culture and advocate for privacy as a core design principle. Requirements: - 15+ years of relevant experience in enterprise data or technology roles, including senior leadership in data privacy, security, or compliance engineering. - Deep expertise in data privacy technologies, both product-based and in-house development. - Strong knowledge of global privacy regulations and frameworks, such as GDPR. - Technical proficiency in data discovery, classification, masking, encryption, and access control enforcement. - Understanding of metadata-driven architectures and control automation. - Experience in hybrid data estates, including data lakes and multi-cloud environments. - Proven track record of partnering with legal, compliance, and cybersecurity teams to implement privacy programs aligned with business needs. - Ability to lead multi-regional engineering teams, make architectural decisions, and deliver at scale. - Experience with platform monitoring, policy enforcement, and control assurance frameworks. Join HSBC to achieve more in your career. For more information and to explore opportunities, visit www.hsbc.com/careers. Please note that personal data provided by applicants will be handled in accordance with the Bank's Privacy Statement available on the website.,

Posted 1 week ago

Apply

3.0 - 15.0 years

7 - 25 Lacs

hyderabad, chennai, bengaluru

Work from Office

Roles and Responsibilities : Design, develop, and maintain metadata frameworks for data governance across various business units. Collaborate with stakeholders to identify and define data quality requirements, ensuring compliance with industry standards. Develop and implement effective data governance policies, procedures, and best practices to ensure high-quality data management. Provide training and support to end-users on metadata management tools and processes. Job Requirements : 3-15 years of experience in Metadata Management or related field (Data Governance). Strong understanding of data quality principles, including data validation, cleansing, and profiling techniques. Location- PAN INDIA Experience with developing metadata frameworks using industry-standard tools such as Informatica IDQ or similar technologies.

Posted 1 week ago

Apply

4.0 - 7.0 years

5 - 9 Lacs

kochi, hyderabad, bengaluru

Work from Office

Summary: We are looking for an experienced Oracle EPM Functional Consultant with deep expertise in Enterprise Data Management (EDM) and Account Reconciliation Cloud Service (ARCS). The ideal candidate will play a key role in implementing, configuring, and supporting Oracle EPM Cloud solutions, ensuring alignment with business goals and compliance standards. Key Responsibilities: Implementation & Configuration: Strong experience in the implementation and configuration of Oracle EPM modules, especially EDMCS and ARCS. Design and manage metadata, hierarchies, and mappings in EDMCS ensuring data accuracy, completeness, and consistency. Configure reconciliation formats, rules, and workflows in ARCS. Integration & Support: Ensure seamless integration between EDMCS, ARCS, and other Oracle EPM modules (e.g., FCCS, EPBCS). Provide ongoing support, troubleshoot issues, and optimize system performance. Monitor KPIs and ensure continuous improvement. Training & Documentation: Prepare training materials and deliver end-user training sessions. Maintain detailed documentation for configurations, processes, and user guides. Qualifications: Bachelors degree in Finance, Information Systems, or related field. 4-7 years of experience in Oracle EPM Cloud, including EDMCS and ARCS. Proven track record of at least two end-to-end Oracle EPM Cloud implementations. Strong understanding of financial close, reconciliation, and data governance processes. Proficiency with EPM Automate and integration tools. Excellent communication and stakeholder management skills. Preferred Skills: Experience with other Oracle EPM modules like FCCS, EPBCS, PCMCS. Familiarity with Agile or Waterfall methodologies. Oracle certifications in EPM Cloud modules. Location - Bengaluru,Kochi,Hyderabad,Chennai

Posted 1 week ago

Apply

10.0 - 15.0 years

40 - 45 Lacs

pune

Work from Office

Job Description: Job Title: Solution Architect Location: Pune, India Corporate Title: Assistant Vice President Role Description Solution Architect with expertise in solution design to join our enterprise architecture team. This role will focus on: Designing secure, scalable, and cost-effective solution. Reviewing and improving existing architecture to ensure performance compliance and resilience. Leading and defining cloud migration strategies, including hybrid and multi-cloud adoption Bridging business needs with secure, data driven architecture that align with enterprise standards Enabling strong data design principles across enterprise platform and systems. Creating and reviewing vendor application architecture, presenting them to architecture forums and obtaining approvals. Work on POC and MVPs to evaluate new technologies and solution approaches. Your key responsibilities Architecture & Design Review, assess, and enhance existing architectures to identify security and performance gaps. Design and document new solution architectures, ensuring they are secure, scalable, and cost-effective. Provide end-to-end solution options with clear trade-off analysis for business and IT stakeholders. Lead cloud migration assessments, define migration roadmaps, and design hybrid/multi-cloud solutions. Security Integration Embed security-by-design principles into all solution and data architectures. Conduct threat modeling and define countermeasures for identified risks. Define secure patterns for API, data exchange, and application integration. Work with DevSecOps teams to ensure continuous compliance in CI/CD pipelines. Data Architecture Responsibilities Define data models, data flows, and integration strategies for enterprise systems. Ensure data security, governance, lineage, and quality are built into architectures. Design solutions to handle structured and unstructured data across platforms. Work with analytics teams to enable secure and scalable data platforms (DWH, Data Lakes, BI tools). Support implementation of data privacy regulations (GDPR, HIPAA, etc.) in solution designs. Migration & Modernization Design cloud-native solutions leveraging AWS, Azure, or GCP services. Define migration patterns (rehost, refactor, replatform, etc.) for legacy applications and databases. Ensure secure data migration strategies, including encryption, backup, and failover planning. Collaboration & Governance Act as a trusted advisor to business and IT leaders on secure and data-driven design choices. Participate in architecture review boards to approve designs and ensure compliance with enterprise standards. Provide solution recommendations and alternatives to align IT capabilities with business goals. Mentor junior architects and technical teams on secure solution and data design practices. Create and streamline the process of application onboarding, ensuring alignment with enterprise standards. Your skills and experience Education: Bachelors or Masters in Computer Science, Information Security, Data Engineering, or related field. Experience: 10+ years in IT with at least 5 years in solution architecture, including significant security and data architecture responsibilities. Technical Skills: Deep knowledge of the Architecture and Design Principles, Algorithms and Data Structures for both on-prem and cloud native solutions (GCP Architecture Certification preferred) Strong background in cloud platforms (AWS, Azure, GCP) and cloud migration strategies. Expertise in IAM, PKI, encryption, network security, API security, and DevSecOps. Hands-on experience in data modeling, ETL, data lakes/warehouses, and BI platforms. Familiarity with data governance frameworks, metadata management, and master data management (MDM). Knowledge of compliance frameworks (GDPR, HIPAA, PCI-DSS, ISO 27001). The following criteria would be beneficial, but are good to have: Knowledge in AI/ML to boost the business objectives Knowledge of data regulations, sustainable technology and ESG products. Knowledge of data services regulatory/jurisdictional data concerns and experience in providing solutions Certifications (preferred): TOGAF, AWS/GCP/AZURE Solution Architect Soft Skills: Strong communication, ability to influence stakeholders, and proven track record of simplifying complex designs.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies