Jobs
Interviews

426 Data Modelling Jobs - Page 6

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As an Assistant Vice President Metadata Support Analyst at HSBC, you will play a crucial role in supporting the implementation of Metadata Management solutions and Business Data Layer. You will work collaboratively with Business, Data, and IT teams to create and maintain source-aligned data products according to agreed templates, guidelines, and principles. Your responsibilities will include engaging with the Data Architecture and Governance team to discover necessary metadata, automate metadata collection, design metadata usage with stakeholders, and ensure the quality and integrity of metadata. Your role will also involve monitoring the overall health and usage of metadata and data products to enhance user experience. You will define key performance indicators (KPIs) to measure the effectiveness of data products, provide ongoing support for metadata implementation, testing, and integration, and offer expertise to enable customers to meet their data requirements independently. Collaboration with cross-functional teams to ensure strategic solutions align with Data Strategy will be essential. To excel in this role, you should have strong analytical and data manipulation skills, particularly in SQL, to provide data solutions, reports, and support for data customers. Previous experience in a Global Data Organisation and familiarity with Data Management activities are required. An understanding of Data and Metadata management, including data cataloguing, as well as experience with Metadata management tools and Data Catalogue tools, is necessary. Knowledge of Data Modelling/Metadata Modelling (MetaModel) and basic understanding of Data products are expected. Ideal candidates will hold a Bachelor's degree or equivalent qualification and possess the ability to work effectively in global and local teams across different time zones. Being a good team player, maintaining strong relationships, and working under pressure with evolving priorities are key attributes for this role. You should be able to clearly communicate concepts and issues, translate theory into practical business solutions, and have experience interacting with senior stakeholders in a global environment. While Asset Management experience is preferred, it is not mandatory. Join HSBC to make a real impact and discover the value you can bring to a global banking and financial services organisation.,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

The WRB Data Technology team at Standard Chartered Bank supports Data and Business Intelligence, Finance and Risk projects globally by delivering data through data warehouse solutions. The team is composed of data specialists, technology experts, and project managers who work closely with business stakeholders to implement end-to-end solutions. Standard Chartered Bank is looking to hire skilled data professionals with relevant experience to contribute to the team's objectives. The successful candidates will be expected to work in a global environment, drawing from both internal and external talent pools. Your responsibilities as a member of the WRB Data Technology team will include participating in data warehousing migration programs involving cross-geography and multi-functional delivery. You will need to align project timelines to ensure successful project delivery, provide support for data analysis, mapping, and profiling, and perform data requirement gathering, analysis, and documentation. Additionally, you will be responsible for mapping data attributes from different source systems to target data models, interpreting use case requirements, designing target data models/data marts, and profiling data attributes to assess data quality and provide remediation recommendations. It is crucial to ensure that data use complies with data architecture principles, including golden sources and standard reference data. Furthermore, you will be involved in data modeling for better data integration within the data warehouse platform and project delivery, engaging consultants, business analysts, and escalating issues in a timely manner. You will work closely with Chapter Leads and Squad Leads to lead projects and manage various stakeholders, including business, technology teams, and internal development teams. Your role will involve transforming business requirements into data requirements, designing data models for use cases and data warehousing, creating data mapping templates, and profiling data to assess quality, suitability, and cardinality. You will also support data stores inbound and/or outbound development, perform data acceptance testing, provide direction on solutions from a standard product/architecture perspective, and participate in key decision-making discussions with business stakeholders. Additionally, you will be responsible for supporting System Integration Testing (SIT) and User Acceptance Testing (UAT), managing change requests effectively, ensuring alignment with bank processes and standards, and delivering functional specifications to the development team. To excel in this role, you should possess domain knowledge and technical skills, along with 6-8 years of experience in banking domain/product knowledge with IT working experience. A graduate degree in computer science or a relevant field is required, and familiarity with tools such as Clarity, ADO, Axess, and SQL is beneficial. Strong communication and stakeholder management skills are essential, as well as the ability to write complex SQL scripts. Knowledge of Base SAS is an advantage, and familiarity with Retail Banking and Wealth Lending data is ideal. You should be able to work effectively in a multi-cultural, cross-border, and matrix reporting environment, demonstrating knowledge management for MIS applications, business rules, mapping documents, data definitions, system functions, and processes. With a background in business or data analysis roles, you should have a good understanding of data analytics, deep dive capabilities, and excellent attention to detail and time management. This role offers the opportunity to become a go-to person for data across the bank globally, providing extensive exposure to all parts of the bank's business model. It serves as a solid foundation for a future career in the broader data space, preparing individuals for roles in analytics, business intelligence, and big data. Your work will contribute to driving commerce and prosperity through unique diversity, aligning with Standard Chartered Bank's purpose and brand promise to be here for good. If you are passionate about making a positive difference and are eager to work in a collaborative and inclusive environment, we encourage you to join our team at Standard Chartered Bank.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

As a BI Developer at Adobe, you will be part of a dedicated software design team, reporting directly to the Technical Manager. Your role will involve contributing to all aspects of software coding and design, collaborating closely with Business Analysts to understand design specifications and convert requirements into applications, dashboards, or reporting solutions. Your work will have a global impact as you develop visual reports, dashboards, and KPI scorecards using Power BI desktop & Power Service, create writeback tools using Power App and Automate, connect to data sources, import and transform data for Business Intelligence, and develop tabular and multidimensional models adhering to warehouse standards. Additionally, you will integrate Power BI reports into other applications using embedded analytics and implement row-level security on data. To succeed in this role, you should possess a Bachelor's degree in computer science or related fields, along with 5+ years of work experience in Power BI, Power Apps, and Automate. You should also have 5+ years of experience in scripting languages like DAX and Python, a strong understanding of TSQL, stored procedures, and database performance tuning. Experience in Databricks, Big Data, and Gen AI technologies is a plus. Your ability to design excellent UIs, develop entities in PowerApps, and showcase expertise in data modeling, prototyping, performance tuning, and data analysis techniques will be crucial. Moreover, your willingness to learn new software and technologies quickly and adapt to a dynamic environment is essential for success in this role. At Adobe, you will have the opportunity to work in an exceptional environment known worldwide for its quality. You will be part of a team dedicated to mutual growth through continuous feedback. If you aspire to make a significant impact, Adobe is the ideal place for you. Learn more about our employees" experiences and the benefits we offer by visiting the Adobe Life blog. If you require any accommodations to access our website or complete the application process due to a disability or special need, please contact accommodations@adobe.com or call (408) 536-3015.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

As an experienced professional with 3-5 years in the field, you will be responsible for handling various technical tasks related to Azure Data Factory, Talend/SISS, MSSQL, Azure, and MySQL. Your expertise in Azure Data Factory will be crucial in this role. Your primary responsibilities will include demonstrating advanced knowledge of Azure SQL DB & Synapse Analytics, Power BI, SSIS, SSRS, T-SQL, and Logic Apps. Your ability to analyze and comprehend complex data sets will play a key role in your daily tasks. Proficiency in Azure Data Lake and other Azure services such as Analysis Service, SQL Databases, Azure DevOps, and CI/CD will be essential for success in this role. Additionally, a solid understanding of master data management, data warehousing, and business intelligence architecture will be required. You will be expected to have experience in data modeling and database design, with a strong grasp of SQL Server best practices. Effective communication skills, both verbal and written, will be necessary for interacting with stakeholders at all levels. A clear understanding of the data warehouse lifecycle will be beneficial, as you will be involved in preparing design documents, unit test plans, and code review reports. Experience working in an Agile environment, particularly with methodologies like Scrum, Lean, or Kanban, will be advantageous. Knowledge of big data technologies such as the Spark Framework, NoSQL, Azure Data Bricks, and the Hadoop Ecosystem (Hive, Impala, HDFS) would be a valuable asset in this role.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Senior Database Administrator at NTT DATA, you will play a crucial role in ensuring the availability, integrity, and performance of complex and critical data assets. Working closely with cross-functional teams, you will support data-driven applications, troubleshoot issues, and implement robust backup and recovery strategies. Your expertise will be instrumental in controlling access to database environments through permissions and privileges. Key Responsibilities: - Install, configure, and maintain complex database management systems (DBMS) such as Oracle, MySQL, PostgreSQL, and others. - Collaborate with software developers/architects to design and optimize database schemas and data models. - Write database documentation, data standards, data flow diagrams, and standard operating procedures. - Monitor database performance, identify bottlenecks, and optimize queries for optimal performance. - Design and implement backup and disaster recovery strategies for data availability and business continuity. - Work with Change Control and Release Management to commission new applications and customize existing ones. - Plan and execute database software upgrades and patches to ensure system security and up-to-date functionality. - Implement security measures to safeguard databases from unauthorized access, breaches, and data loss. - Conduct security audits and vulnerability assessments to maintain compliance with data protection standards. - Collaborate with cross-functional teams to support database-related initiatives and provide technical support to end-users. Knowledge, Skills, and Attributes: - Proficiency in database administration tasks, SQL, database security, backup and recovery strategies. - Ability to monitor database performance, manage multiple projects, and communicate complex IT information effectively. - Strong problem-solving and analytical skills to troubleshoot database-related issues. - Familiarity with data architecture, data services, and application development lifecycle. - Experience working with unstructured datasets and extracting value from large datasets. Academic Qualifications and Certifications: - Bachelor's degree in computer science, engineering, information technology, or related field. - Relevant certifications such as MCSE DBA, Oracle Certified Professional, MySQL Database Administrator, PostgreSQL Certified Professional. - Completion of database management courses covering database administration, data modeling, SQL, and performance tuning. Required Experience: - Demonstrated experience as a Database Administrator within an IT organization. - Experience with database backup and recovery practices, health assessment reports, and managing databases. Workplace Type: - Hybrid Working About NTT DATA: NTT DATA is a trusted global innovator of business and technology services, serving Fortune Global 100 clients. Committed to innovation and long-term success, NTT DATA invests in R&D to drive organizations confidently into the digital future. With a diverse global team and extensive partner ecosystem, NTT DATA offers consulting, AI, industry solutions, and application management services. As a leading provider of digital and AI infrastructure, NTT DATA is part of NTT Group and headquartered in Tokyo. NTT DATA is an Equal Opportunity Employer.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a skilled and experienced Database Administrator (DBA), you will be responsible for managing and supporting our database environments to ensure optimal performance, integrity, and security. Working closely with other IT team members and stakeholders, you will play a crucial role in ensuring that our data systems operate efficiently and meet the business needs. Your qualifications include a Bachelor's degree in Computer Science, Information Technology, or a related field. A Master's degree or relevant certifications such as Oracle DBA or Microsoft SQL Server Certified would be a plus. With at least 5+ years of proven experience in managing database systems, you should have hands-on experience with major DBMS platforms like Oracle, SQL Server, MySQL, PostgreSQL, and MongoDB. Proficiency in SQL for querying and managing databases, along with knowledge of database design, data modeling, and normalization, is essential. Your responsibilities will include installing, configuring, and maintaining database software and related tools, monitoring database performance, and ensuring optimal resource utilization. Additionally, you will perform routine maintenance tasks, implement database security measures, and analyze performance metrics to identify bottlenecks and improve query efficiency. Strong analytical and problem-solving skills, excellent communication abilities, and the capacity to manage multiple tasks and projects simultaneously are required. Experience with cloud-based database services like AWS RDS, Google Cloud SQL, and big data technologies such as Hadoop would be beneficial. You will also participate in database design and data modeling activities, ensure data integrity through normalization and data validation, and develop and maintain documentation including data dictionaries and schema diagrams. Implementing robust backup and recovery procedures, managing disaster recovery planning, enforcing database security policies, and ensuring compliance with data privacy regulations are crucial aspects of your role. Collaboration with developers, system administrators, and stakeholders to ensure seamless database integration, as well as providing technical support and troubleshooting for database-related issues, will be part of your everyday tasks. Additionally, you may need to participate in on-call rotations and respond to critical database incidents to maintain the efficiency and security of our database systems.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

You have over 8 years of experience and are located in Balewadi, Pune. Your technical skills and core competencies include a strong understanding of Data Architecture and models, leading data-driven projects, expertise in Data Modelling paradigms such as Kimball, Inmon, Data Marts, Data Vault, and Medallion. You have a solid experience with Cloud Based data strategies and big data technologies with a preference for AWS. You are adept at designing data pipelines for ETL, possessing expert knowledge on ingestion, transformation, and data quality. Hands-on experience in SQL is a must, including a deep understanding of PostGreSQL development, query optimization, and designing indexes. You should be able to understand and manipulate intermediate to complex levels of SQL, with thorough knowledge of Postgres PL/SQL for complex warehouse workflows. Moreover, you can apply advanced SQL concepts and statistical concepts through SQL, and experience with PostGres SQL extensions like PostGIS is desired. Expertise in writing ETL pipelines combining Python + SQL is required, along with an understanding of data manipulation libraries in Python like Pandas, Polars, DuckDB. Experience in designing Data visualization with tools such as Tableau and PowerBI is desirable. Your responsibilities include participating in the design and development of features in the existing Data Warehouse, providing leadership in establishing connections between Engineering, product, and analytics/data scientists team, designing, implementing, and updating existing/new batch ETL pipelines, defining and implementing data architecture, and partnering with both engineers and data analysts to build reliable datasets. You will work with various data orchestration tools (Apache Airflow, Dagster, Prefect, and others), embrace a fast-paced start-up environment, and should be passionate about your job and enjoy a fast-paced international working environment. Background or experience in the telecom industry is a plus but not a requirement. You love automating and enjoy monitoring.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

As the Lead Power BI Developer at Stackular, you will play a crucial role in designing, developing, and maintaining business intelligence solutions utilizing Microsoft's Power BI platform. Your primary responsibility will involve working closely with stakeholders to comprehend business requirements, generate insightful reports and dashboards, and deliver data-driven insights that steer decision-making processes. Your expertise in data modeling, data visualization, and transforming intricate data sets into actionable business intelligence will be essential for success in this role. In your capacity as a Lead Power BI Developer, you will be tasked with various key responsibilities. This includes designing, developing, and maintaining Power BI reports and dashboards, creating data models to support reporting needs, and implementing best practices in data visualization to provide clear and actionable insights. Furthermore, you will be responsible for connecting to diverse data sources such as SQL databases, Excel, and cloud-based data sources, optimizing DAX calculations and queries, and ensuring data accuracy and integrity through ETL processes. Collaboration will be a significant aspect of your role as you will collaborate closely with business stakeholders to gather and comprehend requirements, work alongside data engineers and team members to ensure seamless data integration, and offer training and support to end-users on Power BI functionalities. Moreover, you will be expected to monitor and optimize the performance of Power BI solutions, troubleshoot and resolve data quality and performance issues, and uphold the highest standards of quality in everything you do. To qualify for this role, you should hold a Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field. A master's degree would be considered a plus. Additionally, a minimum of 8+ years of experience in business intelligence and data analytics is required, along with a proven track record in developing Power BI solutions. Proficiency in Power BI, including DAX and Power Query, SQL, data modeling, and ETL processes is essential. Experience with Azure Data Services such as Azure SQL and Azure Data Factory would be advantageous, and familiarity with other BI tools is considered a plus. Apart from technical skills, soft skills are also valued for this role. Strong analytical and problem-solving abilities, excellent communication and presentation skills, the capacity to work independently and collaboratively, and a keen attention to detail and commitment to quality are attributes that will contribute to your success as a Lead Power BI Developer at Stackular. In return, we offer a culture deeply rooted in our core values, a competitive salary and benefits package, opportunities for personal and professional growth, and the chance to work for a company with a really cool name. Join us at Stackular and be a part of our dynamic product development community where your skills and values align with our shared vision.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You will be responsible for demonstrating thorough knowledge and a proven record of success in executing various functional and technical aspects of SAP Master Data Governance (MDG) projects following industry best practices. This includes Data Modelling, Process Modelling, UI Modelling, Business Validation Rules Modelling, Derivations, and Data Replication Framework (DRF) and Workflow creations and Maintenance. Your role will require a good understanding of the SAP MDG technical framework, including BADI, BAPI/RFC/FM, Workflows, BRF+, Enterprise Services, IDoc, Floorplan Manager, WebDynPro, Fiori, and MDG API framework. Knowledge of SAP data dictionary tables, views, relationships, and corresponding data architecture for ECC and S/4 HANA for various SAP master and transactional data entities is essential, including excellent functional knowledge for core master data objects like customer, vendor, and material. Hands-on experience in configuring customer, vendor, finance, and product/material master data in MDG is necessary, including data harmonization involving de-duplication, mass changes, and data replication involving Key/Value mapping, SOA Web services, ALE/Idoc. Effective communication with customers and partners to understand specific Enterprise Data needs is a key aspect of this role. You should possess excellent written and verbal communication skills with the ability to impart ideas in technical, business, and user-friendly language. Having an appetite to acquire new knowledge, adapt to, and contribute to fast innovation is important for success in this role. The ideal candidate will have a minimum of 5 years of experience in SAP Master Data Governance (MDG) with at least 2 full cycle implementations. Implementation experience of SAP MDG in key domains such as Customer, Supplier, Material, and Finance Master is required. Hands-on experience with SAP Fiori, SAP MDG mass processing, consolidation, central governance, Workflow, and BRF+ is essential. Experience in RDG is considered an added advantage for this role.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

The role of MDG Technical requires 4-8 years of experience and is based in Pune. As an MDG Technical, you must have skills in OO-ABAP, FPM, Workflow, BRF+, Web services, Process modelling, Data Modelling, UI Modelling, and ALE (IDOC). Nice to have skills include WD-ABAP, BOPF, ODATA, REST, UI5, ABAP on HANA DB, ABAP on S/4 HANA, HANA Data Modelling, Fiori, and Master data consolidation. As a Developer with 4+ years of experience, your responsibilities will include Master data consolidation, Mapping, Migration Cockpit, Customization (Finance & Business Partner), and Functional Knowledge. Qualifications required for this position are B.E/B.Tech. This role demands a professional with 4-8 years of experience in the field.,

Posted 1 week ago

Apply

10.0 - 15.0 years

0 Lacs

maharashtra

On-site

The Oracle PLSQL Developer -TSYS Prime position in Mumbai requires 10 to 15 years of experience in Bank domain with TSYS PRIME experience. You must possess sound knowledge of TSYS PRIME and Oracle PLSQL language, along with APIs knowledge. Your responsibilities will include participating in all phases of SDLC, such as design, coding, code reviews, testing, and project documentation. You will also be required to coordinate with co-developers and other related departments. Desired skills and qualifications for this role include a strong understanding of TSYS PRIME, Oracle PL/SQL language, and APIs knowledge. You should have good exposure to Oracle advanced database concepts like Performance Tuning, indexing, Partitioning, and Data Modeling. Additionally, you will be responsible for database side development, implementation, and support, including solving daily Service requests, Incidents, and change requests. Experience in Code Review, Team Management, Effort Estimation, and Resource Planning will be beneficial for this role. If you are interested in this position, please apply by sending your resume to hr@techplusinfotech.com.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

As a PL/SQL Developer with 3+ years of experience in Oracle / Postgres, you will be responsible for designing, developing, and maintaining database applications using the PL/SQL programming language. Your key roles and responsibilities will include: - Designing and developing database schemas, stored procedures, functions, and triggers using PL/SQL to ensure efficient data storage and retrieval. - Optimizing database performance by tuning SQL queries and PL/SQL code to enhance overall system efficiency. - Developing and executing test plans to validate the quality and accuracy of PL/SQL code, ensuring the reliability of database applications. - Troubleshooting and resolving issues related to PL/SQL code to maintain the integrity and functionality of database systems. - Implementing database security policies and procedures to safeguard the confidentiality and integrity of data, ensuring data protection. - Collaborating with cross-functional teams to support their data needs and provide access to data for reporting and analytics purposes. - Deploying and supporting object shipment during any database deployment and integrated system upgrades. - Creating and maintaining database schemas, tables, indexes, and relationships based on project requirements and best practices. - Writing and optimizing SQL queries to extract, manipulate, and transform data for various business needs, ensuring query performance. - Integrating data from different sources into the SQL database, including APIs, flat files, and other databases, for comprehensive data storage. - Developing and maintaining data models, ER diagrams, and documentation to effectively represent database structures and relationships. - Monitoring and fine-tuning database performance to identify and resolve bottlenecks and inefficiencies for optimized system functionality. - Ensuring data accuracy and consistency through validation and cleansing processes, identifying and rectifying data quality issues. - Analyzing and optimizing complex SQL queries and procedures for enhanced performance and efficiency. - Maintaining comprehensive documentation of database structures, schemas, and processes for future reference and team collaboration. You should possess strong problem-solving and analytical skills, with attention to detail, excellent project management abilities to oversee multiple projects and meet deadlines, and strong collaboration skills to work both independently and in a team. Fluency in English with excellent written and verbal communication skills is essential for effective interaction with stakeholders.,

Posted 1 week ago

Apply

9.0 - 13.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Technical Architect at WNS-Vuram, a global hyperautomation services company specializing in low-code enterprise automation, you will play a crucial role in engaging with potential clients, understanding their business needs, and crafting technical solutions using the Appian platform. With over 9 years of experience, you will collaborate with the sales team and customers to assess client requirements and design effective solutions utilizing various Appian components, integrations, and Appian AI. Your responsibilities will include presenting demos and POCs, managing technical resources, providing technical guidance and mentorship, as well as leading architecture and solutions discussions with customers. Key Responsibilities: - Collaborate with sales team and customers to understand their needs and objectives. - Assess client requirements and convert them into effective Appian solutions. - Design effective solutions utilizing various Appian components, integrations, and Appian AI. - Utilize the technical team to get the project delivered following agile methodologies. - Present demos and POCs to showcase the solution fitment. - Create effort estimation, manage scrum sprints, and perform code reviews for quality deliverables. - Lead architecture & solutions discussion with customers and influence technical decisions. - Provide technical guidance and mentorship to team members for any technical issues. - Lead and manage a mid-size team of technical resources for work allocation, estimation, delivery, monitoring, feedback, and performance management. - Develop and enforce Appian best practices, reusable components and frameworks. - Stay updated on Appian product updates & industry trends and share the same with the team and customers. - Manage the Appian platform upgrade process, ensuring minimal disruptions. - Collaborate with COE teams on technical aspects and Design team on UI / UX aspects respectively. Key Skills and Qualifications: - Bachelors degree in Computer Science, Engineering, or a related field (B.Tech / B.E. / MCA). - 7-8 years of experience in delivering complex and large-scale enterprise implementations in Appian. - Deep understanding of Appian's architecture, components, capabilities, and limitations. - Proficiency in integration technologies like REST API, SOAP, JSON, Message queues etc. - Hands-on experience with database design, data modeling, and security aspects. - Exposure to Agile framework and managing agile delivery through JIRA. - Strong presentation and communication skills with the ability to convey technical concepts effectively. - Certification in Appian (Appian Certified Lead Developer) preferred. - Exposure in RPA, AI / ML technologies, and Generative AI would be preferred. - Domain experience in BFSI would be an added advantage. - Previous leadership role in an IT services company, experience in Low Code / No Code technologies like Service Now, Power Platform, and understanding of User Experience (UX) concepts are highly preferable. Join WNS-Vuram and be part of a team committed to driving digital transformation for organizations worldwide through an Empathy-First Technology-Next approach. Visit https://www.vuram.com for more information.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

Millennium is a top tier global hedge fund with a strong commitment to leveraging innovations in technology and data science to solve complex problems for the business. Assembling a strong Quant Technology team to build the next generation in-house analytics and trader support tools, this team will sit under the Fixed Income & Commodities Technology (FICT) group. The team will be responsible for developing and maintaining the in-house pricing libraries to support trading in Fixed Income, Commodities, Credit, and FX business at Millennium. FICT provides a dynamic and fast-paced environment with excellent growth opportunities. You will be tasked with working closely with Quant Researchers, Portfolio Managers & Technology teams to build commodities fundamental analytics and modelling platform from scratch. Your responsibilities will include developing scalable libraries and APIs for commodities fundamental modelling across multiple assets and geographies. You will also be involved in building, helping maintain, and operating end-to-end modelling pipelines that involve diverse and large sets of statistical and machine learning models. Additionally, you will contribute to building scalable tools to aid data analysis and visualization, and collaborate with the broader team to develop robust delivery and operating models that enable rapid development and scalable deployment of new capabilities. To excel in this role, you must possess a strong Python fundamentals knowledge (for data science) and SQL/Database experience. You should have solid experience with all stages of the data modelling pipeline, including data access, transformation, model training and inference pipeline, and data analysis tools. The ability to work independently with hands-on experience on the complete software development lifecycle and relevant development tools is essential. Previous experience as a Quant Developer/Data Scientist in the financial industry is required, with commodities and hands-on statistical/data modelling experience being preferred. Familiarity with tools and libraries like Airflow, Flask, Dash, etc. is also preferred.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

NTT DATA is looking for a PowerBI Data Visualisation Engineer to join their team in Ban/Hyd/Chn/Gur/Noida, Karntaka (IN-KA), India. As a PowerBI Data Visualisation Engineer, you will be responsible for designing, developing, and maintaining interactive dashboards and reports using PowerBI. You will collaborate with data analysts, data engineers, designers, and business stakeholders to gather requirements and ensure accurate, timely, and effective delivery of data visualisations. One of your key responsibilities will be to transform raw data into meaningful insights, ensuring data accuracy and consistency. You will also need to optimize dashboards for performance and usability, providing users with intuitive and efficient access to key metrics. It is essential to stay updated on the latest trends and best practices in data visualization and continuously seek opportunities to enhance the company's data analytics capabilities. In terms of technical knowledge, you must have expertise in PowerBI, PowerAutomate, Azure DevOps, SQL, Databricks, data modelling, data warehousing, ETL processes, and Python. Additionally, having soft skills such as exceptional analytical skills, attention to detail, excellent communication skills, creativity in proposing innovative visualization solutions, and a proactive attitude are crucial for this role. To qualify for this position, you should have proven experience as a Data Visualisation Engineer or a similar role, proficiency in PowerBI, a strong understanding of SQL, data warehousing, and ETL processes. Experience with programming languages such as Python, R, or Javascript is considered a plus. NTT DATA is a global innovator of business and technology services, serving 75% of the Fortune Global 100. As a Global Top Employer, they have diverse experts in more than 50 countries and a robust partner ecosystem. Their services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation, and management of applications, infrastructure, and connectivity. NTT DATA invests significantly in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

maharashtra

On-site

You will be joining as a Manager Tax Technology in Mumbai. In this role, you will play a crucial part as a solution tester and integrator, utilizing your expertise in Partnership Tax along with technical skills. Your responsibilities will involve driving and improving the overall firm-wide strategy related to tax process standardization and efficient utilization of tax technology tools to support the tax practice across different service lines. Collaboration with the firm's tax, information technology, and transformation team leadership and employees will be essential to identify future tax technology requirements, including areas for process enhancements, automation, efficiency improvements, and the application of best practices in tax processes and technology. This role is of high visibility and impact within the firm, where you will engage in various projects leveraging your tax knowledge in the financial services, corporate, or individual sector, combined with your technology acumen to innovate our work methods and contribute to cutting-edge technology development. Your primary responsibilities will include: - Proactively evaluating current tax technology and processes to implement transformative solutions that standardize, streamline, centralize, automate, track, and analyze business operations. - Collaborating with the information technology department to prototype, develop, enhance, and implement technology solutions and best practices. - Acting as a bridge between the Tax and Information Technology departments to enhance the understanding of tax department's process improvement/information technology needs, objectives, and challenges. - Translating conceptual user requirements into functional requirements clearly to the enterprise information technology team. - Documenting process workflows, both current and future state. - Creating business cases, identifying key stakeholders, and leading presentations with leadership. - Converting a goal/vision into a timeline with deliverables: Managing relationships to monitor workstream(s) progress, ensuring timely reporting of milestones and dependency status, monitoring risks and issues to escalate to leadership, and executing day-to-day project management activities throughout the transformation lifecycle of initiate, plan, and execute. - Developing and conducting training on new technology and processes. Basic qualifications for this role: - Bachelor's degree in Accounting, Business Administration, Business Management, Computer Science, or related field (MBA preferred). - Minimum of 5 years of experience (4 years in Technology space and 1-2 years in compliance within the 5 years period). - 4+ years of tax technology/transformation experience (financial services and/or real estate) with a public accounting firm or large global corporation. - 2+ years of tax compliance experience in the corporate, financial services, or individual/private wealth advisory industry. - Proficiency in tax software tools such as Thomson Reuters Go-Systems, CCH Axcess, and OneSource. - Experience in collaborating with software developers to communicate business requirements. - Demonstrated ownership of projects and ability to drive outcomes from inception to full business value. - Experience in gathering business requirements for technology implementations/process improvements. - Proficiency in documenting end-to-end processes using tools like Visio and Alteryx. - Advanced experience with Excel is a must. Additionally, exposure to Microsoft Power BI suite, development of databases, Bots, RPA, and experience in developing ETL solutions will be beneficial. Knowledge or proficiency in tools such as Power Query, Power BI/Tableau, Alteryx, Excel, data modeling, dashboarding, data pre-processing, application integration techniques, SharePoint development, VBA, SSIS, and SQL will be an advantage.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

thrissur, kerala

On-site

As a Data Engineer at WAC, you will be responsible for ensuring the availability, reliability, and scalability of the data infrastructure. Your role will involve collaborating closely with cross-functional teams to support data-driven initiatives, enabling data scientists, analysts, and business stakeholders to access high-quality data for critical decision-making. You will be involved in designing, developing, and maintaining efficient ETL processes and data pipelines to collect, process, and store data from various sources. Additionally, you will create and manage data warehouses and data lakes, optimizing storage and query performance for both structured and unstructured data. Implementing data quality checks, validation processes, and error handling will be crucial in ensuring data accuracy and consistency. Administering and optimizing relational and NoSQL databases to ensure data integrity and high availability will also be part of your responsibilities. Identifying and addressing performance bottlenecks in data pipelines and databases to improve overall system efficiency is another key aspect of the role. Furthermore, implementing data security measures and access controls to protect sensitive data assets will be essential. Collaboration with data scientists, analysts, and stakeholders to understand their data needs and provide support for analytics and reporting projects is an integral part of the job. Maintaining clear and comprehensive documentation for data processes, pipelines, and infrastructure will also be required. Monitoring data pipelines and databases, proactively identifying issues, and troubleshooting and resolving data-related problems in a timely manner are vital aspects of the position. To qualify for this role, you should have a Bachelor's degree in Computer Science, Information Technology, or a related field, with at least 4 years of experience in data engineering roles. Proficiency in programming languages such as Python, Java, or Scala is necessary. Experience with data warehousing solutions and database systems, as well as a strong knowledge of ETL processes, data integration, and data modeling, are also required. Familiarity with data orchestration and workflow management tools, an understanding of data security best practices and data governance principles, excellent problem-solving skills, and the ability to work in a fast-paced, collaborative environment are essential. Strong communication skills and the ability to explain complex technical concepts to non-technical team members are also important for this role. Thank you for your interest in joining the team at Webandcrafts. We look forward to learning more about your candidacy through this application.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

ahmedabad, gujarat

On-site

As a Solution Architect at Kanerika, you will collaborate with our sales, presales, and COE teams to provide technical expertise and support throughout the new business acquisition process. Your role will involve understanding customer requirements, presenting our solutions, and demonstrating the value of our products. In this high-pressure environment, maintaining a positive outlook and making strategic choices for career growth are essential. Your excellent communication skills, both written and verbal, will enable you to convey complex technical concepts clearly and effectively. Being a team player, customer-focused, self-motivated, and responsible individual who can work under pressure with a positive attitude is crucial for success in this role. Experience in managing and handling RFPs/ RFIs, client demos and presentations, and converting opportunities into winning bids is required. Having a strong work ethic, positive attitude, and enthusiasm to embrace new challenges are key qualities. You should be able to multitask, prioritize, and demonstrate good time management skills, as well as work independently with minimal supervision. A process-oriented and methodical approach with a quality-first mindset will be beneficial. The ability to convert a client's business challenges and priorities into winning proposals through excellence in technical solutions will be the key performance indicator for this role. Your responsibilities will include developing high-level architecture designs for scalable, secure, and robust solutions, selecting appropriate technologies, frameworks, and platforms for business needs, and designing cloud-native, hybrid, or on-premises solutions using AWS, Azure, or GCP. You will also ensure seamless integration between various enterprise applications, APIs, and third-party services, as well as design and develop scalable, secure, and performant data architectures on Microsoft Azure and/or new generation analytics platforms. To excel in this role, you should have at least 10 years of experience working in data analytics and AI technologies from consulting, implementation, and design perspectives. Certifications in data engineering, analytics, cloud, and AI will be advantageous. A Bachelor's in engineering/technology or an MCA from a reputed college is a must, along with prior experience working as a solution architect during the presales cycle. Soft skills such as communication, presentation, flexibility, and being hard-working are essential. Additionally, having knowledge of presales processes and a basic understanding of business analytics and AI will benefit you in this role at Kanerika. Join us at Kanerika and become part of a vibrant and diverse community where your talents are recognized, your growth is nurtured, and your contributions make a real impact. See the benefits section below for the perks you'll get while working for Kanerika.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

haryana

On-site

You will play a crucial role as a Data Modeler at Sun Life, where you will be responsible for designing and implementing new data structures to support project teams working on ETL, data warehouse design, managing the enterprise data model, data maintenance, and enterprise data integration approaches. Your technical responsibilities will include building and maintaining data models to report disparate data sets reliably, consistently, and in an interpretable manner. You will gather, distil, and harmonize data requirements to design conceptual, logical, and physical data models, as well as develop source-to-target mappings with complex ETL transformations. In this role, you will contribute to requirement analysis and database design, both in transactional and dimensional data modeling. You will work independently on data warehouse projects, collaborate with data consumers and suppliers to understand detailed requirements, and propose standardized data models. Additionally, you will help improve Data Management data models and facilitate discussions to understand business requirements and develop dimension data models based on industry best practices. To be successful in this position, you should have extensive practical experience in Information Technology and software development projects, with a minimum of 8 years of experience in designing operational data stores and data warehouses. Proficiency in data modeling tools such as Erwin or SAP Power Designer, a strong understanding of ETL and data warehouse concepts, and the ability to write complex SQL for data transformations and profiling are essential. Furthermore, you should possess a combination of solid business knowledge, technical expertise, excellent analytical and logical thinking, and strong communication skills. It would be advantageous if you have an understanding of the Insurance Domain, basic knowledge of AWS cloud services, experience with Master Data Management, Data Quality, Data Governance, and data visualization tools like SAS VA and Tableau. Familiarity with implementing and architecting data solutions using tools like Informatica, SQL Server, or Oracle is also beneficial. Join Sun Life's Advanced Analytics team and embark on a rewarding journey where you can contribute to making a positive impact on individuals, families, and communities worldwide.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

Wipro Limited is a leading technology services and consulting company dedicated to developing innovative solutions that cater to the most complex digital transformation needs of clients. Our comprehensive range of consulting, design, engineering, and operational capabilities enables us to assist clients in achieving their most ambitious goals and establishing sustainable, future-ready businesses. With a global presence of over 230,000 employees and business partners spanning 65 countries, we remain committed to supporting our customers, colleagues, and communities in navigating an ever-evolving world. We are currently seeking an individual with hands-on experience in data modeling for both OLTP and OLAP systems. The ideal candidate should possess a deep understanding of Conceptual, Logical, and Physical data modeling, coupled with a robust grasp of indexing, partitioning, and data sharding, supported by practical experience. Experience in identifying and mitigating factors impacting database performance for near-real-time reporting and application interaction is essential. Proficiency in at least one data modeling tool, preferably DB Schema, is required. Additionally, functional knowledge of the mutual fund industry would be beneficial. Familiarity with GCP databases such as Alloy DB, Cloud SQL, and Big Query is preferred. The role demands willingness to work from our Chennai office, with a mandatory presence on-site at the customer site requiring five days of work per week. Cloud-PaaS-GCP-Google Cloud Platform is a mandatory skill set for this position. The successful candidate should have 5-8 years of relevant experience and should be prepared to contribute to the reimagining of Wipro as a modern digital transformation partner. We are looking for individuals who are inspired by reinvention - of themselves, their careers, and their skills. At Wipro, we encourage continuous evolution, reflecting our commitment to adapt to the changing world around us. Join us in a business driven by purpose, where you have the freedom to shape your own reinvention. Realize your ambitions at Wipro. We welcome applications from individuals with disabilities. For more information, please visit www.wipro.com.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You will be joining our team as a Python Developer in the CDRR, where the mission is to deliver first-line defences to manage Cyber and Fraud risks to Morgan Stanley's technology, operations, and information. This involves risk identification, control management, and assurance, enabling the business to operate securely and legally. The team's vision is to deliver programs that protect and enable the business, ensure secure delivery of services to clients, adapt to evolving threat landscapes, and meet regulatory expectations. In the Technology division, we leverage innovation to build connections and capabilities that power our Firm, allowing our clients and colleagues to redefine markets and shape the future of our communities. This position is a Software Engineering II role at the Associate Level, responsible for developing and maintaining software solutions that support business needs. Morgan Stanley, a global leader in financial services since 1935, operates in over 40 countries worldwide, constantly evolving and innovating to better serve clients and communities. Your Responsibilities: - Application Development: - Design, develop, and maintain scalable and efficient Python-based applications and services. - Write clean, maintainable, and well-documented code following industry best practices. - Code Reviews and Mentorship: - Conduct code reviews to ensure code quality, performance, and adherence to standards. - Mentor and guide junior developers, fostering technical growth within the team. - Integration and Automation: - Develop and maintain APIs and integrations with third-party systems. - Automate repetitive tasks and workflows to improve efficiency. - Testing and Debugging: - Write unit tests, integration tests, and perform debugging to ensure high-quality deliverables. - Identify and resolve performance bottlenecks and system issues. - Collaboration and Communication: - Work closely with product managers, DevOps, and other teams to deliver end-to-end solutions. - Communicate technical concepts effectively to both technical and non-technical stakeholders. - Continuous Improvement: - Stay updated with the latest Python frameworks, libraries, and tools. - Propose and implement improvements to existing systems and processes. - Data Handling and Analysis: - Work with large datasets, ensuring efficient data processing and storage. - Implement data pipelines and ETL processes as needed. Requirements for the Role: - 5-8 years of development experience in Python. - 2-5 years of experience in Angular or any other UI development skills. - Strong networking background in IP, Firewalls, Proxy, Routing, Load Balancing, OSI Model, Packet trace and analysis. - Good understanding of web protocols like TCP/IP, HTTP, SSL/TLS, etc. - Hands-on experience in understanding data sets from various cybersecurity products/services/SIEM tools. - Understanding of data structures, data modeling, and software architecture. - Experience in architecture, design, and implementation of data-intensive applications. - Practical knowledge of deep learning implementation in areas like Cyber, NLP, Image Processing. - Strong quantitative and problem-solving skills. - Expertise in visualizing large datasets efficiently. - Ability to work in a fast-paced and dynamic environment. - Good written and verbal communication skills. - Strong sense of ownership and accountability for deliverables. At Morgan Stanley, we are committed to maintaining first-class service and excellence, guided by our values of putting clients first, doing the right thing, leading with exceptional ideas, committing to diversity and inclusion, and giving back. Our inclusive environment supports individuals to maximize their potential, with a diverse and skilled workforce reflecting global communities. Join us and work alongside the best and brightest in an empowering environment with attractive benefits and opportunities for growth and advancement.,

Posted 1 week ago

Apply

2.0 - 8.0 years

0 Lacs

maharashtra

On-site

You have 3 to 8 years of IT experience in the development and implementation of Business Intelligence and Data warehousing solutions using Oracle Data Integrator (ODI). Your responsibilities will include Analysis Design, Development, Customization, Implementation & Maintenance of ODI. Additionally, you will be required to design, implement, and maintain ODI load plans and processes. To excel in this role, you should possess a working knowledge of ODI, PL/SQL, TOAD, Data Modelling (logical / Physical), Star/Snowflake Schema, FACT & Dimensions tables, ELT, OLAP, as well as experience with SQL, UNIX, complex queries, Stored Procedures, and Data Warehouse best practices. You will be responsible for ensuring the correctness and completeness of Data loading (Full load & Incremental load). Excellent communication skills are essential for this role, as you will be required to effectively deliver high-quality solutions using ODI. The location for this position is flexible and includes Mumbai, Pune, Kolkata, Chennai, Coimbatore, Delhi, and Bangalore. To apply for this position, please send your resume to komal.sutar@ltimindtree.com.,

Posted 1 week ago

Apply

10.0 - 15.0 years

0 Lacs

haryana

On-site

As a highly skilled and experienced Data Manager, you will lead the development, governance, and utilization of enterprise data systems. Your strategic leadership role will focus on ensuring seamless and secure flow of data across platforms and teams, enabling timely and accurate access to actionable insights. Your objectives will include optimizing data systems and infrastructure to support business intelligence and analytics, implementing best-in-class data governance, quality, and security frameworks, leading a team of data and software engineers to develop cloud-native platforms, and supporting data-driven decision-making across the enterprise. You will be responsible for developing and enforcing policies for effective data management, designing secure processes for data collection and analysis, monitoring data quality and lineage, overseeing data integration, supporting internal stakeholders with data needs, maintaining compliance with regulatory frameworks, troubleshooting data-related issues, evaluating new data tools and technologies, and automating cloud operations. In terms of leadership and strategic duties, you will manage a high-performing data engineering team, collaborate with backend engineers and product teams, partner with cloud providers, conduct architecture reviews, and align data operations with enterprise goals. The required qualifications for this role include a Bachelor's or Master's degree in Computer Science or related field, 10-15 years of experience in enterprise data architecture or governance, expertise in SQL and modern data tools, deep understanding of AWS cloud services, proficiency in scripting and CI/CD pipelines, experience with ETL/ELT orchestration, and strong knowledge of DevOps practices. Preferred experience in healthcare analytics or data environments and soft skills such as strong leadership, effective communication, and a passion for continuous learning are also valued for this role. In return, you will receive competitive compensation, performance-based bonuses, a hybrid and flexible work environment, career development programs, and a diverse and collaborative culture focused on innovation and impact.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

Blenheim Chalcot is a prominent venture builder with a track record of over 20 years in creating disruptive businesses across various sectors such as FinTech, EdTech, GovTech, Media, Sport, Charity, and more. The ventures developed by Blenheim Chalcot are all GenAI enabled, positioning them as some of the most innovative companies in the UK and globally. The team at Blenheim Chalcot India plays a vital role in the growth and success of the organization. Since its establishment in 2009, Blenheim Chalcot India has been a launchpad for individuals looking to drive innovation and entrepreneurship. Driven by a mission to empower visionaries, Blenheim Chalcot India focuses on enabling individuals to lead, innovate, and create disruptive solutions. The organization offers a wide range of services to support new businesses, including technology, growth (marketing and sales), talent, HR, finance, legal, and tax services. Fospha, a MarTech venture under Blenheim Chalcot, is experiencing rapid growth and is seeking energetic and motivated individuals to join their team. Fospha is a marketing measurement platform catering to eCommerce brands, having achieved product/market fit and garnered recognition as a market leader with significant growth and accolades. **Key Responsibilities:** - Lead and mentor a team of data engineers, fostering a collaborative culture focused on continuous improvement. - Plan and execute data projects in alignment with business objectives and timelines. - Provide technical guidance to the team, emphasizing best practices in data engineering. - Implement and maintain ELT processes using scalable data pipelines and architecture. - Collaborate with cross-functional teams to understand data requirements and deliver effective solutions. - Ensure data integrity and quality across diverse data sources. - Support data-driven decision-making by delivering clean, reliable, and timely data. - Define high-quality data standards for Data Science and Analytics use-cases and contribute to shaping the data roadmap. - Design, develop, and maintain data models for ML Engineers, Data Analysts, and Data Scientists. - Conduct exploratory data analysis to identify patterns and trends. - Identify opportunities for process enhancement and drive continuous improvement in data operations. - Stay informed about industry trends, technologies, and best practices in data engineering. **About You:** The ideal candidate will have a proven track record of delivering results in a fast-paced environment, demonstrating comfort with change and uncertainty. **Required:** - Prior experience in leading a team with final tech sign-off responsibilities. - Proficiency in PostgreSQL, SQL technologies, and Python programming. - Understanding of data architecture, pipelines, ELT flows, and agile methodologies. **Preferred:** - Experience with dbt (Data Build Tool) and pipeline technologies within AWS. - Knowledge of data modeling, statistics, and related tools. **Education Qualifications:** Bachelor's or Master's degree in Computer Science, Engineering, or a related field. **What We Offer:** - Opportunity to be part of the World's Leading Digital Venture Builder. - Exposure to diverse talent within BC and opportunities for continuous learning. - Work 4 days a week from office. - Engagement with challenges in a culture that supports learning and development. - Fun and open atmosphere, enriched with a passion for cricket. - Generous annual leave, maternity and paternity leaves, and private medical benefits. At Blenheim Chalcot, we champion diversity, meritocracy, and a culture of inclusion where individual capabilities and potential are highly valued. Our commitment to recruiting, developing, and advancing individuals based on their skills and talent underscores our belief in the diversity, agility, generosity, and curiosity of our people as the driving force behind our organization's success.,

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

You have an exciting opportunity to join YASH Technologies as a Business Analysis Professional. With 7-10 years of experience in Data & Analytics projects, you will be responsible for expertise in MDM data mappings, analysis, and configuration. Working closely with subject matter experts, you will understand functional requirements, lead the requirements, and prepare data mapping sheets. Your role will require strong analytical and troubleshooting skills, proficiency in data profiling, and understanding data patterns. In this position, you will need to have a solid grasp of data models, entity relationships, SQL, ETL, and Data warehousing. Experience in Snowflake is a plus. Functional testing, publishing metrics, system testing, and UAT for data validation are key aspects of the role. Domain knowledge in Manufacturing, particularly in BOM subject area, is preferred. Excellent communication skills, both written and verbal, are essential. Your technical expertise should include technical writing, data modeling, data sampling, and experience in Agile Scrum development environments. Creating user stories, product backlogs, attending scrum events, and scheduling calls with business users to understand requirements are also part of the responsibilities. You will provide technical assistance to the development team, work closely with business stakeholders to gather requirements, and build strong relationships. Your role will involve proven analytics skills, including data mining, evaluation, and visualization. Strong SQL or Excel skills are required, with an aptitude for learning other analytics tools. Defining and implementing data acquisition and integration logic, as well as analyzing data to answer key questions for stakeholders, are crucial components of the position. At YASH Technologies, you will have the opportunity to create a fulfilling career in an inclusive team environment. The company offers career-oriented skilling models and continuous learning opportunities. Embracing a Hyperlearning workplace culture, YASH empowers employees through flexible work arrangements, emotional positivity, agile self-determination, transparency, and open collaboration. You will receive all the support needed to achieve business goals, along with stable employment and an ethical corporate culture.,

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies