Jobs
Interviews

8586 Data Modeling Jobs - Page 23

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 6.0 years

7 - 12 Lacs

Pune

Work from Office

Project Role : Data Strategist Project Role Description : Develop and implement a data strategy and architecture for improving the management, governance, and security of that information. Enable organizations to have a single source of truth for all data, helping them translate information into action to drive efficiency and competitive advantage. Must have skills : Qlik Replicate Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Strategist, you will develop and implement a comprehensive data strategy and architecture aimed at enhancing the management, governance, and security of information. Your typical day will involve collaborating with various teams to ensure that the organization has a unified source of truth for all data, enabling effective decision-making and driving operational efficiency. You will engage in strategic discussions, analyze data needs, and work towards translating complex information into actionable insights that contribute to the organization's competitive advantage. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate workshops and training sessions to enhance team capabilities.- Monitor and evaluate the effectiveness of data strategies and make necessary adjustments. Professional & Technical Skills: - Must To Have Skills: Proficiency in Qlik Replicate.- Strong analytical skills to interpret complex data sets.- Experience with data governance frameworks and best practices.- Familiarity with data integration and transformation processes.- Ability to communicate technical concepts to non-technical stakeholders. Additional Information:- The candidate should have minimum 5 years of experience in Qlik Replicate.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 6 days ago

Apply

2.0 - 4.0 years

4 - 8 Lacs

Pune

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Data Analysis & Interpretation Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems, contributing to the overall efficiency and reliability of data management within the organization. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to understand data requirements and deliver effective solutions.- Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Analysis & Interpretation.- Strong understanding of data modeling and database design principles.- Experience with ETL tools and data integration techniques.- Familiarity with data visualization tools to present findings effectively.- Knowledge of programming languages such as Python or SQL for data manipulation. Additional Information:- The candidate should have minimum 2 years of experience in Data Analysis & Interpretation.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 6 days ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Noida

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data ServicesMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. You will create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and maintain data pipelines.- Ensure data quality throughout the data lifecycle.- Implement ETL processes for data migration and deployment.- Collaborate with cross-functional teams to understand data requirements.- Optimize data storage and retrieval processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery.- Strong understanding of data engineering principles.- Experience with cloud-based data services.- Knowledge of SQL and database management systems.- Hands-on experience with data modeling and schema design. Additional Information:- The candidate should have a minimum of 3 years of experience in Google BigQuery.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 6 days ago

Apply

3.0 - 8.0 years

13 - 18 Lacs

Pune

Work from Office

Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : MongoDB Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the data architecture aligns with the overall business objectives and technical specifications. You will collaborate with various stakeholders to gather requirements and translate them into effective data solutions, while also addressing any challenges that arise during the development process. Your role will be pivotal in ensuring that the data architecture is robust, scalable, and efficient, contributing to the overall success of the application and the organization. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather and analyze data requirements.- Design and implement data models that support business processes and analytics. Professional & Technical Skills: - Must To Have Skills: Proficiency in MongoDB.- Strong understanding of data modeling concepts and best practices.- Experience with data integration tools and techniques.- Familiarity with cloud-based data storage solutions.- Knowledge of data governance and data quality principles. Additional Information:- The candidate should have minimum 3 years of experience in MongoDB.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 6 days ago

Apply

7.0 - 11.0 years

13 - 18 Lacs

Pune

Work from Office

Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Apache Kafka Good to have skills : Data AnalyticsMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the architecture aligns with business needs and technical specifications. You will collaborate with various teams to ensure that data flows seamlessly across systems, contributing to the overall efficiency and effectiveness of data management within the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Develop and maintain documentation related to data architecture and design. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Kafka.- Good To Have Skills: Experience with Data Analytics.- Strong understanding of data modeling techniques and best practices.- Experience with data integration tools and methodologies.- Familiarity with cloud-based data storage solutions. Additional Information:- The candidate should have minimum 5 years of experience in Apache Kafka.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 6 days ago

Apply

6.0 - 9.0 years

12 - 19 Lacs

Hyderabad

Work from Office

Key Skills: SAP Analytics Cloud (SAC), SAP HANA Modeling, SAP BW/4HANA, Native HANA, SDA/SDI, SQL Script, SAC Planning, Agile, Solution Architecture, Data Modeling, Customer Engagement, Communication, Mentoring. Roles and Responsibilities: Architect, build, and support the implementation of SAP Analytics Cloud (SAC) solutions. Lead design and blueprinting sessions, acting as a solution architect for SAP SAC implementations. Consult clients on the benefits of SAP SAC in comparison to alternative analytics solutions. Collaborate with business teams to utilize SAP Analytics Cloud effectively to meet business requirements. Perform hands-on SAC development and provide objective analysis and recommendations. Contribute to large, complex projects managed in an Agile environment, ensuring timely delivery and quality. Engage in mentoring and knowledge sharing across teams to promote best practices and SAC capabilities. Ensure adherence to change management procedures and internal development guidelines. Experience Requirements: 6-9 years of experience in SAP Analytics Cloud (SAC) architecture, development, and implementation. Strong experience in SAP HANA Modeling using HANA Studio, SAP BW4HANA, SAP BW on HANA, and Native HANA with SDA/SDI. Expertise in SAP HANA 2.0 data modeling, including Attribute Views, Analytic Views, and Calculation Views. Proficient in SAP HANA programming using SQL, SQL Script, Variables, and Input Parameters. Good to have planning knowledge of SAP SAC Planning functionality. Experience consulting on SAC benefits, delivering customer-focused solutions. Strong analytical, problem-solving, and multitasking abilities with limited supervision. Excellent communication, mentoring capabilities, and ability to deliver in an Agile project setup. Education: Any Post Graduation, Any Graduation.

Posted 6 days ago

Apply

4.0 - 6.0 years

8 - 16 Lacs

Hyderabad

Work from Office

Key Skills: MDG Technical, ABAP OO, SAP Gateway, SAP CDS, SAP Workflow, Fiori, Agile, Data Modeling, Debugging, S/4HANA, Eclipse ADT, Open SQL, SD/MM/QM knowledge. Roles and Responsibilities: Deliver strong technical value through ABAP and Gateway development. Work in an Agile environment to deliver business value consistently and efficiently. Design and implement ABAP object-oriented solutions, SAP workflows, and CDS views. Analyze and resolve performance issues, with strong debugging and problem-solving skills. Configure and execute Gateway projects and contribute to SAP Fiori architecture and development. Understand and apply data modeling concepts in technical solution development. Collaborate with cross-functional teams and contribute to functional understanding of areas like SD, MM, and QM. Work with Eclipse ADT tools and have a foundational knowledge of HANA/S4HANA architecture and Fiori. Experience Requirements: 4-6 years of experience in MDG Technical with good exposure to Agile methodologies. Strong proficiency in ABAP including object-oriented programming and performance optimization. Hands-on experience in ABAP CDS views, SAP Gateway configuration, and execution. Experience working with SAP Fiori Architecture and Eclipse ADT development tools. Strong debugging skills with the ability to independently analyze and resolve complex issues. Knowledge of SAP functional modules such as SD, MM, and QM is required. Familiarity with SAP HANA/S4HANA architecture and basic Fiori concepts is expected. Education: Any Post Graduation, Any Graduation.

Posted 6 days ago

Apply

3.0 - 5.0 years

15 - 20 Lacs

Pune

Work from Office

3+ yrs in BI operations Strong in BI monitoring and cross-functional collaboration. Monitor & Support BI systems Resolve incidents, Optimize workflows Ensure data reliability across tools Example - (Qlik, Power BI, Tableau, and SAP Analytics Cloud.) Required Candidate profile 3+ yrs in BI operations with hands-on in Power BI, Qlik & Tableau. Skilled in troubleshooting, data modeling, ETL, and incident resolution. Strong in BI monitoring and cross-functional collaboration.

Posted 6 days ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Noida, Gurugram, Delhi / NCR

Work from Office

Requirement : Data Architect & Business Intelligence Experience: 9+ Years Location: Gurgaon (Remote) Preferred: Immediate Joiners Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills.

Posted 6 days ago

Apply

7.0 - 11.0 years

15 - 25 Lacs

Hyderabad

Hybrid

Role Purpose: The Senior Data Engineer will support and enable the Data Architecture and the Data Strategy. Supporting solution architecture and engineering for data ingestion and modelling challenges. The role will support the deduplication of enterprise data tools, working with the Lonza Data Governance Board, Digital Council and IT to drive towards a single Data and Information Architecture. This will be a hands-on engineering role with a focus on business and digital transformation. The role will be responsible for managing and maintain the Data Architecture and solutions that deliver the platform at with operational support and troubleshooting. The Senior Data Engineer will also manage (no reporting line changes but from day-to-day delivery) and coordinate the Data Engineering team members (Internal and External) working on the various project implementations. Experience : 7-10 years experience with digital transformation and data projects. Experience in designing, delivering and managing data infrastructures. Proficiency in using Cloud Services (Azure) for data engineering, storage and analytics. Strong SQL and NoSQL experience Data Modelling Hands on developing pipelines, setting-up architectures in Azure Fabric. Team management experience (internal and external resources). Good understanding of data warehousing, data virtualization and analytics. Experience in working with data analysts, data scientists and BI teams to deliver on data requirements. Data Catalogue experience is a plus. ETL Pipeline Design is a plus Python Development skills is a plus Realtime data ingestion (E.g. Kafka) Licenses or Certifications Beneficial; ITIL, PM, CSM, Six Sigma, Lean Knowledge Good understanding about integration, ETL, API and Data sharing concepts. Understanding / Awareness of Visualization tools is a plus Knowledge and understanding of relevant legal and regulatory requirements, such as CFR 21 part 11, EU General Data Protection Regulation, Health Insurance Portability and Accountability Act (HIPAA) and GxP validation process would be a plus. Skills The position requires a pragmatic leader with sound knowledge of data, integration and analytics. Excellent written and verbal communication skills, interpersonal and collaborative skills, and the ability to communicate technical concepts to nontechnical audiences. Exhibit excellent analytical skills, the ability to manage and contribute to multiple projects under strict timelines, as well as the ability to work well in a demanding, dynamic environment and meet overall objectives. Project management skills: scheduling and resource management are a plus. Ability to motivate cross-functional, interdisciplinary teams to achieve tactical and strategic goals. Data Catalogue Project and Team management skills are plus. Strong SAP skills are a plus.

Posted 6 days ago

Apply

12.0 - 17.0 years

12 - 17 Lacs

Pune

Work from Office

Role Overview: The Technical Architect specializing in Data Governance and Master Data Management (MDM) designs, implements, and optimizes enterprise data solutions. The jobholder has expertise in tools like Collibra, Informatica, InfoSphere, Reltio, and other MDM platforms, ensuring data quality, compliance, and governance across the organization. Responsibilities: Architect and optimize strategies for data quality, metadata management, and data stewardship. Design and implement data governance frameworks and MDM solutions using tools like Collibra, Informatica, InfoSphere, and Reltio. Develop strategies for data quality, metadata management, and data stewardship. Collaborate with cross-functional teams to integrate MDM solutions with existing systems. Establish best practices for data governance, security, and compliance. Monitor and troubleshoot MDM environments for performance and reliability. Provide technical leadership and guidance to data teams. Stay updated on advancements in data governance and MDM technologies. Key Technical Skills & Responsibilities Overall 12+Yrs of Experience 10+ years of experience working on DG/MDM projects Strong on Data Governance concepts Hands-on different DG tools/services Hands-on Reference Data, Taxonomy Strong understanding of Data Governance, Data Quality, Data Profiling, Data Standards, Regulations, Security Match and Merge strategy Design and implement the MDM Architecture and Data Models Usage of Spark capabilities Statistics to deduce meanings from vast enterprise level data Different data visualization means of analyzing huge data sets Good to have knowledge of Python/R/Scala languages Experience on DG on-premise and on-cloud Understanding of MDM, Customer, Product, Vendor Domains and related artifacts Experience of working on proposals, customer workshops, assessments etc is preferred Must have good communication and presentation skills Technology Stack - Collibra, IBM MDM, Reltio, Infosphere Eligibility Criteria: Bachelors degree in Computer Science, Data Management, or a related field. Proven experience as a Technical Architect in Data Governance and MDM. Certifications in relevant MDM tools (e.g., Collibra Data Governance, Informatica / InfoSphere / Reltio MDM, ). Experience with cloud platforms like AWS, Azure, or GCP. Proficiency in tools like Collibra, Informatica, InfoSphere, Reltio, and similar platforms. Strong understanding of data modeling, ETL/ELT processes, and cloud integration. Excellent problem-solving and communication skills.

Posted 6 days ago

Apply

4.0 - 9.0 years

18 - 20 Lacs

Hyderabad

Hybrid

Position Job Title: Business Intelligence Developer Reports To: Business Intelligence Manager Primary Purpose The BI developer applies business and advanced technical expertise in meeting business data and reporting needs. The position supports business planning by compiling, visualizing, and analyzing business and statistical data from UCWs information systems. The BI developer liaises with various stakeholders across the university to provide them with the data, reporting, and analysis required to make informed data-driven decisions. The Business Intelligence developer will work on projects that will have a significant impact on student, faculty, and staff experience. Specific Responsibilities The BI Developer will at various times be responsible for the following as well as other related duties as assigned to support the business objectives and purpose of the Company. Design relational databases to support business enterprise applications and physical data modeling according to business requirements Gather requirements from various business departments at UCW and transform them into self-serve reports/dashboards for the various business units using Power BI Understand ad-hoc data requirements and convert it into reporting deliverables Contribute to driving reporting automation and simplification to free up time for in-depth analyses Collaborate with internal and external team members, including system architects, software developers, database administrators, and design analysts, to find creative and innovative approaches to enrich business data Provide business and technical expertise for the analytics process, tools, and applications for the University. Identify opportunities that improve data accuracy and efficiency of our processes. Contributes to the development of training materials, documenting processes, and delivering sessions. Develop strategies for data modeling, design, transport, and implementation to meet requirements for metadata management, operational data stores, and ELT/ETL environments Create and test data models for a variety of business data, applications, database structures, and metadata tables to meet operational goals for performance and efficiency Research modern technologies, data modeling methods, and information management systems and recommend changes to company data architectures Contribute to a team environment where all team members consistently experience a sense of belonging and inclusion Position Requirements Competencies: Demonstrated experience in creating complex data models and developing insightful reports and dashboards using Microsoft Power BI Must possess advanced skills in using DAX queries for Power BI Connecting data sources, importing data, cleaning, and transforming data for Business intelligence Knowledge of database management principles and experience working with SQL/MySQL databases Ability to implement row-level security on data along with an understanding of application security layer models in Power BI Ability to translate business requirements into informative reports/visuals A good sense of design that will help communicate data using visually compelling reports and dashboards Experience in ETL (Extract, Transform and Load) processes an asset Experience in being involved in the development of a data warehouse is an asset Data analysis and visualization skills using Python and/or R an asset Strong analytical, problem-solving, and data analysis skills Ability to ensure organizational data privacy and confidentiality Understanding of statistical analysis techniques such as correlation and regression Demonstrated ability to collect data from a variety of sources, synthesize data, produce reports, and make recommendations Ability to manage multiple concurrent tasks and competing demands Education and Experience: Bachelors or masters degree in business, Information Systems, Computer Science, or related discipline Demonstrated experience in using Power BI to create reports, dashboards, and self serve analytics Must have 3+ years of experience in data-specific roles especially in the use of Power BI, Excel, and SQL

Posted 6 days ago

Apply

7.0 - 11.0 years

0 Lacs

maharashtra

On-site

You are looking for a Lead Data Engineer with at least 7 years of experience, who is proficient in Python, PySpark, Airflow (Batch Jobs), HPCC, and ECL. Your role will involve driving complex data solutions across various teams. It is essential that you have practical knowledge of data modeling, test-driven development, and familiarity with Agile/Waterfall methodologies. Your responsibilities will include leading projects, working collaboratively with different teams, and transforming business requirements into scalable data solutions following industry best practices in managed services or staff augmentation environments.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

chennai, tamil nadu

On-site

As an Analytics Manager specializing in Power BI, Python, and Tableau within the Insurance Domain, you will play a crucial role in designing, developing, and implementing Power BI dashboards. Your expertise in Power BI, Python, Tableau, and SQL is essential for this role. You will be responsible for leading, mentoring, and developing a team of data analysts and data scientists. Your key responsibilities will include providing strategic direction for analytical projects, defining and implementing the company's data analytics strategy, and conducting complex data analysis to identify trends and patterns. You will oversee the development of interactive dashboards, reports, and visualizations to make data insights easily accessible to stakeholders. Ensuring data integrity and consistency across systems, collaborating with cross-functional teams, and staying current with the latest data analytics trends and technologies are also important aspects of this role. Additionally, you will lead data-driven projects from initiation to execution, managing timelines, resources, and risks effectively. To be successful in this role, you should have a Bachelor's degree in data science, Statistics, Computer Science, Engineering, or a related field, with at least 10 years of experience in data analysis and 2 years in a managerial or leadership position. Proficiency in data analysis and visualization tools such as SQL, Python, R, Tableau, and Power BI is required. Strong knowledge of data modeling, ETL processes, and database management, along with exceptional problem-solving and critical thinking skills, are essential. Effective communication of complex technical concepts to non-technical stakeholders, proven experience in managing and growing a team of data professionals, strong project management skills, and domain knowledge in insurance will be advantageous for this role. If you are looking for a challenging opportunity to lead data analytics projects, collaborate with diverse teams, and drive business insights within the Insurance Domain, this role is ideal for you.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

You should have at least 4 years of experience in Power BI and Tableau, specializing in data modeling for reporting and data warehouses. Your expertise should include a thorough understanding of the BI process and excellent communication skills. Your responsibilities will involve Tableau Server Management, including installation, configuration, and administration to maintain consistent availability and performance. You must showcase your ability to design, develop, and optimize data models for large datasets and complex reporting requirements. Strong analytical and debugging skills are essential to identify, analyze, and resolve issues within Power BI reports, SQL code, and data for accurate and efficient performance. Proficiency in DAX and Power Query, along with advanced knowledge of data modeling concepts, is required. Additionally, you should possess strong SQL skills for querying, troubleshooting, and data manipulation. Security implementation is crucial, as you will be responsible for managing user permissions and roles to ensure data security and compliance with organizational policies. A good understanding of ETL processes and in-depth knowledge of Power BI Service, Tableau Server, and Desktop are expected. Your familiarity with workspaces, datasets, dataflows, and security configurations will be beneficial in fulfilling your role effectively.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Data and Solution Architect at our company, you will play a crucial role in participating in requirements definition, analysis, and designing logical and physical data models for various data models such as Dimensional Data Model, NoSQL, or Graph Data Model. You will lead data discovery discussions with the Business in Joint Application Design (JAD) sessions and translate business requirements into logical and physical data modeling solutions. It will be your responsibility to conduct data model reviews with project team members and capture technical metadata using data modeling tools. Your expertise will be essential in ensuring that the database designs efficiently support Business Intelligence (BI) and end-user requirements. You will collaborate closely with ETL/Data Engineering teams to create data process pipelines for data ingestion and transformation. Additionally, you will work with Data Architects for data model management, documentation, and version control. Staying updated with industry trends and standards will be crucial in driving continual improvement and enhancement of existing systems. To excel in this role, you must possess strong data analysis and data profiling skills. Your experience in conceptual, logical, and physical data modeling for Very Large Database (VLDB) Data Warehouse and Graph DB will be highly valuable. Hands-on experience with modeling tools like ERWIN or other industry-standard tools is required. Proficiency in both normalized and dimensional model disciplines and techniques is essential. A minimum of 3 years" experience in Oracle Database along with hands-on experience in Oracle SQL, PL/SQL, or Cypher is expected. Exposure to tools such as Databricks Spark, Delta Technologies, Informatica ETL, and other industry-leading tools will be beneficial. Good knowledge or experience with AWS Redshift and Graph DB design and management is desired. Working knowledge of AWS Cloud technologies, particularly on services like VPC, EC2, S3, DMS, and Glue, will be advantageous. You should hold a Bachelor's degree in Software Engineering, Computer Science, or Information Systems (or equivalent experience). Excellent verbal and written communication skills are necessary, including the ability to describe complex technical concepts in relatable terms. Your ability to manage and prioritize multiple workstreams confidently and make decisions about prioritization will be crucial. A data-driven mentality, self-motivation, responsibility, conscientiousness, and detail-oriented approach are highly valued. In terms of education and experience, a Bachelor's degree in Computer Science, Engineering, or relevant fields along with 3+ years of experience as a Data and Solution Architect supporting Enterprise Data and Integration Applications or a similar role for large-scale enterprise solutions is required. You should have at least 3 years of experience in Big Data Infrastructure and tuning experience in Lakehouse Data Ecosystem, including Data Lake, Data Warehouses, and Graph DB. Possessing AWS Solutions Architect Professional Level certifications will be advantageous. Extensive experience in data analysis on critical enterprise systems like SAP, E1, Mainframe ERP, SFDC, Adobe Platform, and eCommerce systems is preferred. If you are someone who thrives in a dynamic environment and enjoys collaborating with enthusiastic individuals, this role is perfect for you. Join our team and be a part of our exciting journey towards innovation and excellence!,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You should have strong SQL knowledge including query performance, tuning, and understanding of database structure. It is important to have knowledge of data modeling principles. Additionally, familiarity with at least one ETL tool (such as SSIS, Talend, abinitio, etc.) is required. Your analytical skills should be strong with attention to detail, and you should be passionate about complete data structure and problem solving. You should be able to quickly grasp new data tools and concepts. Experience in data warehouse environments with knowledge of data architecture patterns is essential. You will be responsible for preparing and/or directing the creation of system test plans, test criteria, and test data. You will also determine system design and prepare work estimates for developments or changes for multiple work efforts. Creating or updating system documents such as BRD, functional documents, ER-diagrams, etc. will be part of your responsibilities. Experience in handling and supporting mid-scale teams is required. Excellent communication skills are necessary to manage the team effectively and understand and resolve resource issues promptly to enhance team productivity. Virtusa values teamwork, quality of life, and professional and personal development. By joining Virtusa, you become part of a global team of 27,000 individuals who are committed to your growth. You will have the opportunity to work on exciting projects, utilize state-of-the-art technologies, and advance your career. At Virtusa, collaboration and a team-oriented environment are highly valued. We provide a dynamic platform for great minds to exchange ideas, innovate, and strive for excellence.,

Posted 1 week ago

Apply

15.0 - 19.0 years

0 Lacs

pune, maharashtra

On-site

Some careers have more impact than others. If you're looking for further opportunities to develop your career, take the next step in fulfilling your potential right here at HSBC. HSBC is one of the largest banking and financial services organizations in the world, with operations in 62 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realize their ambitions. We are currently seeking an experienced professional to join our team in the role of Data Technology Lead, Data Catalogue & Data Quality. The role is based in Pune/Hyderabad. The Opportunity: We are seeking a skilled and driven Data Technology Lead to join the CTO Data Technology team, focusing on enterprise Data Catalogue and Data Quality solutions. This role will be responsible for designing, delivering, and managing modern data control capabilities across the bank's global footprint. This leadership role is crucial in driving adoption of strategic metadata and quality platforms that enable data governance, regulatory compliance, and operational efficiency. The successful candidate will collaborate closely with Group Chief Data Offices (CDOs), Cybersecurity, Risk, Audit, and business-aligned technology teams to provide technical solutions. What you'll do: - - Lead the design, engineering, and global rollout of strategic Data Catalogue and Data Quality solutions. - Partner with Group Data Strategy and CDOs to define control requirements, metadata models, and quality rules frameworks. - Own the platform roadmap and architecture for metadata ingestion, automated data discovery, lineage tracing, and data quality scoring. - Ensure interoperability with broader data platform architecture. - Establish automation pipelines for data standards validation, issue detection, and remediation orchestration at scale. - Drive adoption across thousands of Systems of Record (SoRs), Reference Data Masters, and Data Platforms. - Ensure adherence to global and local data privacy, residency, and regulatory requirements (e.g., GDPR, BCBS 239, PIPL, etc.). - Deliver service-level management, system resilience, and operational excellence for critical metadata and quality services. - Drive continuous improvement and technology modernization, including migration to cloud-native or hybrid architectures. - Support internal audits, regulator reviews, and compliance reporting related to data controls. - Champion engineering best practices and foster a DevSecOps and agile delivery culture within the team. Requirements: - 15+ years of relevant experience in data technology or data governance roles, with leadership experience in large-scale metadata and data quality programs. - Deep expertise in enterprise metadata management (EMM), data lineage tooling, data quality frameworks, and cataloging systems (e.g., Collibra, PyDQ, Graph Databases). - Strong architectural understanding of distributed data platforms, cloud services (e.g., AWS, GCP, Azure), and real-time data pipelines. - Experience in regulatory data compliance (e.g., CCAR, BCBS 239), including working with second-line risk and audit teams. - Strong understanding of data modeling, data classification, data lifecycle management, and data integration technologies. - Leadership experience with matrixed global teams and large-scale platform delivery. - Excellent stakeholder engagement skills, able to communicate with both senior business and technical stakeholders. You'll achieve more when you join HSBC. [HSBC Careers Website](www.hsbc.com/careers) Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website.,

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

noida, uttar pradesh

On-site

At 3CLogic, we strongly believe in the value of our team members and understand that success is achieved through teamwork. As a fast-growing SaaS startup based in Rockville, Maryland, we have a diverse team working both locally and remotely. We prioritize talent and potential over geographical location, creating a hybrid remote and in-person culture that fosters personal and professional growth. We invite you to consider joining our vibrant team at 3CLogic, where every day is filled with exciting opportunities and colorful challenges. Our dynamic environment thrives on innovation and creativity, encouraging all team members to bring their unique perspectives and ideas to the table. We are a team of entrepreneurs at heart, driven by the belief that embracing individuality leads to limitless possibilities. If you are intrigued by the prospect of working in a technicolor world rather than a monochromatic one, we would love to connect with you. Let us show you firsthand what sets us apart and help you discover where you can excel within our organization. **What we do:** 3CLogic specializes in providing voice AI, Contact Center, and SMS solutions to enterprise and Global 2000 organizations worldwide. Our clients include renowned names such as 7-Eleven, Swiss Railways, Regeneron, Hyatt Hotels, and more. By leveraging our cutting-edge technology, organizations enhance their customer service quality, empower their agents, reduce operational costs, and streamline their communication processes. We transform the experience of seeking assistance into a positive and efficient interaction for all parties involved. **General Job Details:** **Position Name:** Senior Product Manager - Contact Center Analytics & CX Integrations **Experience:** 7-10 years (with 5-6 years in CCaaS/CRM/CX Product Management) **Job Type:** Full-Time **Location:** Sector-142, Noida (Hybrid Working Model) **Position Summary:** We are in search of exceptional product managers who are driven to exceed standards rather than just meeting them. As a committed and customer-centric individual focused on continuous improvement, you will play a crucial role in shaping the success of one or more products within the 3CLogic Contact Center suite. Your responsibilities will include defining the product vision, creating roadmaps, prioritizing requirements, and collaborating with various teams to ensure business objectives and customer satisfaction goals are achieved. **Primary Focus - Analytics, Dashboards, Reporting & APIs:** This position centers around productizing data, developing customer-facing dashboards, enabling KPI tracking, and delivering self-service analytics features. You will be responsible for overseeing the development lifecycle of reporting and analytics modules integrated with major CRMs and internal systems. **Key Responsibilities:** - Own and enhance the product roadmap for reporting, analytics, dashboards, and KPI frameworks. - Drive the execution of self-service analytics and API-driven reporting capabilities, translating customer and business needs into actionable requirements. - Collaborate with engineering and data teams to design, validate, and scale data models and pipelines. - Design intuitive dashboards and visualization experiences for business users and administrators. - Define and monitor success metrics for all reporting features, engaging with internal stakeholders and customers to refine use cases. - Serve as a subject matter expert on reporting APIs, real-time and historical analytics, and Voice of Customer metrics. **Qualifications:** - 8-9 years of total experience with 5-6 years as a Product Manager. - Strong background in Analytics, and either CX or CRM, with CCaaS/Telephony experience being advantageous. - Proficiency in Contact Center architectures, including IVR, ACD, Omnichannel routing, and Queuing strategies. - Familiarity with analytics platforms such as Power BI, Tableau, or Looker, along with hands-on knowledge of SQL and data modeling. - Excellent communication skills, stakeholder management abilities, and agility in working with cross-functional teams. **Preferred Skills:** - Exposure to enterprise platforms like ServiceNow and Salesforce. - Experience in delivering Voice of Customer analytics or customer engagement dashboards. - Familiarity with enterprise SaaS delivery and reporting frameworks. **Benefits:** - Flexible Working Hours - Hybrid Working Style - Personal Accidental Insurance - Health Insurance for Self, Spouse, and two children - 5-day work week If you are a dedicated and forward-thinking product manager with a passion for analytics and customer experience, we encourage you to explore this exciting opportunity with 3CLogic. Join us in shaping the future of Contact Center solutions and driving meaningful business outcomes through innovation and collaboration.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

As an Associate Data Architect at Quantiphi, you will be part of a dynamic team that thrives on innovation and growth. Your role will involve designing and delivering big data pipelines for structured and unstructured data across diverse geographies, particularly focusing on assisting healthcare organizations in achieving their business objectives through the utilization of data ingestion technologies, cloud services, and DevOps practices. Your responsibilities will include collaborating with cloud engineers and clients to address large-scale data challenges by creating tools for migration, storage, and processing on Google Cloud. You will be instrumental in crafting cloud migration strategies for both cloud-based and on-premise applications, as well as diagnosing and resolving complex issues within distributed systems to enhance efficiency at scale. In this role, you will have the opportunity to design and implement cutting-edge solutions for data storage and computation for various clients. You will work closely with experts from different domains such as Cloud engineering, Software engineering, and ML engineering to develop platforms and applications that align with the evolving trends in the healthcare sector, including digital diagnosis, AI marketplace, and software as a medical product. Effective communication with cross-functional teams, including Infrastructure, Network, Engineering, DevOps, SiteOps, and cloud customers, will be essential to drive successful project outcomes. Additionally, you will play a key role in building advanced automation tools, monitoring solutions, and data operations frameworks across multiple cloud environments to streamline processes and enhance operational efficiency. A strong understanding of data modeling and governance principles will be crucial for this role, enabling you to contribute meaningfully to the development of scalable and sustainable data architectures. If you thrive in a fast-paced environment that values innovation, collaboration, and continuous learning, then a career as an Associate Data Architect at Quantiphi is the perfect fit for you. Join us and be part of a team of dedicated professionals who are passionate about driving positive change through technology and teamwork.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

You will be responsible for conducting data analysis to extract actionable insights, exploring datasets to uncover patterns and anomalies, analyzing historical data for trend identification and forecasting, investigating data discrepancies, providing user training and support on data analysis tools, communicating findings through compelling visualizations, supporting data projects, and ensuring data accuracy and integrity. Your main responsibilities will include: 1. Conducting data analysis to extract actionable insights and drive decision-making. 2. Exploring and visualizing datasets to uncover patterns, trends, and anomalies. 3. Analyzing historical data to identify trends and develop forecasts for future performance. 4. Investigating and identifying root causes of issues or discrepancies in data. 5. Providing training and support to users on data analysis tools and techniques. 6. Communicating findings and insights through compelling data visualizations and narratives. 7. Supporting data-related projects by providing analytical expertise and insights. 8. Ensuring data accuracy, completeness, and integrity through quality assurance processes. You should have a Bachelor's degree or equivalent in Engineering, Computer Science, MIS, Mathematics, Statistics, or a similar discipline. A Master's degree or MBA is preferred. The ideal candidate will have at least five years of relevant work experience in data analysis. Required Knowledge, Skills, and Abilities: - Fluency in English - Analytical Skills - Accuracy & Attention to Detail - Numerical Skills - Planning & Organizing Skills - Presentation Skills - Statistical Knowledge - Data Modeling and Visualization Skills FedEx is an equal opportunity/affirmative action employer committed to a diverse, equitable, and inclusive workforce. Regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy, physical or mental disability, or any other characteristic protected by applicable laws, all qualified applicants will receive consideration for employment. FedEx, one of the world's largest express transportation companies, delivers for customers worldwide with outstanding transportation and business solutions. The company's success is attributed to its outstanding team of FedEx team members who strive to make every FedEx experience outstanding. The People-Service-Profit philosophy at FedEx describes the principles that guide every decision, policy, or activity. By taking care of its people, FedEx ensures impeccable service for its customers, leading to profitability that secures the company's future. Through the P-S-P philosophy, FedEx invests back into the business and its people, creating a work environment that encourages innovation and the highest quality service for customers. FedEx's culture and values have been key to its success and growth since the early 1970s. While other companies may replicate systems and processes, FedEx's unique culture is a differentiating factor that contributes to its success in today's global marketplace.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

You are a strategic thinker passionate about driving solutions in financial analysis. You have found the right team. As a Data Domain Architect Lead - Vice President within the Finance Data Mart team, you will be responsible for overseeing the design, implementation, and maintenance of data marts to support our organization's business intelligence and analytics initiatives. You will collaborate with business stakeholders to gather and understand data requirements, translating them into technical specifications. Leading the development of robust data models to ensure data integrity and consistency, you will oversee the implementation of ETL processes to populate data marts with accurate and timely data. Your role will involve optimizing data mart performance and scalability, ensuring high availability and reliability, while mentoring and guiding a team of data mart developers. Responsibilities: - Lead the design and development of data marts, ensuring alignment with business intelligence and reporting needs. - Collaborate with business stakeholders to gather and understand data requirements, translating them into technical specifications. - Develop and implement robust data models to support data marts, ensuring data integrity and consistency. - Oversee the implementation of ETL processes to populate data marts with accurate and timely data. - Optimize data mart performance and scalability, ensuring high availability and reliability. - Monitor and troubleshoot data mart issues, providing timely resolutions and improvements. - Document data mart structures, processes, and procedures, ensuring knowledge transfer and continuity. - Mentor and guide a team of data mart developers if needed, fostering a collaborative and innovative work environment. - Stay updated with industry trends and best practices in data warehousing, data modeling, and business intelligence. Required qualifications, capabilities, and skills: - Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. - Extensive experience in data warehousing, data mart development, and ETL processes. - Strong expertise in Data Lake, data modeling, and database management systems (e.g., Databricks, Snowflake, Oracle, SQL Server, etc.). - Leadership experience, with the ability to manage and mentor a team. - Excellent problem-solving skills and attention to detail. - Strong communication and interpersonal skills to work effectively with cross-functional teams. Preferred qualifications, capabilities, and skills: - Experience with cloud-based data solutions (e.g., AWS, Azure, Google Cloud). - Familiarity with advanced data modeling techniques and tools. - Knowledge of data governance, data security, and compliance practices. - Experience with business intelligence tools (e.g., Tableau, Power BI, etc.). Candidates must be able to physically work in our Bengaluru Office in the evening shift from 2 PM to 11 PM IST. The specific schedule will be determined and communicated by direct management.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

The Associate Manager, Salesforce Data Cloud will play a crucial role in leveraging Salesforce Data Cloud to transform how the organization uses customer data. This position, located in Hyderabad, within the Data Cloud Business Enablement Team, focuses on building, managing, and optimizing the data unification strategy to empower business intelligence, marketing automation, and customer experience initiatives. Key Responsibilities: - Managing data models within Salesforce Data Cloud to ensure optimal data harmonization across multiple sources. - Maintaining data streams from various platforms into Data Cloud, such as CRM, SFMC, MCP, Snowflake, and third-party applications. - Developing and optimizing SQL queries to transform raw data into actionable insights. - Building and maintaining data tables, calculated insights, and segments for organization-wide use. - Collaborating with marketing teams to translate business requirements into effective data solutions. - Monitoring data quality and implementing processes to ensure accuracy and reliability. - Creating documentation for data models, processes, and best practices. - Providing training and support to business users on leveraging Data Cloud capabilities. Essential Requirements: - Bachelor's degree in Computer Science, Information Systems, or related field. - Salesforce Data Cloud certification preferred. - 3+ years of experience working with Salesforce platforms. - Previous work with Customer Data Platforms (CDPs). - Experience with Tableau CRM or other visualization tools. - Background in marketing technology or customer experience initiatives. - Salesforce Administrator or Developer certification. - Familiarity with Agile ways of working, Jira, and Confluence. Desired Requirements: - Advanced knowledge of Salesforce Data Cloud architecture and capabilities. - Strong SQL skills for data transformation and query optimization. - Experience with ETL processes and data integration patterns. - Understanding of data modeling principles and best practices. - Experience with Salesforce Marketing Cloud, MCI & MCP. - Familiarity with APIs and data integration techniques. - Knowledge of data privacy regulations and compliance requirements (GDPR, CCPA, etc.). - Demonstrated experience with data analysis and business intelligence tools. - Strong problem-solving abilities and analytical thinking. - Excellent communication skills to translate technical concepts to business users. - Ability to work collaboratively in cross-functional teams. - Familiarity with and adaptability to new-generation technologies and trends (Gen AI and Agentic AI) is an added advantage. Commitment To Diversity And Inclusion: Novartis is committed to building an outstanding, inclusive work environment and diverse teams representative of the patients and communities served. Accessibility And Accommodation: Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If you need a reasonable accommodation due to a medical condition or disability, please email diversityandincl.india@novartis.com with details. Join our Novartis Network: If this role isn't right for you, consider signing up for our talent community to stay connected and learn about suitable career opportunities as soon as they arise. Benefits and Rewards: Read our handbook to discover how Novartis supports personal and professional growth for its employees.,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

navi mumbai, maharashtra

On-site

You should have 6-8 years of experience with a deep understanding of the Spark framework, along with hands-on experience in Spark SQL and Pyspark. Your expertise should include Python programming and familiarity with common Python libraries. Strong analytical skills are essential, especially in database management, including writing complex queries, query optimization, debugging, user-defined functions, views, and indexes. Your problem-solving abilities will be crucial in designing, implementing, and maintaining efficient data models and pipelines. Experience with Big Data technologies is a must, while familiarity with any ETL tool would be advantageous. As part of your responsibilities, you will be working on projects to deliver, review, and design PySpark and Spark SQL-based data engineering analytics solutions. Your tasks will involve writing clean, efficient, reusable, testable, and scalable Python logic for analytical solutions. Emphasis will be on building solutions for data cleaning, data scraping, and exploratory data analysis, ensuring compatibility with any BI tool. Collaboration with Data Analysts/BI developers to provide clean and processed data will be essential. You will design data processing pipelines using ETL techniques, develop and deliver complex requirements to achieve business goals, and work with unstructured, structured, and semi-structured data and their respective databases. Effective coordination with internal engineering and development teams to understand requirements and develop solutions is critical. Communication with stakeholders to grasp business logic and provide optimal data engineering solutions will also be part of your role. It is important to adhere to best coding practices and standards throughout your work.,

Posted 1 week ago

Apply

1.0 - 5.0 years

0 Lacs

hyderabad, telangana

On-site

As a Snowflake Administrator, you will be responsible for managing and maintaining the Snowflake environment, including warehouses, databases, and users. You will monitor and manage costs associated with Snowflake usage, provision databases and warehouses as needed, and integrate Snowflake with MSK (Kafka) as a mandatory requirement. Additionally, you will implement and manage Role-Based Access Control (RBAC) to ensure proper access controls, integrate Snowflake with AWS objects like S3, Lambda, and API Gateway, and with Cortex AI. You will configure and manage network policies to ensure secure access to Snowflake resources and develop Python scripts to automate administrative tasks. To be successful in this role, you must have a minimum of 3-4 years of experience in Snowflake administration, experience with MSK integration, at least 1 year of DevOps experience, a strong understanding of RBAC concepts, and proficiency in Cost Management. It would be nice to have experience with Cortex AI integration with Snowflake, Snowflake Data Warehouse, AWS services, and Python scripting for automating Snowflake administrative tasks. If you are an immediate joiner and have the required experience and skills, we look forward to receiving your application for the Snowflake Administrator position in Hitech City, Hyderabad. Work timings are flexible within USA-EST timings (3 PM IST-12 AM IST) while working from the office.,

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies