Home
Jobs

121 Data Modelling Jobs - Page 4

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

16 - 25 Lacs

Hyderabad

Work from Office

Naukri logo

Key Responsibilities Architect and implement modular, test-driven ELT pipelines using dbt on Snowflake. Design layered data models (e.g., staging, intermediate, mart layers / medallion architecture) aligned with dbt best practices. Lead ingestion of structured and semi-structured data from APIs, flat files, cloud storage (Azure Data Lake, AWS S3), and databases into Snowflake. Optimize Snowflake for performance and cost: warehouse sizing, clustering, materializations, query profiling, and credit monitoring. Apply advanced dbt capabilities including macros, packages, custom tests, sources, exposures, and documentation using dbt docs. Orchestrate workflows using dbt Cloud, Airflow, or Azure Data Factory, integrated with CI/CD pipelines. Define and enforce data governance and compliance practices using Snowflake RBAC, secure data sharing, and encryption strategies. Collaborate with analysts, data scientists, architects, and business stakeholders to deliver validated, business-ready data assets. Mentor junior engineers, lead architectural/code reviews, and help establish reusable frameworks and standards. Engage with clients to gather requirements, present solutions, and manage end-to-end project delivery in a consulting setup Required Qualifications 5 to 8 years of experience in data engineering roles, with 3+ years of hands-on experience working with Snowflake and dbt in production environments. Technical Skills: o Cloud Data Warehouse & Transformation Stack: Expert-level knowledge of SQL and Snowflake, including performance optimization, storage layers, query profiling, clustering, and cost management. Experience in dbt development: modular model design, macros, tests, documentation, and version control using Git. o Orchestration and Integration: Proficiency in orchestrating workflows using dbt Cloud, Airflow, or Azure Data Factory. Comfortable working with data ingestion from cloud storage (e.g., Azure Data Lake, AWS S3) and APIs. Data Modelling and Architecture: Dimensional modelling (Star/Snowflake schemas), Slowly changing dimensions. ' Knowledge of modern data warehousing principles. Experience implementing Medallion Architecture (Bronze/Silver/Gold layers). Experience working with Parquet, JSON, CSV, or other data formats. o Programming Languages: Python: For data transformation, notebook development, automation. SQL: Strong grasp of SQL for querying and performance tuning. Jinja (nice to have): Exposure to Jinja for advanced dbt development. o Data Engineering & Analytical Skills: ETL/ELT pipeline design and optimization. Exposure to AI/ML data pipelines, feature stores, or MLflow for model tracking (good to have). Exposure to data quality and validation frameworks. o Security & Governance: Experience implementing data quality checks using dbt tests. Data encryption, secure key management and security best practices for Snowflake and dbt. Soft Skills & Leadership: Ability to thrive in client-facing roles with competing/changing priorities and fast-paced delivery cycles. Stakeholder Communication: Collaborate with business stakeholders to understand objectives and convert them into actionable data engineering designs. Project Ownership: End-to-end delivery including design, implementation, and monitoring. Mentorship: Guide junior engineers and establish best practices; Build new skill in the team. Agile Practices: Work in sprints, participate in scrum ceremonies, story estimation. Education: Bachelors or masters degree in computer science, Data Engineering, or a related field. Certifications such as Snowflake SnowPro Advanced, dbt Certified Developer are a plus.

Posted 3 weeks ago

Apply

4.0 - 5.0 years

3 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Working Model : Our flexible work arrangement combines both remote and in-office work, optimizing flexibility and productivity. This position will be part of Sapiens Digital (Data Suite) division, for more information about it, click here: Designation: Senior Developer Must Skills 4-5 Years Experience in Data Bricks, PySpark, SQL, Data warehousing Criterias Job Requirements General Job Description A seasoned, experienced professional with a full understanding of area of specialization; resolves a wide range of issues in creative ways. This job is the fully qualified, career-oriented, journey-level position. Pre - requisites Knowledge & Experience B.E (or equivalent) Extensive hands-on experience in Java development, including strong knowledge of core Java concepts, data structures, and algorithms. In-depth understanding of distributed data processing frameworks like Apache Spark , with specific expertise in Databricks . Proficiency in designing and building data pipelines for data extraction, transformation, and loading (ETL). Familiarity with big data technologies and concepts, including Hadoop, Hive, and HDFS. Proven experience in building scalable and high-performance data solutions for large datasets. Solid understanding of data modelling, database design, and data warehousing concepts. Knowledge of both SQL and NoSQL databases , and ability to choose the right database type based on project requirements. Demonstrated ability to write clean, maintainable, and efficient Java code for data processing and integration tasks. Experience with Java libraries commonly used in data engineering, such as Apache Kafka for streaming data. Extensive hands-on experience with Databricks for big data processing and analytics. Ability to set up and configure Databricks clusters and optimize their performance. Proficiency in Spark Data Frame and Spark SQL for data manipulation and querying. Understanding of data architecture principles and experience in designing data solutions that meet scalability and reliability requirements. Familiarity with cloud-based data platforms like AWS or Azure. Problem-Solving and Analytical Skills: Strong problem-solving skills and the ability to analyse complex data-related issues. Capacity to propose innovative and efficient solutions to data engineering challenges. Excellent communication skills, both verbal and written, with the ability to convey technical concepts to non-technical stakeholders effectively. Experience working collaboratively in cross-functional teams, including Data Scientists, Data Analysts, and business stakeholders. A strong inclination to stay updated with the latest advancements in data engineering, Java, and Databricks technologies. Adaptability to new tools and technologies to support evolving data requirements. Required Product/project Knowledge Ability to work in an agile development environment. Hand on experience in technical design document preparation Proven experience in fine tuning and identifying the potential bottle necks on the applications Required Skills Ability to work on tasks (POCs, Stories, CR's, Defects etc.) without taking much help. Technical ability includes Programming, Debugging and Logical skills. Ability to technically guide juniors in completion of POC, Stories, CR's, Defects etc Common Tasks Come up and follow process for: Technical compliance and documentation Code review Unit & Functional testing Deployment Ensures that the team is also following the process properly. Able to write at least two technical paper or present one tech talk in a year 100% Compliance to Sprint Plan. Required Soft Skills Providing technical leadership and mentoring to junior developers Collaboration and teamwork skills Self-motivated with strong initiative and excellent Communication Skills Abilities of becoming a technical activity leader Proactive and initiative approach Self-motivated, flexible and a team player Have good understanding of the requirements in the area of functionality being developed

Posted 3 weeks ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Chennai

Work from Office

Naukri logo

GenAI Exposure, Data Modelling, Database Development, Data Warehouse Management, Data Strategy Development, End to end Data Architecture, Virtualization and Consumption layer expertise Skill & Experience Bachelors degree in Computer Science, Data Analytics or similar field. Highly analytical mindset, with an ability to see both the big picture and the details Experience in building Virtualization & Consumption Layer. Ideas how to divide costing based on consumption. Exposure to Gen-Ai is added advantage. Strong organizational and troubleshooting skills Strong problem-solving and analytical skills. Excellent communication and interpersonal skills. Proven track record of successful project management. Develop and Implement Data Strategy Identify and Manage Data Sources Coordinate with Cross-Functional Teams Manage End-to-End Data Architecture Develop Virtualization & Consumption Layers Ensure Data Quality and Governance Ensure Data Security and Compliance Exposure to Gen-AI (Preferred).

Posted 3 weeks ago

Apply

11.0 - 20.0 years

45 - 50 Lacs

Bengaluru

Work from Office

Naukri logo

Technologies Used: Odoo Platform v15+ Based Development. Experience with Odoo development and customization. Odoo User base (Logged-in users) > 1000 Users . Odoo on Kubernetes (Microservices Based Architecture) with DevOps understanding. Knowledge of Odoo modules, architecture, and APIs. Ability to integrate Odoo with other systems and data sources. Capable of creating custom modules. Scale Odoo deployments for a large number of users and transactions. Programming. Languages: Proficiency in Python is essential. Experience with other programming languages (e.g., Java, Scala) is a plus. Data Analysis and Reporting: Ability to analyse and interpret complex data sets. Your future duties and responsibilities: Experience with data visualization tools (e.g., Superset). Experience in Cassandra (4.0+) along with Query Engine like Presto. Proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL). Experience with ETL tools and processes. Data Structure & Data Modelling Knowledge of data warehousing concepts and technologies. Familiarity with big data technologies (e.g., Hadoop, Spark) is a plus. Experience in managing and processing large Datasets DevSecOps: Experience with containerization, Docker, and Kubernetes clusters. CI/CD with GitLab. Methodologies: Knowledge and experience of SCRUM and Agile methodologies. Operating Systems: Linux/Windows OS. Tools Used: Jira, GitLab, Confluence. Other Skills: Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Attention to detail and a commitment to data quality. Ability to work in a fast-paced, dynamic environment. Skills: English ERP System CSB Postgre SQL Python Hadoop Ecosystem (HDFS) Java

Posted 3 weeks ago

Apply

1.0 - 5.0 years

2 - 5 Lacs

Hyderabad

Work from Office

Naukri logo

Coeo is a professional services business delivering consulting and support services, we are expert in the high growth area of Microsoft data and cloud platform We have offices in Hyderabad, India and Reading, UK Our customers are typically UK based in Retail, Finance and Professional Services sectors running mission critical data applications in Azure, and leading-edge Business Intelligence and Cloud Analytics solutions We are Microsoft Gold Partners and proud to work with many UK household names including Dominos Pizza, Next, ASOS and Fat Face. Were a growing company, which has doubled in size in the last 5 years with ambitious plans for further expansion We have our HR infrastructure in place and now were looking for an HR Executive wholl support our entire employee lifecycle, as well as promoting wellbeing at work. We offer excellent benefits, a vibrant and fun work environment, fantastic colleagues, and a variety of social activities to keep things exciting. We are seeking an experienced Talent Sourcer to join our recruitment team and focus on sourcing top-tier talent for technology roles, specifically within the Microsoft data solutions The ideal candidate will have a strong background in end-to-end recruitment, with the ability to identify, engage, and attract highly skilled professionals in the Microsoft ecosystem You will play a crucial role in building a pipeline of candidates for our technology hiring needs, ensuring a seamless recruitment experience for both candidates and hiring managers. Key Responsibilities: Proactively source and engage with high-quality candidates for roles related to the Microsoft data stack (e.g., Azure, .NET, C#, Power BI, SQL ServerSynapse, Fabric, Databricks, Data modelling etc.). Collaborate closely with hiring managers to understand hiring requirements, team needs, and key technologies. Develop and execute sourcing strategies to attract a diverse pool of qualified candidates through multiple channels, including job boards, social media, LinkedIn, internal databases, networking, and direct outreach. Screen resumes, conduct initial phone interviews, and assess candidates technical skills, experience, and cultural fit for the role. Build and maintain a strong pipeline of active and passive candidates for current and future technology hiring needs. Partner with the recruitment team to ensure a seamless handover of candidates through the interview and hiring process. Track and manage candidate pipelines in the applicant tracking system (ATS) to ensure efficient recruitment workflows. Maintain up-to-date knowledge of trends and best practices in the Microsoft data stack and technology recruitment. Assist in developing and improving the recruitment process to increase efficiency, candidate experience, and quality of hire. Provide regular updates on sourcing progress and candidate activity to hiring managers and senior leadership. Required Skills and Qualifications: Proven experience as a Talent Sourcer, Recruiter, or similar role, with a focus on technology recruitment. Strong knowledge of the Microsoft data stack, including but not limited to Azure, .NET, Power BI, SQL Server, Synapse, Fabric, Databricks, and other Microsoft-related technologies. Experience sourcing candidates using a variety of tools and methods, including LinkedIn, job boards, social media, and direct outreach. End-to-end recruitment experience, including creating s, candidate screening, and managing candidates through the interview process. Strong interpersonal and communication skills, with the ability to build relationships with both candidates and hiring managers. Ability to work independently and manage multiple open requisitions in a fast-paced environment. Familiarity with applicant tracking systems (ATS) and other recruitment software. Knowledge of current recruitment trends and best practices within technology hiring. Ability to evaluate technical skill sets and qualifications against job requirements and team needs. Preferred Qualifications: Experience working with global teams or in a multinational recruitment environment. Familiarity with additional technologies in the Microsoft stack, data solutions and other enterprise-level tools. Experience in hiring for a range of technical roles, from junior developers to senior technical architects. Familiarity with diversity and inclusion best practices in sourcing and recruiting. Education and Experience: Bachelor's degree in human resources, Business Administration, or a related field, or equivalent work experience. years of experience in talent sourcing or recruiting, with a focus on technical roles. Experience in recruiting for Microsoft-related technologies is highly preferred. Other: Primarily office based but with opportunities for flexible or part time working Support for continuing professional development Diversity and Inclusion: Coeo is an equal opportunity employer We celebrate diversity and are committed to creating an inclusive environment for all employees.

Posted 3 weeks ago

Apply

5.0 - 9.0 years

7 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

What youll do Following are high level responsibilities that you will play but not limited to: Analyze Business Requirements. Analyze the Data Model and do GAP analysis with Business Requirements and Power BI. Design and Model Power BI schema. Transformation of Data in Power BI/SQL/ETL Tool. Create DAX Formula, Reports, and Dashboards. Able to write DAX formulas. Experience writing SQL Queries and stored procedures. Design effective Power BI solutions based on business requirements. Manage a team of Power BI developers and guide their work. Integrate data from various sources into Power BI for analysis. Optimize performance of reports and dashboards for smooth usage. Collaborate with stakeholders to align Power BI projects with goals. Knowledge of Data Warehousing(must), Data Engineering is a plus What youll bring B. Tech computer science or equivalent Minimum 5+ years of relevant experience

Posted 3 weeks ago

Apply

8.0 - 10.0 years

12 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

What you 'll do: Work closely with our product management, internal stakeholders and customers to identify, validate and document new system requests, oversee proper implementation by providing acceptance criteria and act as a liaison between the business users and the developers. Be an integral part of our dynamic agile R&D team, become an expert with our innovative product and contribute to the product's vision. Perform enterprise level and project level data modelling, including model management, consolidation and integration Understand business requirements and translate to Conceptual (CDM), Logical (LDM) and Physical (PDM) data model by using industry standards Managing Data Model for multiple projects and make sure data model in all projects are synchronized and adhering to Enterprise Architecture with proper change management Establish and manage existing standards for naming and abbreviation conventions, data definitions, ownership, documentation, procedures, and techniques Adopt, support, and participate in the implementation of the Enterprise Data Management Strategy Experience in creating P&C, Life & health insurance/BFSI-specific target data models, meta data layer and data marts. Experience in Medallion (Lakehouse) Architecture. Collaborate with Application team to implement data flows, samples and develop conceptual-logical data models Ensure reusability of model and approach in across different business requirements Support data specific system integration and support data migration Good to have experience in modelling MongoDB schema What to Have for this position. Must have Skills. Min 5+ years as data modeler involved in mid- to large-scale system development projects and have experience in Data Analysis, Data Modelling and Data Mart designing. Overall experience of 8+ years Should have Data analysis/profiling and reverse engineering of data experience. Experience of working on a data migration project is a plus Prior experience in BFSI domain [Insurance would be a plus] Should have experience in ER Studio/Toad data Modeler or equivalent tool Should be strong in Data warehousing concepts Should have strong database development skills like complex SQL queries, complex store procedures Should be strong in Medallion (Lakehouse) Architecture. Good verbal and written communication skills in English Ability to work with minimal guidance or supervision in a time critical environment.

Posted 3 weeks ago

Apply

5.0 - 9.0 years

7 - 11 Lacs

Navi Mumbai

Work from Office

Naukri logo

5+ years overall experience, with at least 4 years relevant experience on Mendix At least worked on one Mendix solution for a complete Mendix solution (i.e., from development till deployment). Experienced in data modelling and structures. Strong analytical and problem-solving skills. Capable of communicating complex subjects in a simple fashion To motivate, coach and evaluate the performance of the team members. Determining the best practices for developing in Mendix, setting standards and making sure they are executed. Experienced as being a part of a Scrum-Agile multi-disciplinary team Strong troubleshoot and debug applications. Thoroughly understands decision process issues of technology choice, such as capacities, response time, data interfacing, client server communication, industry standard technologies and new industry trends, etc. Responsible for maximizing the effectiveness by working with cross functional team including UI Team Lead/contribute to engineering during execution and delivery to solve complex engineering problems in the development life cycle and work closely with cross-functional team.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

8 - 12 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

JobOpening Senior Data Engineer (Remote, Contract 6 Months) Remote | Contract Duration: 6 Months | Experience: 6-8 Years We are hiring a Senior Data Engineer for a 6-month remote contract position. The ideal candidate is highly skilled in building scalable data pipelines and working within the Azure cloud ecosystem, especially Databricks, ADF, and PySpark. You'll work closely with cross-functional teams to deliver enterprise-level data engineering solutions. #KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices #GoodToHaveSkills Event Hubs, Logic Apps Power BI Strong logic building and competitive programming background Location : - Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune,Remote

Posted 3 weeks ago

Apply

4.0 - 8.0 years

10 - 18 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Work from Office

Naukri logo

Responsibilities: • Develop, deploy, and manage OLAP cubes and tabular models. • Collaborate with data teams to design and implement effective data solutions. • Troubleshoot and resolve issues related to SSAS and data models. • Monitor system performance and optimize queries for efficiency. • Implement data security measures and backup procedures. • Stay updated with the latest SSAS and BI technologies and best practices. Qualifications: • Bachelor's degree in Computer Science, Information Technology, or a related field. • 7+ years of experience working with SSAS (SQL Server Analysis Services). • Strong understanding of data warehousing, ETL processes, OLAP concepts and data modelling concepts • Proficiency in SQL, MDX, and DAX query languages. • Experience with data visualization tools like Power BI. • Excellent problem-solving skills and attention to detail. • Strong communication and collaboration abilities. • Experience on Agile way of working Skills: • SSAS (SQL Server Analysis Services) • SQL • MDX/DAX • Data Warehousing • ETL Processes • Performance Tuning • Data Analysis • Data Security • Data modelling • Plus: knowledge on Power BI or a reporting tool • Plus: working for ING

Posted 3 weeks ago

Apply

6.0 - 8.0 years

8 - 12 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

JobOpening Senior Data Engineer (Remote, Contract 6 Months) Remote | Contract Duration: 6 Months | Experience: 6-8 Years We are hiring a Senior Data Engineer for a 6-month remote contract position. The ideal candidate is highly skilled in building scalable data pipelines and working within the Azure cloud ecosystem, especially Databricks, ADF, and PySpark. You'll work closely with cross-functional teams to deliver enterprise-level data engineering solutions. #KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices #GoodToHaveSkills Event Hubs, Logic Apps Power BI Strong logic building and competitive programming background Location : - Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 3 weeks ago

Apply

7.0 - 9.0 years

9 - 14 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills. Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 3 weeks ago

Apply

7.0 - 12.0 years

20 - 27 Lacs

Bengaluru

Remote

Naukri logo

Profisee Consultant Job Summary: We are seeking a skilled and experienced Profisee Consultant to join our data management team. In this role, you will be responsible for designing, developing, and implementing Master Data Management (MDM) solutions using the Profisee Platform. Youll work closely with business and IT stakeholders to ensure data integrity, governance, and usability across the enterprise . Key Responsibilities: Lead and participate in the implementation of Profisee MDM solutions. Work with stakeholders to gather and analyze MDM requirements. Design and configure Profisee entities, hierarchies, workflows, and match/merge rules. Integrate Profisee with other enterprise systems (ERP, CRM, Data Warehouses). Develop and maintain data quality rules and governance frameworks. Provide ongoing support, troubleshooting, and optimization of MDM solutions. Deliver documentation, training, and knowledge transfer to internal teams. Ensure compliance with data governance, privacy, and security policies. Required Qualifications : Proven experience with Profisee MDM platform (3+ years preferred). Strong understanding of Master Data Management principles and best practices. Experience with data modeling, SQL Server, and integration tools (e.g., SSIS). Familiarity with data quality, data stewardship, and data governance concepts. Ability to gather requirements and translate them into technical solutions. Excellent problem-solving, communication, and stakeholder management skills .

Posted 4 weeks ago

Apply

10.0 - 14.0 years

12 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Role Purpose The purpose of the role is to define and develop Enterprise Data Structure along with Data Warehouse, Master Data, Integration and transaction processing with maintaining and strengthening the modelling standards and business information. Do 1. Define and Develop Data Architecture that aids organization and clients in new/ existing deals a. Partnering with business leadership (adopting the rationalization of the data value chain) to provide strategic, information-based recommendations to maximize the value of data and information assets, and protect the organization from disruptions while also embracing innovation b. Assess the benefits and risks of data by using tools such as business capability models to create an data-centric view to quickly visualize what data matters most to the organization, based on the defined business strategy c. Create data strategy and road maps for the Reference Data Architecture as required by the clients d. Engage all the stakeholders to implement data governance models and ensure that the implementation is done based on every change request e. Ensure that the data storage and database technologies are supported by the data management and infrastructure of the enterprise f. Develop, communicate, support and monitor compliance with Data Modelling standards g. Oversee and monitor all frameworks to manage data across organization h. Provide insights for database storage and platform for ease of use and least manual work i. Collaborate with vendors to ensure integrity, objectives and system configuration j. Collaborate with functional & technical teams and clients to understand the implications of data architecture and maximize the value of information across the organization k. Presenting data repository, objects, source systems along with data scenarios for the front end and back end usage l. Define high-level data migration plans to transition the data from source to target system/ application addressing the gaps between the current and future state, typically in sync with the IT budgeting or other capital planning processes m. Knowledge of all the Data service provider platforms and ensure end to end view. n. Oversight all the data standards/ reference/ papers for proper governance o. Promote, guard and guide the organization towards common semantics and the proper use of metadata p. Collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure a common understanding, consistency, accuracy and control q. Provide solution of RFPs received from clients and ensure overall implementation assurance i. Develop a direction to manage the portfolio of all the databases including systems, shared infrastructure services in order to better match business outcome objectives ii. Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution for the big/small data iii. Provide technical leadership to the implementation of custom solutions through thoughtful use of modern technology iv. Define and understand current issues and problems and identify improvements v. Evaluate and recommend solutions to integrate with overall technology ecosystem keeping consistency throughout vi. Understand the root cause problem in integrating business and product units vii. Validate the solution/ prototype from technology, cost structure and customer differentiation point of view viii. Collaborating with sales and delivery leadership teams to identify future needs and requirements ix. Tracks industry and application trends and relates these to planning current and future IT needs 2. Building enterprise technology environment for data architecture management a. Develop, maintain and implement standard patterns for data layers, data stores, data hub & lake and data management processes b. Evaluate all the implemented systems to determine their viability in terms of cost effectiveness c. Collect all the structural and non-structural data from different places integrate all the data in one database form d. Work through every stage of data processing: analysing, creating, physical data model designs, solutions and reports e. Build the enterprise conceptual and logical data models for analytics, operational and data mart structures in accordance with industry best practices f. Implement the best security practices across all the data bases based on the accessibility and technology g. Strong understanding of activities within primary discipline such as Master Data Management (MDM), Metadata Management and Data Governance (DG) h. Demonstrate strong experience in Conceptual, Logical and physical database architectures, design patterns, best practices and programming techniques around relational data modelling and data integration 3. Enable Delivery Teams by providing optimal delivery solutions/ frameworks a. Build and maintain relationships with delivery and practice leadership teams and other key stakeholders to become a trusted advisor b. Define database physical structure, functional capabilities, security, back-up and recovery specifications c. Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results d. Monitor system capabilities and performance by performing tests and configurations e. Integrate new solutions and troubleshoot previously occurred errors f. Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards g. Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects h. Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams i. Recommend tools for reuse, automation for improved productivity and reduced cycle times j. Help the support and integration team for better efficiency and client experience for ease of use by using AI methods. k. Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams l. Ensures architecture principles and standards are consistently applied to all the projects m. Ensure optimal Client Engagement i. Support pre-sales team while presenting the entire solution design and its principles to the client ii. Negotiate, manage and coordinate with the client teams to ensure all requirements are met iii. Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Mandatory Skills: Datacenter Architecting - Unix Stack.

Posted 4 weeks ago

Apply

3.0 - 7.0 years

30 - 32 Lacs

Mohali

Work from Office

Naukri logo

We are seeking a highly skilled and experienced Senior Data Engineer to join our team. This role will be instrumental in designing, developing, and maintaining our data infrastructure, ensuring the effective processing and analysis of large datasets. The ideal candidate will have a strong background in data modeling, data architecture, and experience with a variety of data technologies. Key Responsibilities: Design and implement robust data pipelines, ETL processes, and data warehouses to support our analytics and reporting needs. Develop and maintain data models, schemas, and metadata to ensure data quality and consistency. Collaborate with data scientists, analysts, and business stakeholders to understand their requirements and translate them into technical solutions. Optimize data pipelines for performance and scalability to handle large volumes of data. Stay up-to-date with the latest data technologies and trends to drive innovation and efficiency. Responsibilities Design, develop, and maintain scalable data architectures, pipelines, APIs, and integrations. Create and optimize data models to support efficient data processing and storage. Manage and maintain databases, including Postgres, SSIS, ensuring data integrity and performance. Develop, deploy, and manage ETL and EDI processes. Develop and maintain scripts and applications using Python for data processing and analysis. Ensure data security and compliance with relevant regulations and best practices. Leverage cloud services (e.g., Azure, AWS) for data storage, processing, and analytics. Collaborate with cross-functional teams to gather requirements and provide data-driven insights. Implement and manage caching solutions to improve data retrieval speeds. Create and maintain comprehensive documentation for all data processes and architectures. Utilize data visualization tools to create interactive dashboards and reports for stakeholders. Qualifications Bachelors degree in Computer Science, Information Systems, Data Science, or a related field (or equivalent work experience). Minimum of 5 years of experience in data engineering or a related field. Proficiency in data modeling, data architecture, and database management with Postgres or SSIS.. Experience with electronic medical records (EMRs) and understanding of the healthcare industry is strongly desired. Strong SQL skills and experience with common ETL tools. Proficiency in Python for data processing and automation. Experience with common caching solutions (e.g., Redis, Memcached). Expertise in data security best practices and regulatory compliance. Hands-on experience with cloud platforms like Azure and AWS. Proficiency with data visualization tools such as Power BI, Tableau, or similar. Excellent problem-solving skills and ability to troubleshoot data issues effectively. Strong communication skills, both written and verbal, with the ability to explain complex technical concepts to non-technical stakeholders. Desired Skills Knowledge of data warehousing concepts and methodologies. Experience with Agile/Scrum methodologies. Familiarity with Power BI administration and deployment.

Posted 4 weeks ago

Apply

5.0 - 7.0 years

18 - 20 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

Design, develop, and maintain Cognos dashboards and reports to visualize data insights and support business objectives Collaborate with stakeholders to gather requirements and translate them into effective visualizations Optimize Cognos performance and ensure scalability of dashboards for large datasets Develop and maintain SQL queries for data transformation, and Analytics processes Work closely with the data engineering team to ensure the availability and quality of data for reporting purposes Conduct thorough testing and validation of Cognos dashboards to ensure accuracy and reliability of data visualizations Provide technical support and troubleshooting assistance to end-users regarding Cognos dashboards and related data issues Stay updated with the latest Cognos features, best practices, and industry trends to continuously improve reporting solutions Solid RDBMS experience and data modelling and advanced querying skills. Good to have other BI tool knowledge like Tableau and BI migration experience from Source BI tool to Tableau Strong ability to extract information by questioning, active listening, and interviewing. Strong analytical and problem-solving skills. Excellent writing skills, with the ability to create clear requirements, specifications, and Documentation. Location - Remote, Hyderabad,Ahmedabad,pune,chennai,kolkata.

Posted 4 weeks ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Role Purpose The purpose of the role is to design, program, simulate and test the automation product or process to achieve the efficiency and effectiveness required. Do 1. Instrumental in understanding the software requirements and design of the product Analyze and understand the current technology architecture, system interdependencies and application stacks Formulate project plans by working with project management and outlining steps required to develop project and submit project plans to project management for approval Understand current operating procedures by consulting with users/partners/clients and reviewing project objectives on regular basis Contribute to the automation roadmap design and testing process improvements by researching on automation architectures and developing new automation solutions Improve and maintain the automation framework to be used horizontally across our technology stacks as well as build out reusable libraries across our business line verticals 2. Design and execute software developing and reporting Ensure the environment is ready for the execution process designing, test plans, developing test cases/scenarios/usage cases and executing these cases Development of technical specifications and plans and resolution of complex technical design issues Participate and conduct design activities with the development team relating to testing of the automation processes for both functional and non-functional requirements Implement, track, and report key metrics to assure full coverage of functional and non-functional requirements through automation Eliminates errors by owning the testing and validations of codes Track problems, resolutions, and bug fixes throughout the project and create a comprehensive database of defects and successful mitigation techniques Provide resolutions to problems by taking the initiative to use all available resources for research Design and implement automated testing tools when possible, and update tools as needed to ensure efficiency and accuracy Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Develop programs that run efficiently and adhere to WIPRO standards by using similar logic from existing applications, discussing best practices with team members, referencing text books and training manuals, documenting the code and by using accepted design patterns 3. Ensuring smooth flow of communication with customer & internal stakeholders Work with Agile delivery teams to understand product vision and product backlogs; develop robust, scalable, and high quality test automation tests for functional, regression and performance testing Assist in creating acceptance criteria for user stories and generate a test automation backlog Collaborate with Development team to create/improve continuous deployment practices by developing strategies, formalizing processes and providing tools Work closely with business Subject Matter Experts to understand requirements for automation, then designs, builds and deploys the application using automations tools Ensure long term maintainability of the system by documenting projects according to WIPRO guidelines Ensure quality of communication by being clear and effective with test personnel, users, developers, and clients to facilitate quick resolution of problems and accurate documentation of successes Provide assistance to testers and supports personnel as needed to determine system problems Ability to perform backend/database programming for key projects. Stay up-to-date on industry standards and incorporate them appropriately. Design and implement automated testing tools when possible, and update tools as needed to ensure efficiency and accuracya Display No. Performance Parameter Measure 1. Automation Quality of design/ adherence to design Adherence to project plan Issue resolution and client escalation management Zero disruption/ error in deployment EWS on risks and deployment of mitigation measures 2. Documentation Complete documentation of automation process, test cases, debug data and performance review as per quality standards Mandatory Skills: Telecom NMS Data Modelling South Bound.

Posted 4 weeks ago

Apply

10.0 - 20.0 years

25 - 40 Lacs

Noida, Pune, Bengaluru

Hybrid

Naukri logo

Responsibilities Lead functional and technical workshops, demonstrating leadership skills in designing, delivering, testing, and deploying Salesforce solutions. Expertise in Data Modeling, Apex Design Patterns, LWC, and other modern UI techniques. Design and architect scalable and secure Salesforce solutions that meet business requirements. Must have expertise in Salesforce Service Cloud, Einstein AI, Data Cloud & Experience Cloud. Serve as a trusted advisor to the client, conducting conversations with their Enterprise Architects and business stakeholders to shape the architectural vision and establish an architectural roadmap program. Manage customer expectations; negotiate solutions to complex problems with both the customer and third-party stakeholders. Guide customers, partners, and implementation teams on how best to execute digital transformation with the Salesforce platform using Salesforce Industries. Establish trust with the customers leadership, promoting and implementing best practices with Salesforce Industries and Salesforce. Ensure best practices in coding standards, design patterns, and integration processes are followed. Develop and maintain technical documentation for designed solutions. Build out sophisticated business processes using native Salesforce Industries technology and the toolkit of the Force.com platform and integration tools. Work closely with Delivery Managers, Solution Architects, and directly with clients to architect technology solutions to meet client needs. Highlight and manage risk areas in the solution proactively, committing to seeing issues through to completion. Qualifications Minimum 12-15 years of total experience in IT. Minimum 8 years of total Salesforce experience in Salesforce architecture and integration. Minimum 5 years of experience developing Salesforce customizations (Apex/Lightning), integrations, and executing data migrations. Minimum of 3-5 years of experience creating the technical architecture for complex Salesforce implementations. 7+ years of experience in defining, designing, delivering, and deploying Salesforce-based technical solutions, in the capacity of the accountable or responsible contributor. Design and implement Salesforce solutions aligned with business strategy and objectives. Lead technical requirements sessions, architect and document technical solutions aligned with client business objectives. Translate business requirements into well-architected solutions that best leverage the Salesforce platform. Provide guidance on the deployment of Salesforce CRM implementations, integrations, and upgrades. Mandatory to have at least one Developer Track Certification (Platform Developer I) along with at least one Cloud Consultant Certification from either Community, Field Service, Sales, Service, or CPQ. Mandatory to have either System Architect or Application Architect certification. Other relevant Salesforce certifications (Data cloud, Experience Cloud) are a plus. Excellent communication (written and oral) and interpersonal skills, with the ability to present to a variety of audiences (executive to technically detailed audiences). Excellent leadership and management skills. Education / Certification Bachelor’s/University degree or equivalent experience. Salesforce certifications (e.g., Platform Developer I, System Architect, Application Architect) are preferred.

Posted 1 month ago

Apply

2.0 - 7.0 years

7 - 17 Lacs

Mumbai

Work from Office

Naukri logo

Greetings!!! We have an opening with Reputed Finance Industry for the role of Data Management Business Analyst Experince: 2+ years Role & responsibilities Extract and analyze data from the MES system to identify trends, performance metrics, and areas for improvement. Business requirements should be elicited, analyzed, specified, and verified Create documents like Functional Specification Documents (FSD) with Table-Column mapping. Engage with various stakeholders like Business, Data Science, PowerBI team for cross-functional data validation and support. Identify opportunities to optimize manufacturing processes based on data analysis and user feedback. Preferred candidate profile 2 + years of relevant experience in Data Modelling & Management experience . Interested candidates can share your resume to josy@topgearconsultants.com

Posted 1 month ago

Apply

4.0 - 5.0 years

8 - 12 Lacs

Tamil Nadu

Work from Office

Naukri logo

Duration: 12Months Position Description: Develop RPA and Chat Bot solutions using Pega and Case Management solutions. Developing integrations for RPA and Chat Bot solutions using Webservices and API's Pairs with other software engineers to cooperatively deliver user stories. Uses the test driven development methodology to realize the technical solution. Perform requirements gathering, Business Analysis, Fit Gap Analysis, System Testing, Documentation (FDD and TDD) and End User Training Analyze integration requirements and work with legacy systems Should coordinate with business skill teams, PDO IT teams, Architects and Product vendor (Pega) in developing and deploying the automation solutions. Bot Maintenance Skills Required: 4+ Experience in PEGA RPA Development. Experience in Delivering global RPA or DPA projects using Agile methodology. Strong knowledge on C# scripting. Pega certification Knowledge in AI or ML modeling is added advantage. Hands-on Experience on Data Modelling, Stored Procedures, SQL Server Strong understanding of Case Management, Case Hierarchy, Flow Rules, Data Propagation Good working knowledge on Declarative Rules, Database Integrations, Connectors and Services, Decisioning & ML Working Knowledge on UI Design Strong Knowledge on Application Debugging, Performance Tuning, Quality Assurance & Packaging Working knowledge on Pega Security Framework Working knowledge on RPA Integration and Robot Manager Portal Features Experience Required: 4-5 years of Pega RPA development experience Education Required: Bachelor's degree B.EMCA

Posted 1 month ago

Apply

3.0 - 5.0 years

14 - 18 Lacs

Pune

Work from Office

Naukri logo

Proficient in T-SQL for complex database querying and optimization Expertise in Power BI desktop and service for report/dashboard development Hands-on experience with SQL Server database design and management Strong data modeling skills, including dimensional modeling and star schema Ability to transform raw data into meaningful, actionable information Preferred Skills ("Good to Have"): Experience with Azure Data Services (e.g., Azure SQL, Azure Synapse Analytics, Azure Data Factory) Knowledge of data warehousing concepts and best practices Familiarity with ETL processes and data integration tools Understanding of Power BI governance, security, and deployment strategies Exposure to agile software development methodologies Strong problem-solving and analytical skills Excellent communication and stakeholder management abilities Key Responsibilities: Design and develop interactive, visually appealing Power BI dashboards and reports Implement complex data models and DAX calculations to meet business requirements Optimize SQL queries for high performance and scalability Automate data refresh processes and implement data security protocols Collaborate with business stakeholders to understand reporting needs Provide technical guidance and training to end-users Continuously improve dashboard design, functionality, and user experience Stay up-to-date with the latest Power BI and MS-SQL Server features and best practices

Posted 1 month ago

Apply

3.0 - 7.0 years

5 - 8 Lacs

Pune

Work from Office

Naukri logo

Job Title - Senior Engineer for Data Management Private Bank Role Description Our Data Governance and Architecture team is driving forward data management together with the Divisional Data Office for Private Bank. In close collaboration between business and IT we assign data roles, manage the documentation of data flows, align data requirements between consumers and producers of data, report data quality and coordinate Private Banks data delivery through the group data hub. We support our colleagues in the group Chief Data Office to optimize Deutsche Banks Data Policy and the associated processes and methods to manage and model data. As part of the team you will be responsible for work streams from project planning to preparing reports to senior management. You combine regulatory compliance with data driven business benefits for Deutsche Bank. Your key responsibilities Establish and maintain the Private Bank contribution to the Deutsche Bank Enterprise Logical and Physical Data models and ensure its usefulness for the Private Bank business Understand the requirement of the group functions risk, finance, treasury, and regulatory reporting and cast them into data models in alignment with the producers of the data Co-own Private Bank relevant parts of the Deutsche Bank Enterprise Logical and Physical Data models Support the Private Bank experts and stakeholders in delivering the relevant data Optimize requirements management and modelling processes together with the group Chief Data Office and Private Bank stakeholders Align your tasks with the team and the Private Bank Data Council priorities Your skills and experience In depth understanding of how data and data quality impacts processes across the bank in the retail sector Hands on-experience with data modelling in the financial industry Extensive experience with data architecture and the challenges of harmonized data provisioning Project and stakeholder management capabilities Open minded team player: making different people work together well across the world Fluent in English

Posted 1 month ago

Apply

9.0 - 14.0 years

32 - 37 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

Should be able to conduct requirement gathering sessions, estimations for SAP MDG application. Should be aware of the phases of the SAP activate methodology for project execution. Work with the various technical teams to come up with the future state solution architecture to support the MDG implementation which would include areas such as source and consumers integration, application arch, external data/vendor access, security, performance, and scalability strategies to support the various business capabilities. Share perspectives about best practices, common technical issues & approaches. support high level logical data model definition and discussions to ensure feasibility with MDG. Full life cycle implementation cycle with Blueprinting, fit gap analysis, configurations, data migrations, cutovers and go live experience. Functional design for Data modelling UI modelling Rules and Validations in BRF+ through configurations Replication modelling DRFin and DRFout, IDOCs, ALE, SOAP services for SAP MDG Location-Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Delhi / NCR, Bengaluru

Work from Office

Naukri logo

Should be able to conduct requirement gathering sessions, estimations for SAP MDG application. Should be aware of the phases of the SAP activate methodology for project execution. Work with the various technical teams to come up with the future state solution architecture to support the MDG implementation which would include areas such as source and consumers integration, application arch, external data/vendor access, security, performance, and scalability strategies to support the various business capabilities. Share perspectives about best practices, common technical issues & approaches. support high level logical data model definition and discussions to ensure feasibility with MDG. Full life cycle implementation cycle with Blueprinting, fit gap analysis, configurations, data migrations, cutovers and go live experience. Functional design for Data modelling UI modelling Rules and Validations in BRF+ through configurations Replication modelling DRFin and DRFout, IDOCs, ALE, SOAP services for SAP MDG Location: Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad

Posted 1 month ago

Apply

5.0 - 10.0 years

8 - 14 Lacs

Tirunelveli

Work from Office

Naukri logo

Should be able to conduct requirement gathering sessions, estimations for SAP MDG application. Should be aware of the phases of the SAP activate methodology for project execution. Work with the various technical teams to come up with the future state solution architecture to support the MDG implementation which would include areas such as source and consumers integration, application arch, external data/vendor access, security, performance, and scalability strategies to support the various business capabilities. Share perspectives about best practices, common technical issues & approaches. support high level logical data model definition and discussions to ensure feasibility with MDG. Full life cycle implementation cycle with Blueprinting, fit gap analysis, configurations, data migrations, cutovers and go live experience. Functional design for Data modelling UI modelling Rules and Validations in BRF+ through configurations Replication modelling DRFin and DRFout, IDOCs, ALE, SOAP services for SAP MDG.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies