Jobs
Interviews

436 Data Modelling Jobs - Page 18

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

- 1 years

1 - 1 Lacs

Gurugram

Work from Office

Key Responsibilities: Enter and update data into the company's database and systems accurately and efficiently. Verify data for accuracy and completeness. Process and maintain confidential information. Review and correct any discrepancies or errors in data. Collect, clean, and organize data from various sources to ensure accuracy and completeness. Analyze and interpret complex datasets to identify trends, patterns, and insights. Generate reports, dashboards, and visualizations to communicate findings effectively. Collaborate with business units to understand their needs and translate them into data requirements. Develop and implement data models and forecasts to predict future trends. Ensure the integrity and security of company data. Provide information about products, services, policies, and procedures. Identify opportunities for improvement and suggest improvements to enhance the customer experience Stay up to date with the latest industry trends and technologies related to data analytics.

Posted 2 months ago

Apply

3 - 4 years

5 - 6 Lacs

Noida, Gurugram, Bengaluru

Work from Office

Senior Engineer: The T C A practice has experienced significant growth in demand for engineering & architecture roles from CST, driven by client needs that extend beyond traditional data & analytics architecture skills. There is an increasing emphasis on deep technical sk ills like s uch as strong ex pertise i n Azure, Snowflake, Azure OpenAI, and Snowflake Cortex, along with a solid understanding of their respective functionalities. In dividual w ill work on a robust pipeline of T C A-driven projects with pharma clients . This role offers significant opportunities for progression within the practice. What Youll Do Opportunity to work on high-impact projects with leading clients. Exposure to complex and technological initiatives Learning support through organization sponsored trainings & certifications Collaborative and growth-oriented team culture. Clear progression path within the practice. Opportunity work on latest technologies Successful delivery of client projects and continuous learning mindset certifications in newer areas Contribution to partner with project leads and AEEC leads todeliver complex projects & growTCA practice. Development of experttech solutions for client needs with positive feedback from clients and team members. What Youll Bring 3- 4 years of experience in RDF ontologies, RDF based knowledge graph (Anzo graph DB preferred), Data modelling, Azure cloud and data engineering Understanding of ETL processes, Data pull using Azure services via polling mechanism and API/middleware development using Azure services. Strong ability to identify data anomalies, design data validation rules, and perform data cleanup to ensure high-quality data. Experience in pharma or life sciences data: Familiarity with pharmaceutical datasets, including product, patient, or healthcare provider data, is a plus.

Posted 2 months ago

Apply

7 - 9 years

9 - 11 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills. Location : - Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 2 months ago

Apply

10 - 14 years

37 - 40 Lacs

Pune

Work from Office

We are looking for an experienced SAP Analytics Cloud (SAC) Consultant to implement, configure, and optimize SAC solutions for data analytics, reporting, planning, and forecasting. The ideal candidate will work closely with business stakeholders Required Candidate profile Proficiency in SAP Analytics Cloud (SAC) features such as stories, data models, and planning functionalities. Experience with data modelling, reporting, and dashboarding

Posted 2 months ago

Apply

4 - 9 years

18 - 25 Lacs

Bengaluru

Hybrid

Skill required : Data Engineers- Azure Designation : Sr Analyst/ Consultant Job Location : Bengaluru Qualifications: BE/BTech Years of Experience : 4 - 11 Years OVERALL PURPOSE OF JOB Understand client requirements and build ETL solution using Azure Data Factory, Azure Databricks & PySpark . Build solution in such a way that it can absorb clients change request very easily. Find innovative ways to accomplish tasks and handle multiple projects simultaneously and independently. Works with Data & appropriate teams to effectively source required data. Identify data gaps and work with client teams to effectively communicate the findings to stakeholders/clients. Responsibilities : Develop ETL solution to populate Centralized Repository by integrating data from various data sources. Create Data Pipelines, Data Flow, Data Model according to the business requirement. Proficient in implementing all transformations according to business needs. Identify data gaps in data lake and work with relevant data/client teams to get necessary data required for dashboarding/reporting. Strong experience working on Azure data platform, Azure Data Factory, Azure Data Bricks. Strong experience working on ETL components and scripting languages like PySpark, Python . Experience in creating Pipelines, Alerts, email notifications, and scheduling jobs. Exposure on development/staging/production environments. Providing support in creating, monitoring and troubleshooting the scheduled jobs. Effectively work with client and handle client interactions. Skills Required: Bachelors' degree in Engineering or Science or equivalent graduates with at least 4-11 years of overall experience in data management including data integration, modeling & optimization. Minimum 4 years of experience working on Azure cloud, Azure Data Factory, Azure Databricks. Minimum 3-4 years of experience in PySpark, Python, etc. for data ETL . In-depth understanding of data warehouse, ETL concept and modeling principles. Strong ability to design, build and manage data. Strong understanding of Data integration. Strong Analytical and problem-solving skills. Strong Communication & client interaction skills. Ability to design database to store huge data necessary for reporting & dashboarding. Ability and willingness to acquire knowledge on the new technologies, good analytical and interpersonal skills with ability to interact with individuals at all levels. Interested candidates can reach on Neha 9599788568 neha.singh@mounttalent.com

Posted 2 months ago

Apply

5 - 10 years

5 - 10 Lacs

Pune, Gurugram, Bengaluru

Work from Office

Hiring for Top management consulting org for the Data Architect-AI role, Kindly go through the JD in detail As a key pillar of our organization, the Engineering Products team worked on various fields from data & AI perspective- Data Strategy, AI Strategy, Data Modelling, Data Architecture, Cloud Assessment, Industry & AI Value strategy etc. that helps our customers in setting up strong data platform foundation & target roadmap to scaling & evolve towards achieving AI/GEN AI & advanced analytics vision to meet the evolving future needs of technology advancement Location - Gurgaon/Bangalore/Pune/hyderabad/Mumbai Who are we looking for? Years of Experience: Candidates should typically have at least 5- 10 years of experience in AI strategy, management, or a related field. This experience should include hands-on involvement in developing and implementing AI strategies for clients across various industries. Desired experience. Minimum 6 years of experience working with clients in the products industry (Lifesciences, CPG, Industry & Ret that are heavily influenced by AI & Gen AI preferences and behaviors, is highly valued. Candidates who have a deep understanding of AI & Gen AI trends and market dynamics can provide valuable insights and strategic guidance to clients. Minimum 5 years proven experience & deep expertise in developing and implementing AI strategy frameworks tailored to the specific needs and aims of clients within LS, Industry, CPG, and Retail sectors. The ability to craft innovative AI solutions that address industry-specific challenges and drive tangible business outcomes will set you apart. Minimum 6 years strong consulting background with a demonstrated ability to lead client engagements from start to completion. Consulting experience should encompass stakeholder management and effective communication to ensure successful project delivery

Posted 2 months ago

Apply

12 - 18 years

14 - 24 Lacs

Hyderabad

Work from Office

Overview Deputy Director - Data Engineering PepsiCo operates in an environment undergoing immense and rapid change. Big-data and digital technologies are driving business transformation that is unlocking new capabilities and business innovations in areas like eCommerce, mobile experiences and IoT. The key to winning in these areas is being able to leverage enterprise data foundations built on PepsiCos global business scale to enable business insights, advanced analytics, and new product development. PepsiCos Data Management and Operations team is tasked with the responsibility of developing quality data collection processes, maintaining the integrity of our data foundations, and enabling business leaders and data scientists across the company to have rapid access to the data they need for decision-making and innovation. What PepsiCo Data Management and Operations does: Maintain a predictable, transparent, global operating rhythm that ensures always-on access to high-quality data for stakeholders across the company. Responsible for day-to-day data collection, transportation, maintenance/curation, and access to the PepsiCo corporate data asset Work cross-functionally across the enterprise to centralize data and standardize it for use by business, data science or other stakeholders. Increase awareness about available data and democratize access to it across the company. As a data engineering lead, you will be the key technical expert overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be empowered to create & lead a strong team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. Responsibilities Data engineering lead role for D&Ai data modernization (MDIP) Ideally Candidate must be flexible to work an alternative schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon coverage requirements of the job. The can didate can work with immediate supervisor to change the work schedule on rotational basis depending on the product and project requirements. Responsibilities Manage a team of data engineers and data analysts by delegating project responsibilities and managing their flow of work as well as empowering them to realize their full potential. Design, structure and store data into unified data models and link them together to make the data reusable for downstream products. Manage and scale data pipelines from internal and external data sources to support new product launches and drive data quality across data products. Create reusable accelerators and solutions to migrate data from legacy data warehouse platforms such as Teradata to Azure Databricks and Azure SQL. Enable and accelerate standards-based development prioritizing reuse of code, adopt test-driven development, unit testing and test automation with end-to-end observability of data Build and own the automation and monitoring frameworks that captures metrics and operational KPIs for data pipeline quality, performance and cost. Collaborate with internal clients (product teams, sector leads, data science teams) and external partners (SI partners/data providers) to drive solutioning and clarify solution requirements. Evolve the architectural capabilities and maturity of the data platform by engaging with enterprise architects to build and support the right domain architecture for each application following well-architected design standards. Define and manage SLAs for data products and processes running in production. Create documentation for learnings and knowledge transfer to internal associates. Qualifications 12+ years of engineering and data management experience Qualifications 12+ years of overall technology experience that includes at least 5+ years of hands-on software development, data engineering, and systems architecture. 8+ years of experience with Data Lakehouse, Data Warehousing, and Data Analytics tools. 6+ years of experience in SQL optimization and performance tuning on MS SQL Server, Azure SQL or any other popular RDBMS 6+ years of experience in Python/Pyspark/Scala programming on big data platforms like Databricks 4+ years in cloud data engineering experience in Azure or AWS. Fluent with Azure cloud services. Azure Data Engineering certification is a plus. Experience with integration of multi cloud services with on-premises technologies. Experience with data modelling, data warehousing, and building high-volume ETL/ELT pipelines. Experience with data profiling and data quality tools like Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one business intelligence tool such as Power BI or Tableau Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes. Experience with version control systems like ADO, Github and CI/CD tools for DevOps automation and deployments. Experience with Azure Data Factory, Azure Databricks and Azure Machine learning tools. Experience with Statistical/ML techniques is a plus. Experience with building solutions in the retail or in the supply chain space is a plus. Understanding of metadata management, data lineage, and data glossaries is a plus. BA/BS in Computer Science, Math, Physics, or other technical fields. Candidate must be flexible to work an alternative work schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon product and project coverage requirements of the job. Candidates are expected to be in the office at the assigned location at least 3 days a week and the days at work needs to be coordinated with immediate supervisor Skills, Abilities, Knowledge: Excellent communication skills, both verbal and written, along with the ability to influence and demonstrate confidence in communications with senior level management. Proven track record of leading, mentoring data teams. Strong change manager. Comfortable with change, especially that which arises through company growth. Ability to understand and translate business requirements into data and technical requirements. High degree of organization and ability to manage multiple, competing projects and priorities simultaneously. Positive and flexible attitude to enable adjusting to different needs in an ever-changing environment. Strong leadership, organizational and interpersonal skills; comfortable managing trade-offs. Foster a team culture of accountability, communication, and self-management. Proactively drives impact and engagement while bringing others along. Consistently attain/exceed individual and team goals. Ability to lead others without direct authority in a matrixed environment. Comfortable working in a hybrid environment with teams consisting of contractors as well as FTEs spread across multiple PepsiCo locations. Domain Knowledge in CPG industry with Supply chain/GTM background is preferred.

Posted 2 months ago

Apply

8 - 10 years

15 - 30 Lacs

Hyderabad, Pune, Chennai

Work from Office

Warm Greetings from SP Staffing Services Pvt Ltd!!!! Experience:8-10yrs Work Location :Hyderabad/Pune/Chennai Job Description: Experience in Power BI Developer, data modelling, data visualization, API We are seeking a Senior Power BI Developer with over 5 years of experience in data analytics and business intelligence The ideal candidate will have a deep understanding of Power BI data modeling and data visualization You will be responsible for designing developing and maintaining business intelligence solutions to help our organization make datadriven decisions Experience in DAX Key Responsibilities Design develop and maintain Power BI reports and dashboards Collaborate with business stakeholders to understand their data needs and translate them into technical requirements Create and optimize data models to support reporting and analytics Integrate Power BI with various data sources including databases cloud services and APIs Ensure data accuracy and integrity in all reports and dashboards Provide training and support to endusers on Power BI tools and functionalities Stay uptodate with the latest Power BI features and best practices Interested candidates, Kindly share your updated resume to ramya.r@spstaffing.in or contact number 8667784354 (Whatsapp:9597467601)to proceed further.

Posted 2 months ago

Apply

8 - 13 years

10 - 20 Lacs

Bengaluru

Work from Office

Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com with the below details ASAP. C.CTC- E.CTC- Notice Period- Current location- Are you serving Notice period/immediate- Exp in Snowflake- Exp in Matillion- 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Only Serving/ Immediate candidates can apply. Interview Process-1 Round(Virtual)+Final Round(F2F) Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse,Python, Matillion, AWS S3, EC2 Preferred skills : SSIR, SSIS, Informatica, Shell Scripting Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com

Posted 2 months ago

Apply

3 - 8 years

10 - 20 Lacs

Bengaluru

Work from Office

Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com with the below details ASAP. C.CTC- E.CTC- Notice Period- Current location- Are you serving Notice period/immediate- Exp in Snowflake- Exp in Matillion- 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Only Serving/ Immediate candidates can apply. Interview Process-2 Rounds(Virtual)+Final Round(F2F) Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse,Python, Matillion, AWS S3, EC2 Preferred skills : SSIR, SSIS, Informatica, Shell Scripting Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com

Posted 2 months ago

Apply

8 - 13 years

30 - 35 Lacs

Bengaluru

Work from Office

Power BI Architect - J48917 As a Data Engineer BI Analytics & DWH , you will play a pivotal role in designing and implementing comprehensive business intelligence solutions that empower our organization to make data-driven decisions. You will leverage your expertise in Power BI, Tableau, and ETL processes to create scalable architectures and interactive visualizations. This position requires a strategic thinker with strong technical skills and the ability to collaborate effectively with stakeholders at all levels. Key Responsibilities: BI Architecture & DWH Solution Design: Develop and design scalable BI Analytical & DWH Solution that meets business requirements, leveraging tools such as Power BI and Tableau. Data Integration: Oversee the ETL processes using SSIS to ensure efficient data extraction, transformation, and loading into data warehouses. Data Modelling: Create and maintain data models that support analytical reporting and data visualization initiatives. Database Management: Utilize SQL to write complex queries, stored procedures, and manage data transformations using joins and cursors. Visualization Development: Lead the design of interactive dashboards and reports in Power BI and Tableau, adhering to best practices in data visualization. Collaboration: Work closely with stakeholders to gather requirements and translate them into technical specifications and architecture designs. Performance Optimization: Analyse and optimize BI solutions for performance, scalability, and reliability. Data Governance: Implement best practices for data quality and governance to ensure accurate reporting and compliance. Team Leadership: Mentor and guide junior BI developers and analysts, fostering a culture of continuous learning and improvement. Azure Databricks: Leverage Azure Databricks for data processing and analytics, ensuring seamless integration with existing BI solutions. Required Candidate profile Candidate Experience Should Be : 8 To 15 Candidate Degree Should Be : BE-Comp/IT

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies