Jobs
Interviews

7 Dbschema Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Cloud Data Modeler with 6 to 9 years of experience in GCP environments, you will play a crucial role in designing schema architecture, creating performance-efficient data models, and guiding teams on cloud-based data integration best practices. Your expertise will be focused on GCP data platforms such as BigQuery, CloudSQL, and AlloyDB. Your responsibilities will include architecting and implementing scalable data models for cloud data warehouses and databases, optimizing OLTP/OLAP systems for reporting and analytics, supporting cloud data lake and warehouse architecture, and reviewing and optimizing existing schemas for cost and performance on GCP. You will also be responsible for defining documentation standards, ensuring model version tracking, and collaborating with DevOps and DataOps teams for deployment consistency. Key Requirements: - Deep knowledge of GCP data platforms including BigQuery, CloudSQL, and AlloyDB - Expertise in data modeling, normalization, and dimensional modeling - Understanding of distributed query engines, table partitioning, and clustering - Familiarity with DBSchema or similar tools Preferred Skills: - Prior experience in BFSI or asset management industries - Working experience with Data Catalogs, lineage, and governance tools Soft Skills: - Collaborative and consultative mindset - Strong communication and requirements gathering skills - Organized and methodical approach to data architecture challenges By joining our team, you will have the opportunity to contribute to modern data architecture in a cloud-first enterprise, influence critical decisions around GCP-based data infrastructure, and be part of a future-ready data strategy implementation team.,

Posted 1 day ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You are a skilled Data Modeler with expertise in using DBSchema within GCP environments. In this role, you will be responsible for creating and optimizing data models for both OLTP and OLAP systems, ensuring they are well-designed for performance and maintainability. Your key responsibilities will include developing conceptual, logical, and physical models using DBSchema, aligning schema design with application requirements, and optimizing models in BigQuery, CloudSQL, and AlloyDB. Additionally, you will be involved in supporting schema documentation, reverse engineering, and visualization tasks. Your must-have skills for this role include proficiency in using the DBSchema modeling tool, strong experience with GCP databases such as BigQuery, CloudSQL, and AlloyDB, as well as knowledge of OLTP and OLAP system structures and performance tuning. It is essential to have expertise in SQL and schema evolution/versioning best practices. Preferred skills include experience integrating DBSchema with CI/CD pipelines and knowledge of real-time ingestion pipelines and federated schema design. As a Data Modeler, you should possess soft skills such as being detail-oriented, organized, and communicative. You should also feel comfortable presenting schema designs to cross-functional teams. By joining this role, you will have the opportunity to work with industry-leading tools in modern GCP environments, enhance modeling workflows, and contribute to enterprise data architecture with visibility and impact.,

Posted 1 day ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be working as a Data Schema Designer focusing on designing clean, extensible, and high-performance schemas for GCP data platforms in Chennai. The role is crucial in standardizing data design, enabling scalability, and ensuring cross-system consistency. Your responsibilities will include creating and maintaining unified data schema standards across BigQuery, CloudSQL, and AlloyDB, collaborating with engineering and analytics teams to identify modeling best practices, ensuring schema alignment with ingestion pipelines, transformations, and business rules, developing entity relationship diagrams and schema documentation templates, and assisting in the automation of schema deployments and version control. To excel in this role, you must possess expert knowledge in schema design principles for GCP platforms, proficiency with schema documentation tools such as DBSchema and dbt docs, a deep understanding of data normalization, denormalization, and indexing strategies, as well as hands-on experience with OLTP and OLAP schemas. Preferred skills for this role include exposure to CI/CD workflows and Git-based schema management, experience in metadata governance and data cataloging. Soft skills like precision and clarity in technical documentation, collaboration mindset with attention to performance and quality are also valued. By joining this role, you will be the backbone of reliable and scalable data systems, influence architectural decisions through thoughtful schema design, and work with modern cloud data stacks and enterprise data teams. Skills required for this position include GCP, denormalization, metadata governance, data, OLAP schemas, Git-based schema management, CI/CD workflows, data cataloging, schema documentation tools (e.g., DBSchema, dbt docs), indexing strategies, OLTP schemas, collaboration, analytics, technical documentation, schema design principles for GCP platforms, and data normalization.,

Posted 1 day ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You are a versatile Data Model Developer with 6 to 9 years of experience, proficient in designing robust data models across cloud (GCP) and traditional RDBMS environments. Your role involves collaborating with cross-functional teams to develop schemas that cater to both operational systems and analytical use cases. Your key responsibilities include designing and implementing scalable data models for cloud (GCP) and traditional RDBMS, supporting hybrid data architectures integrating real-time and batch workflows, collaborating with engineering teams for seamless schema implementation, documenting conceptual, logical, and physical models, assisting in ETL and data pipeline alignment with schema definitions, and monitoring and refining performance through partitioning and indexing strategies. You must have experience with GCP data services like BigQuery, CloudSQL, AlloyDB, proficiency in relational databases such as PostgreSQL, MySQL, or Oracle, solid grounding in OLTP/OLAP modeling principles, familiarity with schema design tools like DBSchema, ER/Studio, and SQL expertise for query performance optimization. Preferred skills include experience working in hybrid cloud/on-prem data architectures, functional knowledge in BFSI or asset management domains, and knowledge of metadata management and schema versioning. Soft skills required for this role include adaptability to cloud and legacy tech stacks, clear communication with engineers and analysts, and strong documentation and collaboration skills. Joining this role will allow you to contribute to dual-mode data architecture (cloud + on-prem), solve real-world data design challenges in regulated industries, and have the opportunity to influence platform migration and modernization.,

Posted 1 day ago

Apply

4.0 - 9.0 years

10 - 20 Lacs

Pune

Hybrid

Hi, Greetings! This is regarding a job opportunity for the position of Data Modeller with a US based MNC in Healthcare Domain. This opportunity is under the direct pay roll of US based MNC. Job Location: Pune, Mundhwa Mode of work: Hybrid (3 days work from office) Shift timings: 1pm to 10pm About the Company: The Global MNC is a mission-driven startup transforming the healthcare payer industry. Our secure, cloud-enabled platform empowers health insurers to unlock siloed data, improve patient outcomes, and reduce healthcare costs. Since our founding in 2017, we've raised over $81 million from top-tier VCs and built a thriving SaaS business. Join us in shaping the future of healthcare data. With our deep expertise in cloud-enabled technologies and knowledge of the healthcare industry, we have built an innovative data integration and management platform that allows healthcare payers access to data that has been historically siloed and inaccessible. As a result, these payers can ingest and manage all the information they need to transform their business by supporting their analytical, operational, and financial needs through our platform. Since our founding in 2017, it has built a highly successful SaaS business, raising more than $80 Million by leading VC firms with profound expertise in the healthcare and technology industries. We are solving massive complex problems in an industry ready for disruption. We're building powerful momentum and would love for you to be a part of it! Interview process: 5 rounds of interview 4 rounds of Technical Interview 1 round of HR or Fitment discussion Job Description: Data Modeller About the Role: Were seeking a Data Modeler to join our global data modeling team. Youll play a key role in translating business requirements into conceptual and logical data models that support both operational and analytical use cases. This is a high-impact opportunity to work with cutting-edge technologies and contribute to the evolution of healthcare data platforms. What Youll Do Design and build conceptual and logical data models aligned with enterprise architecture and healthcare standards. Perform data profiling and apply data integrity principles using SQL. Collaborate with cross-functional teams to ensure models meet client and business needs. Use tools like Erwin, ER/Studio, DBT, or similar for enterprise data modeling. Maintain metadata, business glossaries, and data dictionaries. Support client implementation teams with data model expertise. What Were Looking For 2+ years of experience in data modeling and cloud-based data engineering. Proficiency in enterprise data modeling tools (Erwin, ER/Studio, DBSchema). Experience with Databricks, Snowflake, and data lakehouse architectures. Strong SQL skills and familiarity with schema evolution and data versioning. Deep understanding of healthcare data domains (Claims, Enrollment, Provider, FHIR, HL7, etc.). Excellent collaboration and communication skills. In case you have query, please feel free to contact me on the below mention email or whatsapp or call. Thanks & Regards, Priyanka Das Email: priyanka.das@dctinc.com Contact Number: 74399 37568

Posted 2 days ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Modeller specializing in GCP and Cloud Databases, you will play a crucial role in designing and optimizing data models for both OLTP and OLAP systems. Your expertise in cloud-based databases, data architecture, and modeling will be essential in collaborating with engineering and analytics teams to ensure efficient operational systems and real-time reporting pipelines. You will be responsible for designing conceptual, logical, and physical data models tailored for OLTP and OLAP systems. Your focus will be on developing and refining models that support performance-optimized cloud data pipelines, implementing models in BigQuery, CloudSQL, and AlloyDB, as well as designing schemas with indexing, partitioning, and data sharding strategies. Translating business requirements into scalable data architecture and schemas will be a key aspect of your role, along with optimizing for near real-time ingestion, transformation, and query performance. You will utilize tools like DBSchema for collaborative modeling and documentation while creating and maintaining metadata and documentation around models. In terms of required skills, hands-on experience with GCP databases (BigQuery, CloudSQL, AlloyDB), a strong understanding of OLTP and OLAP systems, and proficiency in database performance tuning are essential. Additionally, familiarity with modeling tools such as DBSchema or ERWin, as well as a proficiency in SQL, schema definition, and normalization/denormalization techniques, will be beneficial. Preferred skills include functional knowledge of the Mutual Fund or BFSI domain, experience integrating with cloud-native ETL and data orchestration pipelines, and familiarity with schema version control and CI/CD in a data context. In addition to technical skills, soft skills such as strong analytical and communication abilities, attention to detail, and a collaborative approach across engineering, product, and analytics teams are highly valued. Joining this role will provide you with the opportunity to work on enterprise-scale cloud data architectures, drive performance-oriented data modeling for advanced analytics, and collaborate with high-performing cloud-native data teams.,

Posted 2 days ago

Apply

12.0 - 15.0 years

12 - 15 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title: Data Modeler Key Responsibilities As a Data Modeler, you will: Data Model Design: Perform hands-on data modeling for both OLTP (Online Transaction Processing) and OLAP (Online Analytical Processing) systems, covering Conceptual, Logical, and Physical data modeling. Database Performance Optimization: Apply a strong understanding of indexing, partitioning, and data sharding, with practical experience, to optimize database performance, especially for near-real-time reporting and application interaction. Tool Utilization: Work with at least one data modeling tool, preferably DBSchema or Erwin. GCP Database Understanding: Leverage a good understanding of GCP databases like AlloyDB, CloudSQL, and BigQuery. Collaboration: (Implied) Collaborate with teams to understand requirements and translate them into data models. Mandatory Skills & Experience Technical Proficiency: Data Modeling: Hands-on experience in data modeling for OLTP and OLAP systems . Modeling Concepts: In-depth knowledge of Conceptual, Logical, and Physical data modeling . Database Performance: Strong understanding of indexing, partitioning, data sharding , with practical experience. Performance Variables: Strong understanding of variables impacting database performance for near-real-time reporting and application interaction. Modeling Tools: Working experience on at least one data modeling tool, preferably DBSchema or Erwin . GCP Databases: Good understanding of GCP databases like AlloyDB, CloudSQL, and BigQuery . Experience & Qualifications: Demonstrated experience in applying database optimization techniques (indexing, partitioning, sharding). Essential Professional Skills Domain Knowledge (Plus): Functional knowledge of the mutual fund industry will be a plus.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies