Home
Jobs

Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Hi , Please find the below Job Description. Job Title: GCP Data Modeler Duration: Full Time Location: Hybrid Locations: Hyderabad, Chennai, Bengaluru, Pune, Nagpur. Job Description: Experience with with ( GCP, Bigquery, Dataflow, LookML, Looker, SQL, Python) Job Description: Senior Data Modeler with Expertise in GCP and Looker Overview: We are seeking a highly skilled and experienced Data Modeler to join our data and analytics team. The ideal candidate will have deep expertise in data modeling, particularly with Google Cloud Platform (GCP), and a strong background in managing complex data projects. This role involves designing scalable data models, optimizing workflows, and ensuring seamless data integration to support strategic business decisions. Key Responsibilities: Data Modeling: Design, develop, and maintain conceptual, logical, and physical data models to support data warehousing and analytics needs. Ensure data models are scalable, efficient, and aligned with business requirements. Database Design: Create and optimize database schemas, tables, views, indexes, and other database objects in Google BigQuery. Implement best practices for database design to ensure data integrity and performance. ETL Processes: Design and implement ETL (Extract, Transform, Load) processes to integrate data from various source systems into BigQuery. Use tools like Google Cloud Dataflow, Apache Beam, or other ETL tools to automate data pipelines. Data Integration: Work closely with data engineers to ensure seamless integration and consistency of data across different platforms. Integrate data from on-premises systems, third-party applications, and other cloud services into GCP. Data Governance: Implement data governance practices to ensure data quality, consistency, and security. Define and enforce data standards, naming conventions, and documentation. Performance Optimization: Optimize data storage, processing, and retrieval to ensure high performance and scalability. Use partitioning, clustering, and other optimization techniques in BigQuery. Collaboration: Collaborate with business stakeholders, data scientists, and analysts to understand data requirements and translate them into effective data models. Provide technical guidance and mentorship to junior team members. Data Visualization: Work with data visualization tools like Looker, Looker Studio, or Tableau to create interactive dashboards and reports. Develop LookML models in Looker to enable efficient data querying and visualization. Documentation: Document data models, ETL processes, and data integration workflows. Maintain up-to-date documentation to facilitate knowledge sharing and onboarding of new team members. Required Expertise: Looker: 2-5+ Years of Strong proficiency in Looker, including LookML, dashboard creation, and report development. BigQuery: 5+ Extensive experience with Google BigQuery, including data warehousing, SQL querying, and performance optimization. SQL& Python: 10+ years of SQL and Advanced SQL and Python skills for data manipulation, querying, and modelling. ETL: 10+ years of hands-on experience with ETL processes and tools for data integration from various source systems. Cloud Services: Familiarity with Google Cloud Platform (GCP) services, particularly BigQuery, Cloud Storage, and Dataflow. Data Modelling Techniques: Proficiency in various data modelling techniques such as star schema, snowflake schema, normalized and denormalized models, and dimensional modelling. Knowledge of data modelling frameworks, including Data Mesh, Data Vault, Medallion architecture, and methodologies by Kimball and Inmon, is highly advantageous. Problem-Solving: Excellent problem-solving skills and the ability to work on complex, ambiguous projects. Communication: Strong communication and collaboration skills, with the ability to work effectively in a team environment. Project Delivery: Proven track record of delivering successful data projects and driving business value through data insights. Preferred Qualifications: Education: Bachelor's or Master's degree in Data Science, Computer Science, Information Systems, or a related field. Certifications: Google Cloud certification in relevance to Data Modeler or engineering capabilities. Visualization Tools: Experience with other data visualization tools such as Looker, Looker Studio and Tableau. Programming: Familiarity with programming languages such as Python for data manipulation and analysis. Data Warehousing: Knowledge of data warehousing concepts and best practices.

Mock Interview

Practice Video Interview with JobPe AI

Start Data Interview Now
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

RecommendedJobs for You

Gurugram, Haryana, India

Andhra Pradesh, India

Gurugram, Haryana, India

Bhubaneswar, Odisha, India

Indore, Madhya Pradesh, India