Jobs
Interviews

208 Datahub Jobs - Page 9

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 7.0 years

8 - 10 Lacs

bengaluru

Work from Office

An academic degree in Statistics, Data Analytics, Computer Science or equivalent work experience. A minimum of five (5) years of relevant professional experience in either geospatial datasets or high-volume time-series datasets Required Candidate profile A minimum of three (3) years of experience in open-source analytical stack Superset, Trino, QGIS, Kepler, Mapbox, Carto PostgreSQL DB, and Dagster. minimum of five (5) years of experience with Python

Posted Date not available

Apply

5.0 - 7.0 years

8 - 10 Lacs

visakhapatnam

Work from Office

An academic degree in Statistics, Data Analytics, Computer Science or equivalent work experience. A minimum of five (5) years of relevant professional experience in either geospatial datasets or high-volume time-series datasets Required Candidate profile A minimum of three (3) years of experience in open-source analytical stack Superset, Trino, QGIS, Kepler, Mapbox, Carto PostgreSQL DB, and Dagster. minimum of five (5) years of experience with Python

Posted Date not available

Apply

5.0 - 7.0 years

8 - 10 Lacs

kolkata

Work from Office

An academic degree in Statistics, Data Analytics, Computer Science or equivalent work experience. A minimum of five (5) years of relevant professional experience in either geospatial datasets or high-volume time-series datasets Required Candidate profile A minimum of three (3) years of experience in open-source analytical stack Superset, Trino, QGIS, Kepler, Mapbox, Carto PostgreSQL DB, and Dagster. minimum of five (5) years of experience with Python

Posted Date not available

Apply

5.0 - 7.0 years

8 - 10 Lacs

patna

Work from Office

An academic degree in Statistics, Data Analytics, Computer Science or equivalent work experience. A minimum of five (5) years of relevant professional experience in either geospatial datasets or high-volume time-series datasets Required Candidate profile A minimum of three (3) years of experience in open-source analytical stack Superset, Trino, QGIS, Kepler, Mapbox, Carto PostgreSQL DB, and Dagster. minimum of five (5) years of experience with Python

Posted Date not available

Apply

5.0 - 7.0 years

8 - 10 Lacs

hyderabad

Work from Office

An academic degree in Statistics, Data Analytics, Computer Science or equivalent work experience. A minimum of five (5) years of relevant professional experience in either geospatial datasets or high-volume time-series datasets Required Candidate profile A minimum of three (3) years of experience in open-source analytical stack Superset, Trino, QGIS, Kepler, Mapbox, Carto PostgreSQL DB, and Dagster. minimum of five (5) years of experience with Python

Posted Date not available

Apply

5.0 - 7.0 years

8 - 10 Lacs

nagpur

Work from Office

An academic degree in Statistics, Data Analytics, Computer Science or equivalent work experience. A minimum of five (5) years of relevant professional experience in either geospatial datasets or high-volume time-series datasets Required Candidate profile A minimum of three (3) years of experience in open-source analytical stack Superset, Trino, QGIS, Kepler, Mapbox, Carto PostgreSQL DB, and Dagster. minimum of five (5) years of experience with Python

Posted Date not available

Apply

5.0 - 7.0 years

8 - 10 Lacs

mumbai

Work from Office

An academic degree in Statistics, Data Analytics, Computer Science or equivalent work experience. A minimum of five (5) years of relevant professional experience in either geospatial datasets or high-volume time-series datasets Required Candidate profile A minimum of three (3) years of experience in open-source analytical stack Superset, Trino, QGIS, Kepler, Mapbox, Carto PostgreSQL DB, and Dagster. minimum of five (5) years of experience with Python

Posted Date not available

Apply

4.0 - 9.0 years

1 - 6 Lacs

hyderabad, chennai, bengaluru

Hybrid

Job Requirements and Preferences: Basic Qualifications: Role: Guidewire datahub developer Preferred Qualifications: Preferred Knowledge/Skills: Demonstrates expert abilities and extensive Application Managed Service projects and solutioning the Smartcomm integration with Guidewire Suite of applications on premises and SaaS, with proven success executing and leading all aspects of complex engagements within the Smartcomm application achieving on-time and on-budget delivery, as well as the following: 6+ years of experience as technical lead for Datahub and ETL tools Strong understanding of data warehouse concepts and dimensional data modeling Expertise with SQL queries, analytical services, reporting services Experience with one or more SDLC methodologies Expertise related to metadata management, data modeling, data model rationalization, and database products Experience with data scripting and programming Understands context of their project within the larger portfolio Demonstrated a strong attention to detail Possesses strong analytical skills Demonstrated a strong sense of ownership and commitment to program goals Strong verbal and written communication skills Identifies and implements operational database requirements and proposed enhancements in support of requested application development or business functionality Develops and translates business requirements into detailed data designs Maps data between systems (in conjunction with Business Analysts) Assists engineering and development teams as application data expert from physical data models to production implementations Identifies entities, attributes, and referential relationships for data models using a robust enterprise information engineering approach Creates tables, indexes, constants, and assist programmers with SQL, tuning and reviews Participates in data analysis, archiving, database design, and development activities for migration of existing data as needed Develops ETL interfaces from source databases and systems to the data warehouse Works closely with application development teams to ensure quality interaction with the database Job Functions: (1) To be responsible for providing technical guidance or solutions (2) To develop and guide the team members in enhancing their technical capabilities and increasing productivity (3) To ensure process compliance in the assigned module, and participate in technical discussions or review. (4) To prepare and submit status reports for minimizing exposure and risks on the project or closure of escalations. Technologies Guidewire PolicyCenter, BillingCenter, ClaimCenter and Conversion ETL tools SQL competence (query performance tuning, index management, etc.) and a grasp of database structure are required. Understanding of data modeling concepts. Knowledge of at least one ETL tool (Informatica, SSIS, Talend, etc.) Knowledge of different SQL/NoSQL data storage techniques If interested people please share the cv to indumathi.j@pwc.com

Posted Date not available

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies