Posted:1 hour ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Senior Data Engineer

Location:

Experience:

Work Mode:

About the Role

Senior Data Engineer

construction-focused enterprise tools

You will architect scalable systems, build automated ETL/ELT workflows, standardize data governance, and enable real-time insights, predictive analytics, and automated operational intelligence across the organization.

strong technical depth, system architecture capability, and hands-on experience

Key Responsibilities

Data Architecture & System Design

  • Architect

    modular, scalable, and secure

    data platforms.
  • Design centralized data models to support

    analytics, forecasting, AI, and automation workloads

    .
  • Define data governance standards including lineage, quality, and retention.
  • Produce HLD/LLD architecture diagrams, integration maps, and technical documentation.
  • Build frameworks that support complex cross-system workflows.

Integrations Across Enterprise Tools (Sage 300, Paperless, WorkMax, Procore, Kojo, CRM, etc.)

  • Lead integrations across construction and operations software including:
  • Sage 300

    (Financials, Job Costing, AP/AR, GL)
  • Paperless

    (Document workflows, invoice automation)
  • WorkMax

    (Timecards, units, production, field data)
  • Procore

    (Project management, RFIs, submittals, schedules)
  • Kojo

    (Procurement, materials management)
  • CRM systems

    (Sales pipeline, customer records)
  • Develop connectors using

    REST APIs, GraphQL, Webhooks, Secure File Transfer, and Database links

    .
  • Standardize schemas across tools to create a

    single source of truth

    .
  • Ensure secure and reliable data ingestion from on-prem and cloud systems.
  • Implement robust incremental sync, batch loads, and real-time event-based ingestion.

ETL/ELT Development & Data Pipelines

  • Build automated ETL/ELT workflows using

    Airflow, NiFi, Azure Data Factory, or custom frameworks

    .
  • Ensure pipelines are

    fault-tolerant, versioned, and optimized for performance

    .
  • Handle complex transformations, reconciliations, and multi-system data validation.
  • Develop end-to-end data ingestion frameworks supporting large structured and unstructured datasets.
  • Deploy pipelines into production with CI/CD practices.

Data Warehousing, Modeling & Real-Time Reporting

  • Build enterprise-wide data warehouse layers using

    Snowflake (preferred)

    , Azure SQL, or cloud data stores.
  • Maintain

    real-time operational data models

    for job cost, purchasing, productivity, CRM, safety, compliance, and field activity.
  • Create unified analytical datasets for dashboards and business intelligence.
  • Support executive and departmental reporting by eliminating manual spreadsheets and fragmented data silos.
  • Ensure high data accuracy, reliability, and performance at scale.

Predictive Analytics & AI Enablement

  • Prepare AI-ready datasets for forecasting, anomaly detection, productivity modeling, and operational risk scoring.
  • Collaborate with data scientists/AI engineers to build and deploy ML pipelines.
  • Develop streaming/near-real-time data flows to support AI-driven decision automation.
  • Integrate predictive insights back into operation systems (alerts, workflows, dashboards).

Automation & Operational Workflows

  • Automate KPI tracking, threshold-based alerts, and data validation rules.
  • Build automated workflows for:
  • RFIs
  • CORs
  • Contracts
  • Vendor documents
  • Field reporting
  • Implement monitoring frameworks for pipeline health, error handling, retry logic, and SLA compliance.

Documentation & Knowledge Sharing

  • Create detailed documentation for data models, transformations, integrations, and pipelines.
  • Conduct internal training and handover sessions to engineering, analytics, and operations teams.
  • Maintain continuous improvement of data standards and reusable components.

Skills & Experience Required

Core Experience

  • 3–5 years of hands-on

    Data Engineering / Data Platform development

    .
  • Strong expertise in

    API-based integrations

    , especially with enterprise applications.
  • Practical experience with:
  • Python

    , FastAPI, Node.js
  • ETL frameworks:

    Airflow / NiFi / Azure Data Factory

  • Snowflake

    (strongly preferred) or Azure-based data warehousing
  • SQL + cloud databases
  • Azure Functions, Storage, Data Lake environments

Data Architecture & Modeling

  • Strong data modeling principles (Kimball, Data Vault, or custom patterns).
  • Building scalable data lakes, warehouse schemas, and analytical data marts.

Tools & Platforms

  • Dashboarding tools:

    Power BI / Tableau

  • Experience working with large structured, semi-structured, and unstructured datasets

    .
  • Background in financial or field operations workflows (preferred).

Preferred Knowledge

  • Working with industry tools such as:
  • Sage 300

  • Procore

  • WorkMax

  • Kojo

  • Paperless

  • Experience in building ML/AI data pipelines.
  • Exposure to

    Azure DevOps, CI/CD

    , and cloud deployment best practices.

Key Competencies

  • Strong system design and architecture thinking.
  • High ownership, accountability, and delivery focus.
  • Excellent analytical and problem-solving abilities.
  • Clear communication and documentation skills.
  • Ability to work with leadership, operations, finance, field teams, and engineering.
  • Commitment to data accuracy, reliability, and operational performance.

What We Offer

  • Opportunity to build a

    next-generation Enterprise AI Data Platform

    .
  • Career growth into

    Data Platform Lead / Engineering Manager roles

    .
  • Medical insurance and employee benefits.
  • Supportive leadership, high visibility, and impactful responsibilities.
  • Competitive compensation aligned with expertise and contribution.


Mock Interview

Practice Video Interview with JobPe AI

Start DevOps Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You

mumbai, hyderabad, bengaluru

chennai, tamil nadu, india