Posted:8 hours ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Position Details:


The Data Engineer is responsible for designing, developing, and maintaining robust, scalable, and secure data pipelines and platforms that enable efficient storage, processing, and consumption of data across the enterprise. The role ensures high-quality, timely, and governed data delivery to data scientists, analysts, and business users.

The Data Engineer plays a crucial role in building cloud-native data architectures, integrating data from diverse sources, and supporting the full lifecycle of data—from ingestion to transformation to consumption. The role ensures that the data infrastructure is reliable, compliant, and optimized to support advanced analytics and AI use cases across business functions.

  • Qualification:

    Technical graduate (Engineering

    degree). Certifications in cloud data platforms (Azure Data Engineer, AWS Big Data, GCP Data Engineer), Big Data tools, or database technologies are preferred
  • Experience:

    5–10 years of experience in data engineering or related roles. Proven experience in building and optimizing data pipelines, cloud data architecture, and handling large-scale datasets. Experience working with cross-functional data science, analytics, and IT teams.


Key Responsibilities

Data Pipeline Development

  • Work Design, develop, and manage scalable ETL/ELT pipelines to extract, transform, and load data
  • Automate ingestion from multiple sources including APIs, files, databases, and third-party systems
  • Implement data ingestion from a variety of sources: ERP, CRM, APIs, file systems, IoT, etc.
  • Implement efficient batch and real-time data processing pipelines

Data Integration & Middleware Interaction

  • Integrate data from source systems via middleware platforms and API-based connectors
  • Leverage OData, REST, SOAP, and other data protocols to extract data from third-party platforms (e.g., SAP, Salesforce, legacy systems)
  • Collaborate with integration teams to establish secure and reliable connections via middleware (e.g., MuleSoft, Azure Integration Services)

Data Modelling & Architecture

  • Design and implement data models for operational and analytical use cases (e.g., star/snowflake schemas) . Leverage knowledge of source systems schemas for ETL design and development
  • Work closely with Data Architects and Analysts to ensure alignment with business needs
  • Optimize data storage using partitioning, compression, and indexing strategies

Cloud Data Platform Engineering

  • Build and manage cloud-native data solutions using services like Azure Data Factory, AWS Glue, or GCP Dataflow
  • Ensure optimal performance, scalability, and cost-efficiency of cloud data environments
  • Support migration from legacy systems to modern cloud-based architectures

Data Quality & Governance

  • Implement data validation, reconciliation, and monitoring frameworks
  • Ensure data accuracy, completeness, and integrity through automated checks
  • Collaborate with data governance teams to ensure metadata management and lineage tracking
  • Data Lineage- Document lineage and transformations to support transparency and auditability

Collaboration and Support

  • Partner with Data Scientists, Analysts, and BI Developers to deliver data for ML models, dashboards, and reports
  • Provide support for data issues and troubleshoot pipeline failures
  • Document code, workflows, and platform configurations for maintainability

Security & Compliance

  • Ensure secure access control, encryption, and compliance with data privacy regulations
  • Work with security teams to perform vulnerability assessments on data systems
  • Implement logging and monitoring for audit and traceability

Technology Background

  • Infrastructure & Systems:

    Build and Maintain the data pipelines, databases, and other systems that an organization uses to manage its data
  • Data Processing & Transformation:

    Develop processes to transform raw data into a usable format for analysis and reporting
  • Programming Languages:

    Proficiency in languages Python, SQL, Scala, Shell scripting
  • ETL Tools:

    Experience on ETL tools - Azure Data Factory, Apache NiFi, Talend, Informatica
  • Big Data Frameworks:

    Awareness of Apache Spark, Hadoop, Kafka
  • Cloud Platforms:

    Awareness of Azure (preferred), AWS, GCP
  • Data Storage:

    Azure Data Lake, Delta Lake, Snowflake, AWS S3, BigQuery
  • Databases:

    MS SQL Server, PostgreSQL, MySQL, MongoDB
  • Orchestration:

    Airflow, Azure Data Factory, Prefect
  • CI/CD & Version Control:

    Git, Azure DevOps, Jenkins
  • Monitoring & Logging:

    Datadog, Prometheus, ELK Stack
  • Visualization:

    Awareness of few visualization tools example- power BI etc
  • Data Governance Tools:

    Responsible for ensuring that all data is stored securely and that investments in security measures are made and regularly maintained example Collibra, Microsoft Purview, Alation (as applicable)


About Company:


Coromandel International Limited

Founded in the early 1960s (as Coromandel Fertilisers), the company is currently headquartered in Chennai with its registered office in Hyderabad.

They are one of India’s largest private-sector producers of phosphatic fertilizers and the world’s largest manufacturer of neem-based bio-pesticides. Additionally, they lead the market in organic fertilizers and operate the country’s largest agri-retail chain, with 1000+ stores serving over 2 crore farmers.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

vellore, tamil nadu, india

madurai, tamil nadu, india

coimbatore, tamil nadu, india

faridabad, haryana, india

pune, maharashtra, india

pune, maharashtra, india