Home
Jobs

1 Conceptsdata Modeling Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 7.0 years

8 - 11 Lacs

Noida

Work from Office

Data Pipeline Development: Design, build, and maintain robust and scalable data pipelines to support data processing, transformation, and analysis. Data Modeling: Develop and implement data models and schemas to optimize data storage, retrieval, and analysis, ensuring efficiency and accuracy. Data Integration: Integrate data from various sources and systems, including databases, APIs, and external sources, to create unified datasets for analysis and reporting. ETL Processes: Design and implement efficient Extract, Transform, Load (ETL) processes to extract data from source systems, transform it into a usable format, and load it into data warehouses or other storage systems. Data Quality Assurance: Implement data quality checks and validation processes to ensure data accuracy, completeness, and consistency across datasets. Performance Optimization: Optimize data pipelines and processing workflows for performance, scalability, and cost-effectiveness, leveraging cloud computing platforms and distributed computing frameworks. Data Governance: Establish and enforce data governance policies and standards, ensuring compliance with data privacy regulations and security best practices. Collaboration: Collaborate closely with cross-functional teams, including data scientists, analysts, and software engineers, to understand data requirements and deliver solutions that meet business needs. Documentation and Maintenance: Document data pipelines, workflows, and processes, and provide ongoing maintenance and support to ensure the reliability and integrity of data infrastructure. Requirements: Bachelor's degree in Computer Science, Engineering, or related field. 3+ years of experience in data engineering or related roles, with a strong focus on building and maintaining data pipelines and infrastructure. Proficiency in programming languages such as Python, SQL, and/or Scala. Experience with data warehousing technologies (e.g., Snowflake, Redshift, BigQuery) and distributed computing frameworks (e.g., Spark). Solid understanding of data modeling concepts, relational and non-relational databases, and data storage solutions (e.g., HDFS, S3). Experience with data integration tools and technologies (e.g., Apache Kafka, Apache NiFi). Strong problem-solving skills and ability to analyze complex data challenges and propose effective solutions. Excellent communication and interpersonal skills, with the ability to collaborate effectively with cross-functional teams. Proactive attitude towards learning and professional development, with a passion for exploring new technologies and approaches in the field of data engineering. Preferred: Master's degree in Computer Science, Engineering, or related field. Experience with cloud computing platforms (e.g., AWS, Azure, GCP) and associated services (e.g., AWS Glue, Azure Data Factory). Knowledge of data governance frameworks and data privacy regulations (e.g., GDPR, CCPA). Certifications in data engineering or related fields (e.g., AWS Certified Big Data - Specialty, Google Professional Data Engineer).

Posted 18 hours ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies