Senior Data Engineer

5 - 6 years

0 Lacs

Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

Job Title: Data Engineer

Website: https://www.issc.co.in

Location: Udyog Vihar, Phase-V, Gurugram

Job type: Full-Time

Employment Type: 5 Days Working( No Hybrid/Work from Home)

Compensation: As per Industry Standards

Address: Udyog Vihar, Phase – V, Gurgaon


Company Overview:

With the world constantly and rapidly changing, the future will be full of realigned priorities. You are keen to strengthen your firms profitability and reputation by retaining existing clients and winning more in the market.

We at ISSC have the right resources to ensure your team has access to right skills to deliver effective assurance and IT Advisory whilst you build and scale your team onshore to meet the client’s broader assurance needs.

By offshoring part of the routine and less complex auditing work to ISSC, you will free up capacity in your own organization which can be utilized in areas which requires more face time with your clients including your quest to win new clients. Having the right team on your side at ISSC will be vital as you follow your exciting growth plans and it is in this role your ISSC team stands apart. We offer a compelling case in becoming your key partner for the future.


Position Summary:

We are seeking a skilled and detail-oriented Data Engineer to join our team. As a Data Engineer, you will be responsible for developing and optimizing data pipelines, managing data architecture, and ensuring the data is easily accessible, reliable, and secure. You will work closely with data scientists, analysts, and other stakeholders to gather requirements and deliver data solutions that support business intelligence and analytics initiatives. The ideal candidate should possess strong data manipulation skills, a keen eye for detail, and the ability to work with diverse datasets. This role plays a crucial part in ensuring the quality and integrity of our data, enabling informed decision-making across the organization.


Responsibilities:

Data Pipeline Development:

  • Design, develop, and maintain scalable data pipelines to process, transform, and move large datasets across multiple platforms.
  • Ensure data integrity, reliability, and quality across all pipelines.

Data Architecture and Infrastructure:

  • Architect and manage the data infrastructure, including databases, warehouses, and data lakes.
  • Implement solutions to optimize storage and retrieval of both structured and unstructured data.

Data Integration and Management:

  • Integrate data from various sources (e.g., APIs, databases, third-party providers) into a unified system.
  • Manage ETL (Extract, Transform, Load) processes to clean, enrich, and make data ready for analysis.

Data Security and Compliance:

  • Ensure data governance, privacy, and compliance with security standards (e.g., GDPR, HIPAA).
  • Implement robust access controls and encryption protocols.

Collaboration:

  • Work closely with data scientists, analysts, and business stakeholders to gather requirements and deliver high-performance data solutions.
  • Collaborate with DevOps and software engineering teams to deploy and maintain the data infrastructure in a cloud or on-premises environment.

Performance Tuning:

  • Monitor and improve the performance of databases and data pipelines to ensure low-latency data availability.
  • Troubleshoot and resolve issues in the data infrastructure.

Documentation and Best Practices:

  • Maintain detailed documentation of data pipelines, architecture, and processes.
  • Follow industry best practices for data engineering, including version control and continuous integration.

Skills/ Requirements:

Technical Skills:

  • Proficiency in programming languages such as Python, or SQL.
  • Good experience with big data technologies like Apache Spark, Hadoop, Kafka, Flink, etc.
  • Experience with cloud data platforms (AWS, Azure).
  • Familiarity with databases (SQL and NoSQL), data warehousing solutions (e.g., Snowflake, Redshift), and ETL tools (e.g., Airflow, Talend).

Data Modeling and Database Design:

  • Expertise in designing data models and relational database schemas.

Problem-Solving:

  • Strong analytical and problem-solving skills, with the ability to handle complex data issues.

Version Control and Automation:

  • Experience with CI/CD pipelines and version control tools like Git.


Professional Qualifications:

• 5 - 6 years of relevant experience.

• BTech, Statistics, Information Technology, or a related field.


Other Benefits:

• Free Meal

• 1 Happy Hour Every week

• 3 Offsite in a year

• 1 Spa every week

Mock Interview

Practice Video Interview with JobPe AI

Start DevOps Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You