Senior Data Engineer (5-7 Years) | AWS, Databricks & Python

5 years

0 Lacs

Posted:15 hours ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

Company Description

VOPAIS is dedicated to driving technological innovation that enhances the human experience. Our expertise lies in designing transformative digital solutions, creating intuitive applications that address complex challenges while delivering joy to users. We empower our clients through integrity, transparency, and accountability, offering both comprehensive software development services and premium team deployment solutions. VOPAIS seamlessly integrates skilled technical teams into organizations, enabling them to thrive in an evolving technological landscape. Our mission is to connect advanced technology with meaningful outcomes for businesses and their people.


Role Description

We are seeking a Senior Data Engineer for a full-time, remote position to architect, implement, and optimize scalable data solutions. In this role, you will design and maintain data pipelines, infrastructure, and systems to support data workflows, analytics, and reporting. You will collaborate with cross-functional teams to ensure data quality and accessibility, implement best practices for data governance and security, and support the strategic use of data insights. As a key part of our engineering team, you will contribute to fostering innovation at the intersection of data and technology.


Qualifications

Responsibilities:

  • Design, develop & maintain scalable AWS data lakes/pipelines with Databricks
  • Integrate, transform, and centralize large-scale data from varied sources
  • Implement/manage Delta Lake architecture (Databricks Delta or Apache Hudi)
  • Build end-to-end data workflows with PySpark, Databricks Notebooks, Python
  • Develop data warehouses/marts (Snowflake, Redshift, etc.)
  • Optimize data storage, queries, and cost across Databricks/AWS
  • Build CI/CD for Databricks/Python, maintain version control (Git)
  • Collaborate with cross-functional teams for high-performance data solutions
  • Drive technical/architectural decisions, troubleshoot clusters & ETL jobs


Core skills and requirements:

  • 5+ years building/managing AWS Data Lake Architectures
  • 3+ years with AWS services (S3, Glue, Redshift, etc.)
  • 3+ years with Databricks, Delta Lake, PySpark, ETL
  • Hands-on with Python, data automation, API integration
  • CI/CD, Git best practices (Terraform/CloudFormation a plus)
  • Bachelor's in Computer Science, IT, Data Science, or related
  • Experience in Agile environments
  • Strong SQL, RDBMS, data modeling, governance (Unity Catalog/DLT/MLflow a plus)
  • AWS/Databricks certifications, security/compliance knowledge are valued


Send your CV & Cover Letter to careers@vopais.com.


Kindly review the detailed Job Description before applying.


#DataEngineer #Databricks #AWS #Python #BigData #DataLake #DataJobs #CI/CD #ETL #Redshift #Snowflake #VopaisCareers #ApplyNow

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You