Data Engineer Solution Architect

8 - 12 years

40 - 48 Lacs

Posted:3 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Experience:

8 to 12 Years

Required Qualifications

  • 8–12 years of experience in data engineering and architecture, including hands-on solution delivery.
  • Deep expertise with Snowflake or Databricks, with strong working knowledge of tools like dbt, Matillion, SQL, and Python or PySpark.
  • Experience designing and implementing data pipelines and orchestration using tools like Airflow, Control-M, or equivalent.
  • Familiarity with cloud-native data engineering services (Such as AWS Glue, Redshift, Athena, GCP BigQuery, Dataflow, Pub/Sub, etc.) or similar.
  • Strong understanding of data modelling, ELT/ETL design, and modern architecture frameworks (medallion, layered, or modular architectures).
  • Experience integrating and troubleshooting APIs and real-time data ingestion technologies (Kafka, Kinesis, Pub/Sub, REST APIs).
  • Familiarity with traditional ETL and data integration tools (Informatica, SSIS, Oracle Data Integrator, etc.).
  • Excellent understanding of data governance, performance tuning, and DevOps for data (CI/CD, version control, monitoring).
  • Strong communication, problem-solving, and stakeholder management skills.

Preferred Qualifications

  • Certifications such as:
  • Snowflake SnowPro, Databricks Certified Architect, AWS Data Analytics Specialty, or Google Professional Data Engineer.
  • Prior consulting or client-facing experience.
  • Exposure to AI/ML, data quality, or metadata management frameworks.
  • Experience leading solution design across multi-cloud or hybrid environments.

Key Responsibilities

  • Design and architect end-to-end data solutions using technologies like Snowflake, Databricks, dbt, Matillion, Python, Airflow, Control-M, and cloud-native services on AWS/Azure/GCP.
  • Define and implement data ingestion, transformation, integration and orchestration frameworks for structured and semi-structured data.
  • Architect data lakes and data warehouses with an emphasis on scalability, cost optimization, performance, and governance.
  • Support real-time and API-based data integration scenarios; design solutions for streaming, micro-batch, and event-driven ingestion.
  • Lead design and delivery of data visualization and reporting solutions using tools such as Power BI, Tableau, and Streamlit.
  • Collaborate with business and technical stakeholders to define requirements, design architecture blueprints, and ensure alignment with business objectives.
  • Establish and enforce engineering standards, frameworks, and reusable assets to improve delivery efficiency and solution quality.
  • Mentor data engineers and help build internal capability on emerging technologies.
  • Provide thought leadership around modern data platforms, AI/ML integration, and data modernization strategies.
Skills: snowflake,aws,integration,airflow,cloud,data engineering,dbt,architecture

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You