Associate Data Solutions Architect

5 - 8 years

19 - 25 Lacs

Posted:7 hours ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Position Responsibilities:

  • Contribute to the design, build, and upkeep of data architectures in standard databases and data warehouses
  • Translate stakeholder requirements into data models for acquisition and implementation
  • Write Python code for ETL/ELT tasks, data validation, and automation
  • Build and operate simple, repeatable data pipelines across multi- and hybrid-cloud environments
  • Develop and support basic data access APIs to enable reliable data consumption
  • Prepare curated datasets and features to support data scientists analysis and modeling
  • Create collection frameworks for structured and unstructured data; address low-complexity data issues
  • Implement data quality checks, monitoring, and clear documentation for datasets and pipelines
  • Share guidance on data standards, API usage, and Python practices within the team
  • Apply automation and CI/CD to Python code, pipelines, and APIs for consistent delivery

EMPLOYER WILL NOT SPONSOR APPLICANTS FOR EMPLOYMENT VISA STATUS.

Basic Qualifications (Required Skills/Experience):

  • Hands-on Python experience for data engineering (ETL/ELT scripting, data validation, and automation)
  • Working knowledge of SQL and relational databases, including basic data modeling concepts
  • Ability to integrate data from multiple sources and formats with attention to data quality
  • Experience building simple pipelines and using scripts or basic orchestration to move data
  • Familiarity interfacing with RESTful APIs using Python
  • Experience preparing datasets and features in collaboration with data scientists
  • Understanding of core cloud concepts and working within multi- or hybrid-cloud setups
  • Exposure to version control and basic CI/CD workflows (e.g., Git, automated testing)
  • Effective communication and collaboration with technical and non-technical stakeholders

Preferred Qualifications (Desired Skills/Experience ) :

  • Production experience creating Python-based pipelines and jobs on AWS, Azure, or GCP
  • Use of workflow/orchestration tools (e.g., Airflow, Prefect, or Dagster)
  • Creation and maintenance of data access APIs or SDKs, including authentication and testing
  • Work with modern warehousing/lake house platforms (e.g., Snowflake, Redshift, BigQuery, Databricks, Delta Lake)
  • Implementation of CI/CD for data projects (tests, linting, packaging, and deployment for Python code and pipelines)
  • Experience with containerization and environment management (e.g., Docker, Conda, Poetry) and infrastructure-as-code (e.g., Terraform, CloudFormation)
  • Knowledge of data governance and security practices (e.g., access control, encryption, PII handling)
  • Familiarity with streaming or messaging systems (e.g., Kafka, Kinesis, Pub/Sub) and data observability tools
  • Ability to create technical documentation and mentor peers on Python, data standards, and API patterns

Typical Education & Experience:

Bachelor or Master degree in Computer Science/ Engineering ( Software / Instrumentation / Electronics / Electrical / Mechanical or equivalent discipline) with 5 to 8 years experience.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Boeing logo
Boeing

Aviation and Aerospace Component Manufacturing

Arlington VA

RecommendedJobs for You