Overview
The Data Solution Architect is responsible for designing scalable and secure data solutions while also supporting pre-sales activities, proposal development, and client engagement.
Key Responsibilities
1. Solution Design & Architecture
- Design end-to-end data architectures including data lakes, data warehouses, lakehouses, and real-time pipelines.
- Develop reusable architectural patterns, reference models, and solution frameworks.
- Evaluate emerging technologies and recommend optimal components for each solution.
2. Proposal Development & Pre-Sales Support
- Collaborate with sales teams to understand client needs and craft compelling technical proposals.
- Develop solution blueprints, technical write-ups, effort estimations, and pricing inputs.
- Participating in RFP/RFI responses, ensuring architecture, scope, and approach are clearly articulated.
- Deliver technical presentations, demos, and workshops to clients.
- Translate business problems into clear, value-driven technical solutions.
3. Client Engagement
- Work closely with clients to understand data challenges, requirements, and long-term goals.
- Serve as the technical point of contact during discovery sessions and pre-sales discussions.
- Build trust with executive stakeholders by articulating the value and feasibility of proposed data solutions.
4. Data Platform Development
- Guide the implementation of modern cloud-based data platforms (AWS, Azure, or GCP).
- Oversee engineering teams in building robust ETL/ELT pipelines, data models, and integration layers.
- Ensure solutions adhere to performance, quality, and reliability standards.
5. Governance, Security & Compliance
- Define data governance strategies, security frameworks, and compliance controls.
- Ensure architectural adherence to standards such as GDPR, HIPAA, or industry-specific regulations.
- Establish best practices for data lifecycle management, access policies, and metadata management.
6. DevOps & DataOps Engineering
- Design CI/CD pipelines for data pipelines, models, and platform components (e.g., Azure DevOps, GitHub Actions, Jenkins).
- Define standard automated deployment processes for ETL/ELT workflows, streaming applications, and analytical models.
- Define monitoring, logging, and alerting data workloads using tools such as Prometheus, Grafana, CloudWatch, or Azure Monitor.
- Ensure platform reliability, scalability, and observability through automation and performance tuning.
7. Performance Optimization & Innovation
- Review architecture performance and guide optimization for scalability and cost-efficiency.
- Lead proof-of-concepts (POCs) to validate new technologies or solution approaches.
- Continuously evaluate and integrate modern data engineering and AI/ML trends.
Qualifications
Education & Experience
- Bachelor s or master s degree in computer science, Information Systems, Data Engineering, or related field.
- 6+ years of experience in data engineering, analytics platform design, or solution architecture.
- Prior experience supporting pre-sales, proposal creation, or client solutions is highly desirable.
Technical Skills
- Strong hands-on expertise with cloud data platforms (Snowflake, Databricks, BigQuery, Redshift, Synapse).
- Deep understanding of data modeling, ETL/ELT frameworks, and distributed data processing (Spark, Kafka).
- Proficiency in SQL and Python; experience with CI/CD, IaC (Terraform), and containerization (Docker/Kubernetes).
- Knowledge of BI tools (Power BI, Tableau, Looker) and ML ecosystem is a plus.
Soft Skills
- Excellent written communication for producing technical proposals and solution documents.
- Strong presentation and pre-sales communication skills with the ability to engage senior stakeholders.
- Ability to work cross-functionally with sales, engineering, and business teams.
- Analytical mindset with strong problem-solving skills.