Work from Office
Full Time
Job Description & Summary: We are seeking an experienced Senior Data Architect to lead the design and development of our data architecture, leveraging cloud-based technologies, big data processing frameworks, and DevOps practices. The ideal candidate will have a strong background in data warehousing, data pipelines, performance optimization, and collaboration with DevOps teams. Responsibilities: 1. Design and implement end-to-end data pipelines using cloud-based services (AWS/ GCP/Azure) and conventional data processing frameworks. 2. Lead the development of data architecture, ensuring scalability, security, and performance. 3. Collaborate with cross-functional teams, including DevOps, to design and implement data lakes, data warehouses, and data ingestion/extraction processes. 4. Develop and optimize data processing workflows using PySpark, Kafka, and other big data processing frameworks. 5. Ensure data quality, integrity, and security across all data pipelines and architectures. 6. Provide technical leadership and guidance to junior team members. 7. Design and implement data load strategies, data partitioning, and data storage solutions. 8. Collaborate with stakeholders to understand business requirements and develop data solutions to meet those needs. 9. Work closely with DevOps team to ensure seamless integration of data pipelines with overall system architecture. 10. Participate in design and implementation of CI/CD pipelines for data workflows. DevOps Requirements: 1. Knowledge of DevOps practices and tools, such as Jenkins, GitLab CI/CD, or Apache Airflow. 2. Experience with containerization using Docker. 3. Understanding of infrastructure as code (IaC) concepts using tools like Terraform or AWS CloudFormation. 4. Familiarity with monitoring and logging tools, such as Prometheus, Grafana, or ELK Stack. Requirements: 1. 12-14 years of experience for Senior Data Architect in data architecture, data warehousing, and big data processing. 2. Strong expertise in cloud-based technologies (AWS/ GCP/ Azure) and data processing frameworks (PySpark, Kafka, Flink , Beam etc.). 3. Experience with data ingestion, data extraction, data warehousing, and data lakes. 4. Strong understanding of performance optimization, data partitioning, and data storage solutions. 5. Excellent leadership and communication skills. 6. Experience with NoSQL databases is a plus. Mandatory skill sets: 1. Experience with agile development methodologies. 2. Certification in cloud-based technologies (AWS / GCP/ Azure) or data processing frameworks. 3. Experience with data governance, data quality, and data security. Preferred skill sets: Knowledge of AgenticAI and GenAI is added advantage
PwC India
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Bengaluru
30.0 - 35.0 Lacs P.A.
Bengaluru
15.0 - 19.0 Lacs P.A.
Hyderabad
11.0 - 21.0 Lacs P.A.
Hyderābād
2.655 - 7.29 Lacs P.A.
Mumbai
9.0 - 14.0 Lacs P.A.
Pune, Maharashtra
Salary: Not disclosed
Bengaluru
20.0 - 27.5 Lacs P.A.
Hyderabad, Pune, Bengaluru
10.0 - 20.0 Lacs P.A.
Bengaluru, Karnataka
Salary: Not disclosed
Hyderabad, Chennai, Bengaluru
7.0 - 17.0 Lacs P.A.