Job Description: Data Architect (Azure / GCP/ Databricks)
About Us
Niveus Solutions is a dynamic and innovative organization dedicated to leveraging data to drive business growth and decision-making. We are seeking a highly skilled Senior Data Architect to join our team and play a pivotal role in designing, building, and maintaining robust data platforms on Azure and GCP.
Responsibilities
- Solutioning and Consultation
- Data Architecture: Develop and implement comprehensive data architectures, including data warehouses, data lakes, and data lake houses, on Azure and GCP platforms.
- Data Modeling: Design and create data models that align with business requirements and support efficient data analysis and reporting.
- ETL/ELT: Develop and optimize ETL/ELT pipelines using tools like Databricks, Azure Data Factory, or GCP Data Fusion to extract, transform, and load data into various data stores.
- Data Warehousing: Design and implement scalable data warehouses on Azure Synapse Analytics or GCP BigQuery to support enterprise reporting and analytics.
- Data Lakehouses: Design and implement data lakehouses on Azure Databricks or GCP Dataproc to enable unified data management and analytics.
- Hadoop Ecosystem: Leverage Hadoop components (HDFS, MapReduce, Spark) for distributed data processing and analysis.
- Data Governance: Establish and maintain data governance policies and procedures to ensure data quality, security, and compliance.
- Scripting: Write scripts using languages like Python, SQL, or Scala to automate data tasks and integrate with other systems.
- Cloud Platform Expertise: Deep understanding of Azure and GCP cloud platforms, including their data services, storage options, and compute resources.
- Team Leadership: Mentor and Coach junior data architects and data engineers.
- Collaboration: Collaborate with business stakeholders, data analysts, and software developers to understand data requirements and deliver solutions.
Qualifications
- Bachelor's degree in Computer Science, Data Science, or a related field.
- 5+ years of experience in data architecture, data warehousing, and data lakehouse implementation.
- Strong proficiency in Azure and GCP data services, including Azure Synapse Analytics, Azure Databricks, GCP BigQuery, snowflake, databricks and GCP Dataproc.
- Expertise in ETL/ELT tools and techniques.
- Solid understanding of Hadoop components and distributed data processing.
- Proficiency in scripting languages like Python, SQL, and Scala.
- Experience with data governance and compliance frameworks.
- Excellent communication and collaboration skills.
- Ability to work independently and as part of a team.
Bonus Points
- Certifications in Azure Data Engineer Associate or GCP Data Engineer.
- Experience with real-time data processing and streaming technologies at scale.
- Knowledge of data governance, data security, and compliance frameworks.
- Experience with data visualization tools like Power BI, Tableau, or Looker.
- Familiarity with machine learning and artificial intelligence concepts.
- Knowledge of cloud-native data platforms and technologies.
If you are a passionate data architect with a proven track record of delivering successful data solutions, we encourage you to apply. Join our team and make a significant impact on our organization's data-driven journey.