Data Architect

7 years

0.0 Lacs P.A.

Kochi, Kerala, India

Posted:3 weeks ago| Platform: Linkedin logo

Apply Now

Skills Required

dataarchitecturegovernanceprocessingengineeringwritingautomationpowerdesignintegritysecuritysourcingflowstrategiesawshivemldatabrickssparketlairflowredshiftkafkakubernetesdockeranalyticsanalysisrelationalsqlnosqlpythontableaunetworkingstorageoptimizationleadershipdevelopmentagile

Work Mode

Not specified

Job Type

Full Time

Job Description

Data Architect is responsible to define and lead the Data Architecture, Data Quality, Data Governance, ingesting, processing, and storing millions of rows of data per day. This hands-on role helps solve real big data problems. You will be working with our product, business, engineering stakeholders, understanding our current eco-systems, and then building consensus to designing solutions, writing codes and automation, defining standards, establishing best practices across the company and building world-class data solutions and applications that power crucial business decisions throughout the organization. We are looking for an open-minded, structured thinker passionate about building systems at scale. Role Design, implement and lead Data Architecture, Data Quality, Data Governance Defining data modeling standards and foundational best practicesDevelop and evangelize data quality standards and practicesEstablish data governance processes, procedures, policies, and guidelines to maintain the integrity and security of the dataDrive the successful adoption of organizational data utilization and self-serviced data platformsCreate and maintain critical data standards and metadata that allows data to be understood and leveraged as a shared assetDevelop standards and write template codes for sourcing, collecting, and transforming data for streaming or batch processing dataDesign data schemes, object models, and flow diagrams to structure, store, process, and integrate dataProvide architectural assessments, strategies, and roadmaps for data managementApply hands-on subject matter expertise in the Architecture and administration of Big Data platforms, Data Lake Technologies (AWS S3/Hive), and experience with ML and Data Science platformsImplement and manage industry best practice tools and processes such as Data Lake, Databricks, Delta Lake, S3, Spark ETL, Airflow, Hive Catalog, Redshift, Kafka, Kubernetes, Docker, CI/CDTranslate big data and analytics requirements into data models that will operate at a large scale and high performance and guide the data analytics engineers on these data modelsDefine templates and processes for the design and analysis of data models, data flows, and integrationLead and mentor Data Analytics team members in best practices, processes, and technologies in Data platforms Qualifications B.S. or M.S. in Computer Science, or equivalent degree10+ years of hands-on experience in Data Warehouse, ETL, Data Modeling & Reporting7+ years of hands-on experience in productionizing and deploying Big Data platforms and applications, Hands-on experience working with: Relational/SQL, distributed columnar data stores/NoSQL databases, time-series databases, Spark streaming, Kafka, Hive, Delta Parquet, Avro, and moreExtensive experience in understanding a variety of complex business use cases and modeling the data in the data warehouseHighly skilled in SQL, Python, Spark, AWS S3, Hive Data Catalog, Parquet, Redshift, Airflow, and Tableau or similar toolsProven experience in building a Custom Enterprise Data Warehouse or implementing tools like Data Catalogs, Spark, Tableau, Kubernetes, and DockerKnowledge of infrastructure requirements such as Networking, Storage, and Hardware Optimization with hands-on experience in Amazon Web Services (AWS)Strong verbal and written communications skills are a must and should work effectively across internal and external organizations and virtual teamsDemonstrated industry leadership in the fields of Data Warehousing, Data Science, and Big Data related technologiesStrong understanding of distributed systems and container-based development using Docker and Kubernetes ecosystemDeep knowledge of data structures and algorithmsExperience working in large teams using CI/CD and agile methodologies Unique ID -

Litmus7
Litmus7
Not specified
No locations

RecommendedJobs for You

Pune, Maharashtra, India

Kochi, Kerala, India

Chennai, Tamil Nadu, India

Thane, Maharashtra, India

Hyderabad, Telangana, India

Bengaluru, Karnataka, India

Chennai, Tamil Nadu, India