Cloud Leader

7 - 10 years

9 - 13 Lacs

Mangaluru, Udupi

Posted:1 week ago| Platform: Naukri logo

Apply

Skills Required

beam c++ scala amazon redshift data warehousing sql cloud load testing java apache postgresql spark gcp debugging mysql hadoop bigquery etl big data mongodb python oracle airflow microsoft azure relational databases sql server nosql cassandra kafka testing methodologies aws object

Work Mode

Hybrid

Job Type

Full Time

Job Description

Cloud Leader (Jr. Data Architect) 7+ yrs of IT experience Should have worked on any two Structural (SQL/Oracle/Postgres) and one NoSQL Database Should be able to work with the Presales team, proposing the best solution/architecture Should have design experience on BQ/Redshift/Synapse Manage end-to-end product life cycle, from proposal to delivery, and regularly check with delivery on architecture improvement Should be aware of security protocols for in-transit data, encryption/decryption of PII data Good understanding of analytics tools for effective analysis of data Should have been part of the production deployment team and the Production Support team. Experience with Big Data tools- Hadoop, Spark, Apache Beam, Kafka etc. Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc. Experience in ETL and Data Warehousing. Experience and firm understanding of relational and non-relational databases like MySQL, MS SQL Server, Postgres, MongoDB, Cassandra, etc. Experience with cloud platforms like AWS, GCP, and Azure. Experience with workflow management using tools like Apache Airflow. Preferred: Need to be Aware of Design Best Practices for OLTP and OLAP Systems Should be part of the team designing the DB and pipeline Should be able to propose the right architecture, Data Warehouse/Datamesh approaches Should be aware of data sharing and multi-cloud implementation Should have exposure to Load testing methodologies, Debugging pipelines, and Delta load handling Worked on heterogeneous migration projects Experience on multiple Cloud platforms Should have exposure to Load testing methodologies, Debugging pipelines, and Delta load handling Roles and Responsibilities Develop high-performance and scalable solutions using GCP that extract, transform, and load big data. Designing and building production-grade data solutions from ingestion to consumption using Java / Python Design and optimize data models on the GCP cloud using GCP data stores such as BigQuery Optimizing data pipelines for performance and cost for large-scale data lakes. Writing complex, highly optimized queries across large data sets and creating data processing layers. Closely interact with Data Engineers to identify the right tools to deliver product features by performing POC Collaborative team player that interacts with business, BAs and other Data/ML engineers Research new use cases for existing data.

Mock Interview

Practice Video Interview with JobPe AI

Start Beam Interview Now

RecommendedJobs for You

Udupi, Karnataka, India