Data Engineer and Architect (AWS/Azure/GCP/Oracle)

5 - 10 years

12 - 20 Lacs

Posted:23 hours ago| Platform: Naukri logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

Role & responsibilities

  • Build and deploy data platforms, data warehouses, and big data solutions across industries (BFSI, Manufacturing, Healthcare, eCommerce, IoT, Digital Twin, etc)
  • Integrating, transforming, and consolidating data from various structured and unstructured data systems.
  • Expert in data ingestion, transformation, storage, and analysis, often using Azure services and migration from legacy on-premise services
  • Essential skills include SQL, Python, R and knowledge of ETL/ELT processes and big data technologies like Apache Spark, Scala, PySpark.
  • Maintain data integrity, resolve data-related issues, and ensure the reliability and performance of data solutions
  • Work with stakeholders to provide real-time data analytics, monitor data pipelines, and optimize performance and scalability
  • Strong understanding of data management fundamentals, data warehousing, and data modeling.
  • Bigdata technologies HDFS, Spark, Hbase, Hive, Sqoop, Kafka, RabbitMQ, Flink
  • Implement seamless data integration solutions between Azure/AWS/GCP and Snowflake platforms.
  • Identify and resolve performance bottlenecks, optimize queries, and ensure the overall efficiency of data pipelines.
  • Lead the development and management of data infrastructure, including tools, dashboards, queries, reports, and scripts, ensuring automation of recurring tasks while maintaining data quality and integrity
  • Implement and maintain data security measures, ensuring compliance with industry standards and regulations.
  • Ensure data architecture aligns with business requirements and best practices.
  • Experience in Power BI /Tableau /Looker
  • Management, administration, and maintenance with data streaming tools such as Kafka/Confluent Kafka, Flink
  • Experience in Test Driven Development, building libraries, and proficiency in Pandas, NumPy, Elasticsearch, Apache Beam
  • Familiarity with CI/CD pipelines, monitoring, and infrastructure-as-code (e. g., Terraform, CloudFormation).
    Proficient in query optimization, data partitioning, indexing strategies, and caching mechanisms.
  • Ensure GDPR, SOX, and other regulatory compliance across data workflows
  • Essential skills include SQL, Python, and knowledge of ETL/ELT processes and big data technologies like Spark.
  • Exposure in working on Event/File/Table Formats such as Avro, Parquet, Iceberg, Delta

Must have Skills (Atleast one or two)

  • Azure Data Factory (ADF), Databricks, Synapse, Data Lake Storage, Timeseries Insights, Azure SQL Database, SQL Server, Presto, SSIS
  • AWS data services such as S3, Glue Studio, Redshift, Athena, and EMR, Redshift , Airflow, IAM, DBT, Lambda, RDS, DynamoDB, Neo4j, Amazon Neptune
  • GCP Big query, SQL, Composer, Dataflow, Dataform, DBT, /Python, Cloud functions, Dataproc+pyspark, Python injection, Dataflow, Cloud Storage, Pub/Sub, and Vertex AI, GKE, Cloud Functions
  • OCI Object Storage, OCI Data Integration, Oracle Database, Oracle Analytics Cloud, Oracle Analytics Cloud (OAC), Autonomous Data Warehouse (ADW), NetSuite Analytics Warehouse (NSAW), PL/SQL, Exadata

Mock Interview

Practice Video Interview with JobPe AI

Start Machine Learning Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

gurugram, haryana, india

gurugram, haryana, india

hyderabad, telangana, india

hyderabad, telangana, india