Data Engineer - Python/SQL

5 years

0 Lacs

Noida, Uttar Pradesh, India

Posted:1 day ago| Platform: Linkedin logo

Apply Now

Skills Required

data python sql design databricks etl apache spark processing analytics governance security compliance efficiency ai ml automate code integrity reliability engineering scala analysis aws azure gcp architecture devops certifications developer mlflow kubernetes docker terraform kafka reporting power tableau looker

Work Mode

On-site

Job Type

Full Time

Job Description

Job Responsibilities Design, implement, and optimize big data pipelines in Databricks. Develop scalable ETL workflows to process large datasets. Leverage Apache Spark for distributed data processing and real-time analytics. Implement data governance, security policies, and compliance standards. Optimize data lakehouse architectures for performance and cost-efficiency. Collaborate with data scientists, analysts, and engineers to enable advanced AI/ML workflows. Monitor and troubleshoot Databricks clusters, jobs, and performance bottlenecks. Automate workflows using CI/CD pipelines and infrastructure-as-code practices. Ensure data integrity, quality, and reliability in all Qualifications : Bachelors or Masters degree in Computer Science, Data Engineering, or a related field. 5+ years of hands-on experience with Databricks and Apache Spark. Proficiency in SQL, Python, or Scala for data processing and analysis. Experience with cloud platforms (AWS, Azure, or GCP) for data engineering. Strong knowledge of ETL frameworks, data lakes, and Delta Lake architecture. Experience with CI/CD tools and DevOps best practices. Familiarity with data security, compliance, and governance best practices. Strong problem-solving and analytical skills with an ability to work in a fast-paced Qualifications : Databricks certifications (e.g., Databricks Certified Data Engineer, Spark Developer). Hands-on experience with MLflow, Feature Store, or Databricks SQL. Exposure to Kubernetes, Docker, and Terraform. Experience with streaming data architectures (Kafka, Kinesis, etc.). Strong understanding of business intelligence and reporting tools (Power BI, Tableau, Looker). Prior experience working with retail, e-commerce, or ad-tech data platforms. (ref:hirist.tech) Show more Show less

Mock Interview

Boost Confidence & Sharpen Skills

Start Data Interview Now

RecommendedJobs for You