Posted:22 hours ago|
Platform:
Hybrid
Full Time
Qualifications: • Bachelors or masters degree in computer science, Engineering, or a related field. • Proven experience (8+) in a data engineering role, with expertise in designing and building data pipelines, ETL processes, and data warehouses. • Strong proficiency in SQL, Python and Spark programming languages. • Strong experience with cloud platforms such as AWS, Azure, or GCP is a must. • Hands-on experience with big data technologies such as Hadoop, Spark, Kafka, and distributed computing frameworks. • Knowledge of data lake and data warehouse solutions, including Databricks, Snowflake, Amazon Redshift, Google BigQuery, Azure Data Factory, Airflow etc. • Experience in implementing CI/CD pipelines for automating build, test, and deployment processes. • Solid understanding of data modeling concepts, data warehousing architectures, and data management best practices. • Excellent communication and leadership skills, with the ability to effectively collaborate with cross-functional teams and drive consensus on technical decisions. • Relevant certifications (e.g., Azure, Databricks, Snowflake) would be a plus.
Apex Systems
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Now
noida
7.0 - 12.0 Lacs P.A.
noida, pune, bengaluru
16.0 - 27.5 Lacs P.A.
hyderabad
0.5 - 0.5 Lacs P.A.
chennai, bengaluru
3.25 - 8.25 Lacs P.A.
gurgaon
5.0 - 8.0 Lacs P.A.
Salary: Not disclosed
bengaluru
15.0 - 30.0 Lacs P.A.
gurugram, haryana, india
Salary: Not disclosed
hyderabad
12.0 - 16.0 Lacs P.A.
chennai, tamil nadu, india
Salary: Not disclosed