Hybrid
Full Time
Looking for Immediate Joiners only. Pls share your updated CV, CTC, ECTC, Notice Period/LWD @ monika.yadav@ness.com Job Description: Data Engineer As a Data Engineer, you will develop, maintain Data Pipelines. You will be involved in the design of data solutions for ESG. You will implement and manage cluster for streaming using technologies like Postgres, Oracle, Scala, Azure Data Lake, Spark, Kafka, Data bricks, ETL and Advanced SQL. You will be responsible for: Converting existing manual & semi-automated Data Ingress/ Egress processes to automated Data pipelines Create data pipelines for AI/ ML using Python / Pyspark Full operational lifecycle of Data platform including creating a streaming platform & helping with Kafka apps development Implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc. Job Requirements: 4 years of experience in Big Data technologies Experience in developing data processing flows using Python/ PySpark Hands on experience on Data ingestion, data cleansing, ETL, Data Mart creation and exposing data for consumers Strong experience in Oracle / PostgreSQL Experience of implementing and managing large scale cluster for streaming Kafka, Flink, Druid, NoSQL DB (MongoDB) etc Experience with Elastic Search, Splunk, Kibana or similar technologies Good to have experience in Business Intelligence tool (Qlik Sense) Knowledge of Microservices Familiarity with packages such as Numpy/pandas is desirable Qualifications: Bachelors degree in computer science, Information Technology, or a similar field (Minimum Qualification) Experience in Big Data technologies Experience in developing data processing flows using Python/ PySpark Hands on experience on Data ingestion, data cleansing, ETL, Data Mart creation and exposing data for consumers Strong experience in Oracle / PostgreSQL Experience of implementing and managing large scale cluster for streaming Kafka, Flink, Druid, NoSQL DB (MongoDB) etc Experience with Elastic Search, Splunk, Kibana or similar technologies Good to have experience in Business Intelligence tool (Qlik Sense) Knowledge of Microservices Familiarity with packages such as Numpy/pandas is desirable
Ness
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Chennai
4.0 - 7.0 Lacs P.A.
Bengaluru
25.0 - 30.0 Lacs P.A.
Mumbai
4.0 - 5.0 Lacs P.A.
Pune, Chennai, Bengaluru
0.5 - 0.5 Lacs P.A.
Chennai, Malaysia, Malaysia, Kuala Lumpur
7.0 - 11.0 Lacs P.A.
5.0 - 8.0 Lacs P.A.
Bengaluru
15.0 - 20.0 Lacs P.A.
Hyderabad
25.0 - 35.0 Lacs P.A.
Hyderabad
25.0 - 30.0 Lacs P.A.
25.0 - 30.0 Lacs P.A.