Posted:3 months ago|
                                Platform:
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                            
Hybrid
Full Time
issueshands-on,,,roduct-basedtoin,and ,,, ,associated,,Real-timeKafkaPython
Position - Data Engineer
Skills - AWS, Data engineer - Pyspark , python Real time analytics - Azure streaming and kafka
Years of exp 4 to 8
Location - Pune / Kolkata / Indore
Skills:
Proficiency in programming languages such as Python, PySpark and associated libraries and frameworks.
Hands-on and proficient in Databases (on-prem and Cloud), Messaging Apps, API Development and libraries.
Exposure and experience in Microsoft Fabric, Azure ADLS, Synapse, ADF, Databricks, Azure Stream Analytics, BLOB and other associates cloud services. Knowledge in Real Time Analytics Kafka, Azure Streaming etc.
Knowledge of Collibra, Atlan, OpenMetadata, Flink, Airflow would be considered a plus.
Role Description:
1. Play a role of end-to-end Data Engineering and Analytics solutions including data, technology infrastructure and models.
2. Design and development of solutions according to the business objective.
3. Exploring and visualizing data to gain an understanding of it, then identifying differences in data distribution that could affect performance when deploying the solutions in the real world
4. Verifying data quality, and/or ensuring it via data cleaning
5. Supervising the data acquisition process if more data is needed
6. Defining data augmentation pipelines
7. Work with the technology team to plan, execute, and deliver Data Engineering Product based projects. Contribute towards the best practices regarding handling the projects.
8. Provide technical design, implementation, and support services around the creation of APIs, Models and Integration Pipelines.
9. Knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools
10. Environments Understanding of the auxiliary practical concerns in production systems.
11. Strong communication, presentation and consulting skills, including technical writing skills and the ability to listen and understand client issue.
12. Needs to be fully hands on.
Intrested candidates may share their updated
                Calsoft
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
        Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
            
        
                        Practice Python coding challenges to boost your skills
Start Practicing Python Now
    Kolkata, Indore, Pune
15.0 - 25.0 Lacs P.A.
pune
25.0 - 30.0 Lacs P.A.
navi mumbai
6.0 - 13.0 Lacs P.A.
chennai
7.0 - 8.0 Lacs P.A.
hyderabad
10.0 - 11.0 Lacs P.A.
chennai
10.0 - 11.0 Lacs P.A.
chennai
10.0 - 11.0 Lacs P.A.
5.0 - 9.0 Lacs P.A.
hyderabad
10.0 - 11.0 Lacs P.A.
9.0 - 13.0 Lacs P.A.