Posted:1 day ago|
Platform:
Hybrid
Full Time
Role Description:
1. Play a role of solution architect/consultant for end-to-end Data Engineering and Analytics solutions including data, technology infrastructure and models.
2. Design and development of solutions according to the business objective.
3. Exploring and visualizing data to gain an understanding of it, then identifying differences in data distribution that could affect performance when deploying the solutions in the real world
4. Verifying data quality, and/or ensuring it via data cleaning
5. Supervising the data acquisition process if more data is needed
6. Defining data augmentation pipelines
7. Work with the technology team to plan, execute, and deliver Data Engineering Product based projects. Contribute towards the best practices regarding handling the projects.
8. Provide technical design, implementation, and support services around the creation of APIs, Models and Integration Pipelines.
9. Knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools
10. Environments Understanding of the auxiliary practical concerns in production systems. 11. Strong communication, presentation and consulting skills, including technical writing skills and the ability to listen and understand client issue.
12. Needs to be fully hands on.
1. Majority of Hadoop, Hive, Cassandra, MDM, Data Lakes, Streaming data management and Analytics, Spark, Kafka, Redis, Databricks etc.
2. Proficiency in programming languages such as Python, PySpark and associated libraries and frameworks. Hands-on and proficient in Databases (on-prem and Cloud), Messaging Apps, API Development and libraries.
3. Familiarity with cloud-based database services (AWS RDS, Azure SQL Database, Google Cloud SQL etc).
4. Familiarity with cloud-specific data services (e.g., S3, ADLS, GCS; Redshift, Synapse, BigQuery; EMR, Databricks, Dataproc).
5.Real Time Analytics Kafka, Azure Streaming, AWS Kinesis Knowledge of ETL/ELT tools (e.g., Apache NiFi, AWS Glue, Azure Data Factory, Google Cloud Dataflow).
6. Very good knowledge of cloud-based architecture, APIs, frameworks for Data Engineering and Analytics. Good to have knowledge of Data Governance & Data Quality Management, Metadata & Lineage Management Exposure to enterprise level Data migration, Data Warehousing and DB modeling. Good understanding of Microservices and Test-Driven development.
7.Familiarity with data visualization tools (e.g., Matplotlib, Seaborn, Power BI, Tableau). Python libraries for API development e.g. FastAPI, Flask, Django.
Quadrangle
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python NowBengaluru
50.0 - 65.0 Lacs P.A.
25.0 - 30.0 Lacs P.A.
Hyderābād
3.76 - 8.76 Lacs P.A.
Bengaluru, Karnataka
Salary: Not disclosed
Kolkata, Indore, Pune
30.0 - 40.0 Lacs P.A.
Bengaluru
13.0 - 18.0 Lacs P.A.
Mumbai
13.0 - 17.0 Lacs P.A.
Bengaluru
13.0 - 18.0 Lacs P.A.
Pune, Maharashtra, India
Salary: Not disclosed
Hyderabad, Telangana
Salary: Not disclosed