Home
Jobs

4 Data Framework Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 8.0 years

15 - 25 Lacs

Hyderabad

Remote

Naukri logo

Job Title : Data Engineer II Experience : 6+ Years Location : Remote (India) Job Type : Full-time Job Description : We are looking for a highly skilled Data Engineer II with 6+ years of experience, including at least 4 years in data engineering or software development. The ideal candidate will be well-versed in building scalable data solutions using modern data ecosystems and cloud platforms. Key Responsibilities : Design, build, and optimize scalable ETL pipelines. Work extensively with Big Data technologies like Snowflake and Databricks . Write and optimize complex SQL queries for large datasets. Define and manage SLAs, performance benchmarks, and monitoring systems. Develop data solutions using the AWS Data Ecosystem , including S3 , Lambda , and more. Handle both relational (e.g., PostgreSQL) and NoSQL databases. Work with programming languages like Python , Java , and/or Scala . Use Linux command-line tools for system and data operations. Implement best practices in data lineage , data quality , data observability , and data discoverability . Preferred (Nice-to-Have) : Experience with data mesh architecture or building distributed data products. Prior exposure to data governance frameworks.

Posted 2 weeks ago

Apply

6.0 - 8.0 years

15 - 25 Lacs

Hyderabad

Remote

Naukri logo

Job Title : Data Engineer II Experience : 6+ Years Location : Remote (India) Job Type : Full-time Job Description : We are looking for a highly skilled Data Engineer II with 6+ years of experience, including at least 4 years in data engineering or software development. The ideal candidate will be well-versed in building scalable data solutions using modern data ecosystems and cloud platforms. Key Responsibilities : Design, build, and optimize scalable ETL pipelines. Work extensively with Big Data technologies like Snowflake and Databricks . Write and optimize complex SQL queries for large datasets. Define and manage SLAs, performance benchmarks, and monitoring systems. Develop data solutions using the AWS Data Ecosystem , including S3 , Lambda , and more. Handle both relational (e.g., PostgreSQL) and NoSQL databases. Work with programming languages like Python , Java , and/or Scala . Use Linux command-line tools for system and data operations. Implement best practices in data lineage , data quality , data observability , and data discoverability . Preferred (Nice-to-Have) : Experience with data mesh architecture or building distributed data products. Prior exposure to data governance frameworks.

Posted 3 weeks ago

Apply

6 - 8 years

12 - 16 Lacs

Hyderabad

Remote

Naukri logo

Job Title: Data Engineer Job Summary: Are you passionate about building scalable data pipelines, optimizing ETL processes, and designing efficient data models? We are looking for a Databricks Data Engineer to join our team and play a key role in managing and transforming data in Azure cloud environments. In this role, you will work with Azure Data Factory (ADF), Databricks, Python, and SQL to develop robust data ingestion and transformation workflows. Youll also be responsible for integrating, ,optimizing performance, and ensuring data quality & governance. If you have strong experience in big data processing, distributed computing (Spark), and data modeling, wed love to hear from you! Key Responsibilities: 1. Develop & Optimize ETL Pipelines : Build robust and scalable data pipelines using ADF, Databricks, and Python for data ingestion, transformation, and loading. 2. Data Modeling & Systematic Layer Modeling : Design logical, physical, and systematic data models for structured and unstructured data. 3. Database Management : Develop and optimize SQL queries, stored procedures, and indexing strategies to enhance performance. 4. Big Data Processi ng: Work with Azure Databricks for distributed computing, Spark for large-scale processing, and Delta Lake for optimized storage. 5. Data Quality & Governance : Implement data validation, lineage tracking, and security measures for high-quality, compliant data. 6. Collaboration : Work closely with business analysts, data scientists, and DevOps teams to ensure data availability and usability. 7. Testing and Debugging : Write unit tests and perform debugging to ensure the Implementation is robust and error-free. Conduct performance optimization and security audits. Required Skills and Qualifications: Azure Cloud Expertise: Strong experience in Azure Data Factory (ADF), Databricks, and Azure Synapse. Programming: Proficiency in Python for data processing, automation, and scripting. SQL & Database Skills: Advanced knowledge of SQL, T-SQL, or PL/SQL for data manipulation. Data Modeling: Hands-on experience in dimensional modeling, systematic layer modeling, and entity-relationship modeling. Big Data Frameworks: Strong understanding of Apache Spark, Delta Lake, and distributed computing. Performance Optimization: Expertise in query optimization, indexing, and performance tuning. Data Governance & Security: Knowledge of RBAC, encryption, and data privacy standards. Preferred Qualifications: Experience with CI/CD for data pipelines using Azure DevOps. Knowledge of Kafka/Event Hub for real-time data processing. Experience with Power BI/Tableau for data visualization (not mandatory but a plus).

Posted 1 month ago

Apply

8 - 13 years

25 - 30 Lacs

Pune

Work from Office

Naukri logo

Transaction Surveillance Strategic Architecture and Control Design Data Controls, AVP Role Description As part of Corporate Bank, Transaction Surveillance sits within Cash Management. Its mandate is to protect the franchise through comprehensive transaction surveillance solutions,ensuring regulatory compliance and enabling business opportunities in line with a controlled risk appetite. The Product Manager for Transaction Surveillance Strategic Architecture and Control Design Data Controls is responsible to support the conceptual design and drive from an overall risk and control framework perspective the implementation, respectively further enhancement of existing control solutions from an end-to-end perspective. Key imminent task will be to support the proper end-to-end delivery and control of required Payment as well as non-Payment Corporate Bank Product & Services data for the roll-out of the strategic AFC Transaction Monitoring system in all DB locations globally. Your key responsibilities Support the Corporate Bank Strategic Data Framework Program, led by the Head of Transaction Surveillance to ensure proper end-to-end delivery and control of required Payment as well as non-Payment Corporate Bank Product & Services data for the roll-out of the strategic AFC Transaction Monitoring system in all DB locations globally, leveraging of this implementation for other post-execution control consumers, and deployment of respective concepts for Transaction Surveillance real-time controls. Closely engage with key stakeholders (Product, AFC, TMDC, TDI, and Operations) to understand data requirements and architecture concepts, drive the analysis and documentation of data requirements and flows on a country-by-country basis, support in the roll-out of data completeness and data content integrity controls, identify data quality issues and guide respective remediation efforts. Identify opportunities to leverage the implementation for other post-execution control consumers (e.g. Account Activity Report) and support in the deployment of respective concepts for Transaction Surveillance real-time controls (in particular Sanctions & Embargoes Transaction Filtering). Your skills and experience Excellent strategic and analytical thinking skills Excellent interpersonal skills and demonstrated ability to collaborate well with range of people across divisions, including senior stakeholders Sound project management and organizational skills, and ability to understand and transpose complex topics in a comprehensible manner to others An advantage is experience in Cash Management / Product & Services Cash Management / Payments Processing Transaction monitoring and screening Fluent English skills

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies