Jobs
Interviews

4 Unity Catalogue Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

0 Lacs

karnataka

On-site

As a Data Bricks and Python Engineer at Oracle FSGIU - Finergy, your role involves designing, developing, testing, and deploying high-performance and scalable data solutions using Python, PySpark, and SQL. You will collaborate with cross-functional teams to understand business requirements and translate them into technical specifications. It is important to implement efficient and maintainable code following best practices and coding standards. Your key responsibilities include: - Working with the Databricks platform for big data processing and analytics. - Developing and maintaining ETL processes using Databricks notebooks. - Implementing and optimizing data pipelines for data transformation and integration. - Staying updated on the latest industry trends, tools, and technologies related to Python, SQL, and Databricks. - Sharing knowledge with the team and contributing to a culture of continuous improvement. - Utilizing expertise in SQL to design, optimize, and maintain relational databases. - Writing complex SQL queries for data retrieval, manipulation, and analysis. Qualifications required for this role: - 4 to 8 years of experience in Databricks and big data frameworks. - Advanced proficiency in AWS, including EC2, S3, and container orchestration (Docker, Kubernetes). - Proficient in AWS services and data migration. - Experience in Unity Catalogue. - Familiarity with batch and real-time processing. - Data engineering with strong skills in Python, PySpark, and SQL. In summary, as a Data Databricks and Python Engineer at Oracle FSGIU - Finergy, you will play a crucial role in developing high-performance data solutions, collaborating with cross-functional teams, and staying updated on the latest industry trends to contribute to the success of the Finergy division within Oracle FSGIU.,

Posted 12 hours ago

Apply

4.0 - 8.0 years

0 Lacs

mumbai, maharashtra, india

On-site

Data Bricks and Python Engineer About Oracle FSGIU - Finergy: Finergy division within Oracle FSGIU exclusively focuses on the Banking, Financial Services, and Insurance (BFSI) sector, offering deep domain knowledge to address complex financial needs. Finergy has Industry expertise in BFSI. ( On Accelerated Implementation ) Finergy has Proven methodologies that fast-track the deployment of multi-channel delivery platforms, minimizing IT intervention and reducing time to market. Due to Personalization tools that tailor customer experiences, Finergy has several loyal customers for over a decade. ( On End-to-End Banking Solutions ) Finergy Provides a single platform for a wide range of banking services-trade, treasury, cash management-enhancing operational efficiency with integrated dashboards and analytics. Finergy offers Expert Consulting Services , Comprehensive consulting support, from strategy development to solution implementation, ensuring the alignment of technology with business goals. Job Responsibilities 1. Software Development: - Design, develop, test, and deploy high-performance and scalable data solutions using Python,PySpark, SQL - Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications. - Implement efficient and maintainable code using best practices and coding standards. 2. Databricks Platform: - Work with Databricks platform for big data processing and analytics. - Develop and maintain ETL processes using Databricks notebooks. - Implement and optimize data pipelines for data transformation and integration. 3. Continuous Learning: - Stay updated on the latest industry trends, tools, and technologies related to Python, SQL, and Databricks. - Share knowledge with the team and contribute to a culture of continuous improvement. 4. SQL Database Management: - Utilize expertise in SQL to design, optimize, and maintain relational databases. - Write complex SQL queries for data retrieval, manipulation, and analysis. Mandatory Skills: 4 to 8 Years of experience in Databricks and big data frameworks Advanced proficiency in AWS, including EC2, S3 and container orchestration (Docker, Kubernetes) Proficient in AWS services and data migration Experience in Unity Catalogue Familiarity with Batch and real time processing Data engineering with strong skills in Python, PySpark, SQL Data Bricks and Python Engineer About Oracle FSGIU - Finergy: Finergy division within Oracle FSGIU exclusively focuses on the Banking, Financial Services, and Insurance (BFSI) sector, offering deep domain knowledge to address complex financial needs. Finergy has Industry expertise in BFSI. ( On Accelerated Implementation ) Finergy has Proven methodologies that fast-track the deployment of multi-channel delivery platforms, minimizing IT intervention and reducing time to market. Due to Personalization tools that tailor customer experiences, Finergy has several loyal customers for over a decade. ( On End-to-End Banking Solutions ) Finergy Provides a single platform for a wide range of banking services-trade, treasury, cash management-enhancing operational efficiency with integrated dashboards and analytics. Finergy offers Expert Consulting Services , Comprehensive consulting support, from strategy development to solution implementation, ensuring the alignment of technology with business goals. Job Responsibilities 1. Software Development: - Design, develop, test, and deploy high-performance and scalable data solutions using Python,PySpark, SQL - Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications. - Implement efficient and maintainable code using best practices and coding standards. 2. Databricks Platform: - Work with Databricks platform for big data processing and analytics. - Develop and maintain ETL processes using Databricks notebooks. - Implement and optimize data pipelines for data transformation and integration. 3. Continuous Learning: - Stay updated on the latest industry trends, tools, and technologies related to Python, SQL, and Databricks. - Share knowledge with the team and contribute to a culture of continuous improvement. 4. SQL Database Management: - Utilize expertise in SQL to design, optimize, and maintain relational databases. - Write complex SQL queries for data retrieval, manipulation, and analysis. Mandatory Skills: 4 to 8 Years of experience in Databricks and big data frameworks Advanced proficiency in AWS, including EC2, S3 and container orchestration (Docker, Kubernetes) Proficient in AWS services and data migration Experience in Unity Catalogue Familiarity with Batch and real time processing Data engineering with strong skills in Python, PySpark, SQL Career Level - IC2

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

As a Data Modeler at our organization, you will be responsible for designing, developing, and maintaining conceptual, logical, and physical data models that align with our business needs. Working closely with data engineers, architects, business analysts, and stakeholders, you will play a critical role in ensuring data consistency, integrity, and performance across multiple systems. Your key responsibilities will include: - Designing and developing conceptual, logical, and physical data models based on business requirements. - Collaborating with business analysts, data engineers, and architects to ensure that data models support our business goals. - Optimizing database design to improve performance, scalability, and maintainability. - Defining and enforcing data governance standards, such as naming conventions, metadata management, and data lineage. - Working with ETL and BI teams to facilitate seamless data integration and reporting capabilities. - Analyzing and documenting data relationships, dependencies, and transformations across various platforms. - Maintaining data dictionaries and ensuring compliance with industry best practices. To excel in this role, you should be proficient in the Azure data engineering stack and have hands-on experience with ADF, Azure Databricks, SCD, Unity Catalogue, PySpark, Power Designer, and Biz Designer. If you have a minimum of 6 years of experience in data modeling and a strong background in data engineering, we encourage you to apply. Immediate joiners are preferred for this position. Location: Bengaluru Mode: Hybrid Join us and be a part of our dynamic team where you can contribute your expertise to drive our data modeling initiatives forward.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

Greetings from Teknikoz! You should have at least 5 years of experience in the following areas: - Data Bricks skillset with Pyspark and SQL - Strong proficiency in Pyspark and SQL - Understanding of data warehousing concepts - ETL processes and Data pipeline building with ADB/ADF - Experience with Azure cloud platform and knowledge of data manipulation techniques In this role, you will be responsible for: - Working with business teams to convert requirements into technical stories for migration - Leading technical discussions and implementing solutions - Experience with multi-tenant architecture and successful project delivery in Databricks + Azure combination - Experience with Unity catalogue is considered beneficial. If you have the required experience and skills, we would like to hear from you!,

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies