Jobs
Interviews

7 Deltalake Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 13.0 years

20 - 35 Lacs

Pune, Bengaluru, Mumbai (All Areas)

Hybrid

Datawarehouse Database Architect - Immediate hiring. We are currently looking for Datawarehouse Database Architect for our client who are into Fintech solutions. Please let us know your interest and availability Experience: 10 plus years of experience Locations: Hybrid Any Accion offices in India pref (Bangalore /Pune/Mumbai) Notice Period: Immediate – 0 – 15 days joiners are preferred Required skills: Tools & Technologies Cloud Platform : Azure (Data Bricks, DevOps, Data factory, azure synapse Analytics, Azure SQL, blob storage, Databricks Delta Lake) Languages : Python/PL/SQL/SQL/C/C++/Java Databases : Snowflake/ MS SQL Server/Oracle Design Tools : Erwin & MS Visio. Data warehouse tools : SSIS, SSRS, SSAS. Power Bi, DBT, Talend Stitch, PowerApps, Informatica 9, Cognos 8, OBIEE. Any cloud exp is good to have Let’s connect for more details. Please write to me at mary.priscilina@accionlabs.com along with your cv and with the best contact details to get connected for a quick discussion. Regards, Mary Priscilina

Posted 4 days ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

Thoucentric, the Consulting arm of Xoriant, a leading digital engineering services company with 5000 employees, is currently seeking a highly experienced Infrastructure Architect with deep DevOps expertise to lead the cloud infrastructure planning and deployment for a global o9 supply chain platform rollout. As the Infrastructure Architect, you will play a crucial role in designing and implementing cloud-native architecture for the o9 platform rollout, ensuring robust, scalable, and future-proof infrastructure across Azure and/or AWS environments. Your responsibilities will include collaborating closely with o9's DevOps team to deploy and manage Kubernetes clusters, overseeing Git-based CI/CD workflows, implementing monitoring and alerting frameworks, and acting as a strategic liaison between the client's IT organization and o9's DevOps team to align platform requirements and deployment timelines. Additionally, you will be responsible for ensuring high availability, low latency, and high throughput to support supply chain operations needs, while anticipating future growth and scalability requirements. To be successful in this role, you should have at least 10 years of experience in Infrastructure Architecture/DevOps, ideally within CPG or large enterprise SaaS/Supply Chain platforms. You must have proven expertise in deploying and scaling platforms on Azure and/or AWS, along with hands-on experience with Kubernetes, Karpenter, Git-based CI/CD, DataLake/DeltaLake architectures, enterprise security, identity and access management, and networking in cloud environments. Experience with infrastructure as code tools like Terraform, Helm, or similar is also required, along with excellent stakeholder management and collaboration skills. Joining Thoucentric's consulting team will offer you the opportunity to define your career path independently, work with Fortune 500 companies and startups, and be part of a dynamic yet supportive working environment that encourages personal development. Additionally, you will have the chance to bond beyond work with your colleagues through sports, get-togethers, and other common interests, contributing to a very enriching environment with an Open Culture, Flat Organization, and an excellent peer group. If you are passionate about infrastructure architecture, DevOps, and cloud technologies, and are looking for a challenging leadership role in a global consulting environment, we encourage you to apply for this position based in Bangalore, India. Don't miss the opportunity to be part of the exciting growth story of Thoucentric!,

Posted 4 days ago

Apply

10.0 - 20.0 years

50 - 75 Lacs

Bengaluru

Work from Office

A leading player in cloud-based enterprise solutions is expanding its analytics leadership team in Bangalore. This pivotal role calls for a seasoned professional to drive the evolution of data products and analytics capabilities across international markets. The ideal candidate will possess the strategic vision, technical expertise, and stakeholder savvy to lead in a fast-paced, innovation-driven environment. Key Responsibilities Lead and mentor a dynamic team of product managers to scale enterprise-grade data lake and analytics platforms Drive program execution and delivery with a focus on performance, prioritization, and business alignment Define and execute the roadmap for an analytical data platform, ensuring alignment with strategic and user-centric goals Collaborate cross-functionally with engineering, design, and commercial teams to launch impactful BI solutions Translate complex business needs into scalable data models and actionable product requirement documents for multi-tenant SaaS products Champion AI-enabled analytics experiences to deliver smart, context-aware data workflows Maintain high standards in performance, usability, trust, and documentation of data products Ensure seamless execution of global data strategies through on-the-ground leadership in India Promote agile methodologies, metadata governance, and product-led thinking across teams Ideal Candidate Profile 10+ years in product leadership roles focused on data products, BI, or analytics in SaaS environments Deep understanding of modern data architectures, including dimensional modeling and cloud-native analytics tools Proven expertise in building multi-tenant data platforms serving external customer use cases Skilled in simplifying complex inputs into clear, scalable requirements and deliverables Familiarity with platforms like Deltalake, dbt, ThoughtSpot, and similar tools Strong communicator with demonstrated stakeholder management and team leadership capabilities Experience launching customer-facing analytics products is a definite plus A passion for intuitive, scalable, and intelligent user experiences powered by data

Posted 1 month ago

Apply

8.0 - 12.0 years

25 - 40 Lacs

Hyderabad

Work from Office

Key Responsibilities: Design and develop the migration strategies and processes Collaborate with stakeholders to understand business requirements and technical challenges. Analyze current data and scope for optimization during the migration process. Define the architecture and roadmap for cloud-based data and analytics transformation on Databricks. Design, implement, and optimize scalable, high-performance data architectures using Databricks. Build and manage data pipelines and workflows within Databricks. Ensure that best practices for security, scalability, and performance are followed. Implement Databricks solutions that enable machine learning, business intelligence, and data science workloads. Oversee the technical aspects of the migration process, from planning through to execution. Work closely with engineering and data teams to ensure proper migration of ETL processes, data models, and analytics workloads. Troubleshoot and resolve issues related to migration, data quality, and performance. Create documentation of the architecture, migration processes, and solutions. Provide training and support to teams post-migration to ensure they can leverage Databricks. Experience: 7+ years of experience in data engineering, cloud architecture, or related fields. 3+ years of hands-on experience with Databricks, including the implementation of data engineering solutions, migration projects, and optimizing workloads. Strong experience with cloud platforms (e.g., AWS, Azure, GCP) and their integration with Databricks. Experience in end-to-end data migration projects involving large-scale data infrastructure. Familiarity with ETL tools, data lakes, and data warehousing solutions. Skills: Expertise in Databricks architecture and best practices for data processing. Strong knowledge of Spark, Delta Lake, DLT, Lakehouse architecture, and other latest Databricks components. Proficiency in Databricks Asset Bundles Expertise in design and development of migration frameworks using Databricks Proficiency in Python, Scala, SQL, or similar languages for data engineering tasks. Familiarity with data governance, security, and compliance in cloud environments. Solid understanding of cloud-native data solutions and services.

Posted 1 month ago

Apply

7.0 - 10.0 years

12 - 18 Lacs

Hyderabad

Work from Office

Roles and Responsibilities Design, develop, test, deploy, and maintain complex Cognos TM1 solutions using various tools such as MDX, SQL, Excel, Visual Basic, DevOps, Ansible, Jenkins, Git, Configuration Management. Collaborate with cross-functional teams to gather requirements and deliver high-quality solutions on time. Troubleshoot issues related to TM1 cubes, dimensions, business rules, security settings using TM1 Development environment. Ensure seamless integration of TM1 applications with other systems through REST APIs. Desired Candidate Profile 7-10 years of experience in Cognos TM1 development with expertise in TM1 Development (Cubes), TM1 Dimensions & Business Rules. Strong understanding of SQL programming language for querying databases like PostgreSQL or Oracle/SQL Server/DB2. Experience with scripting languages like Python/Shell Scripting/Puppet/Chef/Python for automation tasks.

Posted 1 month ago

Apply

10 - 18 years

35 - 55 Lacs

Hyderabad, Bengaluru, Mumbai (All Areas)

Hybrid

Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 Yrs - 18 Yrs Location- Pan India Job Description : - Experience in Synapase with pyspark Knowledge of Big Data pipelinesData Engineering Working Knowledge on MSBI stack on Azure Working Knowledge on Azure Data factory Azure Data Lake and Azure Data lake storage Handson in Visualization like PowerBI Implement endend data pipelines using cosmosAzure Data factory Should have good analytical thinking and Problem solving Good communication and coordination skills Able to work as Individual contributor Requirement Analysis CreateMaintain and Enhance Big Data Pipeline Daily status reporting interacting with Leads Version controlADOGIT CICD Marketing Campaign experiences Data Platform Product telemetry Analytical thinking Data Validation of the new streams Data quality check of the new streams Monitoring of data pipeline created in Azure Data factory updating the Tech spec and wiki page for each implementation of pipeline Updating ADO on daily basis If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in With Regards, Sankar G Sr. Executive - IT Recruitment

Posted 2 months ago

Apply

10 - 20 years

35 - 55 Lacs

Hyderabad, Bengaluru, Mumbai (All Areas)

Hybrid

Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 Yrs - 18 Yrs Location- Pan India Job Description : - Mandatory Skill: Azure ADB with Azure Data Lake Lead the architecture design and implementation of advanced analytics solutions using Azure Databricks Fabric The ideal candidate will have a deep understanding of big data technologies data engineering and cloud computing with a strong focus on Azure Databricks along with Strong SQL Work closely with business stakeholders and other IT teams to understand requirements and deliver effective solutions Oversee the endtoend implementation of data solutions ensuring alignment with business requirements and best practices Lead the development of data pipelines and ETL processes using Azure Databricks PySpark and other relevant tools Integrate Azure Databricks with other Azure services eg Azure Data Lake Azure Synapse Azure Data Factory and onpremise systems Provide technical leadership and mentorship to the data engineering team fostering a culture of continuous learning and improvement Ensure proper documentation of architecture processes and data flows while ensuring compliance with security and governance standards Ensure best practices are followed in terms of code quality data security and scalability Stay updated with the latest developments in Databricks and associated technologies to drive innovation Essential Skills Strong experience with Azure Databricks including cluster management notebook development and Delta Lake Proficiency in big data technologies eg Hadoop Spark and data processing frameworks eg PySpark Deep understanding of Azure services like Azure Data Lake Azure Synapse and Azure Data Factory Experience with ETLELT processes data warehousing and building data lakes Strong SQL skills and familiarity with NoSQL databases Experience with CICD pipelines and version control systems like Git Knowledge of cloud security best practices Soft Skills Excellent communication skills with the ability to explain complex technical concepts to nontechnical stakeholders Strong problemsolving skills and a proactive approach to identifying and resolving issues Leadership skills with the ability to manage and mentor a team of data engineers Experience Demonstrated expertise of 8 years in developing data ingestion and transformation pipelines using DatabricksSynapse notebooks and Azure Data Factory Solid understanding and handson experience with Delta tables Delta Lake and Azure Data Lake Storage Gen2 Experience in efficiently using Auto Loader and Delta Live tables for seamless data ingestion and transformation Proficiency in building and optimizing query layers using Databricks SQL Demonstrated experience integrating Databricks with Azure Synapse ADLS Gen2 and Power BI for endtoend analytics solutions Prior experience in developing optimizing and deploying Power BI reports Familiarity with modern CICD practices especially in the context of Databricks and cloudnative solutions If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in With Regards, Sankar G Sr. Executive - IT Recruitment

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies