Home
Jobs

935 Data Bricks Jobs - Page 38

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5 - 10 years

15 - 30 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Azure Data Bricks (Python) Azure devops Azure data factory

Posted 1 month ago

Apply

8 - 12 years

20 - 25 Lacs

Gandhinagar

Remote

Naukri logo

Requirement : 8+ years of professional experience as a data engineer and 2+ years of professional experience as a senior data engineer Must have strong working experience in Python and its various data analysis packages Pandas / NumPy Must have strong understanding of prevalent cloud ecosystems and experience in one of the cloud platforms AWS / Azure / GCP . Must have strong working experience in one of the leading MPP Databases Snowflake / Amazon Redshift / Azure Synapse / Google Big Query Must have strong working experience in one of the leading data orchestration tools in cloud – Azure Data Factory / Amazon Glue / Apache Airflow Must have experience working with Agile methodologies, Test Driven Development, and implementing CI/CD pipelines using one of leading services – GITLab / Azure DevOps / Jenkins / AWS Code Pipeline / Google Cloud Build Must have Data Governance / Data Management / Data Quality project implementation experience Must have experience in big data processing using Spark Must have strong experience with SQL databases (SQL Server, Oracle, Postgres etc.) Must have stakeholder management experience and very good communication skills Must have working experience on end-to-end project delivery including requirement gathering, design, development, testing, deployment, and warranty support Must have working experience with various testing levels, such as, unit testing, integration testing and system testing Working experience with large, heterogeneous datasets in building and optimizing data pipelines, pipeline architectures Nice to have Skills : Working experience in DataBricks notebooks and managing DataBricks clusters Experience in Data Modelling tool such as Erwin or ER Studio Experience in one of the data architectures, such as Data Mesh or Data Fabric Has handled real time data or near real time data Experience in one of the leading Reporting & analysis tools, such as Power BI, Qlik, Tableau or Amazon Quick Sight Working experience with API integration General insurance / banking / finance domain understanding

Posted 1 month ago

Apply

20 - 30 years

70 - 90 Lacs

Hyderabad, Bengaluru, Mumbai (All Areas)

Work from Office

Naukri logo

Job Title: Global Head Data Practice Location: India Reports to: CEO About Our Client : Our client is a trusted digital engineering and cloud transformation partner, enabling businesses across 40 countries to unlock value from technology at scale. With a legacy of delivering complex programs in the UK, US, Middle East, and APAC, we are committed to engineering excellence that drives agility, resilience, and customer-centric innovation. Their portfolio spans Cloud-Native & Application Development, Oracle Cloud Implementations, Digital Commerce, Data & AI, Application Modernization, Agile Consulting, and Digital Assurance. At the heart of our transformation capability is the ability to harness data for smarter, faster, and more informed decision-making. Role Summary As the Global Head – Data Practice , you will be responsible for shaping and scaling Companies data practice into a globally recognized, high-impact capability. Based in India and working closely with global leadership, this is a strategic and visible role that combines thought leadership , delivery excellence , and business accountability . You will spearhead the end-to-end evolution of our data offerings, from data engineering and platform modernization to AI-driven insights and next-gen governance,partnering with enterprise clients on their most complex transformation journeys. This is not just a practice leadership role. It’s a global change-agent mandate for someone who can influence at the C-suite , innovate with purpose , and scale with precision . Key Responsibilities Strategic Leadership Own and evolve the global strategy for the Data Practice across key markets (UK, Europe, North America, Middle East, APAC), ensuring alignment with industry trends, client needs, and Companies growth priorities. Translate the practice vision into an actionable roadmap—covering go-to-market, talent, partnerships, and capability building. Practice Development: Spearhead the growth and evolution of Companies data capabilities, with a strong emphasis on Snowflake, Databricks , and related modern data platforms. Drive the continuous innovation and improvement of service offerings in data management, analytics, and cloud-native solutions . Client & Market Engagement Serve as a senior advisor to CXOs on data transformation initiatives—shaping outcomes that blend business insight with technical excellence. Develop compelling, differentiated propositions for data modernization, cloud data platforms, AI/ML, governance, and analytics-led innovation. Build deep client relationships with existing and new accounts, ensuring that the data practice is viewed as a value creator, not just a service line. Practice & People Leadership Build and lead a high-caliber team of data engineers, architects, data scientists, and consultants across geographies. Invest in developing next-generation leadership within the practice, creating a bench of future-ready talent aligned to global demand. Create a strong culture of innovation, accountability, and continuous learning. Operational & Financial Excellence Own the P&L of the global data practice, with responsibility for revenue growth, margin optimization, and delivery effectiveness. Work with sales, delivery, and marketing to create scalable solution accelerators, reusable assets, and market-ready offerings. Innovation & Ecosystem Drive the adoption of emerging technologies with a strategic focus on Snowflake and Databricks as core platforms, while also exploring areas like DataOps, GenAI, MLOps, Metadata Automation, and Data Mesh that complement and enhance the modern data stack. Build deep ecosystem partnerships with hyperscalers (AWS, Azure, GCP) , niche ISVs , and platform providers , especially those aligned with Snowflake and Databricks ecosystems . Represent Organization at global industry forums, partner events, and thought leadership platforms to showcase our capabilities and reinforce Companies leadership in next-gen data transformation . Ideal Candidate Profile 15–20+ years of experience in data consulting, digital transformation, or analytics leadership roles with global delivery exposure. Experience in scaling global data practices within IT services, consulting, or cloud-native companies. Proven expertise in cloud data platforms (Azure, AWS, GCP), modern data architectures, and advanced analytics (ML/AI). Executive presence and gravitas to engage and influence CXO-level stakeholders across industries. Track record of building data products, platforms, or frameworks that have driven measurable business impact. Familiarity with verticalized solutions (e.g., healthcare data, financial services analytics, supply chain intelligence) is a plus. Strong understanding of data compliance, privacy, and ethical AI considerations in global markets. Bachelor’s in Engineering/Technology or relevant field; Master’s or MBA preferred. What We Offer A strategic, board-visible role shaping the future of Companies global growth agenda. Competitive compensation, including performance incentives and Long-Term Incentive Plans (RSUs) . A platform to influence enterprise-wide transformation and lead from the front on innovation. A collaborative, values-driven culture that respects autonomy and champions impact. Opportunity to represent Organization at international platforms and contribute to global thought leadership.

Posted 1 month ago

Apply

8 - 13 years

15 - 20 Lacs

Pune

Hybrid

Naukri logo

Job Description : Strong experience on Python programming. Experience on Databricks. Experience on Database like SQL Perform database performance tuning and optimization. Databricks Platform Work with Databricks platform for big data processing and analytics. Develop and maintain ETL processes using Databricks notebooks. Implement and optimize data pipelines for data transformation and integration. Design, develop, test, and deploy high-performance and scalable data solutions using Python

Posted 1 month ago

Apply

2 - 6 years

12 - 16 Lacs

Pune

Work from Office

Naukri logo

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred technical and professional experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers

Posted 1 month ago

Apply

2 - 4 years

12 - 17 Lacs

Chennai, Pune

Work from Office

Naukri logo

Data Quality/Governance Analyst - Data & AI KEY ACCOUNTABILITIES Investigate, troubleshoot, and resolve data related production issues. Provide timely reporting on data quality metrics and trends. Document and maintain support procedures for data quality processes. Collaborate with IT and business teams to implement data quality improvements. Ensure data validation and reconciliation processes are followed. Engage with stakeholders to establish procedures for data validation and quality metrics. Track data issues using incident tickets and ensure timely resolution or escalate issues for immediate attention if not resolved. Maintain and update production support dashboards (Microsoft Power BI) to ensure accuracy and meet monitoring requirements. Develop Data Quality health reports for stakeholders to monitor and observe data reliability across the platform. Creating and maintaining documentation procedures, and best practices of data governance and related processes Provide training to users on tools to promote awareness and adherence. Collaborating with data owners and data stewards to ensure data governance is implemented and followed. Able to work with vendor as there will be technical platform issues that requires coordination and solution. Deliver consistent, accurate and high- quality work while communicating findings and insights in a clear manner. EXPERIENCE / QUALIFICATIONS At least 4 years of hands-on experience with a Data Quality tool (Collibra is preferred), Databricks and Microsoft Power BI Strong technical skills in data and database management, with proficiency in data wrangling, analytics, and transformation using Python and SQL Asset Management experience will be beneficial to understand and recommend the required data quality rules and remediation plan to the stakeholders. Other Attributes Curious, analytical, and able to think critically to solve problems Detail-oriented and comfortable dealing with complex structured and unstructured datasets Customer-centric and strive to deliver value by effectively and proactively engaging stakeholders Clear and effective communication skills, with an ability to communicate complex ideas and manage stakeholder expectations Strong organisational and prioritisation skills, adaptable and able to work independently as required

Posted 1 month ago

Apply

3 - 6 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

locationsTower 02, Manyata Embassy Business Park, Racenahali & Nagawara Villages. Outer Ring Rd, Bangalore 540065 time typeFull time posted onPosted 5 Days Ago job requisition idR0000388711 About us: As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about TII: At Target, we have a timeless purpose and a proven strategy. And that hasnt happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Targets global team and has more than 4,000 team members supporting the companys global strategy and operations. Team Overview: Every time a guest enters a Target store or browses Target.com nor the app, they experience the impact of Targets investments in technology and innovation. Were the technologists behind one of the most loved retail brands, delivering joy to millions of our guests, team members, and communities. Join our global in-house technology team of more than 5,000 of engineers, data scientists, architects and product managers striving to make Target the most convenient, safe and joyful place to shop. We use agile practices and leverage open-source software to adapt and build best-in-class technology for our team members and guestsand we do so with a focus on diversity and inclusion, experimentation and continuous learning. AtTarget, we are gearing up for exponential growth and continuously expanding our guest experience. To support this expansion, Data Engineering is building robust warehouses and enhancing existing datasets to meet business needs across the enterprise. We are looking for talented individuals who are passionate about innovative technology, data warehousing and are eager to contribute to data engineering. . Position Overview Assess client needs and convert business requirements into business intelligence (BI) solutions roadmap relating to complex issues involving long-term or multi-work streams. Analyze technical issues and questions identifying data needs and delivery mechanisms Implement data structures using best practices in data modeling, ETL/ELT processes, Spark, Scala, SQL, database, and OLAP technologies Manage overall development cycle, driving best practices and ensuring development of high quality code for common assets and framework components Develop test-driven solutions and provide technical guidance and heavily contribute to a team of high caliber Data Engineers by developing test-driven solutions and BI Applications that can be deployed quickly and in an automated fashion. Manage and execute against agile plans and set deadlines based on client, business, and technical requirements Drive resolution of technology roadblocks including code, infrastructure, build, deployment, and operations Ensure all code adheres to development & security standards About you 4 year degree or equivalent experience 5+ years of software development experience preferably in data engineering/Hadoop development (Hive, Spark etc.) Hands on Experience in Object Oriented or functional programming such as Scala / Java / Python Knowledge or experience with a variety of database technologies (Postgres, Cassandra, SQL Server) Knowledge with design of data integration using API and streaming technologies (Kafka) as well as ETL and other data Integration patterns Experience with cloud platforms like Google Cloud, AWS, or Azure. Hands on Experience on BigQuery will be an added advantage Good understanding of distributed storage(HDFS, Google Cloud Storage, Amazon S3) and processing(Spark, Google Dataproc, Amazon EMR or Databricks) Experience with CI/CD toolchain (Drone, Jenkins, Vela, Kubernetes) a plus Familiarity with data warehousing concepts and technologies. Maintains technical knowledge within areas of expertise Constant learner and team player who enjoys solving tech challenges with global team. Hands on experience in building complex data pipelines and flow optimizations Be able to understand the data, draw insights and make recommendations and be able to identify any data quality issues upfront Experience with test-driven development and software test automation Follow best coding practices & engineering guidelines as prescribed Strong written and verbal communication skills with the ability to present complex technical information in a clear and concise manner to variety of audiences Life at Target- Benefits- Culture-

Posted 1 month ago

Apply

5 - 10 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Project Role : Data Governance Practitioner Project Role Description : Establish and enforce data governance policies to ensure the accuracy, integrity, and security of organizational data. Collaborate with key stakeholders to define data standards, facilitate effective data collection, storage, access, and usage; and drive data stewardship initiatives for comprehensive and effective data governance. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Governance Practitioner, you will establish and enforce data governance policies to ensure the accuracy, integrity, and security of organizational data. Collaborate with key stakeholders to define data standards, facilitate effective data collection, storage, access, and usage; and drive data stewardship initiatives for comprehensive and effective data governance. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead data governance initiatives within the organization Develop and implement data governance policies and procedures Ensure compliance with data governance regulations and standards Professional & Technical Skills: Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform Strong understanding of data governance principles Experience in implementing data quality and data stewardship programs Knowledge of data privacy regulations and compliance requirements Experience in data management and data security practices Additional Information: The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform. This position is based at our Pune office. A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

18 - 23 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 18 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in ensuring the smooth functioning of applications. Roles & Responsibilities: Expected to be a SME with deep knowledge and experience. Should have influencing and Advisory skills. Engage with multiple teams and responsible for team decisions. Expected to provide solutions to problems that apply across multiple teams. Provide solutions to business area problems. Lead and mentor junior professionals in the team. Collaborate with stakeholders to gather requirements. Conduct code reviews and ensure best practices are followed. Professional & Technical Skills: Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. Strong understanding of data architecture principles. Experience in designing and implementing data solutions. Knowledge of cloud platforms like AWS or Azure. Hands-on experience with ETL processes. Familiarity with data modeling and database design. Additional Information: The candidate should have a minimum of 18 years of experience in Databricks Unified Data Analytics Platform. This position is based at our Mumbai office. A 15 years full-time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

3 - 8 years

9 - 13 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in the development and maintenance of the data platform components, contributing to the overall success of the project. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Assist with the data platform blueprint and design. Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Develop and maintain data platform components. Contribute to the overall success of the project. Professional & Technical Skills: Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. Strong understanding of statistical analysis and machine learning algorithms. Experience with data visualization tools such as Tableau or Power BI. Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform. This position is based at our Hyderabad office. A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies