Home
Jobs

3345 Databricks Jobs - Page 44

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

9.0 - 12.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Kroll’s Portfolio Valuation team is a market leader in providing illiquid portfolio pricing valuation, with our Portfolio Valuation professionals assisting clients in the valuation of alternative investments, specifically for securities and positions for which there are no "active market" quotations available. As part of the team, you will help our client-facing professionals develop solutions that empower us to better serve our clients in a rapidly evolving market. In addition to optimizing existing projects and building better solutions for Kroll engagement teams, this role will have client exposure as we work collaboratively to solve client needs. The ideal candidate should have prior financial analysis experience from consulting, corporate, audit, or banking background or other suitable evidence of a passion for developing deep technical skills. If you are an experienced professional in business modelling now is a great time to join Kroll! We are expanding our Portfolio Valuation team and are looking for a motivated Vice President to join us and support our growth. Day-to-day Responsibilities Conduct thorough reviews of existing Excel models and identify areas for structural and efficiency improvements. Develop and execute on the identified solutions to enhance model performance and reliability. Utilize advanced Excel features (e.g. LET and LAMBDA functions), visual basics, and macros to automate repetitive tasks and streamline processes. Integrate SQL and Python scripting within Excel to extend functionality and data analysis capabilities. Leverage Microsoft Copilot, Power Automate, and/or other advanced tools to optimize modeling techniques, outputs and workflows. Design and structure Excel models to seamlessly integrate data from the Azure cloud environment into workflows. Contribute to the development of compelling data visualizations and dashboards in Excel and Power BI, translating complex data sets into clear, actionable insights. Proactively stay informed of industry best practices and emerging tools through ongoing education and professional development. Work with Kroll engagement teams to identify and implement modeling solutions for our clients. Supervising and mentoring junior staff Essential Traits Bachelor's or Master's degree in Finance, Accounting, Economics, Computer Science or a related field. Minimum 9-12 years of experience creating and improving advanced financial models in Excel. Demonstrated leadership experience including managing and developing client relationships and mentoring and developing staff Proven expertise in Excel modeling with a strong foundation in VBA, SQL, and/or Python. Familiarity with Databricks and/or Azure and the ability to integrate data from these platforms into Excel is preferred. A track record in creating data visualization solutions in Excel and Power BI is preferred. Knowledge of accounting concepts and alternative assets, including but not limited to private equity and/or private credit, is preferred. Excellent analytical, problem-solving, and project management abilities. Strong communication skills, with the capacity to convey technical concepts to non-technical stakeholders. Strong analytical and problem-solving skills, as well as strong verbal and written communication skills. A passion for learning and evidence of self-directed learning on technical subjects. Ability to work with staff at all levels of the organization About Kroll Join the global leader in risk and financial advisory solutions—Kroll. With a nearly century-long legacy, we blend trusted expertise with cutting-edge technology to navigate and redefine industry complexities. As a part of One Team, One Kroll, you'll contribute to a collaborative and empowering environment, propelling your career to new heights. Ready to build, protect, restore and maximize our clients’ value? Your journey begins with Kroll. In order to be considered for a position, you must formally apply via careers.kroll.com . Kroll is committed to equal opportunity and diversity, and recruits people based on merit. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities Design, develop, and optimize data pipelines and ETL processes using PySpark or Scala to extract, transform, and load large volumes of structured and unstructured data from diverse sources. Implement data ingestion, processing, and storage solutions on Azure cloud platform, leveraging services such as Azure Databricks, Azure Data Lake Storage, and Azure Synapse Analytics. Develop and maintain data models, schemas, and metadata to support efficient data access, query performance, and analytics requirements. Monitor pipeline performance, troubleshoot issues, and optimize data processing workflows for scalability, reliability, and cost-effectiveness. Implement data security and compliance measures to protect sensitive information and ensure regulatory compliance. Requirement Proven experience as a Data Engineer, with expertise in building and optimizing data pipelines using PySpark, Scala, and Apache Spark. Hands-on experience with cloud platforms, particularly Azure, and proficiency in Azure services such as Azure Databricks, Azure Data Lake Storage, Azure Synapse Analytics, and Azure SQL Database. Strong programming skills in Python and Scala, with experience in software development, version control, and CI/CD practices. Familiarity with data warehousing concepts, dimensional modeling, and relational databases (e.g., SQL Server, PostgreSQL, MySQL). Experience with big data technologies and frameworks (e.g., Hadoop, Hive, HBase) is a plus. Mandatory Skill Sets Spark, Pyspark, Azure Preferred Skill Sets Spark, Pyspark, Azure Years Of Experience Required 8 - 12 Education Qualification B.Tech / M.Tech / MBA / MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering, Master of Business Administration Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Data Science Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less

Posted 1 week ago

Apply

9.0 - 12.0 years

0 Lacs

New Delhi, Delhi, India

On-site

Linkedin logo

Kroll’s Portfolio Valuation team is a market leader in providing illiquid portfolio pricing valuation, with our Portfolio Valuation professionals assisting clients in the valuation of alternative investments, specifically for securities and positions for which there are no "active market" quotations available. As part of the team, you will help our client-facing professionals develop solutions that empower us to better serve our clients in a rapidly evolving market. In addition to optimizing existing projects and building better solutions for Kroll engagement teams, this role will have client exposure as we work collaboratively to solve client needs. The ideal candidate should have prior financial analysis experience from consulting, corporate, audit, or banking background or other suitable evidence of a passion for developing deep technical skills. If you are an experienced professional in business modelling now is a great time to join Kroll! We are expanding our Portfolio Valuation team and are looking for a motivated Vice President to join us and support our growth. Day-to-day Responsibilities Conduct thorough reviews of existing Excel models and identify areas for structural and efficiency improvements. Develop and execute on the identified solutions to enhance model performance and reliability. Utilize advanced Excel features (e.g. LET and LAMBDA functions), visual basics, and macros to automate repetitive tasks and streamline processes. Integrate SQL and Python scripting within Excel to extend functionality and data analysis capabilities. Leverage Microsoft Copilot, Power Automate, and/or other advanced tools to optimize modeling techniques, outputs and workflows. Design and structure Excel models to seamlessly integrate data from the Azure cloud environment into workflows. Contribute to the development of compelling data visualizations and dashboards in Excel and Power BI, translating complex data sets into clear, actionable insights. Proactively stay informed of industry best practices and emerging tools through ongoing education and professional development. Work with Kroll engagement teams to identify and implement modeling solutions for our clients. Supervising and mentoring junior staff Essential Traits Bachelor's or Master's degree in Finance, Accounting, Economics, Computer Science or a related field. Minimum 9-12 years of experience creating and improving advanced financial models in Excel. Demonstrated leadership experience including managing and developing client relationships and mentoring and developing staff Proven expertise in Excel modeling with a strong foundation in VBA, SQL, and/or Python. Familiarity with Databricks and/or Azure and the ability to integrate data from these platforms into Excel is preferred. A track record in creating data visualization solutions in Excel and Power BI is preferred. Knowledge of accounting concepts and alternative assets, including but not limited to private equity and/or private credit, is preferred. Excellent analytical, problem-solving, and project management abilities. Strong communication skills, with the capacity to convey technical concepts to non-technical stakeholders. Strong analytical and problem-solving skills, as well as strong verbal and written communication skills. A passion for learning and evidence of self-directed learning on technical subjects. Ability to work with staff at all levels of the organization About Kroll Join the global leader in risk and financial advisory solutions—Kroll. With a nearly century-long legacy, we blend trusted expertise with cutting-edge technology to navigate and redefine industry complexities. As a part of One Team, One Kroll, you'll contribute to a collaborative and empowering environment, propelling your career to new heights. Ready to build, protect, restore and maximize our clients’ value? Your journey begins with Kroll. In order to be considered for a position, you must formally apply via careers.kroll.com . Kroll is committed to equal opportunity and diversity, and recruits people based on merit. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Bengaluru East, Karnataka, India

On-site

Linkedin logo

Bangalore, Karnataka, India Working as a member of Database Administrator (DBA) team, the DBA will be responsible for the availability, performance tuning, troubleshooting, security, migration and upgrade of different database platforms/systems across AXA XL. What You’ll Be Doing What will your essential responsibilities include? Installing and configuring Microsoft SQL Server instances in a stand-alone, clustered and Always On environments along with database replication and mirroring. Experience working in Azure SQL DB, Azure SQL DW and other Azure DB Platforms. Assisting with SQL Server Assessment/Planning/Migration/Upgrades/Automation/Performance Tuning, etc. Plan and Execute the On-Prem Database/Analysis Services/Reporting Services/Integration Services Migration to Azure SQL DB / Azure PostgreSQL / Azure MySQL / SQL Server 2016 / 2019. SQL DB Automation for DB Deployments working with PowerShell, ARM Templates, etc. using DB Versioning tools like Azure DevOps SSDT, Liquibase, etc. Expertise in database troubleshooting, performance tuning, query tuning of SQL/T-SQL, partitioning, file group implementation and design database Indexes for high performance, as part of the migration/upgrade activities. Work self-directedly and in coordination with other application developers to investigate, analyze and resolve issues. Work with internal technical resources on project implementations and to resolve business and technical issues. Good verbal and written communication skills with the ability to articulate technical issues in simple terms. Developing, managing, testing and implementing database back-up, recovery and maintenance plans. Performing database capacity management, Handling database incidents and root cause analysis. Recovering databases during Disaster Recovery testing and incidents. Liaising with Project Managers, Developers, Application Support, Release, infrastructure teams to ensure database integrity, security and performance. You will report to GT SDC Operations Lead. What You’ll Bring We’re looking for someone who has these abilities and skills: Required Skills And Abilities Relevant years of experience managing Relational Database Management Systems (DBMS) - SQL Server. Relevant years of experience designing, implementing and managing High Availability database solutions. Hands-On experience working in Azure Platform: SQL DB, ADLS, Azure Synapse, Azure Databricks, Azure Data Factory, CosmosDB. Knowledge of CI/CD Pipelines: Harness. Exposure to migrating existing SQL Server (on-premis) data loads to Cloud (Azure). Excellent working knowledge of SQL scripting / programming. Effective working knowledge of IT Operations and support organisations would be an advantage. Ability to work self-directedly with less supervision is required. Knowledge of third party database auditing and performance monitoring solutions. Relevant years of Erwin or similar DM tools is a plus. Relevant years of producing technical documentation. Prior work experience with performance tuning, query plan analysis, and indexing. Experience doing backups & restores using SQL Litespeed/CommVault is a plus. Desired Skills And Abilities Comprehensive knowledge of DBMS. Able to organise self and others including effective scheduling, prioritisation and time management skills, completing tasks to tight deadlines. Demonstrates a ‘can do’ attitude. Proven track record of knowing what it takes to provide a consistently first class customer service internally and/or externally. Ability to build effective working relationships (Internally/Externally), establishing credibility amongst a wide and demanding client group. Comfortable taking ownership for own work, identifying the need for action (using initiative) whilst working effectively within a team. What We Offer AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business − property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com Who We Are AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and a diverse workforce enable business growth and are critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most diverse workforce possible, and create an inclusive culture where everyone can bring their full selves to work and can reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe Robust support for Flexible Working Arrangements Enhanced family friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Want to know more? At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society - are essential to our future. We’re committed to protecting and restoring nature - from mangrove forests to the bees in our backyard - by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action: We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day - the Global Day of Giving. For more information, please see axaxl.com/sustainability. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Experience: 5 to 10 Years Must Have Experience with tools like Hightouch, Segment, mParticle, Treasure Data, Amperity, Simon data, or Salesforce CDP Experience in CDP implementation and marketing analytics Experience in Python, SQL and AI/ML Frameworks Experience in Azure Data Factory, Databricks, Synapse, Snowflake Experience deploying GenAI applications Qualifications Bachelor's or master’s degree in computer science, Data Science, Marketing Analytics, or a related field. 5+ years of experience in CDP implementation and marketing analytics. Hands-on experience with tools like Hightouch, Segment, mParticle, Treasure Data, Amperity, Simon data, or Salesforce CDP (Experience in any one tool is mandatory). Exposure to MLOps practices and model monitoring pipelines. Strong proficiency in Python, SQL, and ML frameworks (e.g., scikit-learn, TensorFlow, PyTorch). Experience with cloud platforms and tools like Azure Data Factory, Databricks, Synapse, Snowflake. Hands-on experience with NLP techniques and libraries (e.g., spaCy, NLTK, transformers). Experience deploying GenAI applications and understanding of LLM frameworks. Familiarity with campaign attribution models, marketing funnels, and retention strategies. Excellent analytical, communication, and stakeholder management skills. About The Role We are excited to welcome a highly skilled CDP & Marketing Analytics Specialist to join our dynamic digital transformation team. The ideal candidate will possess hands-on experience with Customer Data Platforms (CDPs) and exhibit a deep understanding of customer data ingestion, real-time data pipelines, and AI-driven marketing strategies. You will play a pivotal role in driving data-driven decision-making and enabling hyper-personalized marketing at scale. Key Responsibilities Design and implement end-to-end customer data ingestion strategies from diverse sources (CRM, web, mobile, social, PoS, etc.) into a unified CDP. Build and maintain real-time streaming data pipelines using tools such as Azure Data Factory, Databricks, Kafka, etc. Design and implement composable Customer Data Platforms (CDPs) leveraging modern data platforms such as Snowflake, Databricks, and similar technologies. Develop and deploy AI/ML-driven customer segmentation models (behavioral, demographic, predictive — e.g., high CLV, churn-risk clusters). Skills: azure,genai applications,ai/ml frameworks,hightouch,treasure data,amperity,mparticle,azure data factory,marketing analytics,salesforce cdp,synapse,databricks,spacy,segment,transformers,analytics,cdp,nlp techniques,python,platforms,snowflake,ml,sql,customer data,nltk,simon data Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Preferred Education Master's Degree Required Technical And Professional Expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on AWS; Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies. Preferred Technical And Professional Experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers. Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Linkedin logo

Ascentt is building cutting-edge data analytics & AI/ML solutions for global automotive and manufacturing leaders. We turn enterprise data into real-time decisions using advanced machine learning and GenAI. Our team solves hard engineering problems at scale, with real-world industry impact. We’re hiring passionate builders to shape the future of industrial intelligence. Job Title: Data Engineering Lead Job Summary We are seeking an experienced Data Engineering Lead to oversee and guide our organization's data engineering function. The ideal candidate will combine deep technical expertise in big data and data engineering with strong leadership skills to drive innovation, best practices, and excellence across our data engineering teams. Key Responsibilities Establish and evolve the strategic direction for the data engineering practice across the organization. Define and implement best practices, standards, and methodologies for data engineering processes and technologies. Lead the design and architecture of scalable, enterprise-wide data solutions and platforms. Oversee multiple data engineering teams and projects, ensuring alignment with organizational goals and technical standards. Mentor and develop data engineering talent across the organization, fostering a culture of continuous learning and innovation. Collaborate with other practice leads and senior leadership to align data engineering initiatives with broader technology and business strategies. Stay abreast of emerging trends and technologies in data engineering and feature engineering, evaluating their potential impact and guiding adoption where appropriate. Drive the selection, implementation, and optimization of data engineering tools and technologies. Oversee the design and implementation of data pipelines, ETL processes, and data integration solutions. Establish KPIs and metrics to measure the effectiveness and efficiency of data engineering practices. Represent the data engineering practice in executive-level meetings and decision-making processes. Oversee the design and implementation of data pipelines, ETL processes, and data integration solutions. Guide the adoption of cloud-based data solutions and migration strategies. Participate in industry events, conferences, and communities to enhance the organization's reputation in data engineering. Required Qualifications 10+ years of experience in data engineering roles, with at least 3 years in leadership positions Deep expertise in big data technologies, cloud platforms, and data integration tools and programming languages such Scala, PySpark with Spark knowledge. Strong understanding of data governance, security, and compliance requirements Proven track record of successfully leading large-scale data engineering initiatives Excellent communication skills with the ability to convey complex technical concepts to both technical and non-technical audiences Strong leadership and mentoring abilities with experience managing and developing teams Experience in establishing or significantly growing a data engineering practice Knowledge of machine learning and AI technologies and their integration with data engineering Familiarity with Agile and DevOps methodologies Relevant industry certifications (e.g., AWS Certified Big Data, Google Cloud Professional Data Engineer, Databricks, Snowflake) Experience in a consulting or professional services environment Published articles or speaking engagements related to data engineering and/or feature engineering Good knowledge on GraphDBs, VectorDBs and building RAG frameworks. Preferred Qualifications Certifications: Snowflake SnowPro Certification or Databricks Data Engineer Professional. Scripting Languages: Experience with Python or other scripting languages for automation and data processing. Industry Experience: Familiarity with industries such as manufacturing, oil & gas, or automotive is a plus. Why Ascentt? Innovative Projects: Work on cutting-edge data solutions that drive business transformation. Collaborative Environment: Join a team of professionals who value collaboration and continuous learning. Career Growth: Opportunities for professional development and advancement within the company. Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description Job Details Position: Project Manager-Power BI & Customer Excellence Division & Department: Enabling Functions_Business Technology Group (BTG) Reporting To: Product Manager Educational Qualifications Bachelor’s / Master’s Degree in Computer Science / BE / B.Tech (Computers) Preferred: Certifications in Data Management Microsoft Certified Power BI Analyst (desirable) Experience Total Desired Experience: 5–6 years in relevant data/digital project domains Minimum Relevant Experience: 5 years in Power BI implementation and application integration Role And Responsibilities Key Objectives Deliver end-to-end Power BI requirements across SBUs, including direct hands-on support for critical needs. Lead digital transformation initiatives involving Power BI, Salesforce, and integration between CRM, ERP, and custom applications. Manage project lifecycle, ensuring quality and timely delivery through agile methodologies. Coordinate with internal teams and vendors to develop scalable data-driven solutions. Utilize product management tools to capture and track project progress effectively. Major Deliverables Drive and execute Power BI & Salesforce projects: dashboards, automated reports, and data insights. Maintain project documentation, timelines, and communication using agile tools. Implement Power BI dashboards, KPIs, and scorecards; oversee deployment and publishing processes. Support Sales & Service-related initiatives in SFDC. Enable data governance, warehousing, and reporting using SQL, DAX, and Python. Critical Competencies Essential Attributes Proven experience in digital project management and delivery. Strong analytical thinking, problem-solving, and attention to detail. Ability to communicate effectively and engage stakeholders across functions. Agile mindset; adaptable to changing environments and technology evolution. High ownership, collaborative spirit, and vendor coordination capabilities. Technical Competencies Languages: Python, C, C++, Scala Databases: SQL Server, Oracle, MySQL, PostgreSQL Data Engineering: HDFS, Apache Spark, Databricks Cloud Frameworks: AWS (EC2, EMR, S3, RDS) Machine Learning: pandas, NumPy, Scikit-Learn Tools: Spyder, Orange, Jupyter Notebook, Visual Studio Version Control/DevOps: Git, Maven, Docker Domain & Functional Skills Strong Power BI knowledge (dashboarding, DAX, report lifecycle) Data analytics, ETL, and reporting fundamentals Functional exposure to SFDC, Oracle ERP, SAP, Infor Familiarity with agile project management practices Show more Show less

Posted 1 week ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries,Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: DataBricks - Data Engineering.

Posted 1 week ago

Apply

6.0 - 10.0 years

25 - 30 Lacs

Pune, Bengaluru, Mumbai (All Areas)

Hybrid

Naukri logo

Minimum of 6 yrs of Data Engineering Exp Must be an expert in SQL, Data Lake, Azure Data Factory, Azure Synapse, ETL, Databricks Must be an expert in data modeling, writing complex queries in SQL Ability to convert SQL code to PySpark Required Candidate profile Exp with SQL, Python, data modelling, data warehousing & dimensional modelling concept Familiarity with data governance, data security & Production deployments using Azure DevOps CICD pipelines.

Posted 1 week ago

Apply

2.0 - 3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description Job Details: We are seeking a highly motivated and enthusiastic Junior Data Scientist with 2-3 years of experience to join our data science team. This role offers an exciting opportunity to contribute to both traditional Machine Learning projects for our commercial IoT platform (EDGE Live) and cutting-edge Generative AI initiatives. Position: Data Scientist Division & Department: Enabling Functions_Business Technology Group (BTG) Reporting To: Customer & Commercial Experience Products Leader Educational Qualifications Bachelor's degree in Mechanical, Computer Science, Data Science, Mathematics, or a related field. Experience 2-3 years of hands-on experience with machine learning Exposure to Generative AI concepts and techniques, such as Large Language Models (LLMs), RAG Architecture Experience in manufacturing and with an IoT platform is preferable Role And Responsibilities Key Objectives Machine Learning (ML) Assist in the development and implementation of machine learning models using frameworks such as TensorFlow, PyTorch, or scikit-learn. Help with Python development to integrate models with the overall application Monitor and evaluate model performance using appropriate metrics and techniques. Generative AI Build Gen AI-based tools for various business use cases by fine-tuning and adapting pre-trained generative models Support the exploration and experimentation with Generative AI models Research & Learning Stay up-to-date with the latest advancements and help with POCs Proactively research and propose new techniques and tools to improve our data science capabilities. Collaboration And Communication Work closely with cross-functional teams, including product managers, engineers, and business stakeholders, to understand requirements and deliver impactful solutions. Communicate findings, model performance, and technical concepts to both technical and non-technical audiences. Technical Competencies Programming: Proficiency in Python, with experience in libraries like numpy, pandas, and matplotlib for data manipulation and visualization. ML Frameworks: Experience with TensorFlow, PyTorch, or scikit-learn. Cloud & Deployment: Basic understanding of cloud platforms such as Databricks, Google Cloud Platform (GCP), or Microsoft Azure for model deployment. Data Processing & Evaluation: Knowledge of data preprocessing, feature engineering, and evaluation metrics such as accuracy, F1-score, and RMSE. Show more Show less

Posted 1 week ago

Apply

8.0 - 10.0 years

10 - 12 Lacs

Pune

Work from Office

Naukri logo

Role Purpose The purpose of the role is to create exceptional architectural solution design and thought leadership and enable delivery teams to provide exceptional client engagement and satisfaction. Do 1.Develop architectural solutions for the new deals/ major change requests in existing deals Creates an enterprise-wide architecture that ensures systems are scalable, reliable, and manageable. Provide solutioning of RFPs received from clients and ensure overall design assurance Develop a direction to manage the portfolio of to-be-solutions including systems, shared infrastructure services, applications in order to better match business outcome objectives Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution design framework/ architecture Provide technical leadership to the design, development and implementation of custom solutions through thoughtful use of modern technology Define and understand current state solutions and identify improvements, options & tradeoffs to define target state solutions Clearly articulate, document and sell architectural targets, recommendations and reusable patterns and accordingly propose investment roadmaps Evaluate and recommend solutions to integrate with overall technology ecosystem Works closely with various IT groups to transition tasks, ensure performance and manage issues through to resolution Perform detailed documentation (App view, multiple sections & views) of the architectural design and solution mentioning all the artefacts in detail Validate the solution/ prototype from technology, cost structure and customer differentiation point of view Identify problem areas and perform root cause analysis of architectural design and solutions and provide relevant solutions to the problem Collaborating with sales, program/project, consulting teams to reconcile solutions to architecture Tracks industry and application trends and relates these to planning current and future IT needs Provides technical and strategic input during the project planning phase in the form of technical architectural designs and recommendation Collaborates with all relevant parties in order to review the objectives and constraints of solutions and determine conformance with the Enterprise Architecture Identifies implementation risks and potential impacts 2.Enable Delivery Teams by providing optimal delivery solutions/ frameworks Build and maintain relationships with executives, technical leaders, product owners, peer architects and other stakeholders to become a trusted advisor Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams Recommend tools for reuse, automation for improved productivity and reduced cycle times Leads the development and maintenance of enterprise framework and related artefacts Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams Ensures architecture principles and standards are consistently applied to all the projects Ensure optimal Client Engagement Support pre-sales team while presenting the entire solution design and its principles to the client Negotiate, manage and coordinate with the client teams to ensure all requirements are met and create an impact of solution proposed Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor 3.Competency Building and Branding Ensure completion of necessary trainings and certifications Develop Proof of Concepts (POCs),case studies, demos etc. for new growth areas based on market and customer research Develop and present a point of view of Wipro on solution design and architect by writing white papers, blogs etc. Attain market referencability and recognition through highest analyst rankings, client testimonials and partner credits Be the voice of Wipros Thought Leadership by speaking in forums (internal and external) Mentor developers, designers and Junior architects in the project for their further career development and enhancement Contribute to the architecture practice by conducting selection interviews etc 4.Team Management Resourcing Anticipating new talent requirements as per the market/ industry trends or client requirements Hire adequate and right resources for the team Talent Management Ensure adequate onboarding and training for the team members to enhance capability & effectiveness Build an internal talent pool and ensure their career progression within the organization Manage team attrition Drive diversity in leadership positions Performance Management Set goals for the team, conduct timely performance reviews and provide constructive feedback to own direct reports Ensure that the Performance Nxt is followed for the entire team Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Mandatory Skills: DataBricks - Data Engineering.

Posted 1 week ago

Apply

3.0 - 4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Company Description About Grab and Our Workplace Grab is Southeast Asia's leading superapp. From getting your favourite meals delivered to helping you manage your finances and getting around town hassle-free, we've got your back with everything. In Grab, purpose gives us joy and habits build excellence, while harnessing the power of Technology and AI to deliver the mission of driving Southeast Asia forward by economically empowering everyone, with heart, hunger, honour, and humility. Job Description Get to Know the Team GrabFin Analytics team supports the Fintech product, business and risk orgs. Product Analyst are part of one or more tech families - Fin Core, Fin Experience, Fin Identity, Payments, Financial Services (Invest, Insure & Lending). This role is parked in the Payments tech family We care about understanding how users experience the Product and partner with Business, Product, Design, and Tech to focus on the right outcomes and feature set We remain necessary to product development from understanding user journeys with UX designers to hypothesis development, right through to post rollout optimization. Get to Know the Role Use data to understand user needs - be the de-facto Voice of the Customer (users, driver-partners, merchants, agents, etc) for all GFG teams. Leverage data for further insights to improve decision-making at Grab by developing dashboards, maintaining pipelines, holding metrics reviews and coming up with insight decks/experiments Champion data-driven decision-making and culture in Grab Financial GroupPartner with Product Managers, Business Owners, UX Designers,Risk and Engineering to design and deliver analytical projects that help support the GFG Product roadmap Provide thought leadership and generate data-driven hypotheses to solve key Product and Business problems. You will be reporting to "Analytics Manager II". This is a Hybrid role based in Bangalore (3 days Work from Office every week). The Critical Tasks You Will Perform Support business critical dashboard and pipeline maintenance for day-to-day data driven decisions. Design and analyze A/B tests and multivariate experiments for UI/UX, layout, contextualization, algorithms, and APIs. Mine clickstream and transactional data to derive insights on user behavior and drive GFG product metrics. Own instrumentation for feature releases within assigned tech families in GFG. Generate segmented customer and merchant insights to refine product iterations and improvements. Deliver reliable, on-time outputs and build scalable, automated self-serve solutions for stakeholders. Qualifications What Essential Skills You Will Need Bachelor's/Master's in Statistics, Analytics, Economics, Mathematics, Engineering, or related fields. 3-4 years of experience in Analytics, BI, or Data Science, preferably in Internet/E-Commerce with large, high-velocity data. Strong SQL experience querying large relational databases. Translate data insights into relevant recommendations for non-technical and senior team members. Proficiency in Python, Databricks, Tableau/Power BI, and expertise in A/B testing, hypothesis testing, and DoE principles. Additional Information Life at Grab We care about your well-being at Grab, here are some of the global benefits we offer: We have your back with Term Life Insurance and comprehensive Medical Insurance. With GrabFlex, create a benefits package that suits your needs and aspirations. Celebrate moments that matter in life with loved ones through Parental and Birthday leave, and give back to your communities through Love-all-Serve-all (LASA) volunteering leave We have a confidential Grabber Assistance Programme to guide and uplift you and your loved ones through life's challenges. Balancing personal commitments and life's demands are made easier with our FlexWork arrangements such as differentiated hours What We Stand For At Grab We are committed to building an inclusive and equitable workplace that enables diverse Grabbers to grow and perform at their best. As an equal opportunity employer, we consider all candidates fairly and equally regardless of nationality, ethnicity, religion, age, gender identity, sexual orientation, family commitments, physical and mental impairments or disabilities, and other attributes that make them unique. Show more Show less

Posted 1 week ago

Apply

5.0 - 8.0 years

7 - 12 Lacs

Pune

Work from Office

Naukri logo

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure 1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT 2Team ManagementProductivity, efficiency, absenteeism 3Capability developmentTriages completed, Technical Test performance Mandatory Skills: DataBricks - Data Engineering.

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Dear Associates Greetings from TATA Consultancy Services!! Thank you for expressing your interest in exploring a career possibility with the TCS Family Hiring For : Python AI ML, MlOPs Must Have : Spark, Hadoop,PyTorch, TensorFlow,Matplotlib, Seaborn, Tableau, Power BI,scikit-learn, TensorFlow, XGBoost,AWS,Azure , AWS, Databricks,Pyspark, Python,SQL, Snowflake, Experience: 5+ yrs Location : Mumbai / Pune If interested kindly fill the details and send your resume at nitu.sadhukhan@tcs.com . Note: only Eligible candidates with Relevant experience will be contacted further Name Contact No: Email id: Current Location: Preferred Location: Highest Qualification (Part time / Correspondence is not Eligible) : Year of Passing (Highest Qualification): Total Experience: Relevant Experience : Current Organization: Notice Period: Current CTC: Expected CTC: Pan Number : Gap in years if any (Education / Career): Updated CV attached (Yes / No) ? IF attended any interview with TCS in last 6 months : Available For walk In drive on 14th June _Pune : Thanks & Regards, Nitu Sadhukhan Talent Acquisition Group Tata Consultancy Services Lets Connect : linkedin.com/in/nitu-sadhukhan-16a580179 Nitu.sadhukhan@tcs.com Show more Show less

Posted 1 week ago

Apply

15.0 years

0 Lacs

Kerala

On-site

GlassDoor logo

Senior Azure Data Engineer: a. 15+ years of experience in IT Industry b. 5+ years of experience with Azure Data Engineering Stack (Data Factory, Databricks, Synapse, Event Hub, Cosmos DB, ADLS Gen2, Function app) c. 3+ years of experience with Python / Pyspark d. Good understanding of other Azure services e. Excellent knowledge of SQL f. Good understanding of Data Warehouse Architecture, Data Modelling and design concepts g. Experience in Power BI, SFTP, Messaging, APIs h. Lead and guide the data engineering team technically and share best practices with the team i. Excellent analytical and organization skills. j. Effective working in a team as well as working independently. k. Experience of working in Agile delivery l. Knowledge of software development best practices. m. Strong written and verbal communication skills. n. DP200/DP201 certification is added advantage

Posted 1 week ago

Apply

8.0 - 10.0 years

2 - 8 Lacs

Hyderābād

On-site

GlassDoor logo

India - Hyderabad JOB ID: R-216601 ADDITIONAL LOCATIONS: India - Hyderabad WORK LOCATION TYPE: On Site DATE POSTED: Jun. 09, 2025 CATEGORY: Information Systems Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Senior Manager Technology – US Commercial Data & Analytics What you will do Let’s do this. Let’s change the world. In this vital role you will lead the engagement model between Amgen's Technology organization and our global business partners in Commercial Data & Analytics. We seek a technology leader with a passion for innovation and a collaborative working style that partners effectively with business and technology leaders. Are you interested in building a team that consistently delivers business value in an agile model using technologies such as AWS, Databricks, Airflow, and Tableau? Come join our team! Roles & Responsibilities: Establish an effective engagement model to collaborate with senior leaders on the Sales Insights product team within the Commercial Data & Analytics organization, focused on operations within the United States Serve as the technology product owner for an agile product team committed to delivering business value to Commercial stakeholders via data pipeline buildout for sales data Lead and mentor junior team members to deliver on the needs of the business Interact with business clients and technology management to create technology roadmaps, build cases, and drive DevOps to achieve the roadmaps Help to mature Agile operating principles through deployment of creative and consistent practices for user story development, robust testing and quality oversight, and focus on user experience Ability to connect and understand our vast array Commercial and other functional data sources including Sales, Activity, and Digital data, etc. into consumable and user-friendly modes (e.g., dashboards, reports, mobile, etc.) for key decision makers such as executives, brand leads, account managers, and field representatives. Become the lead subject matter expert in reporting technology capabilities by researching and implementing new tools and features, internal and external methodologies What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master’s degree with 8 - 10 years of experience in Information Systems experience OR Bachelor’s degree with 10 - 14 years of experience in Information Systems experience OR Diploma with 14 - 18 years of experience in Information Systems experience Must-Have Skills Excellent problem-solving skills and a passion for tackling complex challenges in data and analytics with technology Experience leading data and analytics teams in a Scaled Agile Framework (SAFe) Excellent interpersonal skills, strong attention to detail, and ability to influence based on data and business value Ability to build compelling business cases with accurate cost and effort estimations Has experience with writing user requirements and acceptance criteria in agile project management systems such as Jira Ability to explain sophisticated technical concepts to non-technical clients Strong understanding of sales and incentive compensation value streams Preferred Qualifications: Jira Align & Confluence experience Experience of DevOps, Continuous Integration, and Continuous Delivery methodology Understanding of software systems strategy, governance, and infrastructure Experience in managing product features for PI planning and developing product roadmaps and user journeys Familiarity with low-code, no-code test automation software Technical thought leadership Soft Skills: Able to work effectively across multiple geographies (primarily India, Portugal, and the United States) under minimal supervision Demonstrated proficiency in written and verbal communication in English language Skilled in providing oversight and mentoring team members. Demonstrated ability in effectively delegating work Intellectual curiosity and the ability to question partners across functions Ability to prioritize successfully based on business value High degree of initiative and self-motivation Ability to manage multiple priorities successfully across virtual teams Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills Technical Skills: ETL tools: Experience in ETL tools such as Databricks Redshift or equivalent cloud-based dB Big Data, Analytics, Reporting, Data Lake, and Data Integration technologies S3 or equivalent storage system AWS (similar cloud-based platforms) BI Tools (Tableau and Power BI preferred) What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 week ago

Apply

0 years

6 - 8 Lacs

Hyderābād

On-site

GlassDoor logo

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose – the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant- Databricks Developer ! In this role, the Databricks Developer is responsible for solving the real world cutting edge problem to meet both functional and non-functional requirements. Responsibilities Maintains close awareness of new and emerging technologies and their potential application for service offerings and products. Work with architect and lead engineers for solutions to meet functional and non-functional requirements. Demonstrated knowledge of relevant industry trends and standards. Demonstrate strong analytical and technical problem-solving skills. Must have experience in Data Engineering domain . Qualifications we seek in you! Minimum qualifications Bachelor’s Degree or equivalency (CS, CE, CIS, IS, MIS, or engineering discipline) or equivalent work experience. Maintains close awareness of new and emerging technologies and their potential application for service offerings and products. Work with architect and lead engineers for solutions to meet functional and non-functional requirements. Demonstrated knowledge of relevant industry trends and standards. Demonstrate strong analytical and technical problem-solving skills. Must have excellent coding skills either Python or Scala, preferably Python. Must have experience in Data Engineering domain . Must have implemented at least 2 project end-to-end in Databricks. Must have at least experience on databricks which consists of various components as below Delta lake dbConnect db API 2.0 Databricks workflows orchestration Must be well versed with Databricks Lakehouse concept and its implementation in enterprise environments. Must have good understanding to create complex data pipeline Must have good knowledge of Data structure & algorithms. Must be strong in SQL and sprak-sql . Must have strong performance optimization skills to improve efficiency and reduce cost . Must have worked on both Batch and streaming data pipeline . Must have extensive knowledge of Spark and Hive data processing framework. Must have worked on any cloud (Azure, AWS, GCP) and most common services like ADLS/S3, ADF/Lambda, CosmosDB /DynamoDB, ASB/SQS, Cloud databases. Must be strong in writing unit test case and integration test Must have strong communication skills and have worked on the team of size 5 plus Must have great attitude towards learning new skills and upskilling the existing skills. Preferred Qualifications Good to have Unity catalog and basic governance knowledge. Good to have Databricks SQL Endpoint understanding. Good To have CI/CD experience to build the pipeline for Databricks jobs. Good to have if worked on migration project to build Unified data platform. Good to have knowledge of DBT. Good to have knowledge of docker and Kubernetes. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit www.genpact.com . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Lead Consultant Primary Location India-Hyderabad Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 9, 2025, 9:15:16 AM Unposting Date Ongoing Master Skills List Digital Job Category Full Time

Posted 1 week ago

Apply

130.0 years

6 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

Job Description Manager Senior Data Engineer The Opportunity Based in Hyderabad, join a global healthcare biopharma company and be part of a 130- year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Be part of an organisation driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be a part of a team with passion for using data, analytics, and insights to drive decision-making, and which creates custom software, allowing us to tackle some of the world's greatest health threats. Our Technology Centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of our company’s IT operating model, Tech Centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each Tech Center helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers. Role Overview Designs, builds, and maintains data pipeline architecture - ingest, process, and publish data for consumption. Batch processes collected data, formats data in an optimized way to bring it analyze-ready Ensures best practices sharing and across the organization Enables delivery of data-analytics projects Develops deep knowledge of the company's supported technology; understands the whole complexity/dependencies between multiple teams, platforms (people, technologies) Communicates intensively with other platform/competencies to comprehend new trends and methodologies being implemented/considered within the company ecosystem Understands the customer and stakeholders business needs/priorities and helps building solutions that support our business goals Establishes and manages the close relationship with customers/stakeholders Has overview of the date engineering market development to be able to come up/explore new ways of delivering pipelines to increase their value/contribution Builds “community of practice” leveraging experience from delivering complex analytics projects Is accountable for ensuring that the team delivers solutions with high quality standards, timeliness, compliance and excellent user experience Contributes to innovative experiments, specifically to idea generation, idea incubation and/or experimentation, identifying tangible and measurable criteria Qualifications: Bachelor’s degree in Computer Science, Data Science, Information Technology, Engineering or a related field. 3+ plus years of experience as a Data Engineer or in a similar role, with a strong portfolio of data projects. 3+ plus years experience SQL skills, with the ability to write and optimize queries for large datasets. 1+ plus years experience and proficiency in Python for data manipulation, automation, and pipeline development. Experience with Databricks including creating notebooks and utilizing Spark for big data processing. Strong experience with data warehousing solution (such as Snowflake), including schema design and performance optimization. Experience with data governance and quality management tools, particularly Collibra DQ. Strong analytical and problem-solving skills, with an attention to detail. Our technology teams operate as business partners, proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver services and solutions that help everyone be more productive and enable innovation. Who we are: We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What we look for: Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are among the intellectually curious, join us—and start making your impact today. #HYDIT2025 Current Employees apply HERE Current Contingent Workers apply HERE Secondary Language(s) Job Description: In light of the current state of the AHIT commercial data and analytics function, it is imperative to address the limitations posed by relying on multiple HCL contractors who primarily possess ETL/Informatica skills but lack the essential data engineering expertise. This gap hinders the seamless flow of data through the AHIT digital backbone, which is critical for the successful implementation of the JEDI strategy. The future state envisioned by Merck IT emphasizes a strategic shift towards reducing contractor expenditures by developing in-house capabilities. By leveraging the Hyderabad tech center, we can cultivate a skilled team of data engineers who are not only proficient in ETL processes but also possess the advanced data engineering skills necessary to optimize our data infrastructure. This transition will enable us to enhance data accessibility, improve data quality, and facilitate a frictionless data flow, ultimately supporting the overarching goals of the JEDI strategy. Investing in the hiring of dedicated data engineers will yield significant long-term benefits, including reduced operational costs, increased agility in data management, and the ability to innovate and adapt to evolving business needs. By fostering a robust in-house data engineering team, we can ensure that our data capabilities align with Merck's strategic objectives, driving efficiency and effectiveness across the organization Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status: Regular Relocation: VISA Sponsorship: Travel Requirements: Flexible Work Arrangements: Hybrid Shift: Valid Driving License: Hazardous Material(s): Required Skills: Business, Business, Business Intelligence (BI), Business Management, Contractor Management, Cost Reduction, Database Administration, Database Optimization, Data Engineering, Data Flows, Data Infrastructure, Data Management, Data Modeling, Data Optimization, Data Quality, Data Visualization, Design Applications, Information Management, Management Process, Operating Cost Reduction, Productivity Improvements, Project Engineering, Social Collaboration, Software Development, Software Development Life Cycle (SDLC) {+ 1 more} Preferred Skills: Job Posting End Date: 08/20/2025 A job posting is effective until 11:59:59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID: R350683

Posted 1 week ago

Apply

170.0 years

2 - 7 Lacs

Hyderābād

On-site

GlassDoor logo

Country/Region: IN Requisition ID: 26282 Work Model: Position Type: Salary Range: Location: INDIA - HYDERABAD - BIRLASOFT OFFICE Title: Sr Technical Lead-Data Engg Description: Area(s) of responsibility About Us: Birlasoft, a global leader at the forefront of Cloud, AI, and Digital technologies, seamlessly blends domain expertise with enterprise solutions. The company’s consultative and design-thinking approach empowers societies worldwide, enhancing the efficiency and productivity of businesses. As part of the multibillion-dollar diversified CKA Birla Group, Birlasoft with its 12,000+ professionals, is committed to continuing the Group’s 170-year heritage of building sustainable communities. Job Description: Years of experience 6 to 10 Years – Experience in Perform Design, Development & Deployment using Azure Services (Data Factory, Databricks, PySpark, SQL) Develop and maintain scalable data pipelines and build new Data Source integrations to support increasing data volume and complexity. Experience in creating Technical Specification Design, Application Interface Design. Developing Modern Data Warehouse solutions using Azure Stack (Azure Data Lake, Azure Databricks) and PySpark Develop batch processing and integration solutions and process Structured and Non-Structured Data Demonstrated in-depth skills with Azure Databricks and PySpark, and SQL Collaborate and engage with BI & analytics and the business team Minimum 2 year of Project experience in Azure Databricks Minimum 2 years of experience in ADF Minimum 2 years of experience in PySpark

Posted 1 week ago

Apply

15.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

Project Role : Cloud Platform Engineer Project Role Description : Designs, builds, tests, and deploys cloud application solutions that integrate cloud and non-cloud infrastructure. Can deploy infrastructure and platform environments, creates a proof of architecture to test architecture viability, security and performance. Must have skills : Databricks Unified Data Analytics Platform, Knowledge of EC Payroll, Payroll Configuration Support Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a Cloud Platform Engineer, you will be responsible for designing, building, testing, and deploying cloud application solutions that integrate cloud and non-cloud infrastructure. You will deploy infrastructure and platform environments, create proof of architecture to test architecture viability, security, and performance. Roles & Responsibilities: - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute on key decisions - Provide solutions to problems for their immediate team and across multiple teams - Lead the team in implementing innovative solutions - Conduct regular team meetings to ensure alignment and progress - Mentor junior team members to enhance their skills Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform - Strong understanding of cloud architecture and deployment - Experience with infrastructure as code tools like Terraform - Knowledge of containerization technologies such as Docker and Kubernetes - Hands-on experience with CI/CD pipelines for automated deployments Additional Information: - The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform - This position is based at our Hyderabad office - A 15 years full-time education is required 15 years full time education

Posted 1 week ago

Apply

1.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

Role Description: The role is responsible for designing, developing, and maintaining software solutions for Research scientists. Additionally, it involves automating operations, monitoring system health, and responding to incidents to minimize downtime. You will join a multi-functional team of scientists and software professionals that enables technology and data capabilities to evaluate drug candidates and assess their abilities to affect the biology of drug targets. This team implements scientific software platforms that enable the capture, analysis, storage, and reporting for our Large Molecule Discovery Research team (Design, Make, Test and Analyze processes). The team also interfaces heavily with teams supporting our in vitro assay management systems and our compound inventory platforms. The ideal candidate possesses experience in the pharmaceutical or biotech industry, strong technical skills, and full stack software engineering experience (spanning SQL, back-end, front-end web technologies, automated testing). Roles & Responsibilities: Work closely with product team, business team including scientists, and other stakeholders Analyze and understand the functional and technical requirements of applications, solutions and systems and translate them into software architecture and design specifications Design, develop, and implement applications and modules, including custom reports, interfaces, and enhancements Develop and execute unit tests, integration tests, and other testing strategies to ensure the quality of the software Conduct code reviews to ensure code quality and adherence to best practices Create and maintain documentation on software architecture, design, deployment, disaster recovery, and operations Provide ongoing support and maintenance for applications, ensuring that they operate smoothly and efficiently Stay updated with the latest technology and security trends and advancements Basic Qualifications and Experience: Master’s degree with 1 - 3 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Bachelor’s degree with 4 - 6 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Diploma with 7 - 9 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field Preferred Qualifications and Experience: 1+ years of experience in implementing and supporting biopharma scientific software platforms Functional Skills: Must-Have Skills : Proficient in Java or Python Proficient in at least one JavaScript UI Framework (e.g. ExtJS, React, or Angular) Proficient in SQL (e.g. Oracle, PostgreSQL, Databricks) Good-to-Have Skills: Experience with event-based architecture and serverless AWS services such as EventBridge, SQS, Lambda or ECS. Experience with Benchling Hands-on experience with Full Stack software development Strong understanding of software development methodologies, mainly Agile and Scrum Working experience with DevOps practices and CI/CD pipelines Experience of infrastructure as code (IaC) tools (Terraform, CloudFormation) Experience with monitoring and logging tools (e.g., Prometheus, Grafana, Splunk) Experience with automated testing tools and frameworks Experience with big data technologies (e.g., Spark, Databricks, Kafka) Experience with leveraging the use of AI-assistants (e.g. GitHub Copilot) to accelerate software development and improve code quality Professional Certifications (please mention if the certification is preferred or mandatory for the role): AWS Certified Cloud Practitioner preferred Soft Skills: Excellent problem solving, analytical, and troubleshooting skills Strong communication and interpersonal skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to learn quickly & work independently Team-oriented, with a focus on achieving team goals Ability to manage multiple priorities successfully Strong presentation and public speaking skills EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.

Posted 1 week ago

Apply

5.0 years

15 Lacs

Hyderābād

On-site

GlassDoor logo

Experience- 5-10 years JD- Mandatory Skillset: Snowflake, DBT, Data Architecture Design experience in Data Warehouse. Good to have Informatica or any ETL Knowledge or Hands-On Experience Good to have Databricks understanding 5-10 years of IT experience with 2+ years of Data Architecture experience in Data Warehouse,3+ years in Snowflake Responsibilities Design, implement, and manage cloud-based solutions on AWS and Snowflake. Work with stakeholders to gather requirements and design solutions that meet their needs. Develop and execute test plans for new solutions Oversee and design the information architecture for the data warehouse, including all information structures such as staging area, data warehouse, data marts, and operational data stores. Ability to Optimize Snowflake configurations and data pipelines to improve performance, scalability, and overall efficiency. Deep understanding of Data Warehousing, Enterprise Architectures, Dimensional Modeling, Star & Snowflake schema design, Reference DW Architectures, ETL Architect, ETL (Extract, Transform, Load), Data Analysis, Data Conversion, Transformation, Database Design, Data Warehouse Optimization, Data Mart Development, and Enterprise Data Warehouse Maintenance and Support Significant experience working as a Data Architect with depth in data integration and data architecture for Enterprise Data Warehouse implementations (conceptual, logical, physical & dimensional models) Maintain Documentation: Develop and maintain detailed documentation for data solutions and processes. Provide Training: Offer training and leadership to share expertise and best practices with the team Collaborate with the team and provide leadership to the data engineering team, ensuring that data solutions are developed according to best practices Job Type: Full-time Pay: From ₹1,500,000.00 per year Location Type: In-person Schedule: Monday to Friday Ability to commute/relocate: Hyderabad, Telangana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): What is your notice period? How many years of experience do you have in Snowflake? How many years of experience do you have in Data Architecture experience in Data Warehouse? What is your current location? Are you ok with the work from office in Hyderabad location? What is your current CTC? What is your expected CTC? Experience: total work: 5 years (Required) Work Location: In person

Posted 1 week ago

Apply

0 years

0 Lacs

New Delhi, Delhi, India

Remote

Linkedin logo

We are seeking a proactive and business-oriented Data Functional Consultant with strong experience in Azure Data Factory and Azure Databricks . This role bridges the gap between business stakeholders and technical teams—translating business needs into scalable data solutions, ensuring effective data management, and enabling insights-driven decision-making. The ideal candidate is not a pure developer or data engineer but someone who understands business processes, data flows, and stakeholder priorities , and can help drive value from data platforms using cloud-native Azure services. What You’ll Do: Collaborate closely with business stakeholders to gather, understand, and document functional data requirements. Translate business needs into high-level data design, data workflows, and process improvements. Work with data engineering teams to define and validate ETL/ELT logic and data pipeline workflows using Azure Data Factory and Databricks. Facilitate functional workshops and stakeholder meetings to align on data needs and business KPIs. Act as a bridge between business teams and data engineers to ensure accurate implementation and delivery of data solutions. Conduct data validation, UAT, and support users in adopting data platforms and self-service analytics. Maintain functional documentation, data dictionaries, and mapping specifications. Assist in defining data governance, data quality, and master data management practices from a business perspective. Monitor data pipeline health and help triage issues from a functional/business impact standpoint. What You’ll Bring: Proven exposure to Azure Data Factory (ADF) for orchestrating data workflows. Practical experience with Azure Databricks for data processing (functional understanding, not necessarily coding). Strong understanding of data warehousing, data modeling, and business KPIs. Experience working in agile or hybrid project environments. Excellent communication and stakeholder management skills. Ability to translate complex technical details into business-friendly language. Familiarity with tools like Power BI, Excel, or other reporting solutions is a plus. Background in Banking, Finance industries is a bonus. What We Offer: At Delphi, we are dedicated to creating an environment where you can thrive, both professionally and personally. Our competitive compensation package, performance-based incentives, and health benefits are designed to ensure you're well-supported. We believe in your continuous growth and offer company-sponsored certifications, training programs , and skill-building opportunities to help you succeed. We foster a culture of inclusivity and support, with remote work options and a fully supported work-from-home setup to ensure your comfort and productivity. Our positive and inclusive culture includes team activities, wellness and mental health programs to ensure you feel supported. Show more Show less

Posted 1 week ago

Apply

3.0 years

10 Lacs

Gurgaon

Remote

GlassDoor logo

Data Engineer – Azure This is a hands on data platform engineering role that places significant emphasis on consultative data engineering engagements with a wide range of customer stakeholders; Business Owners, Business Analytics, Data Engineering teams, Application Development, End Users and Management teams. You Will: Design and build resilient and efficient data pipelines for batch and real-time streaming Collaborate with product managers, software engineers, data analysts, and data scientists to build scalable and data-driven platforms and tools. Provide technical product expertise, advise on deployment architectures, and handle in-depth technical questions around data infrastructure, PaaS services, design patterns and implementation approaches. Collaborate with enterprise architects, data architects, ETL developers & engineers, data scientists, and information designers to lead the identification and definition of required data structures, formats, pipelines, metadata, and workload orchestration capabilities Address aspects such as data privacy & security, data ingestion & processing, data storage & compute, analytical & operational consumption, data modeling, data virtualization, self-service data preparation & analytics, AI enablement, and API integrations. Execute projects with an Agile mindset. Build software frameworks to solve data problems at scale. Technical Requirements: 3+ years of data engineering experience leading implementations of large-scale lakehouses on Databricks, Snowflake, or Synapse. Prior experience using DBT and PowerBI will be a plus. Extensive experience with Azure data services (Databricks, Synapse, ADF) and related azure infrastructure services like firewall, storage, key vault etc. is required. Strong programming / scripting experience using SQL and python and Spark. Knowledge of software configuration management environments and tools such as JIRA, Git, Jenkins, TFS, Shell, PowerShell, Bitbucket. Experience with Agile development methods in data-oriented projects Other Requirements: Highly motivated self-starter and team player and demonstrated success in prior roles. Track record of success working through technical challenges within enterprise organizations Ability to prioritize deals, training, and initiatives through highly effective time management Excellent problem solving, analytical, presentation, and whiteboarding skills Track record of success dealing with ambiguity (internal and external) and working collaboratively with other departments and organizations to solve challenging problems Strong knowledge of technology and industry trends that affect data analytics decisions for enterprise organizations Certifications on Azure Data Engineering and related technologies. "Remote postings are limited to candidates residing within the country specified in the posting location" About Rackspace Technology We are the multicloud solutions experts. We combine our expertise with the world’s leading technologies — across applications, data and security — to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future. More on Rackspace Technology Though we’re all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know.

Posted 1 week ago

Apply

Exploring Databricks Jobs in India

Databricks is a popular technology in the field of big data and analytics, and the job market for Databricks professionals in India is growing rapidly. Companies across various industries are actively looking for skilled individuals with expertise in Databricks to help them harness the power of data. If you are considering a career in Databricks, here is a detailed guide to help you navigate the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Chennai
  5. Mumbai

Average Salary Range

The average salary range for Databricks professionals in India varies based on experience level: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-25 lakhs per annum

Career Path

In the field of Databricks, a typical career path may include: - Junior Developer - Senior Developer - Tech Lead - Architect

Related Skills

In addition to Databricks expertise, other skills that are often expected or helpful alongside Databricks include: - Apache Spark - Python/Scala programming - Data modeling - SQL - Data visualization tools

Interview Questions

  • What is Databricks and how is it different from Apache Spark? (basic)
  • Explain the concept of lazy evaluation in Databricks. (medium)
  • How do you optimize performance in Databricks? (advanced)
  • What are the different cluster modes in Databricks? (basic)
  • How do you handle data skewness in Databricks? (medium)
  • Explain how you can schedule jobs in Databricks. (medium)
  • What is the significance of Delta Lake in Databricks? (advanced)
  • How do you handle schema evolution in Databricks? (medium)
  • What are the different file formats supported by Databricks for reading and writing data? (basic)
  • Explain the concept of checkpointing in Databricks. (medium)
  • How do you troubleshoot performance issues in Databricks? (advanced)
  • What are the key components of Databricks Runtime? (basic)
  • How can you secure your data in Databricks? (medium)
  • Explain the role of MLflow in Databricks. (advanced)
  • How do you handle streaming data in Databricks? (medium)
  • What is the difference between Databricks Community Edition and Databricks Workspace? (basic)
  • How do you set up monitoring and alerting in Databricks? (medium)
  • Explain the concept of Delta caching in Databricks. (advanced)
  • How do you handle schema enforcement in Databricks? (medium)
  • What are the common challenges faced in Databricks projects and how do you overcome them? (advanced)
  • How do you perform ETL operations in Databricks? (medium)
  • Explain the concept of MLflow Tracking in Databricks. (advanced)
  • How do you handle data lineage in Databricks? (medium)
  • What are the best practices for data governance in Databricks? (advanced)

Closing Remark

As you prepare for Databricks job interviews, make sure to brush up on your technical skills, stay updated with the latest trends in the field, and showcase your problem-solving abilities. With the right preparation and confidence, you can land your dream job in the exciting world of Databricks in India. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies