Home
Jobs

935 Data Bricks Jobs - Page 34

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

- 4 years

0 - 2 Lacs

Hyderabad

Work from Office

Naukri logo

Dear Aspirant, Greetings from TalentSmart!!! We are hiring for multiple positions in the Data Engineering & Analytics space. We are looking for both fresh graduates (B.Tech - CSE/ECE) and experienced professionals (3+ years) with strong knowledge of SQL and Python . This is a great opportunity to join a dynamic team working on real-world data problems and cloud technologies. Open Roles: Freshers B.Tech (CSE / ECE) 2023 / 2024 Passouts Strong communication skills Trained or self-learned in SQL and Python (mandatory) Good understanding of programming basics and databases Experienced Professionals 3+ Years Hands-on experience with SQL and Python (mandatory) Experience in data processing, ETL pipelines, and data analysis Exposure to Azure , Databricks , and PySpark is a plus Ability to write clean, optimized code for large-scale data sets Collaborate with cross-functional teams for data insights Salary: As per industry standards / based on experience Educational Qualification: B.Tech / B.E. in Computer Science, Electronics, or related fields How to Apply: Apply via Naukri.com or send your resume to: devi@talentsmart.co.in or jahnavi@talentsmart.co.in

Posted 1 month ago

Apply

8 - 10 years

15 - 27 Lacs

New Delhi, Hyderabad, Bengaluru

Hybrid

Naukri logo

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Principal Consultant- Databricks Developer and ETL! In this role, the Databricks Developer is responsible for solving the real world cutting edge problem to meet both functional and non-functional requirements. You would be part of the data integrity/analysis team in the Banking and financial domain. You will be responsible to independently build data analysis around complex business problems from data available in the client owned or accessible systems. For these tasks, you would be encouraged to understand the business ask/problem, assess the scope, quantity and quality of the available data, prepare and build the code using Pyspark/Databricks, Python programming and loading data in to DWH and Data Mart for downstream consumption team. Responsibilities • Extensive hands-on experience on Python (Pyspark) and Pyspark with SQL • The experience shall be to carry RDDs, Struct types and more on pyspark • Exposure to work on Databricks notebook for Pyspark and pyspark with sql coding • Good hands on to collaborate with AWS services using Python. • Experience with cloud technologies like AWS (S3, Redshift,SNS) • Expertise in developing ETL and batch processes to support data movement • Candidate shall be good in communication and SELF - Driven • May work in silos with his own deliverables and discussion points with onshore customer. Qualifications we seek in you! Minimum Qualifications / Skills • Degree [BE, Bsc] Preferred Qualifications Candidate must have good communication skills and client handling Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.

Posted 1 month ago

Apply

5 - 10 years

20 - 30 Lacs

Bengaluru

Remote

Naukri logo

ABOUT OPORTUN Oportun (Nasdaq: OPRT) is a mission-driven fintech that puts its 2.0 million members' financial goals within reach. With intelligent borrowing, savings, and budgeting capabilities, Oportun empowers members with the confidence to build a better financial future. Since inception, Oportun has provided more than $16.6 billion in responsible and affordable credit, saved its members more than $2.4 billion in interest and fees, and helped its members save an average of more than $1,800 annually. Oportun has been certified as a Community Development Financial Institution (CDFI) since 2009. WORKING AT OPORTUN Working at Oportun means enjoying a differentiated experience of being part of a team that fosters a diverse, equitable and inclusive culture where we all feel a sense of belonging and are encouraged to share our perspectives. This inclusive culture is directly connected to our organization's performance and ability to fulfill our mission of delivering affordable credit to those left out of the financial mainstream. We celebrate and nurture our inclusive culture through our employee resource groups. As a Sr. Data Engineer at Oportun, you will be a key member of our team, responsible for designing, developing, and maintaining sophisticated software / data platforms in achieving the charter of the engineering group. Your mastery of a technical domain enables you to take up business problems and solve them with a technical solution. With your depth of expertise and leadership abilities, you will actively contribute to architectural decisions, mentor junior engineers, and collaborate closely with cross-functional teams to deliver high-quality, scalable software solutions that advance our impact in the market. This is a role where you will have the opportunity to take up responsibility in leading the technology effort from technical requirements gathering to final successful delivery of the product - for large initiatives (cross-functional and multi-month-long projects). RESPONSIBILITIES Data Architecture and Design: Lead the design and implementation of scalable, efficient, and robust data architectures to meet business needs and analytical requirements. Collaborate with stakeholders to understand data requirements, build subject matter expertise, and define optimal data models and structures. Data Pipeline Development and Optimization: Design and develop data pipelines, ETL processes, and data integration solutions for ingesting, processing, and transforming large volumes of structured and unstructured data. Optimize data pipelines for performance, reliability, and scalability. Database Management and Optimization: Oversee the management and maintenance of databases, data warehouses, and data lakes to ensure high performance, data integrity, and security. Implement and manage ETL processes for efficient data loading and retrieval. Data Quality and Governance: Establish and enforce data quality standards, validation rules, and data governance practices to ensure data accuracy, consistency, and compliance with regulations. Drive initiatives to improve data quality and documentation of data assets. Mentorship and Leadership: Provide technical leadership and mentorship to junior team members, assisting in their skill development and growth. Lead and participate in code reviews, ensuring best practices and high-quality code. Collaboration and Stakeholder Management: Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand their data needs and deliver solutions that meet those needs. Communicate effectively with non-technical stakeholders to translate technical concepts into actionable insights and business value. Performance Monitoring and Optimization: Implement monitoring systems and practices to track data pipeline performance, identify bottlenecks, and optimize for improved efficiency and scalability. Common Requirements You have a strong understanding of a business or system domain with sufficient knowledge & expertise around the appropriate metrics and trends. You collaborate closely with product managers, designers, and fellow engineers to understand business needs and translate them into effective solutions. You provide technical leadership and expertise, guiding the team in making sound architectural decisions and solving challenging technical problems. Your solutions anticipate scale, reliability, monitoring, integration, and extensibility. You conduct code reviews and provide constructive feedback to ensure code quality, performance, and maintainability. You mentor and coach junior engineers, fostering a culture of continuous learning, growth, and technical excellence within the team. You play a significant role in the ongoing evolution and refinement of current tools and applications used by the team, and drive adoption of new practices within your team. You take ownership of (customer) issues, including initial troubleshooting, identification of root cause and issue escalation or resolution, while maintaining the overall reliability and performance of our systems. You set the benchmark for responsiveness and ownership and overall accountability of engineering systems. You independently drive and lead multiple features, contribute to (a) large project(s) and lead smaller projects. You can orchestrate work that spans multiples engineers within your team and keep all relevant stakeholders informed. You support your lead/EM about your work and that of the team, that they need to share with the stakeholders, including escalation of issues QUALIDICATIONS Bachelor's or Master's degree in Computer Science, Data Science, or a related field. 5+ years of experience in data engineering, with a focus on data architecture, ETL, and database management. Proficiency in programming languages like Python/PySpark and Java or Scala Expertise in big data technologies such as Hadoop, Spark, Kafka, etc. In-depth knowledge of SQL and experience with various database technologies (e.g., PostgreSQL, MariaDB, NoSQL databases). Experience and expertise in building complex end-to-end data pipelines. Experience with orchestration and designing job schedules using the CICD tools like Jenkins, Airflow or Databricks Ability to work in an Agile environment (Scrum, Lean, Kanban, etc) Ability to mentor junior team members. Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and their data services (e.g., AWS Redshift, S3, Azure SQL Data Warehouse). Strong leadership, problem-solving, and decision-making skills. Excellent communication and collaboration abilities. Familiarity or certification in Databricks is a plus. We are proud to be an Equal Opportunity Employer and consider all qualified applicants for employment opportunities without regard to race, age, color, religion, gender, national origin, disability, sexual orientation, veteran status or any other category protected by the laws or regulations in the locations where we operate.

Posted 1 month ago

Apply

3 - 6 years

20 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title - Data Science Consultant S&C GN Management Level : Consultant Location: Bangalore / Gurugram / Mumbai / Hyderabad / Chennai Must have skills: Gen AI / ML, SQL, Python, Azure / AWS, ML Ops Good to have skills: Experience in data science projects focused on Utilities, Oil & Gas, Mining, Manufacturing, Chemicals and Forest Product Job Summary :We are seeking a highly skilled and motivated Data Science Consultant to work on innovative projects and drive impactful solutions in domains such as Utilities, Oil & Gas, Mining, Manufacturing, Chemicals and Forest Product . This role requires hands-on technical expertise , and client delivery management skills to execute cutting-edge projects in Generative AI, data science, and cloud-based analytics. Key Responsibilities . 1.Data Science and Engineering Perform advanced analytics using Python, SQL, Pyspark using machine learning frameworks. Develop predictive models, recommendation systems, and optimization solutions tailored to business needs. Manage and preprocess large, complex datasets, ensuring efficient pipelines and advanced feature engineering across structured and unstructured data. Build MLOps pipelines for model training / retraining, monitoring, and scalability 2.Dashboard and Reporting Develop dashboards, reports, and insights to track the impact of deployed models on business outcomes in PowerBI/Tableau. Present results and recommendations to stakeholders, leveraging data storytelling to drive decision-making. 3.Cloud Platform Expertise Design and implement end-to-end data science workflows on cloud platforms (e.g., AWS, Azure, GCP) for business-critical projects. Leverage cloud-native tools and services (e.g., Databricks, ADF, Lambda, Glue, AzureML) for training, deploying, and monitoring machine learning models at scale. 4.Generative AI Expertise Lead the development of Generative AI based application and solutions leveraging frameworks like LangChain, LlamaIndex Drive model evaluation strategies using advanced metrics (e.g., BLEU, ROUGE, FID) and iteratively optimize performance for production-grade applications. Architect deployment solutions, including API development and seamless integration with existing systems. Required Qualifications Experience:4+ years in data science Education:Bachelors / master's degree in computer science, statistics, applied mathematics, or a related field. Industry Knowledge:Preferred experience in Utilities, Oil & Gas, Mining, Manufacturing, Chemicals and Forest Product Technical Skills Programming:Proficiency in Python, SQL and Pyspark. GenAI Expertise: Hands-on experience in building GenAI based applications and solutions Experience in deploying GenAI application in production. Cloud Platforms:Experience with Azure / AWS / GCP Visualization Tools:PowerBI / Tableau Preferred Skills Strong analytical and problem-solving skills with a results-oriented mindset. Good communication and stakeholder management capabilities. Very good in generating business insights and presenting to stakeholders. Qualifications Experience: 4-8 years in data science Educational Qualification: Bachelors / masters degree in computer science, statistics, applied mathematics, or a related field.

Posted 1 month ago

Apply

2 - 5 years

17 - 20 Lacs

Pune

Work from Office

Naukri logo

Job Title - Data Science Analyst S&C GN Management Level : Analyst Location: Bangalore / Gurugram / Mumbai / Hyderabad / Chennai Must have skills: Gen AI / ML, SQL, Python, Azure / AWS, ML Ops Good to have skills: Experience in data science projects focused on Utilities, Oil & Gas, Mining, Manufacturing, Chemicals and Forest Product Job Summary :We are seeking a highly skilled and motivated Data Science Analyst to work on innovative projects and drive impactful solutions in domains such as Utilities, Oil & Gas, Mining, Manufacturing, Chemicals and Forest Product . This role requires hands-on technical expertise , and client delivery management skills to execute cutting-edge projects in Generative AI, data science, and cloud-based analytics. Key Responsibilities . 1.Data Science and Engineering Perform advanced analytics using Python, SQL, Pyspark using machine learning frameworks. Develop predictive models, recommendation systems, and optimization solutions tailored to business needs. Manage and preprocess large, complex datasets, ensuring efficient pipelines and advanced feature engineering across structured and unstructured data. Build MLOps pipelines for model training / retraining, monitoring, and scalability 2.Dashboard and Reporting Develop dashboards, reports, and insights to track the impact of deployed models on business outcomes in PowerBI/Tableau. Present results and recommendations to stakeholders, leveraging data storytelling to drive decision-making. 3.Cloud Platform Expertise Design and implement end-to-end data science workflows on cloud platforms (e.g., AWS, Azure, GCP) for business-critical projects. Leverage cloud-native tools and services (e.g., Databricks, ADF, Lambda, Glue, AzureML) for training, deploying, and monitoring machine learning models at scale. 4.Generative AI Expertise Lead the development of Generative AI based application and solutions leveraging frameworks like LangChain, LlamaIndex Drive model evaluation strategies using advanced metrics (e.g., BLEU, ROUGE, FID) and iteratively optimize performance for production-grade applications. Architect deployment solutions, including API development and seamless integration with existing systems. Required Qualifications Experience:1-5 years in data science Education:Bachelors / master's degree in computer science, statistics, applied mathematics, or a related field. Industry Knowledge:Preferred experience in Utilities, Oil & Gas, Mining, Manufacturing, Chemicals and Forest Product Technical Skills Programming:Proficiency in Python, SQL and Pyspark. GenAI Expertise: Hands-on experience in building GenAI based applications and solutions Experience in deploying GenAI application in production. Cloud Platforms:Experience with Azure / AWS / GCP Visualization Tools:PowerBI / Tableau Preferred Skills Strong analytical and problem-solving skills with a results-oriented mindset. Good communication and stakeholder management capabilities. Very good in generating business insights and presenting to stakeholders. About Our Company | Accenture Qualifications Experience: 1-5 years in data science Educational Qualification: Bachelors / master’s degree in computer science, statistics, applied mathematics, or a related field.

Posted 1 month ago

Apply

3 - 6 years

20 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title - Data Science Consultant S&C GN Management Level : Consultant Location: Bangalore / Gurugram / Mumbai / Hyderabad / Chennai Must have skills: Gen AI / ML, SQL, Python, Azure / AWS, ML Ops Good to have skills: Experience in data science projects focused on Utilities, Oil & Gas, Mining, Manufacturing, Chemicals and Forest Product Job Summary :We are seeking a highly skilled and motivated Data Science Consultant to work on innovative projects and drive impactful solutions in domains such as Utilities, Oil & Gas, Mining, Manufacturing, Chemicals and Forest Product . This role requires hands-on technical expertise , and client delivery management skills to execute cutting-edge projects in Generative AI, data science, and cloud-based analytics. Key Responsibilities . 1.Data Science and Engineering Perform advanced analytics using Python, SQL, Pyspark using machine learning frameworks. Develop predictive models, recommendation systems, and optimization solutions tailored to business needs. Manage and preprocess large, complex datasets, ensuring efficient pipelines and advanced feature engineering across structured and unstructured data. Build MLOps pipelines for model training / retraining, monitoring, and scalability 2.Dashboard and Reporting Develop dashboards, reports, and insights to track the impact of deployed models on business outcomes in PowerBI/Tableau. Present results and recommendations to stakeholders, leveraging data storytelling to drive decision-making. 3.Cloud Platform Expertise Design and implement end-to-end data science workflows on cloud platforms (e.g., AWS, Azure, GCP) for business-critical projects. Leverage cloud-native tools and services (e.g., Databricks, ADF, Lambda, Glue, AzureML) for training, deploying, and monitoring machine learning models at scale. 4.Generative AI Expertise Lead the development of Generative AI based application and solutions leveraging frameworks like LangChain, LlamaIndex Drive model evaluation strategies using advanced metrics (e.g., BLEU, ROUGE, FID) and iteratively optimize performance for production-grade applications. Architect deployment solutions, including API development and seamless integration with existing systems. Required Qualifications Experience:4+ years in data science Education:Bachelors / master's degree in computer science, statistics, applied mathematics, or a related field. Industry Knowledge:Preferred experience in Utilities, Oil & Gas, Mining, Manufacturing, Chemicals and Forest Product Technical Skills Programming:Proficiency in Python, SQL and Pyspark. GenAI Expertise: Hands-on experience in building GenAI based applications and solutions Experience in deploying GenAI application in production. Cloud Platforms:Experience with Azure / AWS / GCP Visualization Tools:PowerBI / Tableau Preferred Skills Strong analytical and problem-solving skills with a results-oriented mindset. Good communication and stakeholder management capabilities. Very good in generating business insights and presenting to stakeholders. About Our Company | Accenture Qualifications Experience: 4-8 years in data science Educational Qualification: Bachelors / master’s degree in computer science, statistics, applied mathematics, or a related field.

Posted 1 month ago

Apply

2 - 5 years

17 - 20 Lacs

Gurugram

Work from Office

Naukri logo

Job Title - Data Science Consultant S&C GN Management Level : Consultant Location: Bangalore / Gurugram / Mumbai / Hyderabad / Chennai Must have skills: Gen AI / ML, SQL, Python, Azure / AWS, ML Ops Good to have skills: Experience in data science projects focused on Utilities, Oil & Gas, Mining, Manufacturing, Chemicals and Forest Product Job Summary :We are seeking a highly skilled and motivated Data Science Consultant to work on innovative projects and drive impactful solutions in domains such as Utilities, Oil & Gas, Mining, Manufacturing, Chemicals and Forest Product . This role requires hands-on technical expertise , and client delivery management skills to execute cutting-edge projects in Generative AI, data science, and cloud-based analytics. Key Responsibilities . 1.Data Science and Engineering Perform advanced analytics using Python, SQL, Pyspark using machine learning frameworks. Develop predictive models, recommendation systems, and optimization solutions tailored to business needs. Manage and preprocess large, complex datasets, ensuring efficient pipelines and advanced feature engineering across structured and unstructured data. Build MLOps pipelines for model training / retraining, monitoring, and scalability 2.Dashboard and Reporting Develop dashboards, reports, and insights to track the impact of deployed models on business outcomes in PowerBI/Tableau. Present results and recommendations to stakeholders, leveraging data storytelling to drive decision-making. 3.Cloud Platform Expertise Design and implement end-to-end data science workflows on cloud platforms (e.g., AWS, Azure, GCP) for business-critical projects. Leverage cloud-native tools and services (e.g., Databricks, ADF, Lambda, Glue, AzureML) for training, deploying, and monitoring machine learning models at scale. 4.Generative AI Expertise Lead the development of Generative AI based application and solutions leveraging frameworks like LangChain, LlamaIndex Drive model evaluation strategies using advanced metrics (e.g., BLEU, ROUGE, FID) and iteratively optimize performance for production-grade applications. Architect deployment solutions, including API development and seamless integration with existing systems. Required Qualifications Experience:4+ years in data science Education:Bachelors / master's degree in computer science, statistics, applied mathematics, or a related field. Industry Knowledge:Preferred experience in Utilities, Oil & Gas, Mining, Manufacturing, Chemicals and Forest Product Technical Skills Programming:Proficiency in Python, SQL and Pyspark. GenAI Expertise: Hands-on experience in building GenAI based applications and solutions Experience in deploying GenAI application in production. Cloud Platforms:Experience with Azure / AWS / GCP Visualization Tools:PowerBI / Tableau Preferred Skills Strong analytical and problem-solving skills with a results-oriented mindset. Good communication and stakeholder management capabilities. Very good in generating business insights and presenting to stakeholders. About Our Company | Accenture Qualifications Experience: 4-8 years in data science Educational Qualification: Bachelors / master’s degree in computer science, statistics, applied mathematics, or a related field.

Posted 1 month ago

Apply

4 - 8 years

20 - 25 Lacs

Gurugram

Work from Office

Naukri logo

Job Title - Data Science Consultant S&C GN Management Level : Consultant Location: Bangalore / Gurugram / Mumbai / Hyderabad / Chennai Must have skills: Gen AI / ML, SQL, Python, Azure / AWS, ML Ops Good to have skills: Experience in data science projects focused on Utilities, Oil & Gas, Mining, Manufacturing, Chemicals and Forest Product Job Summary :We are seeking a highly skilled and motivated Data Science Consultant to work on innovative projects and drive impactful solutions in domains such as Utilities, Oil & Gas, Mining, Manufacturing, Chemicals and Forest Product . This role requires hands-on technical expertise , and client delivery management skills to execute cutting-edge projects in Generative AI, data science, and cloud-based analytics. Key Responsibilities . 1.Data Science and Engineering Perform advanced analytics using Python, SQL, Pyspark using machine learning frameworks. Develop predictive models, recommendation systems, and optimization solutions tailored to business needs. Manage and preprocess large, complex datasets, ensuring efficient pipelines and advanced feature engineering across structured and unstructured data. Build MLOps pipelines for model training / retraining, monitoring, and scalability 2.Dashboard and Reporting Develop dashboards, reports, and insights to track the impact of deployed models on business outcomes in PowerBI/Tableau. Present results and recommendations to stakeholders, leveraging data storytelling to drive decision-making. 3.Cloud Platform Expertise Design and implement end-to-end data science workflows on cloud platforms (e.g., AWS, Azure, GCP) for business-critical projects. Leverage cloud-native tools and services (e.g., Databricks, ADF, Lambda, Glue, AzureML) for training, deploying, and monitoring machine learning models at scale. 4.Generative AI Expertise Lead the development of Generative AI based application and solutions leveraging frameworks like LangChain, LlamaIndex Drive model evaluation strategies using advanced metrics (e.g., BLEU, ROUGE, FID) and iteratively optimize performance for production-grade applications. Architect deployment solutions, including API development and seamless integration with existing systems. Required Qualifications Experience:4+ years in data science Education:Bachelors / master's degree in computer science, statistics, applied mathematics, or a related field. Industry Knowledge:Preferred experience in Utilities, Oil & Gas, Mining, Manufacturing, Chemicals and Forest Product Technical Skills Programming:Proficiency in Python, SQL and Pyspark. GenAI Expertise: Hands-on experience in building GenAI based applications and solutions Experience in deploying GenAI application in production. Cloud Platforms:Experience with Azure / AWS / GCP Visualization Tools:PowerBI / Tableau Preferred Skills Strong analytical and problem-solving skills with a results-oriented mindset. Good communication and stakeholder management capabilities. Very good in generating business insights and presenting to stakeholders. Qualifications Experience: 4-8 years in data science Educational Qualification: Bachelors / masters degree in computer science, statistics, applied mathematics, or a related field.

Posted 1 month ago

Apply

5 - 7 years

9 - 13 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in the development and maintenance of the data platform components, contributing to the overall success of the organization. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Assist with the data platform blueprint and design. Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Develop and maintain data platform components. Contribute to the overall success of the organization. Professional & Technical Skills: Must To Have Skills:Proficiency in Databricks Unified Data Analytics Platform. Strong understanding of statistical analysis and machine learning algorithms. Experience with data visualization tools such as Tableau or Power BI. Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform. This position is based at our Chennai office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 1 month ago

Apply

7 - 11 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for creating efficient and effective applications that align with the organization's goals and objectives. Your typical day will involve collaborating with cross-functional teams, analyzing business requirements, designing application architecture, coding and testing applications, and ensuring their successful deployment and maintenance. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Design, build, and configure applications based on business process and application requirements. Analyze business requirements and translate them into technical specifications. Collaborate with cross-functional teams to ensure the successful implementation of applications. Code and test applications to ensure their functionality and performance. Ensure the efficient deployment and maintenance of applications. Professional & Technical Skills: Must To Have Skills:Proficiency in Databricks Unified Data Analytics Platform. Strong understanding of statistical analysis and machine learning algorithms. Experience with data visualization tools such as Tableau or Power BI. Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform. This position is based at our Pune office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 1 month ago

Apply

2 - 6 years

4 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

Job Title S&C GN AI Managed service Data Operations Analyst Management Level:11 Analyst Location:Hyderabad Must have skills:Python, Pyspark, Databricks, AI/ML Good to have skills:cloud platforms like AWS, Azure, or Google Cloud for deploying and scaling language models, Exposure to Retail, Banking, Healthcare projects. Job Summary : An opportunity to work on high-visibility projects with top clients around the globe. Potential to Co-create with leaders in strategy, industry experts, enterprise function practitioners, and business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed responsible business into everything"from how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge, and capabilities. Opportunity to thrive in a culture that is committed to accelerating equality for all. Engage in boundaryless collaboration across the entire organization. Roles & Responsibilities: As a Data Operations Analyst, you would be responsible to ensure our esteemed business is fully supported in using the business-critical AI enabled applications. This involves solving day-to-day application issues, business queries, addressing adhoc data requests to ensure clients can extract maximum value for the AI applications. Debug issues related to data loads, batch pipeline, application functionality including special handling of data/batch streams. Monitor and maintain pre-processing pipelines, model execution batches and validation of model outputs. In case of deviations or model degradation, take up detailed root cause analysis and implement permanent fixes. As a Data Operations Analyst, you would be working on initial triaging of code related defects/issues, provide root cause analysis and implement code fix for permanent resolution of the defect. Design, build, test and deploy small to medium size enhancements that deliver value to business and enhance application availability and usability. Responsible for sanity testing of use cases as part of pre-deployment and post-production activities. Primarily responsible for Application availability and stability by remediating application issues/bugs or other vulnerabilities. Data Operations Analysts evolve to become Subject Matter Experts as they mature in servicing the applications. Professional & Technical Skills: Bachelor's or Master's degree in any engineering stream or MCA. Experience/Education on Statistics, Data Science, Applied Mathematics, Business Analytics, Computer Science, Information Systems is preferable. Proven experience (2+ years) in working as per the above job description is required. Exposure to Retail, Banking, Healthcare projects is added advantage. Proficiency and hands-on experience in Data Engineering technologies like Python, R, SQL, Spark, Pyspark, Databricks, Hadoop etc. Ability to work with large data sets and present findings/insights to key stakeholders; Data management using databases like SQL. Experience with any of the cloud platforms like AWS, Azure, or Google Cloud for deploying and scaling language models. Experience with any of the data visualization tools like Tableau, Qlikview, and Spotfire is good. Knowledge on PowerBI & PowerApps is an added advantage. Excellent analytical and problem-solving skills, with a data-driven mindset. Proficient in Excel, MS Word, PowerPoint, etc. Ability to solve complex business problems and deliver client delight. Strong writing skills to build points of view on current industry trends. Good Client handling skills; able to demonstrate thought leadership & problem-solving skills. Qualifications Experience:Minimum 2+ year(s) of experience is required Educational Qualification:B.Tech/BE or MCA

Posted 1 month ago

Apply

5 - 9 years

15 - 19 Lacs

Mumbai, Gurugram, Bengaluru

Work from Office

Naukri logo

Entity :- Accenture Strategy & Consulting Team :- Global Network Data & AI Practice :- Insurance Analytics Title :- Ind & Func AI Decision Science Consultant Job location :- Bangalore/Gurgaon/Mumbai/Hyderabad/Pune/Chennai About S&C - Global Network :- Accenture Global Network - Data & AI practice help our clients grow their business in entirely new ways. Analytics enables our clients to achieve high performance through insights from data - insights that inform better decisions and strengthen customer relationships. From strategy to execution, Accenture works with organizations to develop analytic capabilities - from accessing and reporting on data to predictive modelling - to outperform the competition. S&C - GN - Insurance Data & AI Practice helps our clients grow their business in entirely new ways. From strategy to execution, Accenture works with Property & Casualty insurers, Life & Retirement insurers, reinsurers and brokers across the value chain from Underwriting to Claims to Servicing and Enterprise Functions to develop analytic capabilities from accessing and reporting on data to predictive modelling to Generative AI that outperform the competition. We offer deep technical expertise in AI/ML tools, techniques & methods, along with strong strategy & consulting acumen and insurance domain knowledge. Our unique assets & accelerators coupled with diverse insurance insights and capabilities help us bring exceptional value to our clients. WHAT'S IN IT FOR YOU? Accenture Global Network is a unified powerhouse that combines the capabilities of Strategy & Consulting with the force multipliers of Data and Artificial Intelligence. It is central to Accenture's future growth and Accenture is deeply invested in providing individuals with continuous opportunities for learning and growth. What you would do in this role Design, create, validate and refine prompts for Large Language Models (LLMs), for different client problems Employ techniques to guide and enhance model responses Develop effective AI interactions through proficient programming and utilization of playgrounds Utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production-ready quality Interface with clients/account team to understand engineering/business problems and translate it into analytics problems that shall deliver insights for action and operational improvements Consume data from multiple sources and present relevant information in a crisp and digestible manner that delivers valuable insights to both technical and non-technical audiences Mentor junior prompt engineers in both technical and softer aspects of the role Qualifications Who we are looking for? 5+ years experience in data-driven techniques including exploratory data analysis and data pre-processing, machine learning to solve business problems Bachelor's/Master's degree in Mathematics, Statistics, Economics, Computer Science, or related field Solid foundation in Statistical Modeling, Machine Learning algorithms, GenAI, LLMs, RAG architecture and Lang chain frameworks Proficiency in programming languages such as Python, PySpark, SQL or Scala Strong communication and presentation skills to effectively convey complex data insights and recommendations to clients and stakeholders In-depth knowledge and hands-on experience with Azure, AWS or Databricks tools. Relevant certifications in Azure are highly desirable Prior Insurance industry experience is preferred

Posted 1 month ago

Apply

7 - 9 years

19 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role Job Title: Industry & Function AI Decision Science Manager + S&C GN Management Level:07 - Manager Location: Primary Bengaluru, Secondary Gurugram Must-Have Skills: Consumer Goods & Services domain expertise , AI & ML, Proficiency in Python, R, PySpark, SQL , Experience in cloud platforms (Azure, AWS, GCP) , Expertise in Revenue Growth Management, Pricing Analytics, Promotion Analytics, PPA/Portfolio Optimization, Trade Investment Optimization. Good-to-Have Skills: Experience with Large Language Models (LLMs) like ChatGPT, Llama 2, or Claude 2 , Familiarity with optimization methods, advanced visualization tools (Power BI, Tableau), and Time Series Forecasting Job Summary :As a Decision Science Manager , you will lead the design and delivery of AI solutions in the Consumer Goods & Services domain. This role involves working closely with clients to provide advanced analytics and AI-driven strategies that deliver measurable business outcomes. Your expertise in analytics, problem-solving, and team leadership will help drive innovation and value for the organization. Roles & Responsibilities: Analyze extensive datasets and derive actionable insights for Consumer Goods data sources (e.g., Nielsen, IRI, EPOS, TPM). Evaluate AI and analytics maturity in the Consumer Goods sector and develop data-driven solutions. Design and implement AI-based strategies to deliver significant client benefits. Employ structured problem-solving methodologies to address complex business challenges. Lead data science initiatives, mentor team members, and contribute to thought leadership. Foster strong client relationships and act as a key liaison for project delivery. Build and deploy advanced analytics solutions using Accenture's platforms and tools. Apply technical proficiency in Python, Pyspark, R, SQL, and cloud technologies for solution deployment. Develop compelling data-driven narratives for stakeholder engagement. Collaborate with internal teams to innovate, drive sales, and build new capabilities. Drive insights in critical Consumer Goods domains such as: Revenue Growth Management Pricing Analytics and Pricing Optimization Promotion Analytics and Promotion Optimization SKU Rationalization/ Portfolio Optimization Price Pack Architecture Decomposition Models Time Series Forecasting Professional & Technical Skills: Proficiency in AI and analytics solutions (descriptive, diagnostic, predictive, prescriptive, generative). Expertise in delivering large scale projects/programs for Consumer Goods clients on Revenue Growth Management - Pricing Analytics, Promotion Analytics, Portfolio Optimization, etc. Deep and clear understanding of typical data sources used in RGM programs POS, Syndicated, Shipment, Finance, Promotion Calendar, etc. Strong programming skills in Python, R, PySpark, SQL, and experience with cloud platforms (Azure, AWS, GCP) and proficient in using services like Databricks and Sagemaker. Deep knowledge of traditional and advanced machine learning techniques, including deep learning. Experience with optimization techniques (linear, nonlinear, evolutionary methods). Familiarity with visualization tools like Power BI, Tableau. Experience with Large Language Models (LLMs) like ChatGPT, Llama 2. Certifications in Data Science or related fields. Additional Information: The ideal candidate has a strong educational background in data science and a proven track record in delivering impactful AI solutions in the Consumer Goods sector. This position offers opportunities to lead innovative projects and collaborate with global teams. Join Accenture to leverage cutting-edge technologies and deliver transformative business outcomes. About Our Company | Qualifications Experience: Minimum 7-9 years of experience in data science, particularly in the Consumer Goods sector Educational Qualification: Bachelors or Masters degree in Statistics, Economics, Mathematics, Computer Science, or MBA (Data Science specialization preferred)

Posted 1 month ago

Apply

3 - 8 years

9 - 13 Lacs

Ahmedabad

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in shaping the data platform components. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Collaborate with cross-functional teams to design and implement data platform solutions. Develop and maintain data pipelines for efficient data processing. Implement data security and privacy measures to protect sensitive information. Optimize data storage and retrieval processes for improved performance. Conduct regular data platform performance monitoring and troubleshooting. Professional & Technical Skills: Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. Strong understanding of cloud-based data platforms. Experience with data modeling and database design. Hands-on experience with ETL processes and tools. Knowledge of data governance and compliance standards. Additional Information: The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform. This position is based at our Ahmedabad office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 1 month ago

Apply

5 - 10 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Entity:- Accenture Strategy & Consulting Team:- Global Network – Data & AI Practice:- S&C GN AI - Insurance Title:- Decision Science Specialist Job location:- Anywhere in India About S&C - Global Network:- Accenture Global Network – Insurance Data & AI Practice helps our clients grow their business in entirely new ways. From strategy to execution, Accenture works with Property & Casualty insurers, Life & Retirement insurers, reinsurers and brokers across the value chain – from Underwriting to Claims to Servicing and Enterprise Functions – to develop analytic capabilities – from accessing and reporting on data to predictive modelling to Generative AI – that outperform the competition. We offer deep technical expertise in AI/ML tools, techniques & methods, along with strong strategy & consulting acumen and insurance domain knowledge. Our unique assets & accelerators coupled with diverse insurance insights and capabilities help us bring exceptional value to our clients. WHAT'S IN IT FOR YOU? Accenture Global Network is a unified powerhouse that combines the capabilities of Strategy & Consulting with the force multipliers of Data and Artificial Intelligence. It is central to Accenture's future growth and Accenture is deeply invested in providing individuals with continuous opportunities for learning and growth. You'll be part of a diverse, vibrant, global community, continually pushing the boundaries of business capabilities. Accenture is also ranked 10th* on the 2023 Worlds Best Workplaces list, making it a great place to work. What you would do in this role Help the team architect, design, build, deploy, deliver, and monitor advanced analytics models including GenAI, for different client problems Develop functional aspects of Generative AI pipelines, such as information retrieval, storage techniques, and system optimizations across services for a chosen cloud platform such as Azure or AWS Interface with clients/account team to understand engineering/business problems and translate into analytics problems that shall deliver insights for action and operational improvements consume data from multiple sources and present relevant information in a crisp and digestible manner that delivers valuable insights to both technical and non-technical audiences Qualifications Who we are looking for? 5+ years experience in data-driven techniques including exploratory data analysis, data pre-processing, machine learning, and visualization to solve business problems Bachelor's/Master’s degree in Mathematics, Statistics, Economics , Computer Science, or related field Solid foundation in Statistical Modeling, Machine Learning algorithms Advanced proficiency in programming languages such as Python, PySpark, SQL, Scala Experience implementing AI solutions for Insurance industry Experience in production-grade integration of AI/ML pipelines – either upstream (data modeling, engineering, management) or downstream (ML Ops in cloud or on-prem, UI integration with APIs, Visualization in Tableau/PowerBI) required; experience in full-stack implementations preferred Strong communication, collaboration and presentation skills to effectively convey complex data insights and recommendations to clients and stakeholders Hands-on experience with Azure, AWS or Databricks tools is a plus Familiarity with GenAI, LLMs, RAG architecture and Lang chain frameworks is a plus

Posted 1 month ago

Apply

2 - 4 years

8 - 12 Lacs

Gurugram

Work from Office

Naukri logo

Entity:- Accenture Strategy & Consulting Team:- Global Network Data & AI Practice:- S&C GN AI - Insurance Title:- Decision Science Specialist Job location:- Anywhere in India About S&C - Global Network:- Accenture Global Network Insurance Data & AI Practice helps our clients grow their business in entirely new ways. From strategy to execution, Accenture works with Property & Casualty insurers, Life & Retirement insurers, reinsurers and brokers across the value chain from Underwriting to Claims to Servicing and Enterprise Functions to develop analytic capabilities from accessing and reporting on data to predictive modelling to Generative AI that outperform the competition. We offer deep technical expertise in AI/ML tools, techniques & methods, along with strong strategy & consulting acumen and insurance domain knowledge. Our unique assets & accelerators coupled with diverse insurance insights and capabilities help us bring exceptional value to our clients. WHAT'S IN IT FOR YOU? Accenture Global Network is a unified powerhouse that combines the capabilities of Strategy & Consulting with the force multipliers of Data and Artificial Intelligence. It is central to Accentures future growth and Accenture is deeply invested in providing individuals with continuous opportunities for learning and growth. You'll be part of a diverse, vibrant, global community, continually pushing the boundaries of business capabilities. Accenture is also ranked 10th* on the 2023 Worlds Best Workplaces list, making it a great place to work.What you would do in this role Help the team architect, design, build, deploy, deliver, and monitor advanced analytics models including GenAI, for different client problems Develop functional aspects of Generative AI pipelines, such as information retrieval, storage techniques, and system optimizations across services for a chosen cloud platform such as Azure or AWS Interface with clients/account team to understand engineering/business problems and translate into analytics problems that shall deliver insights for action and operational improvements consume data from multiple sources and present relevant information in a crisp and digestible manner that delivers valuable insights to both technical and non-technical audiences Qualifications Who we are looking for?2-4 years experience in data-driven techniques including exploratory data analysis and data pre-processing, machine learning to solve business problemsBachelors/Masters degree in Mathematics, Statistics, Economics , Computer Science, or related fieldSolid foundation in Statistical Modeling, Machine Learning algorithms, GenAI, LLMs, RAG architecture and Lang chain frameworksProficiency in programming languages such as Python, PySpark, SQL or ScalaStrong communication and presentation skills to effectively convey complex data insights and recommendations to clients and stakeholders, as well as to engineer high-quality promptsIn-depth knowledge and hands-on experience with Azure, AWS or Databricks tools. Relevant certifications in Azure/ AWS are highly desirablePrior Insurance industry experience is preferred

Posted 1 month ago

Apply

5 - 10 years

7 - 12 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Technology Architect Project Role Description : Review and integrate all application requirements, including functional, security, integration, performance, quality and operations requirements. Review and integrate the technical architecture requirements. Provide input into final decisions regarding hardware, network products, system software and security. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Cloud Data Architecture Minimum 5 year(s) of experience is required Educational Qualification : BE or MCA Summary :As a Technology Architect, you will be responsible for reviewing and integrating all application requirements, including functional, security, integration, performance, quality, and operations requirements. Your typical day will involve reviewing and integrating technical architecture requirements, providing input into final decisions regarding hardware, network products, system software, and security. Roles & Responsibilities:Should have a minimum of 8 years of experience in Databricks Unified Data Analytics Platform. Good experience in implementing data ingestion pipelines from multiple sources and creating end to end data pipeline on Databricks platform Should have strong educational background in technology and information architectures, along with a proven track record of delivering impactful data-driven solutions. Strong requirement analysis and technical solutioning skill in Data and Analytics Client facing role in terms of running solution workshops, client visits, handled large RFP pursuits and managed multiple stakeholders. Technical Experience:6 or more years of experience in implementing data ingestion pipelines from multiple sources and creating end to end data pipeline on Databricks platform. 2 or more years of experience using Python, PySpark or Scala. Experience in Databricks on cloud. Exp in any of AWS, Azure or GCPe, ETL, data engineering, data cleansing and insertion into a data warehouse Must have Skills like Databricks, Cloud Data Architecture, Python Programming Language, Data Engineering. Professional Attributes:Excellent writing, communication and presentation skills. Eagerness to learn and develop self on an ongoing basis. Excellent client facing and interpersonal skills. Qualifications BE or MCA

Posted 1 month ago

Apply

6 - 10 years

15 - 20 Lacs

Gurugram, Delhi / NCR

Work from Office

Naukri logo

Role & responsibilities 1. Pipeline Development and Support Design, build, and optimize scalable ETL pipelines on Databricks using PySpark, SQL, and Delta Lake. Work with structured and semi-structured insurance data (policy, claims, actuarial, risk, customer data) from multiple sources. Implement data quality checks, governance, and monitoring across pipelines. Collaborate with data scientists, actuaries, and business stakeholders to translate analytics requirements into data models. Develop and deliver compelling visualizations and dashboards using Databricks SQL, Power BI, Tableau, or similar tools. Monitor and troubleshoot pipeline issues, ensuring data integrity and resolving bottlenecks or failures. Optimize Databricks clusters for performance and cost efficiency. Support ML model deployment pipelines in collaboration with data science teams. Document pipelines, workflows, and architecture following best practices. 2. SQL Write complex SQL queries to extract, transform, and load (ETL) data for reporting, analytics, or downstream applications. Optimize SQL queries for performance, especially when working with large datasets in Snowflake or other relational databases. Create and maintain database schemas, tables, views, and stored procedures to support business requirements. 3. Data Integration Integrate data from diverse sources (e.g., on-premises databases, cloud storage like S3 or Azure Blob, or third-party APIs) into a unified system. Ensure data consistency, quality, and availability by implementing data validation and cleansing processes. 4. Good to have skills Insurance domain experience P&C or L&A domain experience candidate will be preferred Team player / strong communication skills Experience with MLflow, feature stores, and model monitoring. Hands-on experience with data governance tools (e.g., Unity Catalog, Collibra). Familiarity with regulatory and compliance requirements in insurance data. Skills Typically Required 5+ years of experience in data engineering, with at least 2+ years hands-on with Databricks and Spark. Strong proficiency in PySpark, SQL, Delta Lake, and data modeling. Solid understanding of cloud platforms (Azure, AWS, or GCP) and data lake architectures. Experience integrating Databricks with BI tools (Power BI, Tableau, Looker) for business-facing dashboards. Knowledge of insurance data (L&A, P&C) and industry metrics is highly preferred. Familiarity with DevOps tools (Git, CI/CD pipelines) and orchestration tools (Airflow, Databricks Jobs). Strong communication skills to explain technical concepts to business stakeholders

Posted 1 month ago

Apply

7 - 12 years

9 - 14 Lacs

Kolkata

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : Engineering graduate preferably Computer Science graduate 15 years of full time education Summary :As a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Roles & Responsibilities: Assist with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform. Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Develop and maintain data pipelines and ETL processes using Databricks Unified Data Analytics Platform. Design and implement data security and access controls for the data platform. Troubleshoot and resolve issues related to the data platform and data pipelines. Professional & Technical Skills: Must To Have Skills:Experience with Databricks Unified Data Analytics Platform. Must To Have Skills:Strong understanding of data modeling and database design principles. Good To Have Skills:Experience with cloud-based data platforms such as AWS or Azure. Good To Have Skills:Experience with data security and access controls. Good To Have Skills:Experience with data visualization tools such as Tableau or Power BI. Additional Information: The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Bangalore, Hyderabad, Chennai and Pune Offices. Mandatory office (RTO) for 2- 3 days and have to work on 2 shifts (Shift A- 10:00am to 8:00pm IST and Shift B - 12:30pm to 10:30 pm IST) Qualifications Engineering graduate preferably Computer Science graduate 15 years of full time education

Posted 1 month ago

Apply

7 - 12 years

9 - 14 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : BTech or Equivalent Min 15 years of education Summary :As an AI Advisor, you will be responsible for driving business outcomes for clients through analytics using Databricks Unified Data Analytics Platform. Your typical day will involve supporting delivery leads, account management, and operational excellence teams to deliver client value through analytics and industry best practices. Roles & Responsibilities: Lead the development and deployment of advanced analytics solutions using Databricks Unified Data Analytics Platform. Conduct detailed analysis of complex data sets, employing statistical methodologies and data munging techniques for actionable insights. Collaborate with cross-functional teams, applying expertise in diverse analytics techniques, including experience in implementing various algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. Communicate technical findings effectively to stakeholders, utilizing data visualization tools for clarity. Stay updated with the latest advancements in analytics and data science, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: Must To Have Skills:Experience with Databricks Unified Data Analytics Platform. Strong understanding of statistical analysis and machine learning algorithms. Experience with data visualization tools. Experience in implementing various analytics techniques such as linear regression, logistic regression, decision trees, and clustering algorithms. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform. The ideal candidate will possess a strong educational background in statistics, mathematics, computer science, or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Bengaluru office. Qualifications BTech or Equivalent Min 15 years of education

Posted 1 month ago

Apply

12 - 17 years

14 - 19 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Python (Programming Language), Microsoft Azure Databricks, Microsoft Azure Data Services Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for creating efficient and scalable applications that align with the organization's goals and objectives. Your typical day will involve collaborating with cross-functional teams, analyzing business requirements, and developing innovative solutions to meet customer needs. You will also be involved in testing, debugging, and troubleshooting applications to ensure their smooth functioning and optimal performance. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Expected to provide solutions to problems that apply across multiple teams. Develop and maintain high-quality software applications. Collaborate with business analysts and stakeholders to gather and analyze requirements. Design and implement application features and enhancements. Perform code reviews and ensure adherence to coding standards. Troubleshoot and debug application issues. Optimize application performance and scalability. Conduct unit testing and integration testing. Document application design, functionality, and processes. Stay updated with emerging technologies and industry trends. Provide technical guidance and mentorship to junior team members. Professional & Technical Skills: Must To Have Skills:Proficiency in Databricks Unified Data Analytics Platform, Python (Programming Language), Microsoft Azure Databricks, Microsoft Azure Data Services. Strong understanding of statistical analysis and machine learning algorithms. Experience with data visualization tools such as Tableau or Power BI. Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: The candidate should have a minimum of 12 years of experience in Databricks Unified Data Analytics Platform. This position is based at our Bengaluru office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 1 month ago

Apply

5 - 10 years

16 - 27 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

If interested pls share the below details on PriyaM4@hexaware.com Total Exp CTC ECTC NP Loc MUST Have skill- Unity Catalog We are looking for a skilled Sr Data Engineer /with expertise in Databricks and Unity Catalog to design, implement, and manage scalable data solutions. Key Responsibilities: • Design and implement scalable data pipelines and ETL workflows using Databricks. • Implement Unity Catalog for data governance, access control, and metadata management across multiple workspaces. • Develop Delta Lake architectures for optimized data storage and retrieval. • Establish best practices for data security, compliance, and lineage tracking in Unity Catalog. • Optimize data lakehouse architecture for performance and cost efficiency. • Collaborate with data scientists, engineers, and business teams to support analytical workloads. • Monitor and troubleshoot Databricks clusters, performance tuning, and cost management. • Implement data quality frameworks and observability solutions to maintain high data integrity. • Work with Azure/AWS/GCP cloud environments to deploy and manage data solutions. Required Skills & Qualifications: • 8-19 years of experience in data engineering, data architecture, or cloud data solutions. • Strong hands-on experience with Databricks and Unity Catalog. • Expertise in PySpark, Scala, or SQL for data processing. • Deep understanding of Delta Lake, Lakehouse architecture, and data partitioning strategies. • Experience with RBAC, ABAC, and access control mechanisms within Unity Catalog. • Knowledge of data governance, compliance standards (GDPR, HIPAA, etc.), and audit logging. • Familiarity with cloud platforms (Azure, AWS, or GCP) and their respective data services. • Strong understanding of CI/CD pipelines, DevOps, and Infrastructure as Code (IaC). • Experience integrating BI tools (Tableau, Power BI, Looker) and ML frameworks is a plus. • Excellent problem-solving, communication, and collaboration skills.

Posted 1 month ago

Apply

3 - 8 years

5 - 10 Lacs

Pune

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Google BigQuery Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in shaping the data platform components. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Collaborate with Integration Architects and Data Architects to design and implement data platform components. Ensure seamless integration between various systems and data models. Develop and maintain data platform blueprints. Implement data governance policies and procedures. Conduct performance tuning and optimization of data platform components. Professional & Technical Skills: Must To Have Skills:Proficiency in Databricks Unified Data Analytics Platform. Good To Have Skills:Experience with Google BigQuery. Strong understanding of data platform architecture and design principles. Hands-on experience in implementing data pipelines and ETL processes. Proficient in SQL and other query languages. Knowledge of cloud platforms such as AWS or Azure. Additional Information: The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform. This position is based at our Pune office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 1 month ago

Apply

5 - 10 years

7 - 12 Lacs

Pune

Work from Office

Naukri logo

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : PySpark Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : BE Project Role :Software Development Engineer Project Role Description :Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have Skills :PySparkGood to Have Skills :No Industry SpecializationJob Requirements :Key Responsibilities :Overall 8 years of experience working in Data Analytics projects, Work on client projects to deliver AWS, PySpark, Databricks based Data engineering Analytics solutions Build and operate very large data warehouses or data lakes ETL optimization, designing, coding, tuning big data processes using Apache Spark Build data pipelines applications to stream and process datasets at low latencies Show efficiency in handling data - tracking data lineage, ensuring data quality, and improving discoverability of data. Technical Experience :Minimum of 2 years of experience in Databricks engineering solutions on any of the Cloud platforms using PySpark Minimum of 5 years of experience years of experience in ETL, Big Data/Hadoop and data warehouse architecture delivery Minimum 3 year of Experience in one or more programming languages Python, Java, Scala Experience using airflow for the data pipelines in min 1 project 2 years of experience developing CICD pipelines using GIT, Jenkins, Docker, Kubernetes, Shell Scripting, Terraform. Must be able to understand ETL technologies and translate into Cloud (AWS, Azure, Google Cloud) native tools or Pyspark. Professional Attributes :1 Should have involved in data engineering project from requirements phase to delivery 2 Good communication skill to interact with client and understand the requirement 3 Should have capability to work independently and guide the team. Educational Qualification:Additional Info : Qualifications BE

Posted 1 month ago

Apply

2 - 7 years

4 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in the development and maintenance of the data platform components, contributing to the overall success of the project. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Assist with the data platform blueprint and design. Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Develop and maintain data platform components. Contribute to the overall success of the project. Professional & Technical Skills: Must To Have Skills:Proficiency in Databricks Unified Data Analytics Platform. Strong understanding of statistical analysis and machine learning algorithms. Experience with data visualization tools such as Tableau or Power BI. Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: The candidate should have a minimum of 2 years of experience in Databricks Unified Data Analytics Platform. This position is based at our Bengaluru office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies