Home
Jobs

935 Data Bricks Jobs - Page 4

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 12.0 years

20 - 35 Lacs

Chennai

Remote

Naukri logo

Who We Are For 20 years, we have been working with organizations large and small to help solve business challenges through technology. We bring a unique combination of engineering and strategy to Make Data Work for organizations. Our clients range from the travel and leisure industry to publishing, retail and banking. The common thread between our clients is their commitment to making data work as seen through their investment in those efforts. In our quest to solve data challenges for our clients, we work with large enterprise, cloud-based and marketing technology suites. We have a deep understanding of these solutions so we can help our clients make the most of their investment in an efficient way to have a data-driven business. Softcrylic now joins forces with Hexaware to Make Data Work in bigger ways! Why Work at Softcrylic? Softcrylic provides an engaging, team-focused, and rewarding work environment where people are excited about the work they do and passionate about delivering creative solutions to our clients. Work Timing: 12:30 pm to 9:30 pm (Flexible in work timing) Why Work at Softcrylic? Softcrylic provides an engaging, team-focused, and rewarding work environment where people are excited about the work they do and passionate about delivering creative solutions to our clients. Work Timing: 12:30 pm to 9:30 pm (Flexible in work timing) Here's how to approach the interview: All technical interview rounds will be conducted virtually. The final round will be a face-to-face interview with HR in Chennai. However, there will be a 15-minute technical assessment/in-person technical discussion as part of the final round. Make sure to prepare accordingly for both virtual and in-person components. Job Description: Key Responsibilities: 1. Data Pipeline Development: Design, develop, and maintain large-scale data pipelines using Databricks, Apache Spark, and AWS. 2. Data Integration: Integrate data from various sources into a unified data platform using dbt and Apache Spark. 3. Graph Database Management: Design, implement, and manage graph databases to support complex data relationships and queries. 4. Data Processing: Develop and optimize data processing workflows using Python, Apache Spark, and Databricks. 5. Data Quality: Ensure data quality, integrity, and security across all data pipelines and systems. 6. Team Management: Lead and manage a team of data engineers, providing guidance, mentorship, and support. 7. Agile Scrum: Work with product owners, product managers, and stakeholders to create product roadmaps, schedule and estimate tasks in sprints, and ensure successful project delivery. Mandatory Skills: 1. Databricks: Experience with Databricks platform, including data processing, analytics, and machine learning. 2. AWS: Experience with AWS services, including S3, Glue, and other relevant services. 3. Python: Proficiency with Python programming language for data processing, analysis, and automation. 4. Graph Database: Experience with graph databases, such as Neo4j, Amazon Neptune, or similar. 5. Apache Spark: Experience with Apache Spark for large-scale data processing and analytics. 6. dbt (Data Build Tool): Experience with dbt for data transformation, modeling, and analytics. 7. Agile Scrum: Experience with Agile Scrum methodologies, including sprint planning, task estimation, and backlog management. Optional Skills: 1. dlt(Data Load Tool): Experience with data load tools for efficiently loading data into target systems. 2. Kubernetes: Experience with Kubernetes for container orchestration and management. 3. Bash Scripting: Proficiency with Bash scripting for automation and task management. 4. Linux: Experience with Linux operating system, including command-line interface and system administration.

Posted 2 days ago

Apply

3.0 - 8.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : EPIC Systems Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Support Engineer, you will act as software detectives, providing a dynamic service that identifies and resolves issues within various components of critical business systems. Your typical day will involve collaborating with team members to troubleshoot software problems, analyzing system performance, and ensuring that applications run smoothly to support business operations effectively. You will engage with users to understand their challenges and work diligently to implement solutions that enhance system functionality and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the development and implementation of application support processes to improve efficiency.- Provide training and support to junior team members to enhance their skills and knowledge. Professional & Technical Skills: - Must To Have Skills: Proficiency in EPIC Systems.- Strong analytical skills to diagnose and resolve software issues.- Experience with troubleshooting and debugging applications.- Familiarity with system integration and data flow management.- Ability to communicate technical information effectively to non-technical stakeholders. Additional Information:- The candidate should have minimum 3 years of experience in EPIC Systems.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure Data Services Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time educationMust Have; Azure Data Factory SQL Azure Data Lake Storage Data Engineering Solutions Structured and Unstructured Data Processing Data Modelling, Analysis, Design, Development, and Documentation Cloud Computing CI/CD DevOps Good to Have (Minimum of 3) Azure DevOps PowerApps Azure Function App Databricks PowerBI (advanced knowledge) Python Airflow DAG Infra deployments (Azure, Bicep) Networking & security Scheduling for Orchestration of Workflows Agile Frameworks (Scrum) Qualification 15 years full time education

Posted 2 days ago

Apply

5.0 - 10.0 years

13 - 18 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Application Architect Project Role Description : Provide functional and/or technical expertise to plan, analyze, define and support the delivery of future functional and technical capabilities for an application or group of applications. Assist in facilitating impact assessment efforts and in producing and reviewing estimates for client work requests. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Architect, you will provide functional and/or technical expertise to plan, analyze, define, and support the delivery of future functional and technical capabilities for an application or group of applications. You will also assist in facilitating impact assessment efforts and in producing and reviewing estimates for client work requests. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead the design and implementation of application solutions.- Ensure compliance with architectural standards and guidelines.- Identify opportunities for process improvement and innovation.- Mentor junior team members to enhance their skills. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of cloud-based data analytics platforms.- Experience in designing and implementing scalable data solutions.- Proficient in data modeling and database design.- Hands-on experience with data integration and ETL processes. Additional Information:- The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Chennai office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

15.0 - 20.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the data architecture. You will be involved in problem-solving sessions, contributing your insights and expertise to develop effective solutions that align with the overall data strategy of the organization. Your role will require you to stay updated with the latest trends in data engineering and analytics, ensuring that the data platform remains robust and efficient. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in continuous learning to stay abreast of industry trends and technologies.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration techniques and tools.- Experience with cloud-based data storage solutions.- Familiarity with data modeling concepts and best practices.- Ability to troubleshoot and optimize data workflows. Additional Information:- The candidate should have minimum 2 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

15.0 - 20.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : Manufacturing Engineering MES Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Lead, you will be responsible for developing and configuring software systems, either end-to-end or for specific stages of the product lifecycle. Your typical day will involve collaborating with various teams to ensure that the software meets the required specifications and quality standards. You will apply your knowledge of technologies and methodologies to support projects effectively, ensuring that all aspects of the software development process are executed smoothly and efficiently. Engaging with stakeholders, you will gather requirements and provide insights that drive the project forward, while also mentoring team members to enhance their skills and performance. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: - Must To Have Skills: Proficiency in Manufacturing Engineering MES.- Good To Have Skills: Experience with software development methodologies such as Agile or Scrum.- Strong understanding of system integration and data flow within manufacturing environments.- Experience with programming languages relevant to MES development.- Familiarity with database management systems and data analytics tools. Additional Information:- The candidate should have minimum 7.5 years of experience in Manufacturing Engineering MES.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

7.0 - 12.0 years

5 - 9 Lacs

Coimbatore

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : PySpark, Python (Programming Language), Talend ETLMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for utilizing your expertise in Databricks Unified Data Analytics Platform to develop efficient and effective solutions. Your typical day will involve collaborating with the team, analyzing business requirements, designing and implementing applications, and ensuring the applications meet the desired functionality and performance standards. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Design, build, and configure applications based on business process and application requirements.- Analyze business requirements and translate them into technical specifications.- Collaborate with cross-functional teams to ensure the successful implementation of applications.- Perform code reviews and provide guidance to junior developers.- Stay updated with the latest industry trends and technologies to continuously improve application development processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform, Python (Programming Language), Talend ETL, PySpark.- Strong understanding of statistical analysis and machine learning algorithms.- Experience with data visualization tools such as Tableau or Power BI.- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms.- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in the development and maintenance of the data platform components, contributing to the overall success of the project. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Assist with the data platform blueprint and design.- Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models.- Develop and maintain data platform components.- Contribute to the overall success of the project. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of statistical analysis and machine learning algorithms.- Experience with data visualization tools such as Tableau or Power BI.- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms.- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information:- The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- 15 years full time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

7.0 - 12.0 years

5 - 9 Lacs

Coimbatore

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Python (Programming Language), PySpark, Apache SparkMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for utilizing the Databricks Unified Data Analytics Platform to develop efficient and effective applications. Your typical day will involve collaborating with the team, analyzing business requirements, designing application solutions, and configuring applications to meet the needs of the organization. You will also be involved in troubleshooting and resolving any application issues that arise, ensuring the smooth functioning of the applications. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Design and build applications using the Databricks Unified Data Analytics Platform.- Configure applications to meet business process and application requirements.- Analyze business requirements and translate them into application solutions.- Troubleshoot and resolve any application issues that arise.- Collaborate with the team to ensure the smooth functioning of applications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with Python (Programming Language), PySpark, Apache Spark.- Strong understanding of statistical analysis and machine learning algorithms.- Experience with data visualization tools such as Tableau or Power BI.- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms.- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

2.0 - 7.0 years

17 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

Job Title - Data Eng, Mgmt. & Governance - Analyst S&C Global Network Management Level: 11 Location: Hyderabad Must have skills: Proficiency and hands-on experience in Data Engineering technologies like Python, R, SQL, Spark, Pyspark, Databricks, Hadoop etc. Good to have skills: Exposure to Retail, Banking, Healthcare projects and knowledge of PowerBI & PowerApps is an added advantage. Job Summary : As a Data Operations Analyst, you would be responsible to ensure our esteemed business is fully supported in using the business-critical AI enabled applications. This involves solving day-to-day application issues, business queries, addressing adhoc data requests to ensure clients can extract maximum value for the AI applications. WHATS IN IT FOR YOU An opportunity to work on high-visibility projects with top clients around the globe. Potential to Co-create with leaders in strategy, industry experts, enterprise function practitioners, and business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed responsible business into everythingfrom how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge, and capabilities. Opportunity to thrive in a culture that is committed to accelerating equality for all. Engage in boundaryless collaboration across the entire organization. Roles & Responsibilities: Monitor and maintain pre-processing pipelines, model execution batches and validation of model outputs. In case of deviations or model degradation, take up detailed root cause analysis and implement permanent fixes. Debug issues related to data loads, batch pipeline, application functionality including special handling of data/batch streams. As a Data Operations Analyst, you would be working on initial triaging of code related defects/issues, provide root cause analysis and implement code fix for permanent resolution of the defect. Design, build, test and deploy small to medium size enhancements that deliver value to business and enhance application availability and usability. Responsible for sanity testing of use cases as part of pre-deployment and post-production activities. Primarily responsible for Application availability and stability by remediating application issues/bugs or other vulnerabilities. Data Operations Analysts evolve to become Subject Matter Experts as they mature in servicing the applications. Professional & Technical Skills: Proven experience (2+ years) in working as per the above job description is required. Experience/Education on Statistics, Data Science, Applied Mathematics, Business Analytics, Computer Science, Information Systems is preferable. Proven experience (2+ years) in working as per the above job description is required. Exposure to Retail, Banking, Healthcare projects is added advantage. Proficiency and hands-on experience in Data Engineering technologies like Python, R, SQL, Spark, Pyspark, Databricks, Hadoop etc. Ability to work with large data sets and present findings/insights to key stakeholders; Data management using databases like SQL. Experience with any of the cloud platforms like AWS, Azure, or Google Cloud for deploying and scaling language models. Experience with any of the data visualization tools like Tableau, Qlikview, and Spotfire is good. Knowledge on PowerBI & PowerApps is an added advantage. Excellent analytical and problem-solving skills, with a data-driven mindset. Proficient in Excel, MS Word, PowerPoint, etc. Ability to solve complex business problems and deliver client delight. Strong writing skills to build points of view on current industry trends. Good Client handling skills; able to demonstrate thought leadership & problem-solving skills. Additional Information: - The ideal candidate will possess a strong educational background in computer science or a related field. This position is based at our Hyderabad office. About Our Company | AccentureQualification Experience: Minimum 2 years of experience is required Educational Qualification: Bachelors or masters degree in any engineering stream or MCA.

Posted 2 days ago

Apply

4.0 - 9.0 years

17 - 20 Lacs

Pune

Work from Office

Naukri logo

Job Title - Data Science Consultant S&C GN Management Level : Consultant Location: Bangalore / Gurugram / Mumbai / Hyderabad / Chennai Must have skills: Gen AI / ML, SQL, Python, Azure / AWS, ML Ops Good to have skills: Experience in data science projects focused on Utilities, Oil & Gas, Mining, Manufacturing, Chemicals and Forest Product Job Summary : We are seeking a highly skilled and motivated Data Science Consultant to work on innovative projects and drive impactful solutions in domains such as Utilities, Oil & Gas, Mining, Manufacturing, Chemicals and Forest Product . This role requires hands-on technical expertise , and client delivery management skills to execute cutting-edge projects in Generative AI, data science, and cloud-based analytics. Key Responsibilities . 1.Data Science and Engineering Perform advanced analytics using Python, SQL, Pyspark using machine learning frameworks. Develop predictive models, recommendation systems, and optimization solutions tailored to business needs. Manage and preprocess large, complex datasets, ensuring efficient pipelines and advanced feature engineering across structured and unstructured data. Build MLOps pipelines for model training / retraining, monitoring, and scalability 2.Dashboard and Reporting Develop dashboards, reports, and insights to track the impact of deployed models on business outcomes in PowerBI/Tableau. Present results and recommendations to stakeholders, leveraging data storytelling to drive decision-making. 3.Cloud Platform Expertise Design and implement end-to-end data science workflows on cloud platforms (e.g., AWS, Azure, GCP) for business-critical projects. Leverage cloud-native tools and services (e.g., Databricks, ADF, Lambda, Glue, AzureML) for training, deploying, and monitoring machine learning models at scale. 4.Generative AI Expertise Lead the development of Generative AI based application and solutions leveraging frameworks like LangChain, LlamaIndex Drive model evaluation strategies using advanced metrics (e.g., BLEU, ROUGE, FID) and iteratively optimize performance for production-grade applications. Architect deployment solutions, including API development and seamless integration with existing systems. Required Qualifications Experience:4+ years in data science Education:Bachelor's / masters degree in computer science, statistics, applied mathematics, or a related field. Industry Knowledge:Preferred experience in Utilities, Oil & Gas, Mining, Manufacturing, Chemicals and Forest Product Technical Skills Programming:Proficiency in Python, SQL and Pyspark. GenAI Expertise: Hands-on experience in building GenAI based applications and solutions Experience in deploying GenAI application in production. Cloud Platforms:Experience with Azure / AWS / GCP Visualization Tools:PowerBI / Tableau Preferred Skills Strong analytical and problem-solving skills with a results-oriented mindset. Good communication and stakeholder management capabilities. Very good in generating business insights and presenting to stakeholders. About Our Company | Accenture Qualification Experience: 4-8 years in data science Educational Qualification: Bachelor's / masters degree in computer science, statistics, applied mathematics, or a related field.

Posted 2 days ago

Apply

4.0 - 9.0 years

20 - 25 Lacs

Kolkata

Work from Office

Naukri logo

Job Title - Data Science Consultant S&C GN Management Level : Consultant Location: Bangalore / Gurugram / Mumbai / Hyderabad / Chennai Must have skills: Gen AI / ML, SQL, Python, Azure / AWS, ML Ops Good to have skills: Experience in data science projects focused on Utilities, Oil & Gas, Mining, Manufacturing, Chemicals and Forest Product Job Summary : We are seeking a highly skilled and motivated Data Science Consultant to work on innovative projects and drive impactful solutions in domains such as Utilities, Oil & Gas, Mining, Manufacturing, Chemicals and Forest Product . This role requires hands-on technical expertise , and client delivery management skills to execute cutting-edge projects in Generative AI, data science, and cloud-based analytics. Key Responsibilities . 1.Data Science and Engineering Perform advanced analytics using Python, SQL, Pyspark using machine learning frameworks. Develop predictive models, recommendation systems, and optimization solutions tailored to business needs. Manage and preprocess large, complex datasets, ensuring efficient pipelines and advanced feature engineering across structured and unstructured data. Build MLOps pipelines for model training / retraining, monitoring, and scalability 2.Dashboard and Reporting Develop dashboards, reports, and insights to track the impact of deployed models on business outcomes in PowerBI/Tableau. Present results and recommendations to stakeholders, leveraging data storytelling to drive decision-making. 3.Cloud Platform Expertise Design and implement end-to-end data science workflows on cloud platforms (e.g., AWS, Azure, GCP) for business-critical projects. Leverage cloud-native tools and services (e.g., Databricks, ADF, Lambda, Glue, AzureML) for training, deploying, and monitoring machine learning models at scale. 4.Generative AI Expertise Lead the development of Generative AI based application and solutions leveraging frameworks like LangChain, LlamaIndex Drive model evaluation strategies using advanced metrics (e.g., BLEU, ROUGE, FID) and iteratively optimize performance for production-grade applications. Architect deployment solutions, including API development and seamless integration with existing systems. Required Qualifications Experience:4+ years in data science Education:Bachelor's / masters degree in computer science, statistics, applied mathematics, or a related field. Industry Knowledge:Preferred experience in Utilities, Oil & Gas, Mining, Manufacturing, Chemicals and Forest Product Technical Skills Programming:Proficiency in Python, SQL and Pyspark. GenAI Expertise: Hands-on experience in building GenAI based applications and solutions Experience in deploying GenAI application in production. Cloud Platforms:Experience with Azure / AWS / GCP Visualization Tools:PowerBI / Tableau Preferred Skills Strong analytical and problem-solving skills with a results-oriented mindset. Good communication and stakeholder management capabilities. Very good in generating business insights and presenting to stakeholders. About Our Company | Accenture Qualification Experience: 4-8 years in data science Educational Qualification: Bachelor's / masters degree in computer science, statistics, applied mathematics, or a related field.

Posted 2 days ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Engineering Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time educationMust Have; Azure Data Factory SQL Azure Data Lake Storage Data Engineering Solutions Structured and Unstructured Data Processing Data Modelling, Analysis, Design, Development, and Documentation Cloud Computing CI/CD DevOps Good to Have (Minimum of 3) Azure DevOps PowerApps Azure Function App Databricks PowerBI (advanced knowledge) Python Airflow DAG Infra deployments (Azure, Bicep) Networking & security Scheduling for Orchestration of Workflows Agile Frameworks (Scrum) Qualification 15 years full time education

Posted 2 days ago

Apply

7.0 - 12.0 years

9 - 13 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in the development and maintenance of the data platform components, contributing to the overall success of the project. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Assist with the data platform blueprint and design.- Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models.- Develop and maintain data platform components.- Contribute to the overall success of the project. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of statistical analysis and machine learning algorithms.- Experience with data visualization tools such as Tableau or Power BI.- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms.- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Chennai office.- 15 years full time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

7.0 - 12.0 years

9 - 13 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in the development and maintenance of the data platform components, contributing to the overall success of the project. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Assist with the data platform blueprint and design.- Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models.- Develop and maintain data platform components.- Contribute to the overall success of the project. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of statistical analysis and machine learning algorithms.- Experience with data visualization tools such as Tableau or Power BI.- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms.- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- 15 years full time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

3.0 - 8.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. You will collaborate with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Your typical day will involve working on the data platform blueprint and design, collaborating with architects, and ensuring seamless integration between systems and data models. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist with the data platform blueprint and design.- Collaborate with Integration Architects and Data Architects.- Ensure cohesive integration between systems and data models.- Implement data platform components.- Troubleshoot and resolve data platform issues. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of statistical analysis and machine learning algorithms.- Experience with data visualization tools such as Tableau or Power BI.- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms.- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information:- The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

4.0 - 9.0 years

4 - 9 Lacs

Chennai

Work from Office

Naukri logo

Design, build, and maintain scalable data pipelines in Databricks. Collaborate with cross-functional teams to gather and interpret data requirements. Develop data models and perform data analysis using SQL and Python. Implement and optimize ETL processes for data ingestion and transformation. Monitor and troubleshoot data pipeline issues to ensure data integrity and availability. Create data visualizations and dashboards to present findings to stakeholders. Stay up-to-date with the latest features and best practices in Databricks and big data technologies. Requirements Proficiency in Databricks and experience with Apache Spark. Strong knowledge of SQL and Python programming languages. Experience in data modeling, ETL processes, and data warehousing. Familiarity with cloud platforms like AWS, Azure, or Google Cloud. Ability to handle large datasets and perform data analysis efficiently. Excellent problem-solving and analytical skills. Strong communication skills and a collaborative mindset.

Posted 2 days ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :Data Engineering Sr. Advisor demonstrates expertise in data engineering technologies with the focus on engineering, innovation, strategic influence and product mindset. This individual will act as key contributor of the team to design, build, test and deliver large-scale software applications, systems, platforms, services or technologies in the data engineering space. This individual will have the opportunity to work directly with partner IT and business teams, owning and driving major deliverables across all aspects of software delivery.The candidate will play a key role in automating the processes on Databricks and AWS. They collaborate with business and technology partners in gathering requirements, develop and implement. The individual must have strong analytical and technical skills coupled with the ability to positively influence on delivery of data engineering products. The applicant will be working in a team that demands innovation, cloud-first, self-service-first, and automation-first mindset coupled with technical excellence. The applicant will be working with internal and external stakeholders and customers for building solutions as part of Enterprise Data Engineering and will need to demonstrate very strong technical and communication skills.Delivery Intermediate delivery skills including the ability to deliver work at a steady, predictable pace to achieve commitments, decompose work assignments into small batch releases and contribute to tradeoff and negotiation discussions.Domain Expertise Demonstrated track record of domain expertise including the ability to understand technical concepts necessary to do the job effectively, demonstrate willingness, cooperation, and concern for business issues and possess in-depth knowledge of immediate systems worked on.Problem Solving Proven problem solving skills including debugging skills, allowing you to determine source of issues in unfamiliar code or systems and the ability to recognize and solve repetitive problems rather than working around them, recognize mistakes using them as learning opportunities and break down large problems into smaller, more manageable ones& Responsibilities:The candidate will be responsible to deliver business needs end to end from requirements to development into production.Through hands-on engineering approach in Databricks environment, this individual will deliver data engineering toolchains, platform capabilities and reusable patterns.The applicant will be responsible to follow software engineering best practices with an automation first approach and continuous learning and improvement mindset.The applicant will ensure adherence to enterprise architecture direction and architectural standards.The applicant should be able to collaborate in a high-performing team environment, and an ability to influence and be influenced by others.Experience Required:More than 12 years of experience in software engineering, building data engineering pipelines, middleware and API development and automationMore than 3 years of experience in Databricks within an AWS environmentData Engineering experienceExperience Desired:Expertise in Agile software development principles and patternsExpertise in building streaming, batch and event-driven architectures and data pipelinesPrimary Skills: Cloud-based security principles and protocols like OAuth2, JWT, data encryption, hashing data, secret management, etc.Expertise in Big data technologies such as Spark, Hadoop, Databricks, Snowflake, EMR, GlueGood understanding of Kafka, Kafka Streams, Spark Structured streaming, configuration-driven data transformation and curationExpertise in building cloud-native microservices, containers, Kubernetes and platform-as-a-service technologies such as OpenShift, CloudFoundryExperience in multi-cloud software-as-a-service products such as Databricks, SnowflakeExperience in Infrastructure-as-Code (IaC) tools such as terraform and AWS cloudformationExperience in messaging systems such as Apache ActiveMQ, WebSphere MQ, Apache Artemis, Kafka, AWS SNSExperience in API and microservices stack such as Spring Boot, Quarkus, Expertise in Cloud technologies such as AWS Glue, Lambda, S3, Elastic Search, API Gateway, CloudFrontExperience with one or more of the following programming and scripting languages Python, Scala, JVM-based languages, or JavaScript, and ability to pick up new languagesExperience in building CI/CD pipelines using Jenkins, Github ActionsStrong expertise with source code management and its best practicesProficient in self-testing of applications, unit testing and use of mock frameworks, test-driven development (TDD)Knowledge on Behavioral Driven Development (BDD) approachAdditional Skills: Ability to perform detailed analysis of business problems and technical environmentsStrong oral and written communication skillsAbility to think strategically, implement iteratively and estimate financial impact of design/architecture alternativesContinuous focus on an on-going learning and development Qualification 15 years full time education

Posted 2 days ago

Apply

3.0 - 8.0 years

5 - 8 Lacs

Kolkata

Work from Office

Naukri logo

We are looking for a Senior Python Developer with a passion for AI research and API development to join our growing team. In this role, you will be responsible for building scalable, high-performance APIs and contributing to AI/ML research and implementation. You will work closely with data scientists, researchers, and product teams to design and deploy intelligent systems that power our next-generation applications. Key Responsibilities Design, develop, and maintain Python-based APIs for AI/ML models and services Collaborate with AI researchers to implement and optimize machine learning models Conduct research into new AI/ML techniques and evaluate their applicability to business problems Build RESTful and GraphQL APIs using frameworks like FastAPI , Flask , or Django REST Framework Write clean, testable, and maintainable Python code with a focus on performance and scalability Participate in code reviews , mentor junior developers, and contribute to best practices Integrate AI models with backend systems and frontend applications Stay up-to-date with AI/ML trends , Python libraries (e.g., PyTorch , TensorFlow , Scikit-learn ), and API design patterns Work in an agile environment , delivering high-quality software in iterative sprints Qualifications Bachelors or Masters degree in Computer Science, Data Science, or a related field 4 + years of professional experience in software development, with 3 + years in Python Strong experience with Python web frameworks (e.g., FastAPI, Flask, Django) What Were Looking For in a Candidate A curious mind with a passion for AI and software development A team player who can mentor and guide others A self-starter who can take initiative and deliver results A lifelong learner who stays current with emerging technologies and trends Why Join Us? Work on cutting-edge AI projects with real-world impact Collaborate with top-tier researchers and engineers Flexible work environment and remote-friendly options Competitive salary and performance-based incentives Opportunities for professional growth and leadership A culture that values innovation, collaboration, and continuous learning

Posted 2 days ago

Apply

5.0 - 10.0 years

12 - 22 Lacs

Gurugram

Remote

Naukri logo

Requirements: Bachelor's degree in Computer Science, Information Technology, or a related field. A master's degree is a plus. Proven experience as a Data Engineer or in a similar role, with a focus on ETL processes and database management. Proficiency in the Microsoft Azure data management suite (MSSQL, Azure Databricks , PowerBI , Data factories, Azure cloud monitoring, etc.) and Python scripting. Strong knowledge of SQL and experience with database management systems Strong development skills in python and pyspark . Experience with data warehousing solutions and data mart creation. Familiarity with big data technologies (e.g., Hadoop, Spark) is a plus. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Preferred Qualifications: Good to have Databricks Certified data engineer associate or professional. Understanding of data modeling and data architecture principles. Experience with data governance and data security best practices.

Posted 2 days ago

Apply

8.0 - 10.0 years

20 - 25 Lacs

Pune

Work from Office

Naukri logo

Meeting with managers for company’s Big Data needs Developing big data solutions on AWS using Apache Spark, Databricks, Delta Tables, EMR, Athena, Glue, Hadoop Loading disparate data sets & conducting pre-processing services using Athena, Glue, Spark Required Candidate profile Proficient with Python & PySpark Extensive experience with Delta Tables, JSON, Parquet file format AWS data analytics services like Athena, Glue, Redshift, EMR Knowledge of NoSQL and RDBMS databases.

Posted 2 days ago

Apply

10.0 - 20.0 years

20 - 35 Lacs

Bengaluru

Remote

Naukri logo

As the data engineering consultant, you should have the common traits and capabilities that are listed Essential Requirements and meet many of the capabilities listed in Desirable Requirements Essential Requirements and Skills 10+ years working with customers in the Data Analytics, Big Data and Data Warehousing field. 10+ years working with data modeling tools. 5+ years building data pipelines for large customers. 2+ years of experience working in the field of Artificial Intelligence that leverages Big Data. This should be in a customer-facing services delivery role. 3+ years of experience in Big Data database design. A good understanding of LLMs, prompt engineering, fine tuning and training. Strong knowledge of SQL, NoSQL and Vector databases. Experience with popular enterprise databases such as SQL Server, MySQL, Postgres and Redis is a must. Additionally experience with popular Vector Databases such as PGVector, Milvus and Elasticsearch is a requirement. Experience with major data warehousing providers such as Teradata. Experience with data lake tools such as Databricks, Snowflake and Starburst. Proven experience building data pipelines and ETLs for both data transformation and multiple data source data extraction. Experience with automation of the deployment and execution of these pipelines. Experience with tools such as Apache Spark, Apache Hadoop, Informatica and similar data processing tools. Proficient knowledge of Python and SQL is a must. Proven experience with building test procedures, ensuring the quality, reliability, performance, and scalability of the data pipelines. Ability to develop applications that expose Restful APIs for data querying and ingestion. Experience preparing training data for Large Language Model ingestion and training (e.g. through vector databases). Experience with integrating with RAG solutions and leveraging related tools such as Nvidia Guardrails. Ability to define and implement metrics for RAG solutions. Understanding of typical AI tooling ecosystem including knowledge and experience of Kubernetes, MLOps, LLMOps and AIOps tools. Ability to gain customer trust, ability to plan, organize and drive customer workshops. Good communication skills in English is a must. The ability to work in a highly efficient team using an Agile methodology such as Scrum or Kanban. Ability to have extended pairing sessions with customers, enabling knowledge transfers in complex domains. Ability to influence and interact with confidence and credibility at all levels within the Dell Technologies companies and with our customers, partners, and vendors. Experience working on project teams within a defined methodology while adhering to margin, planning and SOW requirements. Ability to be onsite during customer workshops and enablement sessions. Desirable Requirements and Skills Knowledge of industry widespread AI Studios and AI Workbenches is a plus. Experience building and using Information Retrieval (IR) frameworks to support LLM inferencing. Working knowledge of Linux is a plus. Knowledge of using Minio is appreciated. Experience using Lean and Iterative Deployment Methodologies. Working knowledge of cloud technologies is a plus. University Degree aligned to Data Engineering is a plus. In possession of relevant industry certifications e.g. Databricks Certified Data Engineer, Microsoft Certifications, etc.

Posted 2 days ago

Apply

5.0 - 8.0 years

20 - 25 Lacs

Pune, Gurugram

Hybrid

Naukri logo

Key Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory, Snowflake, and DBT. Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Must-Have Skills: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets. Good-to-Have Skills: Experience with Azure Data Lake, Azure Synapse, or Azure Functions. Familiarity with Python or PySpark for custom data transformations. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related field. 5+ years of experience in data engineering roles using Azure and Snowflake. • Strong problem-solving, communication, and collaboration skills.

Posted 2 days ago

Apply

6.0 - 8.0 years

12 - 16 Lacs

Kolkata

Hybrid

Naukri logo

What will your day look like? - Leading a dynamic team to deliver high impact risk solutions across credit risk (underwriting, exposure controls and line management). - Work with stakeholders across product management, data science, and engineering to build relationship with the partner teams and drive implementation of risk strategies - Manage challenging time constraints to ensure on-time delivery of projects. - Work closely with partner teams in identifying, evaluating, and recommending new data that helps in risk differentiation. - Analyze loss trends and simulate risk decisioning strategies that help optimize revenue, approval rates etc. - Work closely with data science team and recommends credit risk decisioning and model deployment strategy. - Build a risk scorecard that leverages both internal performance data and external performance data that will be leveraged for credit decisioning at both underwriting and account management reviews for existing customers. - Collates analysis and builds presentations that helps articulate the risk strategy for the leadership team. To Help Us Level Up, You Will Ideally Have : - Quantitative background in engineering, statistics, math, economics, business, or related disciplines. - 5+ years experience in analyzing data and using database query language (e. SQL) analysis and programming and developer tools such as Python, R, data bricks in a finance or analytics field. - 2+ years of experience in leading high performing team of analysts. - Experience in working with non-traditional data such as social media will be a big plus. - Prior model building experience is a plus but not critical. - Possesses an analytical mindset and strong problem-solving skills. - Attention to detail and ability to multitask. - Comfortable working in a fast-paced environment and dealing with ambiguity. - Possesses strong communication, interpersonal and presentation skills; and ability to engage and collaborate with multiple stakeholders across teams. - Extremely proactive communicator willing to raise flags when needed and keep team members informed of ongoing risk or fraud related activities.

Posted 2 days ago

Apply

3.0 - 8.0 years

6 - 16 Lacs

Pune, Gurugram, Bengaluru

Work from Office

Naukri logo

Qualification 4-5 years of good hands-on exposure with Big Data technologies Pyspark (Data frame and Spark SQL), Hadoop, and Hive Good hands-on experience of python and Bash Scripts Hands-on experience with using Cloud Platform provided Big Data technologies Good understanding of SQL and data warehouse tools like (Redshift) Strong analytical, problem-solving, data analysis and research skills Demonstrable ability to think outside of the box and not be dependent on readily available tools Excellent communication, presentation and interpersonal skills are a must Good to have: Orchestration with Airflow and Any job scheduler experience Experience in migrating workload from on-premises to cloud and cloud to cloud migrations Roles & Responsibilities Develop efficient ETL pipelines as per business requirements, following the development standards and best practices. Perform integration testing of different created pipeline in AWS/Azure env. Provide estimates for development, testing & deployments on different env. Participate in code peer reviews to ensure our applications comply with best practices.

Posted 3 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies