Home
Jobs

935 Data Bricks Jobs - Page 3

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 5.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries,Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: DataBricks - Data Engineering. Experience3-5 Years.

Posted 1 day ago

Apply

5.0 - 8.0 years

3 - 6 Lacs

Pune

Work from Office

Naukri logo

Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries,Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: DataBricks - Data Engineering. Experience5-8 Years.

Posted 1 day ago

Apply

5.0 - 8.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Databricks Lead Databricks Lead Should have 6 years of experience Must have skills DataBricks, Delta Lake, pyspark or scala spark, unity catalog. Good to have skills - Azure/AWS Cloud skills To ingest and transform batch and streaming data on the Databricks Lakehouse Platform. Excellent communication skill Mandatory Skills: DataBricks - Data Engineering. Experience5-8 Years.

Posted 1 day ago

Apply

1.0 - 5.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Role Purpose Mandatory Skills: Gen AI, LLM, RAG, Lang chain, Mistral,Llama, Vector DB, Azure/GCP/ Lambda, Python, Tensorflow, Pytorch Preferred Skills GPT-4, NumPy, Pandas, Keras, Databricks, Pinecone/Chroma/Weaviate, Scale/Labelbox, We are looking for a good Python Developer with Knowledge of Machine learning and deep learning framework. Take care of entire prompt life cycle like prompt design, prompt template creation, prompt tuning/optimization for various GenAI base models Design and develop prompts suiting project needs Stakeholder management across business and domains as required for the projects Evaluating base models and benchmarking performance Implement prompt guardrails to prevent attacks like prompt injection, jail braking and prompt leaking Develop, deploy and maintain auto prompt solutions Design and implement minimum design standards for every use case involving prompt engineering You will be responsible for training the machine learning and deep learning model. Writing reusable, testable, and efficient code using Python Design and implementation of low-latency, high-availability, and performant applications Implementation of security and data protection Integration of data storage solutions and API Gateways Production change deployment and related support Reinvent your world.We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 day ago

Apply

9.0 - 13.0 years

7 - 17 Lacs

Pune

Work from Office

Naukri logo

Required Skills and Qualifications: Minimum 8+ years of hands-on experience in Data Engineering Strong proficiency with: Databricks Azure Data Factory (ADF) SQL (T-SQL or similar) PySpark Experience with cloud-based data platforms, especially Azure Strong understanding of data warehousing, data lakes, and data modeling Ability to write efficient, maintainable, and reusable code Excellent analytical, problem-solving, and communication skills Willingness to travel to the customer location in Hinjawadi all 3 working days

Posted 1 day ago

Apply

1.0 - 4.0 years

0 - 0 Lacs

Noida

Work from Office

Naukri logo

Role & responsibilities Diligently observe dashboards and alerts for data pipeline health, system performance, and data integrity issues. Respond to incidents and data failures, perform initial triage, and escalate unresolved issues to L2/L3 teams as needed. Support daily data loads, scheduled jobs, and routine batch processes. Ensure timely completion and log/report any anomalies. Maintain accurate records of incidents, resolutions, and operational runbooks. Prepare daily/weekly reports on system health and issues. Work with data engineers, analysts, and IT support to ensure smooth data operations and resolve cross-functional issues. Run and monitor basic data quality scripts, flagging missing or inconsistent data for further investigation. Requirements Bachelors degree in Computer Science, Information Technology, or a related field. 1-4 years of experience in data operations, IT support, or a similar technical operations role. Basic understanding of data pipelines, ETL/ELT concepts, and database systems (SQL knowledge preferred). Data tools and technologies: Awareness of data analytics platforms and visualization tools like Power BI or Tableau Control-M (Main tool used by client) and Dbeaver Programming and database knowledge: Familiarity with SQL, Python, or similar languages Familiarity with cloud platforms (AWS, Azure, or GCP) and data tools (Databricks, Spark, or similar) is a plus. Experience with Linux/Windows OS and basic scripting (Python, Bash) is desirable. Strong problem-solving, analytical, and communication skills. Willingness to work in shifts (if required) and respond to incidents promptly.

Posted 1 day ago

Apply

6.0 - 10.0 years

1 - 1 Lacs

Bengaluru

Remote

Naukri logo

We are looking for a highly skilled Senior ETL Consultant with strong expertise in Informatica Intelligent Data Management Cloud (IDMC) components such as IICS, CDI, CDQ, IDQ, CAI, along with proven experience in Databricks.

Posted 1 day ago

Apply

5.0 - 10.0 years

14 - 24 Lacs

Pune, Bengaluru, Greater Noida

Work from Office

Naukri logo

Role & responsibilities: Looking for 5 to 8 years experience of ML Engineer with strong Azure Cloud DevOps with even stronger DABs DevOps skills with even stronger DABs Databricks Asset Bundles implementation knowledge. 1 Azure Cloud Engineer 2 Azure DevOps CICD experienced in DABs 3 ML Engineer for deployment Translate business requirement into technical solution Implementation of MLOPS Scalable solution using AIML and reduce the risk of Fraud and other fiscal crisis Creating MLOPS Architecture and implementing it for multiple models in a scalable and automated way Designing and implementing end to end ML solutions Operationalize and monitor machine learning models using high end tools and technologies Design implementation of DevOps principles in Machine Learning Data Science quality assurance and testing Collaborate with data scientists engineers and other key stakeholders Preferred candidate profile: Azure Cloud Engineering Design implement and manage scalable cloud infrastructure on Microsoft Azure Ensure high availability performance and security of cloud based applications Collaborate with cross functional teams to define and implement cloud solutions Azure DevOps CICD Develop and maintain CICD pipelines using Azure DevOps Automate deployment processes to ensure efficient and reliable software delivery Monitor and troubleshoot CICD pipelines to ensure smooth operation DABs Databricks Asset Bundles Implementation Lead the implementation and management of Databricks Asset Bundles Optimize data workflows and ensure seamless integration with existing systems Provide expertise in DABs to enhance data processing and analytics capabilities Machine Learning Deployment Deploy machine learning models into production environments Monitor and maintain ML models to ensure optimal performance Collaborate with data scientists and engineers to integrate ML solutions into applications.

Posted 1 day ago

Apply

6.0 - 10.0 years

15 - 25 Lacs

Kolkata, Pune, Delhi / NCR

Work from Office

Naukri logo

About Client Hiring for One of the Most Prestigious Multinational Corporations Job Title : Azure Databricks Qualification : Any Graduate or Above Experience : 6 to 10 Yrs Location : PAN India Key Responsibilities: Design and build scalable and robust data pipelines using Azure Databricks, PySpark, and Spark SQL. Integrate data from various structured and unstructured data sources using Azure Data Factory,ADLS, Azure Synapse, etc. Develop and maintain ETL/ELT processes for ingestion, transformation, and storage of data. Collaborate with data scientists, analysts, and other engineers to deliver data products and solutions. Monitor, troubleshoot, and optimize existing pipelines for performance and reliability. Ensure data quality,governance, and security compliance in all solutions. Participate in architectural decisions and cloud data solutioning. Required Skills: 5+ years of experience in data engineering or related fields. Strong hands-on experience with Azure Databricks and Apache Spark. Proficiency in Python (PySpark),SQL, and performance tuning techniques. Experience with Azure Data Factory, Azure Data Lake Storage (ADLS), and Azure Synapse Analytics. Solid understanding of data modeling, data warehousing, and data lakes. Familiarity with DevOps practices, CI/CD pipelines, and version control (e.g., Git). Notice period :30,60, 90 days Mode of Work :WFO(Work From Office) Thanks & Regards, SWETHA Black and White Business Solutions Pvt.Ltd. Bangalore,Karnataka,India. Contact Number:8067432433 rathy@blackwhite.in |www.blackwhite.in

Posted 1 day ago

Apply

5.0 - 10.0 years

20 - 22 Lacs

Hyderabad

Work from Office

Naukri logo

Role & responsibilities 5+ years of experience on IT industry in Data Engineering & Data Analyst role. 5 years of development experience using tool Databricks and PySpark, Python, SQL Proficient in writing SQL queries including writing of windows functions Good communication skills with analytical abilities in doing problem solving activities.

Posted 1 day ago

Apply

6.0 - 10.0 years

10 - 15 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Job Description: : Cloud Data Engineer Experience Required: 5+ years Location : PAN India , Preferred Hyderabad(Hybrid) Notice Period : Immediate Detailed JD (Roles and Responsibilities) 5 + years of experience on IT industry in Data Engineering & Data Analyst role. 5 years of development experience using tool Databricks and PySpark, Python, SQL Proficient in writing SQL queries including writing of windows functions Good communication skills with analytical abilities in doing problem solving activities. Mandatory skills Databricks; SQL; PySpark; Python Desired/ Secondary skills ADF Domain Healthcare

Posted 1 day ago

Apply

6.0 - 11.0 years

15 - 30 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Interested can also apply with Sanjeevan Natarajan - 94866 21923 sanjeevan.natarajan@careernet.in Role & responsibilities Technical Leadership Lead a team of data engineers and developers; define technical strategy, best practices, and architecture for data platforms. End-to-End Solution Ownership Architect, develop, and manage scalable, secure, and high-performing data solutions on AWS and Databricks. Data Pipeline Strategy Oversee the design and development of robust data pipelines for ingestion, transformation, and storage of large-scale datasets. Data Governance & Quality Enforce data validation, lineage, and quality checks across the data lifecycle. Define standards for metadata, cataloging, and governance. Orchestration & Automation Design automated workflows using Airflow, Databricks Jobs/APIs, and other orchestration tools for end-to-end data operations. Cloud Cost & Performance Optimization Implement performance tuning strategies, cost optimization best practices, and efficient cluster configurations on AWS/Databricks. Security & Compliance Define and enforce data security standards, IAM policies, and compliance with industry-specific regulatory frameworks. Collaboration & Stakeholder Engagement Work closely with business users, analysts, and data scientists to translate requirements into scalable technical solutions. Migration Leadership Drive strategic data migrations from on-prem/legacy systems to cloud-native platforms with minimal risk and downtime. Mentorship & Growth Mentor junior engineers, contribute to talent development, and ensure continuous learning within the team. Preferred candidate profile Python , SQL , PySpark , Databricks , AWS (Mandatory) Leadership Experience in Data Engineering/Architecture Added Advantage: Experience in Life Sciences / Pharma

Posted 2 days ago

Apply

8.0 - 13.0 years

17 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Project description We are seeking an experienced Senior Project Manager with a strong background in delivering data engineering and Python-based development projects. In this role, you will manage cross-functional teams and lead Agile delivery for high-impact, cloud-based data initiatives. You'll work closely with data engineers, scientists, architects, and business stakeholders to ensure projects are delivered on time, within scope, and aligned with strategic objectives. The ideal candidate combines technical fluency, strong leadership, and Agile delivery expertise in data-centric environments. Responsibilities Lead and manage data engineering and Python-based development projects, ensuring timely delivery and alignment with business goals. Work closely with data engineers, data scientists, architects, and product owners to gather requirements and define project scope. Translate complex technical requirements into actionable project plans and user stories. Oversee sprint planning, backlog grooming, daily stand-ups, and retrospectives in Agile/Scrum environments. Ensure best practices in Python coding, data pipeline design, and cloud-based data architecture are followed. Identify and mitigate risks, manage dependencies, and escalate issues when needed. Own stakeholder communications, reporting, and documentation of all project artifacts. Track KPIs and delivery metrics to ensure accountability and continuous improvement. Skills Must have ExperienceMinimum 8+ years of project management experience, including 3+ years managing data and Python-based development projects. Agile ExpertiseStrong experience delivering projects in Agile/Scrum environments with distributed or hybrid teams. Technical Fluency: Solid understanding of Python, data pipelines, and ETL/ELT workflows. Familiarity with cloud platforms such as AWS, Azure, or GCP. Exposure to tools like Airflow, dbt, Spark, Databricks, or Snowflake is a plus. ToolsProficiency with JIRA, Confluence, Git, and project dashboards (e.g., Power BI, Tableau). Soft Skills: Strong communication, stakeholder management, and leadership skills. Ability to translate between technical and non-technical audiences. Skilled in risk management, prioritization, and delivery tracking. Nice to have N/A OtherLanguagesEnglishC1 Advanced SenioritySenior

Posted 2 days ago

Apply

5.0 - 6.0 years

13 - 17 Lacs

Mumbai, Hyderabad

Work from Office

Naukri logo

Project description A project is intended migrate a global application covering multiple workflows of a top Insurance company into Azure, develop a cloud native application from scratch. Application serves global and North American markets. Responsibilities Drive the development team towards the goal by integrating skills and experiences. Design, develop, test, deploy, maintain and improve the software. Work with QA, product management, and operations in an Agile environment. Develop and support data-driven product decisions in a high energy high-impact team. Develop features that will drive our business through real-time feedback loops. Skills Must have 5 to 6 years of hands-on Azure development expertise on the below AZ App Services, Az Web Jobs, Az Functions, Az Logic Apps, ADF, Key Vault, Az Connectors; Nice to have .Net experience Other Languages EnglishC1 Advanced Seniority Senior

Posted 2 days ago

Apply

7.0 - 12.0 years

14 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about TII: At Target, we have a timeless purpose and a proven strategy. And that hasnt happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Targets global team and has more than 4,000 team members supporting the companys global strategy and operations. Team Overview: Every time a guest enters a Target store or browses Target.com nor the app, they experience the impact of Targets investments in technology and innovation. Were the technologists behind one of the most loved retail brands, delivering joy to millions of our guests, team members, and communities. Join our global in-house technology team of more than 5,000 of engineers, data scientists, architects and product managers striving to make Target the most convenient, safe and joyful place to shop. We use agile practices and leverage open-source software to adapt and build best-in-class technology for our team members and guestsand we do so with a focus on diversity and inclusion, experimentation and continuous learning. Position Overview: As a Lead Data Engineer , you will serve as the technical anchor for the engineering team, responsible for designing and developing scalable, high-performance data solutions . You will own and drive data architecture that supports both functional and non-functional business needs, ensuring reliability, efficiency, and scalability .Your expertise in big data technologies, distributed systems, and cloud platforms will help shape the engineering roadmap and best practices for data processing, analytics, and real-time data serving . You will play a key role in architecting and optimizing data pipelines using Hadoop, Spark, Scala/Java, and cloud technologies to support enterprise-wide data initiatives.Additionally, experience with API development for serving low-latency data and Customer Data Platforms (CDP) will be a strong plus. Key Responsibilities: Architect and build scalable, high-performance data pipelines and distributed data processing solutions using Hadoop, Spark, Scala/Java, and cloud platforms (AWS/GCP/Azure) . Design and implement real-time and batch data processing solutions , ensuring data is efficiently processed and made available for analytical and operational use. Develop APIs and data services to expose low-latency, high-throughput data for downstream applications, enabling real-time decision-making. Optimize and enhance data models, workflows, and processing frameworks to improve performance, scalability, and cost-efficiency. Drive data governance, security, and compliance best practices. Collaborate with data scientists, product teams, and business stakeholders to understand requirements and deliver data-driven solutions . Lead the design, implementation, and lifecycle management of data services and solutions. Stay up to date with emerging technologies and drive adoption of best practices in big data engineering, cloud computing, and API development . Provide technical leadership and mentorship to engineering teams, promoting best practices in data engineering and API design . About You: 7+ years of experience in data engineering, software development, or distributed systems. Expertise in big data technologies such as Hadoop, Spark, and distributed processing frameworks. Strong programming skills in Scala and/or Java (Python is a plus). Experience with cloud platforms (AWS, GCP, or Azure) and their data ecosystem (e.g., S3, BigQuery, Databricks, EMR, Snowflake, etc.). Proficiency in API development using REST, GraphQL, or gRPC to serve real-time and batch data. Experience with real-time and streaming data architectures (Kafka, Flink, Kinesis, etc.). Strong knowledge of data modeling, ETL pipeline design, and performance optimization . Understanding of data governance, security, and compliance in large-scale data environments. Experience with Customer Data Platforms (CDP) or customer-centric data processing is a strong plus. Strong problem-solving skills and ability to work in complex, unstructured environments . Excellent communication and collaboration skills, with experience working in cross-functional teams . Why Join Us Work with cutting-edge big data, API, and cloud technologies in a fast-paced, collaborative environment. Influence and shape the future of data architecture and real-time data services at Target. Solve high-impact business problems using scalable, low-latency data solutions . Be part of a culture that values innovation, learning, and growth . Life at Target- https://india.target.com/ Benefits- https://india.target.com/life-at-target/workplace/benefits Culture- https://india.target.com/life-at-target/belonging

Posted 2 days ago

Apply

13.0 - 18.0 years

14 - 18 Lacs

Gurugram

Work from Office

Naukri logo

0px> Who are we? In one sentence We are seeking a Java Full Stack Architect & People Manager with strong technical depth and leadership capabilities to lead our Java Modernization projects. The ideal candidate will possess a robust understanding of Java Full Stack, Databases and Cloud-based solution delivery , combined with proven experience in managing high-performing technical teams. This role requires a visionary who can translate business challenges into scalable distributed solutions while nurturing talent and fostering innovation. What will your job look like? Lead the design and implementation of Java Full Stack solutions covering Frontend, Backend and Batch processes & interface Integrations across business use cases. Translate business requirements into technical architectures using Azure/AWS Cloud Platforms . Manage and mentor a multidisciplinary team of engineers, Leads and Specialists . Drive adoption of Databricks , Python In addition to Java-based frameworks within solution development. Collaborate closely with product owners, data engineering teams, and customer IT & business stakeholders. Ensure high standards in code quality, system performance, and model governance . Track industry trends and continuously improve the technology stack adopting newer trends showcasing productization, automation and innovative ideas. Oversee end-to-end lifecycle: use case identification, PoC, MVP, production deployment, and support. Define and monitor KPIs to measure team performance and project impact. All you need is... 13+ years of overall IT experience with a strong background in Telecom domain (preferred). Proven hands-on experience with Java Full Stack technologies and Cloud DBs Strong understanding of Design Principles and patterns for distributed applications OnPrem as well as OnCloud . Demonstrated experience in building and deploying on Azure or AWS via CI/CD practices . Strong expertise in Java, Databases, Python, Kafka and Linux Scripting . In-depth understanding of cloud-native architecture , microservices , and data pipelines . Solid people management experience: team building, mentoring, performance reviews. Strong analytical thinking and communication skills. Ability to be Hands-On with Coding, Reviews while Development and Production Support Good to Have Skills: Familiarity with Databricks, PySpark Familiarity of Snowflake Why you will love this job: You will be challenged with leading and mentoring a few development teams & projects You will join a strong team with lots of activities, technologies, business challenges and a progression path You will have the opportunity to work with the industry most advanced technologies

Posted 2 days ago

Apply

7.0 - 12.0 years

6 - 12 Lacs

Pune, Bengaluru

Hybrid

Naukri logo

Required Skill SetMust-Have: Strong experience with AWS Cloud Services (S3, Lambda, Glue, EMR, Redshift, etc.) Proficiency in Python and PySpark for data engineering tasks Strong SQL skills and a solid understanding of data warehousing concepts Experience in designing, developing, and maintaining ETL pipelines Good-to-Have: Working knowledge of Databricks or Apache Spark Familiarity with Terraform and/or AWS CloudFormation Excellent analytical, problem-solving, and communication skills Job ResponsibilitiesPrimary: Design, develop, and manage scalable ETL pipelines using AWS services Handle structured and semi-structured data using Python and PySpark Develop complex SQL queries for data transformation and reporting Collaborate effectively with data architects, analysts, and stakeholders Ensure best practices in terms of cost, security, and cloud resource documentation Secondary: Exposure to DevOps and CI/CD practices Experience with Databricks or other big data platforms Working knowledge of AWS monitoring and logging services (e.g., CloudWatch)

Posted 2 days ago

Apply

8.0 - 12.0 years

20 - 25 Lacs

Chennai

Remote

Naukri logo

Databricks and AWS (S3, Glue, EMR, Kinesis, Lambda, IAM, CloudWatch). • Primary language: Python; strong skills in Spark SQ

Posted 2 days ago

Apply

0.0 - 5.0 years

12 - 24 Lacs

Pune

Work from Office

Naukri logo

Apply Now: https://lnkd.in/dXm2vCAx Responsibilities: * Design, develop & maintain data pipelines using Python, PySpark & SQL. * Optimize performance through Azure cloud services & Data Bricks. Office cab/shuttle Food allowance Provident fund Health insurance Annual bonus

Posted 2 days ago

Apply

7.0 - 9.0 years

18 - 22 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title: Industry & Function AI Decision Science Manager + S&C GN Management Level:07 - Manager Location: Primary Bengaluru, Secondary Gurugram Must-Have Skills: Consumer Goods & Services domain expertise , AI & ML, Proficiency in Python, R, PySpark, SQL , Experience in cloud platforms (Azure, AWS, GCP) , Expertise in Revenue Growth Management, Pricing Analytics, Promotion Analytics, PPA/Portfolio Optimization, Trade Investment Optimization. Good-to-Have Skills: Experience with Large Language Models (LLMs) like ChatGPT, Llama 2, or Claude 2 , Familiarity with optimization methods, advanced visualization tools (Power BI, Tableau), and Time Series Forecasting Job Summary : As a Decision Science Manager , you will lead the design and delivery of AI solutions in the Consumer Goods & Services domain. This role involves working closely with clients to provide advanced analytics and AI-driven strategies that deliver measurable business outcomes. Your expertise in analytics, problem-solving, and team leadership will help drive innovation and value for the organization. Roles & Responsibilities: Analyze extensive datasets and derive actionable insights for Consumer Goods data sources (e.g., Nielsen, IRI, EPOS, TPM). Evaluate AI and analytics maturity in the Consumer Goods sector and develop data-driven solutions. Design and implement AI-based strategies to deliver significant client benefits. Employ structured problem-solving methodologies to address complex business challenges. Lead data science initiatives, mentor team members, and contribute to thought leadership. Foster strong client relationships and act as a key liaison for project delivery. Build and deploy advanced analytics solutions using Accentures platforms and tools. Apply technical proficiency in Python, Pyspark, R, SQL, and cloud technologies for solution deployment. Develop compelling data-driven narratives for stakeholder engagement. Collaborate with internal teams to innovate, drive sales, and build new capabilities. Drive insights in critical Consumer Goods domains such as Revenue Growth Management Pricing Analytics and Pricing Optimization Promotion Analytics and Promotion Optimization SKU Rationalization/ Portfolio Optimization Price Pack Architecture Decomposition Models Time Series Forecasting Professional & Technical Skills: Proficiency in AI and analytics solutions (descriptive, diagnostic, predictive, prescriptive, generative). Expertise in delivering large scale projects/programs for Consumer Goods clients on Revenue Growth Management - Pricing Analytics, Promotion Analytics, Portfolio Optimization, etc. Deep and clear understanding of typical data sources used in RGM programs POS, Syndicated, Shipment, Finance, Promotion Calendar, etc. Strong programming skills in Python, R, PySpark, SQL, and experience with cloud platforms (Azure, AWS, GCP) and proficient in using services like Databricks and Sagemaker. Deep knowledge of traditional and advanced machine learning techniques, including deep learning. Experience with optimization techniques (linear, nonlinear, evolutionary methods). Familiarity with visualization tools like Power BI, Tableau. Experience with Large Language Models (LLMs) like ChatGPT, Llama 2. Certifications in Data Science or related fields. Additional Information: The ideal candidate has a strong educational background in data science and a proven track record in delivering impactful AI solutions in the Consumer Goods sector. This position offers opportunities to lead innovative projects and collaborate with global teams. Join Accenture to leverage cutting-edge technologies and deliver transformative business outcomes. About Our Company | AccentureQualification Experience: Minimum 7-9 years of experience in data science, particularly in the Consumer Goods sector Educational Qualification: Bachelors or Masters degree in Statistics, Economics, Mathematics, Computer Science, or MBA (Data Science specialization preferred)

Posted 2 days ago

Apply

15.0 - 20.0 years

9 - 13 Lacs

Pune

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to enhance the overall data architecture and platform functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in continuous learning to stay updated with industry trends and technologies.- Assist in the documentation of data platform processes and best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with cloud data services and data warehousing solutions.- Strong understanding of data integration techniques and ETL processes.- Familiarity with data modeling concepts and practices.- Experience in working with big data technologies and frameworks. Additional Information:- The candidate should have minimum 2 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

5.0 - 10.0 years

11 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title - S&C GN AI - Insurance AI Generalist- Consultant Management Level: 9-Team Lead/Consultant Location: Bengaluru, BDC7C Must-have skills: Generative AI Good to have skills: Ability to leverage design thinking, business process optimization, and stakeholder management skills. Job Summary : This role involves driving strategic initiatives, managing business transformations, and leveraging industry expertise to create value-driven solutions. Roles & Responsibilities: Provide strategic advisory services, conduct market research, and develop data-driven recommendations to enhance business performance. WHATS IN IT FOR YOU Accenture Global Network is a unified powerhouse that combines the capabilities of Strategy & Consulting with the force multipliers of Data and Artificial Intelligence. It is central to Accenture's future growth and Accenture is deeply invested in providing individuals with continuous opportunities for learning and growth. Youll be part of a diverse, vibrant, global community, continually pushing the boundaries of business capabilities. Accenture is also ranked 10th* on the 2023 Worlds Best Workplaces list, making it a great place to work. What you would do in this role Help the team architect, design, build, deploy, deliver, and monitor advanced analytics models including GenAI, for different client problems Develop functional aspects of Generative AI pipelines, such as information retrieval, storage techniques, and system optimizations across services for a chosen cloud platform such as Azure or AWS Interface with clients/account team to understand engineering/business problems and translate into analytics problems that shall deliver insights for action and operational improvements consume data from multiple sources and present relevant information in a crisp and digestible manner that delivers valuable insights to both technical and non-technical audiences Professional & Technical Skills: - Relevant experience in the required domain. - Strong analytical, problem-solving, and communication skills. - Ability to work in a fast-paced, dynamic environment. Who we are looking for 5+ years experience in data-driven techniques including exploratory data analysis, data pre-processing, machine learning, and visualization to solve business problems Bachelor's/Masters degree in Mathematics, Statistics, Economics , Computer Science, or related field Solid foundation in Statistical Modeling, Machine Learning algorithms Advanced proficiency in programming languages such as Python, PySpark, SQL, Scala Experience implementing AI solutions for Insurance industry Experience in production-grade integration of AI/ML pipelines either upstream (data modeling, engineering, management) or downstream (ML Ops in cloud or on-prem, UI integration with APIs, Visualization in Tableau/PowerBI) required; experience in full-stack implementations preferred Strong communication, collaboration and presentation skills to effectively convey complex data insights and recommendations to clients and stakeholders Hands-on experience with Azure, AWS or Databricks tools is a plus Familiarity with GenAI, LLMs, RAG architecture and Lang chain frameworks is a plus Additional Information: - Opportunity to work on innovative projects. - Career growth and leadership exposure. About Our Company | Accenture Qualification Experience: 8-10Years Educational Qualification: Any Degree

Posted 2 days ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : PySpark Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : BE Project Role :Software Development Engineer Project Role Description :Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have Skills :PySparkGood to Have Skills :No Industry SpecializationJob :Key Responsibilities :Overall 8 years of experience working in Data Analytics projects, Work on client projects to deliver AWS, PySpark, Databricks based Data engineering Analytics solutions Build and operate very large data warehouses or data lakes ETL optimization, designing, coding, tuning big data processes using Apache Spark Build data pipelines applications to stream and process datasets at low latencies Show efficiency in handling data - tracking data lineage, ensuring data quality, and improving discoverability of data. Technical Experience :Minimum of 2 years of experience in Databricks engineering solutions on any of the Cloud platforms using PySpark Minimum of 5 years of experience years of experience in ETL, Big Data/Hadoop and data warehouse architecture delivery Minimum 3 year of Experience in one or more programming languages Python, Java, Scala Experience using airflow for the data pipelines in min 1 project 2 years of experience developing CICD pipelines using GIT, Jenkins, Docker, Kubernetes, Shell Scripting, Terraform. Must be able to understand ETL technologies and translate into Cloud (AWS, Azure, Google Cloud) native tools or Pyspark. Professional Attributes :1 Should have involved in data engineering project from requirements phase to delivery 2 Good communication skill to interact with client and understand the requirement 3 Should have capability to work independently and guide the team. Educational Qualification:Additional Info : Qualification BE

Posted 2 days ago

Apply

2.0 - 7.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in the development and maintenance of the data platform components, contributing to the overall success of the project. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist with the data platform blueprint and design.- Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models.- Develop and maintain data platform components.- Contribute to the overall success of the project. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of statistical analysis and machine learning algorithms.- Experience with data visualization tools such as Tableau or Power BI.- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms.- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information:- The candidate should have a minimum of 2 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in the development and maintenance of the data platform components, contributing to the overall success of the organization. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Assist with the data platform blueprint and design.- Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models.- Develop and maintain data platform components.- Contribute to the overall success of the organization. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of statistical analysis and machine learning algorithms.- Experience with data visualization tools such as Tableau or Power BI.- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms.- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information:- The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies