Home
Jobs

3786 Hadoop Jobs - Page 29

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

4 - 8 Lacs

Navi Mumbai

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Business AgilityMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Additionally, you will monitor and optimize data workflows to enhance performance and reliability, ensuring that data is accessible and actionable for stakeholders. Roles & Responsibilities:- Need Databricks resource with Azure cloud experience- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with data architects and analysts to design scalable data solutions.- Implement best practices for data governance and security throughout the data lifecycle. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with Business Agility.- Strong understanding of data modeling and database design principles.- Experience with data integration tools and ETL processes.- Familiarity with cloud platforms and services related to data storage and processing. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Data Architecture Principles Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. You will create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Your day will involve working on various data-related tasks and collaborating with teams to optimize data processes. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Develop innovative data solutions to meet business requirements.- Optimize data pipelines for efficiency and scalability.- Implement data governance policies to ensure data quality and security. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Architecture Principles.- Strong understanding of data modeling and database design.- Experience with ETL tools and processes.- Knowledge of cloud platforms and big data technologies.- Good To Have Skills: Data management and governance expertise. Additional Information:- The candidate should have a minimum of 12 years of experience in Data Architecture Principles.- This position is based at our Bengaluru office.Education information - - A 15 years full-time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 5.0 years

15 - 20 Lacs

Gurugram

Work from Office

Naukri logo

Qualifications for Data Engineer : 3+ Years of experience in building and optimizing big data solutions required to fulfill business and technology requirements. 4+ years of technical expertise in areas of design and implementation using big data technology Hadoop, Hive, Spark, Python/Java. Strong analytic skills to understand and create solutions for business use cases. Ensure best practices to implement data governance principles, data quality checks on each data layer. A successful history of manipulating, processing and extracting value from large disconnected datasets. Working knowledge of message queuing, stream processing, and highly scalable big data data stores. Strong project management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with 4+ years of experience in a Data Engineer role, who has attained a Graduate degree in B.Tech/B.E. They should also have experience using the following software/tool- Experience with big data : Hadoop, Map Reduce, Hive, Spark, Kafka, Airflow etc Experience with relational SQL and NoSQL databases: MySQL, Postgres, MongoDB, HBase, Cassandra etc. Experience with cloud Data platform: AWS, Azure-HDInsights, GCP, CDP Experience with real time data processing: Storm, Spark-Streaming etc. Experience with object-oriented/object function scripting languages: Java, Python, Scala, etc. If Interested: Kindly fill the google form given below: amulyavaish@paisabazaar.com

Posted 1 week ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : PySpark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : Engineering graduate preferably Computer Science graduate 15 years of full time education Summary :As a Software Development Engineer, you will be responsible for analyzing, designing, coding, and testing multiple components of application code using PySpark. Your typical day will involve performing maintenance, enhancements, and/or development work for one or more clients in Chennai. Roles & Responsibilities:- Design, develop, and maintain PySpark applications for one or more clients.- Analyze and troubleshoot complex issues in PySpark applications and provide solutions.- Collaborate with cross-functional teams to ensure timely delivery of high-quality software solutions.- Participate in code reviews and ensure adherence to coding standards and best practices.- Stay updated with the latest advancements in PySpark and related technologies. Professional & Technical Skills: - Must To Have Skills: Strong experience in PySpark.- Good To Have Skills: Experience in Big Data technologies such as Hadoop, Hive, and HBase.- Experience in designing and developing distributed systems using PySpark.- Strong understanding of data structures, algorithms, and software design principles.- Experience in working with SQL and NoSQL databases.- Experience in working with version control systems such as Git. Additional Information:- The candidate should have a minimum of 5 years of experience in PySpark.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering high-quality software solutions.- This position is based at our Bangalore, Hyderabad, Chennai and Pune Offices.- Mandatory office (RTO) for 2- 3 days and have to work on 2 shifts (Shift A- 10:00am to 8:00pm IST and Shift B - 12:30pm to 10:30 pm IST) Qualification Engineering graduate preferably Computer Science graduate 15 years of full time education

Posted 1 week ago

Apply

1.0 - 3.0 years

14 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Management Level: Ind & Func AI Decision Science Analyst Location: Bengaluru (Bangalore), Gurugram (Gurgaon), Hyderabad, Chennai. Must-have skills: Programming languages -Python/R, Generative AI, Large Language Models (LLMs), ML libraries such as Scikit-learn, TensorFlow, Torch, Lang Chain, or OpenAI API, RAG Applications. Good to have skills :Big data technologies such as Spark or Hadoop,AI model explainability(XAI),bias detection and AI ethics. Familiarity with Edge AI and deploying models on embedded devices for industrial automation. Experience with Reinforcement Learning (RL) and AI-driven optimization techniques. Job Summary We are looking for a Data Scientist / AI Specialist with 1-3 years of experience to join our team and work on client projects in the Automotive & Industrial sectors. This role will involve leveraging traditional Machine Learning (ML), Generative AI (GenAI), Agentic AI, and Autonomous AI Systems to drive innovation, optimize processes, and enhance decision-making in complex industrial environments. Prior experience in the Auto/Industrial industry is a plus, but we welcome candidates from any domain with a strong analytical mindset and a passion for applying AI to real-world business challenges. Roles & Responsibilities: Develop, deploy and monitor AI/ML models in production environments & enterprise systems, including predictive analytics, anomaly detection, and process optimization for clients. Work with Generative AI models (e.g., GPT, Stable Diffusion, DALLE) for applications such as content generation, automated documentation, code synthesis, and intelligent assistants. Implement Agentic AI systems, including AI-powered automation, self-learning agents, and decision-support systems for industrial applications. Design and build Autonomous AI solutions for tasks like predictive maintenance, supply chain optimization, and robotic process automation (RPA). Work with structured and unstructured data from various sources, including IoT sensors, manufacturing logs, and customer interactions. Optimize and fine-tune LLMs (Large Language Models) for specific business applications, ensuring ethical and explainable AI use. Utilize MOps and AI orchestration tools to streamline model deployment, monitoring, and retraining cycles. Collaborate with cross-functional teams, including engineers, business analysts, and domain experts, to align AI solutions with business objectives. Stay updated with cutting-edge AI research in Generative AI, Autonomous AI, and Multi-Agent Systems. Professional & Technical Skills: 1-3 years of experience in Data Science, Machine Learning, or AI-related roles. Proficiency in Python (preferred) or R, and experience with ML libraries such as Scikit-learn, TensorFlow, Torch, Lang Chain, or OpenAI API. Strong understanding of Generative AI, Large Language Models (LLMs), and their practical applications. Hands-on experience in fine-tuning and deploying foundation models (e.g., OpenAI, Llama, Claude, Gemini, etc.). Experience with Vector Databases (e.g., FAISS, Chroma, Weaviate, Pinecone) for retrieval-augmented generation (RAG) applications. Knowledge of Autonomous AI Agents (e.g., AutoGPT, BabyAGI) and multi-agent orchestration frameworks. Experience working with SQL and NoSQL databases. Familiarity with cloud platforms (AWS, Azure, or GCP) for AI/ML model deployment. Strong problem-solving and analytical thinking abilities. Ability to communicate complex AI concepts to technical and non-technical stakeholders. Bonus:Experience in Automotive, Industrial, or Manufacturing AI applications (e.g., predictive maintenance, quality inspection, digital twins). Additional Information: Bachelor/Masters degree in Statistics/Economics/ Mathematics/ Computer Science or related disciplines with an excellent academic record /MBA from top-tier universities. Excellent Communication and Interpersonal Skills. About Our Company | Accenture Qualification Experience: Minimum 1-3 years of relevant Data Science, Machine Learning or AI-related roles., Exposure to Industrial & Automotive Firms or Professional Services. Educational Qualification: Bachelor/Masters degree in Statistics/Economics/ Mathematics/ Computer Science or related disciplines with an excellent academic record or MBA from top-tier universities.

Posted 1 week ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About the role At Tesco, we are committed to providing the best for you. As a result, our colleagues enjoy a unique, differentiated, market- competitive reward package, based on the current industry practices, for all the work they put into serving our customers, communities and planet a little better every day. Our Tesco Rewards framework consists of pillars - Fixed Pay, Incentives, and Benefits. Total Rewards offered at Tesco is determined by four principles - simple, fair, competitive, and sustainable. • Salary - Your fixed pay is the guaranteed pay as per your contract of employment.• Performance Bonus - Opportunity to earn additional compensation bonus based on performance, paid annually• Leave & Time-off - Colleagues are entitled to 30 days of leave (18 days of Earned Leave, 12 days of Casual/Sick Leave) and 10 national and festival holidays, as per the company’s policy.• Making Retirement Tension-FreeSalary - In addition to Statutory retirement beneets, Tesco enables colleagues to participate in voluntary programmes like NPS and VPF.• Health is Wealth - Tesco promotes programmes that support a culture of health and wellness including insurance for colleagues and their family. Our medical insurance provides coverage for dependents including parents or in-laws.• Mental Wellbeing - We offer mental health support through self-help tools, community groups, ally networks, face-to-face counselling, and more for both colleagues and dependents. • Financial Wellbeing - Through our financial literacy partner, we offer one-to-one financial coaching at discounted rates, as well as salary advances on earned wages upon request. • Save As You Earn (SAYE) - Our SAYE programme allows colleagues to transition from being employees to Tesco shareholders through a structured 3-year savings plan. • Physical Wellbeing - Our green campus promotes physical wellbeing with facilities that include a cricket pitch, football field, badminton and volleyball courts, along with indoor games, encouraging a healthier lifestyle. You will be responsible for - Identifying operational improvements and finding solutions by applying CI tools and techniques- Responsible for completing tasks and transactions within agreed KPI's- Knows and applies fundamental work theories/concepts/processes in own areas of workEngaging with business & functional partners to understand business priorities, ask relevant questions and scope same into a analytical solution document calling out how application of data science will improve decision making- In depth understanding of techniques to prepare the analytical data set leveraging multiple complex data set sources- Building Statistical models and ML algorithms with practitioner level competency- Writing structured, modularized & codified algorithms using Continuous Improvement principles (development of knowledge assets and reusable modules on GitHub, Wiki, etc) with expert competency- Building easy visualization layer on top of the algorithms in order to empower end-users to take decisions - this could be on a visualization platform (Tableau / Python) or through a recommendation set through PPTs- Working with the line manager to ensure application / consumption and also think beyond the immediate ask and spot opportunities to address the bigger business questions (if any) You will need -1-2 year of experience in data science application in Retail or CPG Preferred - Functional experience: Marketing, Supply Chain, Customer, Merchandising, Operations, Finance or Digital - Applied Math: Applied Statistics, Design of Experiments, Linear & Logistic Regression, Decision Trees, Forecasting, Optimization algorithms - Tech: SQL, Hadoop, Python, Tableau, MS Excel, MS Powerpoint - Soft Skills: Analytical Thinking & Problem solving, Storyboarding Whats in it for you? At Tesco, we are committed to providing the best for you. As a result, our colleagues enjoy a unique, differentiated, market- competitive reward package, based on the current industry practices, for all the work they put into serving our customers, communities and planet a little better every day. Our Tesco Rewards framework consists of pillars - Fixed Pay, Incentives, and Benefits. Total Rewards offered at Tesco is determined by four principles - simple, fair, competitive, and sustainable. Salary - Your fixed pay is the guaranteed pay as per your contract of employment. Performance Bonus - Opportunity to earn additional compensation bonus based on performance, paid annually Leave & Time-off - Colleagues are entitled to 30 days of leave (18 days of Earned Leave, 12 days of Casual/Sick Leave) and 10 national and festival holidays, as per the company’s policy. Making Retirement Tension-FreeSalary - In addition to Statutory retirement beneets, Tesco enables colleagues to participate in voluntary programmes like NPS and VPF. Health is Wealth - Tesco promotes programmes that support a culture of health and wellness including insurance for colleagues and their family. Our medical insurance provides coverage for dependents including parents or in-laws. Mental Wellbeing - We offer mental health support through self-help tools, community groups, ally networks, face-to-face counselling, and more for both colleagues and dependents. Financial Wellbeing - Through our financial literacy partner, we offer one-to-one financial coaching at discounted rates, as well as salary advances on earned wages upon request. Save As You Earn (SAYE) - Our SAYE programme allows colleagues to transition from being employees to Tesco shareholders through a structured 3-year savings plan. Physical Wellbeing - Our green campus promotes physical wellbeing with facilities that include a cricket pitch, football field, badminton and volleyball courts, along with indoor games, encouraging a healthier lifestyle. About Us Tesco in Bengaluru is a multi-disciplinary team serving our customers, communities, and planet a little better every day across markets. Our goal is to create a sustainable competitive advantage for Tesco by standardising processes, delivering cost savings, enabling agility through technological solutions, and empowering our colleagues to do even more for our customers. With cross-functional expertise, a wide network of teams, and strong governance, we reduce complexity, thereby offering high-quality services for our customers. Tesco in Bengaluru, established in 2004 to enable standardisation and build centralised capabilities and competencies, makes the experience better for our millions of customers worldwide and simpler for over 3,30,000 colleagues. Tesco Business Solutions: Established in 2017, Tesco Business Solutions (TBS) has evolved from a single entity traditional shared services in Bengaluru, India (from 2004) to a global, purpose-driven solutions-focused organisation. TBS is committed to driving scale at speed and delivering value to the Tesco Group through the power of decision science. With over 4,400 highly skilled colleagues globally, TBS supports markets and business units across four locations in the UK, India, Hungary, and the Republic of Ireland. The organisation underpins everything that the Tesco Group does, bringing innovation, a solutions mindset, and agility to its operations and support functions, building winning partnerships across the business. TBS's focus is on adding value and creating impactful outcomes that shape the future of the business. TBS creates a sustainable competitive advantage for the Tesco Group by becoming the partner of choice for talent, transformation, and value creation Show more Show less

Posted 1 week ago

Apply

1.0 - 2.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About the role Enable data driven decision making across the Tesco business globally by developing analytics solutions using a combination of math, tech and business knowledge You will be responsible for -1-2 year experience in data science application in Retail or CPG Preferred- Functional experience: Marketing, Supply Chain, Customer, Merchandising, Operations, Finance or Digital - Applied Math: Applied Statistics, Design of Experiments, Linear & Logistic Regression, Decision Trees, Forecasting, Optimization algorithms- Tech: SQL, Hadoop, Python, Tableau, MS Excel, MS Powerpoint- Soft Skills: Analytical Thinking & Problem solving, Storyboarding You will need - Identifying operational improvements and finding solutions by applying CI tools and techniques- Responsible for completing tasks and transactions within agreed KPI's- Knows and applies fundamental work theories/concepts/processes in own areas of workEngaging with business & functional partners to understand business priorities, ask relevant questions and scope same into a analytical solution document calling out how application of data science will improve decision making- In depth understanding of techniques to prepare the analytical data set leveraging multiple complex data set sources- Building Statistical models and ML algorithms with practitioner level competency- Writing structured, modularized & codified algorithms using Continuous Improvement principles (development of knowledge assets and reusable modules on GitHub, Wiki, etc) with expert competency- Building easy visualization layer on top of the algorithms in order to empower end-users to take decisions - this could be on a visualization platform (Tableau / Python) or through a recommendation set through PPTs- Working with the line manager to ensure application / consumption and also think beyond the immediate ask and spot opportunities to address the bigger business questions (if any) Whats in it for you? At Tesco, we are committed to providing the best for you.As a result, our colleagues enjoy a unique, differentiated, market- competitive reward package, based on the current industry practices, for all the work they put into serving our customers, communities and planet a little better every day.Our Tesco Rewards framework consists of pillars - Fixed Pay, Incentives, and Benefits. Total Rewards offered at Tesco is determined by four principles - simple, fair, competitive, and sustainable.• Salary - Your fixed pay is the guaranteed pay as per your contract of employment.• Performance Bonus - Opportunity to earn additional compensation bonus based on performance, paid annually• Leave & Time-off - Colleagues are entitled to 30 days of leave (18 days of Earned Leave, 12 days of Casual/Sick Leave) and 10 national and festival holidays, as per the company’s policy.• Making Retirement Tension-FreeSalary - In addition to Statutory retirement beneets, Tesco enables colleagues to participate in voluntary programmes like NPS and VPF.• Health is Wealth - Tesco promotes programmes that support a culture of health and wellness including insurance for colleagues and their family. Our medical insurance provides coverage for dependents including parents or in-laws.• Mental Wellbeing - We offer mental health support through self-help tools, community groups, ally networks, face-to-face counselling, and more for both colleagues and dependents. • Financial Wellbeing - Through our financial literacy partner, we offer one-to-one financial coaching at discounted rates, as well as salary advances on earned wages upon request. • Save As You Earn (SAYE) - Our SAYE programme allows colleagues to transition from being employees to Tesco shareholders through a structured 3-year savings plan. • Physical Wellbeing - Our green campus promotes physical wellbeing with facilities that include a cricket pitch, football field, badminton and volleyball courts, along with indoor games, encouraging a healthier lifestyle. About Us Tesco in Bengaluru is a multi-disciplinary team serving our customers, communities, and planet a little better every day across markets. Our goal is to create a sustainable competitive advantage for Tesco by standardising processes, delivering cost savings, enabling agility through technological solutions, and empowering our colleagues to do even more for our customers. With cross-functional expertise, a wide network of teams, and strong governance, we reduce complexity, thereby offering high-quality services for our customers.Tesco in Bengaluru, established in 2004 to enable standardisation and build centralised capabilities and competencies, makes the experience better for our millions of customers worldwide and simpler for over 3,30,000 colleagues. Tesco Business Solutions: Established in 2017, Tesco Business Solutions (TBS) has evolved from a single entity traditional shared services in Bengaluru, India (from 2004) to a global, purpose-driven solutions-focused organisation. TBS is committed to driving scale at speed and delivering value to the Tesco Group through the power of decision science. With over 4,400 highly skilled colleagues globally, TBS supports markets and business units across four locations in the UK, India, Hungary, and the Republic of Ireland. The organisation underpins everything that the Tesco Group does, bringing innovation, a solutions mindset, and agility to its operations and support functions, building winning partnerships across the business. TBS's focus is on adding value and creating impactful outcomes that shape the future of the business. TBS creates a sustainable competitive advantage for the Tesco Group by becoming the partner of choice for talent, transformation, and value creation Show more Show less

Posted 1 week ago

Apply

5.0 - 8.0 years

18 - 22 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title - Sales Excellence - COE - Data Engineering Specialist Management Level: 9-Team Lead/Consultant Location: Mumbai, MDC2C Must-have skills: Sales Good to have skills: Data Science, SQL, Automation, Machine Learning Job Summary : Apply deep statistical tools and techniques to find relationships between variables Roles & Responsibilities: - Apply deep statistical tools and techniques to find relationships between variables. - Develop intellectual property for analytical methodologies and optimization techniques. - Identify data requirements and develop analytic solutions to solve business issues. Job Title - Analytics & Modelling Specialist Management Level :9-Specialist Location:Bangalore/ Gurgaon/Hyderabad/Mumbai Must have skills:Python, Data Analysis, Data Visualization, SQL Good to have skills:Machine Learning Job Summary : The Center of Excellence (COE) makes sure that the sales and pricing methods and offerings of Sales Excellence are effective. - The COE supports salespeople through its business partners and Analytics and Sales Operations teams. The Data Engineer helps manage data sources and environments, utilizing large data sets and maintaining their integrity to create models and apps that deliver insights to the organization. Roles & Responsibilities: Build and manage data models that bring together data from different sources. Help consolidate and cleanse data for use by the modeling and development teams. Structure data for use in analytics applications. Lead a team of Data Engineers effectively. Professional & Technical Skills: A bachelors degree or equivalent Total experience Range:5-8 years in the relevant field A minimum of 3 years of GCP experience with exposure to machine learning/data science Experience in configuration the machine learning workflow in GCP. A minimum of 5 years Advanced SQL knowledge and experience working with relational databases A minimum of 3 years Familiarity and hands on experience in different SQL objects like stored procedures, functions, views etc., A minimum of 3 years Building of data flow components and processing systems to extract, transform, load and integrate data from various sources. A minimum of 3 years Hands on experience in advanced excel topics such as cube functions, VBA Automation, Power Pivot etc. A minimum of 3 years Hands on experience in Python Additional Information: Understanding of sales processes and systems. Masters degree in a technical field. Experience with quality assurance processes. Experience in project management. You May Also Need: Ability to work flexible hours according to business needs. Must have good internet connectivity and a distraction-free environment for working at home, in accordance with local guidelines. Professional & Technical Skills: - Relevant experience in the required domain. - Strong analytical, problem-solving, and communication skills. - Ability to work in a fast-paced, dynamic environment. Additional Information: - Opportunity to work on innovative projects. - Career growth and leadership exposure. About Our Company | Accenture Qualification Experience:8 to 10 Years Educational Qualification: B.Com

Posted 1 week ago

Apply

5.0 - 10.0 years

22 - 25 Lacs

Mumbai

Work from Office

Naukri logo

Candidate Skill: Technical Skills - Data Engineering | ETL | SQL | Python | AWS | Azure | Google Cloud | Hadoop | Spark | Kafka | Data Warehousing | Data Modeling | NoSQL | Data Quality We are looking for an experienced Data Engineer to join our team in Mumbai. As a Data Engineer, you will be responsible for designing, building, and maintaining efficient data pipelines that transform raw data into actionable insights. You will work closely with data scientists and analysts to ensure that data is accessible, reliable, and optimized for analysis. Your role will also involve handling large datasets, ensuring data quality, and implementing data processing frameworks. Key Responsibilities:Design and build scalable data pipelines for processing, transforming, and integrating large datasets.Develop and maintain ETL processes to extract, transform, and load data from multiple sources into the data warehouse.Collaborate with data scientists and analysts to ensure that the data is optimized for analysis and modeling.Ensure the quality, integrity, and security of data throughout its lifecycle. Work with cloud-based technologies for data storage and processing (AWS, Azure, GCP).Implement data processing frameworks for efficient handling of structured and unstructured data.Troubleshoot and resolve issues related to data pipelines and workflows.Automate data integration processes and ensure data consistency and accuracy across systems. Required Skills: 5+ years of experience in Data Engineering with hands-on experience in data pipeline development.Strong expertise in ETL processes, data integration, and data warehousing.Proficiency in SQL, Python, and other programming languages for data manipulation.Experience with cloud technologies such as AWS, Azure, or Google Cloud.Knowledge of big data technologies like Hadoop, Spark, or Kafka is a plus.Strong understanding of data modeling, data quality, and data governance. Familiarity with NoSQL databases (e.g., MongoDB, Cassandra) and relational databases.Strong analytical and problem-solving skills with the ability to work with large, complex datasets.

Posted 1 week ago

Apply

5.0 - 8.0 years

4 - 7 Lacs

Navi Mumbai

Work from Office

Naukri logo

Skill required: Delivery - Warranty Management Designation: I&F Decision Sci Practitioner Sr Analyst Qualifications: Any Graduation Years of Experience: 5 to 8 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Data & AIDefine warranty offerings; run outsourced after-sales warranty support and entitlement programs; evaluate customer feedback and planned versus actual costs of warranty coverage; use warranty data analytics to reduce cost and improve product quality; increase recoveries from suppliers and design and deploy warranty solutions. What are we looking for Warranty Analytics Automotive Warranty Scripting Data Analysis & Interpretation Business Intelligence Commitment to quality Adaptable and flexible Agility for quick learning Ability to work well in a team Written and verbal communication Data Engineering/SQL Databricks ML Roles and Responsibilities: In this role you are required to do analysis and solving of increasingly complex problems Your day-to-day interactions are with peers within Accenture You are likely to have some interaction with clients and/or Accenture management You will be given minimal instruction on daily work/tasks and a moderate level of instruction on new assignments Decisions that are made by you impact your own work and may impact the work of others In this role you would be an individual contributor and/or oversee a small work effort and/or team Qualification Any Graduation

Posted 1 week ago

Apply

7.0 - 11.0 years

11 - 15 Lacs

Kolkata

Work from Office

Naukri logo

Skill required: Tech for Operations - Tech Solution Architecture Designation: Solution Architecture Specialist Qualifications: Any Graduation Years of Experience: 7 to 11 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do In our Service Supply Chain offering, we leverage a combination of proprietary technology and client systems to develop, execute, and deliver BPaaS (business process as a service) or Managed Service solutions across the service lifecycle:Plan, Deliver, and Recover. Join our dynamic Service Supply Chain (SSC) team and be at the forefront of helping world class organizations unlock their full potential. Imagine a career where your innovative work makes a real impact, and every day brings new challenges and opportunities for growth. We re on the lookout for passionate, talented individuals ready to make a difference. . If you re eager to shape the future and drive success, this is your chancejoin us now and lets build something extraordinary together!The Technical Solution Architect I is responsible for evaluating an organizations business needs and determining how IT can support those needs leveraging software like Azure, and Salesforce. Aligning IT strategies with business goals has become paramount, and a solutions architect can help determine, develop, and improve technical solutions in support of business goals. The Technical Solution Architect I also bridge communication between IT and business operations to ensure everyone is aligned in developing and implementing technical solutions for business problems. The process requires regular feedback, adjustments, and problem solving in order to properly design and implement potential solutions. To be successful as a Technical Solution Architect I, you should have excellent technical, analytical, and project management skills. What are we looking for Minimum of 5 years of IT experienceMinimum of 1 year of experience in solution architectureMinimum of 1 year of Enterprise-scale project delivery experienceMicrosoft Azure Cloud ServicesMicrosoft Azure Data FactoryMicrosoft Azure DatabricksMicrosoft Azure DevOpsWritten and verbal communicationAbility to establish strong client relationshipProblem-solving skillsStrong analytical skillsExpert knowledge of Azure Cloud ServicesExperience with Azure Data platforms (Logic apps, Service bus, Databricks, Data Factory, Azure integration services)CI/CD, version-controlling experience using Azure DevopsPython ProgrammingKnowledge of both traditional and modern data architecture and processing concepts, including relational databases, data warehousing, and business analytics. (e.g., NoSQL, SQL Server, Oracle, Hadoop, Spark, Knime). Good understanding of security processes, best practices, standards & issues involved in multi-tier cloud or hybrid applications. Proficiency in both high-level and low-level designing to build an architect using customization or configuration on Salesforce Service cloud, Field Service lightening, APEX, Visual Force, Lightening, Community. Expertise in designing and building real time/batch integrations between Salesforce and other systems. Design Apex and Lightning framework including Lightning Pattern, Error logging framework etc. Roles and Responsibilities: Meet with clients to understanding their needs (lead architect assessment meetings),and determining gaps between those needs and technical functionality. Communicate with key stakeholder, across different stages of the Software Development Life Cycle. Work on creating the high-level design and lead architectural decisions Interact with clients to create end-to-end specifications for Azure & Salesforce cloud solutions Provide clarification and answer any question regarding the solution architecture Lead the development of custom enterprise solutions Responsible for application architecture, ensuring high performance, scalability, and availability for those applications Responsible for overall data architect, modeling, and related standards enforced throughout the enterprise ecosystem including data, master data, and metadata, processes, governance, and change control Unify the data architecture used within all applications and identifying appropriate systems of record, reference, and management Share engagement experience with the internal audiences and enrich collective IP. Conduct architecture workshops and other enablement sessions. Qualification Any Graduation

Posted 1 week ago

Apply

16.0 - 21.0 years

4 - 8 Lacs

Kolkata

Work from Office

Naukri logo

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : SAP HANA DB Administration, PostgreSQL Administration, Hadoop Administration, Ansible on Microsoft Azure Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 16 years full time educationCloud Database Engineer HANARequired Skills: SAP HANA Database Administration - Knowledge of clustering, replication, and load balancing techniques to ensure database availability and reliabilityProficiency in monitoring and maintaining the health and performance of high availability systemsExperience with public cloud platforms such as GCP, AWS, or AzureStrong troubleshooting skills and the ability to provide effective resolutions for technical issuesDesired Skills: Understanding of Cassandra, Ansible, Terraform, Kafka, Redis, Hadoop or Postgres. Growth and product mindset and a strong focus on automation. Working knowledge of Kubernetes for container orchestration and scalability. Activities:Collaborate closely with cross-functional teams to gather requirements and support SAP teams to execute database initiatives. Automate the provisioning and configuration of cloud infrastructure, ensuring efficient and reliable deployments. Provide operational support to monitor database performance, implement changes, and apply new patches and versions when required and previously agreed . Act as the point of contact for escalated technical issues with our Engineering colleagues, demonstrating deep troubleshooting skills to provide effective resolutions to unblock our partners. :Bachelors degree in computer science, Engineering, or a related field. Proven experience in planning, deploying, supporting, and optimizing highly scalable and resilient SAP HANA database systems. Ability to collaborate effectively with cross-functional teams to gather requirements and convert them into measurable scopes. troubleshooting skills and the ability to provide effective resolutions for technical issues. Familiarity with public cloud platforms such as GCP, AWS, or Azure. Understands Agile principles and methodologies. Qualification 16 years full time education

Posted 1 week ago

Apply

3.0 - 6.0 years

16 - 20 Lacs

Gurugram

Work from Office

Naukri logo

Management Level: Ind & Func AI Decision Science Consultant Location: Bengaluru (Bangalore), Gurugram (Gurgaon), Hyderabad, Chennai. Must-have skills: Programming languages -Python/R, Generative AI, Large Language Models (LLMs), ML libraries such as Scikit-learn, TensorFlow, Torch, Lang Chain, or OpenAI API, RAG Applications. Good to have skills :Big data technologies such as Spark or Hadoop,AI model explainability(XAI),bias detection and AI ethics. Familiarity with Edge AI and deploying models on embedded devices for industrial automation. Experience with Reinforcement Learning (RL) and AI-driven optimization techniques. Job Summary We are looking for a Data Scientist / AI Specialist with 3-6 years of experience to join our team and work on client projects in the Automotive & Industrial sectors. This role will involve leveraging traditional Machine Learning (ML), Generative AI (GenAI), Agentic AI, and Autonomous AI Systems to drive innovation, optimize processes, and enhance decision-making in complex industrial environments. Prior experience in the Auto/Industrial industry is a plus, but we welcome candidates from any domain with a strong analytical mindset and a passion for applying AI to real-world business challenges. Roles & Responsibilities: Develop, deploy and monitor AI/ML models in production environments & enterprise systems, including predictive analytics, anomaly detection, and process optimization for clients. Work with Generative AI models (e.g., GPT, Stable Diffusion, DALLE) for applications such as content generation, automated documentation, code synthesis, and intelligent assistants. Implement Agentic AI systems, including AI-powered automation, self-learning agents, and decision-support systems for industrial applications. Design and build Autonomous AI solutions for tasks like predictive maintenance, supply chain optimization, and robotic process automation (RPA). Work with structured and unstructured data from various sources, including IoT sensors, manufacturing logs, and customer interactions. Optimize and fine-tune LLMs (Large Language Models) for specific business applications, ensuring ethical and explainable AI use. Utilize MOps and AI orchestration tools to streamline model deployment, monitoring, and retraining cycles. Collaborate with cross-functional teams, including engineers, business analysts, and domain experts, to align AI solutions with business objectives. Stay updated with cutting-edge AI research in Generative AI, Autonomous AI, and Multi-Agent Systems. Professional & Technical Skills: 3-6 years of experience in Data Science, Machine Learning, or AI-related roles. Proficiency in Python (preferred) or R, and experience with ML libraries such as Scikit-learn, TensorFlow, Torch, Lang Chain, or OpenAI API. Strong understanding of Generative AI, Large Language Models (LLMs), and their practical applications. Hands-on experience in fine-tuning and deploying foundation models (e.g., OpenAI, Llama, Claude, Gemini, etc.). Experience with Vector Databases (e.g., FAISS, Chroma, Weaviate, Pinecone) for retrieval-augmented generation (RAG) applications. Knowledge of Autonomous AI Agents (e.g., AutoGPT, BabyAGI) and multi-agent orchestration frameworks. Experience working with SQL and NoSQL databases. Familiarity with cloud platforms (AWS, Azure, or GCP) for AI/ML model deployment. Strong problem-solving and analytical thinking abilities. Ability to communicate complex AI concepts to technical and non-technical stakeholders. Bonus:Experience in Automotive, Industrial, or Manufacturing AI applications (e.g., predictive maintenance, quality inspection, digital twins). Additional Information: Bachelor/Masters degree in Statistics/Economics/ Mathematics/ Computer Science or related disciplines with an excellent academic record /MBA from top-tier universities. Excellent Communication and Interpersonal Skills. About Our Company | Accenture Qualification Experience: Minimum 3-6 years of relevant Data Science, Machine Learning or AI-related roles., Exposure to Industrial & Automotive Firms or Professional Services. Educational Qualification: Bachelor/Master degree in Statistics/Economics/ Mathematics/ Computer Science or related disciplines with an excellent academic record or MBA from top-tier universities.

Posted 1 week ago

Apply

2.0 - 7.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title - S&C Global Network - AI - CMT DE- Consultant Management Level:9- Consultant Location:Open Must-have skills: Data Engineering Good to have skills: Ability to leverage design thinking, business process optimization, and stakeholder management skills. Job Summary : We are looking for a passionate and results-driven Data Engineer to join our growing data team. You will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure that support data-driven decision-making across the organization. Roles & Responsibilities: Design, build, and maintain robust, scalable, and efficient data pipelines (ETL/ELT). Work with structured and unstructured data across a wide variety of data sources. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements. Optimize data systems and architecture for performance, scalability, and reliability. Monitor data quality and support initiatives to ensure clean, accurate, and consistent data. Develop and maintain data models and metadata. Implement and maintain best practices in data governance, security, and compliance. Professional & Technical Skills: 2+ years in data engineering or related fields Proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL). Strong programming skills in Python, Scala, or Java. Experience with big data technologies such as Spark, Hadoop, or Hive. Familiarity with cloud platforms like AWS, Azure, or GCP, especially services like S3, Redshift, BigQuery, or Azure Data Lake. Experience with orchestration tools like Airflow, Luigi, or similar. Solid understanding of data warehousing concepts and data modeling techniques. Good problem-solving skills and attention to detail. Experience with modern data stack tools like dbt, Snowflake, or Databricks. Knowledge of CI/CD pipelines and version control (e.g., Git). Exposure to containerization (Docker, Kubernetes) and infrastructure as code (Terraform, CloudFormation). Additional Information: - The ideal candidate will possess a strong educational background in quantitative discipline and experience in working with Hi-Tech clients - This position is based at our Bengaluru (preferred) and other AI Accenture locations. About Our Company | Accenture Qualification Experience: 4+ years Educational Qualification: Btech/ BE

Posted 1 week ago

Apply

3.0 - 6.0 years

16 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Management Level: Ind & Func AI Decision Science Consultant Location: Bengaluru (Bangalore), Gurugram (Gurgaon), Hyderabad, Chennai. Must-have skills: Programming languages -Python/R, Generative AI, Large Language Models (LLMs), ML libraries such as Scikit-learn, TensorFlow, Torch, Lang Chain, or OpenAI API, RAG Applications. Good to have skills :Big data technologies such as Spark or Hadoop,AI model explainability(XAI),bias detection and AI ethics. Familiarity with Edge AI and deploying models on embedded devices for industrial automation. Experience with Reinforcement Learning (RL) and AI-driven optimization techniques. Job Summary We are looking for a Data Scientist / AI Specialist with 3-6 years of experience to join our team and work on client projects in the Automotive & Industrial sectors. This role will involve leveraging traditional Machine Learning (ML), Generative AI (GenAI), Agentic AI, and Autonomous AI Systems to drive innovation, optimize processes, and enhance decision-making in complex industrial environments. Prior experience in the Auto/Industrial industry is a plus, but we welcome candidates from any domain with a strong analytical mindset and a passion for applying AI to real-world business challenges. Roles & Responsibilities: Develop, deploy and monitor AI/ML models in production environments & enterprise systems, including predictive analytics, anomaly detection, and process optimization for clients. Work with Generative AI models (e.g., GPT, Stable Diffusion, DALLE) for applications such as content generation, automated documentation, code synthesis, and intelligent assistants. Implement Agentic AI systems, including AI-powered automation, self-learning agents, and decision-support systems for industrial applications. Design and build Autonomous AI solutions for tasks like predictive maintenance, supply chain optimization, and robotic process automation (RPA). Work with structured and unstructured data from various sources, including IoT sensors, manufacturing logs, and customer interactions. Optimize and fine-tune LLMs (Large Language Models) for specific business applications, ensuring ethical and explainable AI use. Utilize MOps and AI orchestration tools to streamline model deployment, monitoring, and retraining cycles. Collaborate with cross-functional teams, including engineers, business analysts, and domain experts, to align AI solutions with business objectives. Stay updated with cutting-edge AI research in Generative AI, Autonomous AI, and Multi-Agent Systems. Professional & Technical Skills: 3-6 years of experience in Data Science, Machine Learning, or AI-related roles. Proficiency in Python (preferred) or R, and experience with ML libraries such as Scikit-learn, TensorFlow, Torch, Lang Chain, or OpenAI API. Strong understanding of Generative AI, Large Language Models (LLMs), and their practical applications. Hands-on experience in fine-tuning and deploying foundation models (e.g., OpenAI, Llama, Claude, Gemini, etc.). Experience with Vector Databases (e.g., FAISS, Chroma, Weaviate, Pinecone) for retrieval-augmented generation (RAG) applications. Knowledge of Autonomous AI Agents (e.g., AutoGPT, BabyAGI) and multi-agent orchestration frameworks. Experience working with SQL and NoSQL databases. Familiarity with cloud platforms (AWS, Azure, or GCP) for AI/ML model deployment. Strong problem-solving and analytical thinking abilities. Ability to communicate complex AI concepts to technical and non-technical stakeholders. Bonus:Experience in Automotive, Industrial, or Manufacturing AI applications (e.g., predictive maintenance, quality inspection, digital twins). Additional Information: Bachelor/Masters degree in Statistics/Economics/ Mathematics/ Computer Science or related disciplines with an excellent academic record /MBA from top-tier universities. Excellent Communication and Interpersonal Skills. About Our Company | Accenture Qualification Experience: Minimum 3-6 years of relevant Data Science, Machine Learning or AI-related roles., Exposure to Industrial & Automotive Firms or Professional Services. Educational Qualification: Bachelor/Master degree in Statistics/Economics/ Mathematics/ Computer Science or related disciplines with an excellent academic record or MBA from top-tier universities.

Posted 1 week ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : PySpark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : BE Project Role :Software Development Engineer Project Role Description :Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have Skills :PySparkGood to Have Skills :No Industry SpecializationJob :Key Responsibilities :Overall 8 years of experience working in Data Analytics projects, Work on client projects to deliver AWS, PySpark, Databricks based Data engineering Analytics solutions Build and operate very large data warehouses or data lakes ETL optimization, designing, coding, tuning big data processes using Apache Spark Build data pipelines applications to stream and process datasets at low latencies Show efficiency in handling data - tracking data lineage, ensuring data quality, and improving discoverability of data. Technical Experience :Minimum of 2 years of experience in Databricks engineering solutions on any of the Cloud platforms using PySpark Minimum of 5 years of experience years of experience in ETL, Big Data/Hadoop and data warehouse architecture delivery Minimum 3 year of Experience in one or more programming languages Python, Java, Scala Experience using airflow for the data pipelines in min 1 project 2 years of experience developing CICD pipelines using GIT, Jenkins, Docker, Kubernetes, Shell Scripting, Terraform. Must be able to understand ETL technologies and translate into Cloud (AWS, Azure, Google Cloud) native tools or Pyspark. Professional Attributes :1 Should have involved in data engineering project from requirements phase to delivery 2 Good communication skill to interact with client and understand the requirement 3 Should have capability to work independently and guide the team. Educational Qualification:Additional Info : Qualification BE

Posted 1 week ago

Apply

10.0 - 14.0 years

11 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Business Process Architect Project Role Description : Design business processes, including characteristics and key performance indicators (KPIs), to meet process and functional requirements. Work closely with the Application Architect to create the process blueprint and establish business process requirements to drive out application requirements and metrics. Assist in quality management reviews, ensure all business and design requirements are met. Educate stakeholders to ensure a complete understanding of the designs. Must have skills : Broadcasting Media, Data Analytics Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : Specific undergraduate qualifications ie engineering computer science Summary :As a Media Broadcast Architect, you will design and implement end-to-end broadcast digital media supply chains. This role involves transitioning business requirements into robust solutions, assessing vendor offerings, and ensuring architecture is reusable, scalable, reliable, and manageable. Your expertise in video/audio codecs, streaming architectures, and production workflows will drive innovation and efficiency. Roles & Responsibilities:- Lead the transition of business requirements into effective technical solutions.- Assess vendor solutions and manage vendor selection processes.- Design and implement scalable, end-to-end broadcast digital media supply chains.- Ensure solutions are aligned with roadmaps and architectural principles.- Collaborate with stakeholders to secure input and adapt solutions as required.- Utilize AWS media services (Live & VOD) for efficient implementation and troubleshooting.- Manage video/audio codecs (H.264, H.265, MPEG4, AAC) and streaming architectures (RTP/UDP, RTSP, HTTP, -RTMP).- Facilitate video delivery through content distribution networks and optimize workflows.- Provide technical guidance across departments and resolve challenges effectively. Professional & Technical Skills: Must-Have Skills: - Experience in serverless architecture, ECS, networking, SQL/NoSQL databases, and AWS CI/CD pipelines.- Proficiency in Azure data services, Python, and big data tools (Hadoop, Spark, Kafka).- Strong knowledge of video/audio codecs, streaming architectures, and DRM solutions.- Expertise in NFVi Architecture and NF Life Cycle Management.- Solid understanding of protocols (SIP, REST, TCP/IP, DNS) and IMS Nodes (P-CSCF, S-CSCF, TAS, SBC, etc.).Good-to-Have Skills: - Cisco certifications (CCNA/CCNP) or cloud certifications (AWS/GCP).- Proven experience designing solutions with 99.999% reliability.- Hands-on exposure to leading broadcast platforms like Mediagenix, Telestream iQ, Dalet Flex, and Avid Media Composer. Additional Information:- Minimum 10 years of experience in media broadcast disciplines.- Strong understanding of emerging broadcast technologies and industry trends.- Excellent verbal, written, and presentation skills.- Educational qualifications in engineering or computer science. Qualification Specific undergraduate qualifications ie engineering computer science

Posted 1 week ago

Apply

3.0 - 8.0 years

9 - 13 Lacs

Pune

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in shaping the data platform components. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with Integration Architects and Data Architects to design and implement data platform components.- Ensure seamless integration between various systems and data models.- Develop and maintain data platform blueprints.- Provide technical expertise in data platform design and implementation.- Troubleshoot and resolve data platform related issues. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data platform architecture and design principles.- Experience in implementing and optimizing data pipelines.- Knowledge of cloud-based data solutions.- Hands-on experience with data platform security and compliance measures. Additional Information:- The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

7.0 - 12.0 years

9 - 13 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : Engineering graduate preferably Computer Science graduate 15 years of full time education Summary :As a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Roles & Responsibilities:- Assist with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform.- Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models.- Develop and maintain data pipelines using Databricks Unified Data Analytics Platform.- Design and implement data security and access controls using Databricks Unified Data Analytics Platform.- Troubleshoot and resolve issues related to data platform components using Databricks Unified Data Analytics Platform. Professional & Technical Skills: - Must To Have Skills: Experience with Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with other big data technologies such as Hadoop, Spark, and Kafka.- Strong understanding of data modeling and database design principles.- Experience with data security and access controls.- Experience with data pipeline development and maintenance.- Experience with troubleshooting and resolving issues related to data platform components. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions.- This position is based at our Bangalore, Hyderabad, Chennai and Pune Offices.- Mandatory office (RTO) for 2- 3 days and have to work on 2 shifts (Shift A- 10:00am to 8:00pm IST and Shift B - 12:30pm to 10:30 pm IST) Qualification Engineering graduate preferably Computer Science graduate 15 years of full time education

Posted 1 week ago

Apply

7.0 - 12.0 years

9 - 13 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : Engineering graduate preferably Computer Science graduate 15 years of full time education Summary :As a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Roles & Responsibilities:- Assist with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform.- Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models.- Develop and maintain data pipelines using Databricks Unified Data Analytics Platform.- Design and implement data security and access controls using Databricks Unified Data Analytics Platform.- Troubleshoot and resolve issues related to data platform components using Databricks Unified Data Analytics Platform. Professional & Technical Skills: - Must To Have Skills: Experience with Databricks Unified Data Analytics Platform.- Must To Have Skills: Strong understanding of data modeling and database design principles.- Good To Have Skills: Experience with cloud-based data platforms such as AWS or Azure.- Good To Have Skills: Experience with data security and access controls.- Good To Have Skills: Experience with data pipeline development and maintenance. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions.- This position is based at our Chennai office. Qualification Engineering graduate preferably Computer Science graduate 15 years of full time education

Posted 1 week ago

Apply

7.0 - 12.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : Engineering graduate preferably Computer Science graduate 15 years of full time education Summary :As a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Roles & Responsibilities:- Assist with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform.- Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models.- Develop and maintain data pipelines and ETL processes using Databricks Unified Data Analytics Platform.- Design and implement data security and access controls for the data platform.- Troubleshoot and resolve issues related to the data platform and data pipelines. Professional & Technical Skills: - Must To Have Skills: Experience with Databricks Unified Data Analytics Platform.- Must To Have Skills: Strong understanding of data modeling and database design principles.- Good To Have Skills: Experience with cloud-based data platforms such as AWS or Azure.- Good To Have Skills: Experience with data security and access controls.- Good To Have Skills: Experience with data visualization tools such as Tableau or Power BI. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions.- This position is based at our Bangalore, Hyderabad, Chennai and Pune Offices.- Mandatory office (RTO) for 2- 3 days and have to work on 2 shifts (Shift A- 10:00am to 8:00pm IST and Shift B - 12:30pm to 10:30 pm IST) Qualification Engineering graduate preferably Computer Science graduate 15 years of full time education

Posted 1 week ago

Apply

7.0 - 12.0 years

9 - 13 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : Engineering graduate preferably Computer Science graduate 15 years of full time education Summary :As a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Roles & Responsibilities:- Assist with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform.- Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models.- Develop and maintain data pipelines using Databricks Unified Data Analytics Platform.- Design and implement data security and access controls using Databricks Unified Data Analytics Platform.- Troubleshoot and resolve issues related to data platform components using Databricks Unified Data Analytics Platform. Professional & Technical Skills: - Must To Have Skills: Experience with Databricks Unified Data Analytics Platform.- Must To Have Skills: Strong understanding of data platform components and architecture.- Good To Have Skills: Experience with cloud-based data platforms such as AWS or Azure.- Good To Have Skills: Experience with data security and access controls.- Good To Have Skills: Experience with data pipeline development and maintenance. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions.-This position is based at our Bangalore, Hyderabad, Chennai and Pune Offices.- Mandatory office (RTO) for 2- 3 days and have to work on 2 shifts (Shift A- 10:00am to 8:00pm IST and Shift B - 12:30pm to 10:30 pm IST) Qualification Engineering graduate preferably Computer Science graduate 15 years of full time education

Posted 1 week ago

Apply

7.0 - 12.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in shaping the data platform components. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead the implementation of data platform solutions.- Conduct performance tuning and optimization of data platform components. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of cloud-based data platforms.- Experience in designing and implementing data pipelines.- Knowledge of data governance and security best practices. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : React.jsMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to gather requirements, developing application features, and ensuring that the applications align with business objectives. You will also engage in problem-solving activities, providing innovative solutions to enhance application performance and user experience, while maintaining a focus on quality and efficiency throughout the development process. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously assess and improve application performance and user experience. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with React.js.- Strong understanding of data integration and ETL processes.- Experience in application development using various programming languages.- Familiarity with cloud computing concepts and services. Additional Information:- The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

10.0 - 14.0 years

11 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Business Process Architect Project Role Description : Design business processes, including characteristics and key performance indicators (KPIs), to meet process and functional requirements. Work closely with the Application Architect to create the process blueprint and establish business process requirements to drive out application requirements and metrics. Assist in quality management reviews, ensure all business and design requirements are met. Educate stakeholders to ensure a complete understanding of the designs. Must have skills : Broadcasting Media, Digital Advertising, Cloud Migration Planning, Data Analytics Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : Specific undergraduate qualifications ie engineering computer science Summary :As a Media Broadcast Architect, you will design and implement end-to-end broadcast digital media supply chains. This role involves transitioning business requirements into robust solutions, assessing vendor offerings, and ensuring architecture is reusable, scalable, reliable, and manageable. Your expertise in video/audio codecs, streaming architectures, and production workflows will drive innovation and efficiency. Roles & Responsibilities:- Lead the transition of business requirements into effective technical solutions.- Assess vendor solutions and manage vendor selection processes.- Design and implement scalable, end-to-end broadcast digital media supply chains.- Ensure solutions are aligned with roadmaps and architectural principles.- Collaborate with stakeholders to secure input and adapt solutions as required.- Utilize AWS media services (Live & VOD) for efficient implementation and troubleshooting.- Manage video/audio codecs (H.264, H.265, MPEG4, AAC) and streaming architectures (RTP/UDP, RTSP, HTTP, -RTMP).- Facilitate video delivery through content distribution networks and optimize workflows.- Provide technical guidance across departments and resolve challenges effectively. Professional & Technical Skills: Must-Have Skills: - Experience in serverless architecture, ECS, networking, SQL/NoSQL databases, and AWS CI/CD pipelines.- Proficiency in Azure data services, Python, and big data tools (Hadoop, Spark, Kafka).- Strong knowledge of video/audio codecs, streaming architectures, and DRM solutions.- Expertise in NFVi Architecture and NF Life Cycle Management.- Solid understanding of protocols (SIP, REST, TCP/IP, DNS) and IMS Nodes (P-CSCF, S-CSCF, TAS, SBC, etc.).Good-to-Have Skills: - Cisco certifications (CCNA/CCNP) or cloud certifications (AWS/GCP).- Proven experience designing solutions with 99.999% reliability.- Hands-on exposure to leading broadcast platforms like Mediagenix, Telestream iQ, Dalet Flex, and Avid Media Composer. Additional Information:- Minimum 10 years of experience in media broadcast disciplines.- Strong understanding of emerging broadcast technologies and industry trends.- Excellent verbal, written, and presentation skills.- Educational qualifications in engineering or computer science. Qualification Specific undergraduate qualifications ie engineering computer science

Posted 1 week ago

Apply

Exploring Hadoop Jobs in India

The demand for Hadoop professionals in India has been on the rise in recent years, with many companies leveraging big data technologies to drive business decisions. As a job seeker exploring opportunities in the Hadoop field, it is important to understand the job market, salary expectations, career progression, related skills, and common interview questions.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Chennai

These cities are known for their thriving IT industry and have a high demand for Hadoop professionals.

Average Salary Range

The average salary range for Hadoop professionals in India varies based on experience levels. Entry-level Hadoop developers can expect to earn between INR 4-6 lakhs per annum, while experienced professionals with specialized skills can earn upwards of INR 15 lakhs per annum.

Career Path

In the Hadoop field, a typical career path may include roles such as Junior Developer, Senior Developer, Tech Lead, and eventually progressing to roles like Data Architect or Big Data Engineer.

Related Skills

In addition to Hadoop expertise, professionals in this field are often expected to have knowledge of related technologies such as Apache Spark, HBase, Hive, and Pig. Strong programming skills in languages like Java, Python, or Scala are also beneficial.

Interview Questions

  • What is Hadoop and how does it work? (basic)
  • Explain the difference between HDFS and MapReduce. (medium)
  • How do you handle data skew in Hadoop? (medium)
  • What is YARN in Hadoop? (basic)
  • Describe the concept of NameNode and DataNode in HDFS. (medium)
  • What are the different types of join operations in Hive? (medium)
  • Explain the role of the ResourceManager in YARN. (medium)
  • What is the significance of the shuffle phase in MapReduce? (medium)
  • How does speculative execution work in Hadoop? (advanced)
  • What is the purpose of the Secondary NameNode in HDFS? (medium)
  • How do you optimize a MapReduce job in Hadoop? (medium)
  • Explain the concept of data locality in Hadoop. (basic)
  • What are the differences between Hadoop 1 and Hadoop 2? (medium)
  • How do you troubleshoot performance issues in a Hadoop cluster? (advanced)
  • Describe the advantages of using HBase over traditional RDBMS. (medium)
  • What is the role of the JobTracker in Hadoop? (medium)
  • How do you handle unstructured data in Hadoop? (medium)
  • Explain the concept of partitioning in Hive. (medium)
  • What is Apache ZooKeeper and how is it used in Hadoop? (advanced)
  • Describe the process of data serialization and deserialization in Hadoop. (medium)
  • How do you secure a Hadoop cluster? (advanced)
  • What is the CAP theorem and how does it relate to distributed systems like Hadoop? (advanced)
  • How do you monitor the health of a Hadoop cluster? (medium)
  • Explain the differences between Hadoop and traditional relational databases. (medium)
  • How do you handle data ingestion in Hadoop? (medium)

Closing Remark

As you navigate the Hadoop job market in India, remember to stay updated on the latest trends and technologies in the field. By honing your skills and preparing diligently for interviews, you can position yourself as a strong candidate for lucrative opportunities in the big data industry. Good luck on your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies