Home
Jobs

935 Data Bricks Jobs - Page 35

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3 - 8 years

5 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. You will collaborate with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Your typical day will involve working on the data platform blueprint and design, collaborating with architects, and ensuring seamless integration between systems and data models. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Assist with the data platform blueprint and design. Collaborate with Integration Architects and Data Architects. Ensure cohesive integration between systems and data models. Implement data platform components. Troubleshoot and resolve data platform issues. Professional & Technical Skills: Must To Have Skills:Proficiency in Databricks Unified Data Analytics Platform. Strong understanding of statistical analysis and machine learning algorithms. Experience with data visualization tools such as Tableau or Power BI. Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform. This position is based at our Bengaluru office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 1 month ago

Apply

7 - 12 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in the development and maintenance of the data platform components, contributing to the overall success of the project. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Assist with the data platform blueprint and design. Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Develop and maintain data platform components. Contribute to the overall success of the project. Professional & Technical Skills: Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. Strong understanding of statistical analysis and machine learning algorithms. Experience with data visualization tools such as Tableau or Power BI. Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform. This position is based at our Bengaluru office. 15 years full time education is required. Qualifications 15 years full time education

Posted 1 month ago

Apply

5 - 10 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in the development and maintenance of the data platform components, contributing to the overall success of the project. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Assist with the data platform blueprint and design. Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Develop and maintain data platform components. Contribute to the overall success of the project. Professional & Technical Skills: Must To Have Skills:Proficiency in Databricks Unified Data Analytics Platform. Strong understanding of statistical analysis and machine learning algorithms. Experience with data visualization tools such as Tableau or Power BI. Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform. This position is based at our Bengaluru office. 15 years full time education is required. Qualifications 15 years full time education

Posted 1 month ago

Apply

3 - 8 years

9 - 13 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. You will collaborate with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Your typical day will involve working on the data platform blueprint and design, collaborating with architects, and ensuring seamless integration between systems and data models. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Assist with the data platform blueprint and design. Collaborate with Integration Architects and Data Architects. Ensure cohesive integration between systems and data models. Implement data platform components. Troubleshoot and resolve data platform issues. Professional & Technical Skills: Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. Strong understanding of statistical analysis and machine learning algorithms. Experience with data visualization tools such as Tableau or Power BI. Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform. This position is based at our Bengaluru office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 1 month ago

Apply

3 - 8 years

6 - 10 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. You will collaborate with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Your typical day will involve working on the data platform blueprint and design, collaborating with architects, and ensuring seamless integration between systems and data models. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Assist with the data platform blueprint and design. Collaborate with Integration Architects and Data Architects. Ensure cohesive integration between systems and data models. Implement data platform components. Troubleshoot and resolve data platform issues. Professional & Technical Skills: Must To Have Skills:Proficiency in Databricks Unified Data Analytics Platform. Strong understanding of data platform blueprint and design. Experience with data integration and data modeling. Hands-on experience with data platform components. Knowledge of data platform security and governance. Additional Information: The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform. This position is based at our Chennai office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 1 month ago

Apply

2 - 7 years

4 - 9 Lacs

Pune

Work from Office

Naukri logo

Project Role :Data Platform Engineer Project Role Description :Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills :Databricks Unified Data Analytics Platform Good to have skills :NA Minimum 2 year(s) of experience is required Educational Qualification :15 years full time education Summary:As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in the development and maintenance of the data platform components, contributing to the overall success of the project. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Assist with the data platform blueprint and design. Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Develop and maintain data platform components. Contribute to the overall success of the project. Professional & Technical Skills: Must To Have Skills:Proficiency in Databricks Unified Data Analytics Platform. Strong understanding of statistical analysis and machine learning algorithms. Experience with data visualization tools such as Tableau or Power BI. Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: The candidate should have a minimum of 2 years of experience in Databricks Unified Data Analytics Platform. This position is based at our Pune office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 1 month ago

Apply

5 - 10 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Architect Project Role Description : Provide functional and/or technical expertise to plan, analyze, define and support the delivery of future functional and technical capabilities for an application or group of applications. Assist in facilitating impact assessment efforts and in producing and reviewing estimates for client work requests. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Architect, you will provide functional and/or technical expertise to plan, analyze, define, and support the delivery of future functional and technical capabilities for an application or group of applications. You will also assist in facilitating impact assessment efforts and in producing and reviewing estimates for client work requests. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Lead the design and implementation of application solutions. Ensure compliance with architectural standards and guidelines. Identify opportunities to improve application performance and scalability. Mentor junior team members to enhance their skills. Professional & Technical Skills: Must To Have Skills:Proficiency in Databricks Unified Data Analytics Platform. Strong understanding of cloud-based data analytics solutions. Experience in designing and implementing scalable data architectures. Knowledge of data governance and security best practices. Hands-on experience with data integration and ETL processes. Additional Information: The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform. This position is based at our Bengaluru office. A 15 years full-time education is required. Qualifications 15 years full time education

Posted 1 month ago

Apply

3 - 8 years

5 - 10 Lacs

Pune

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in shaping the data platform components. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Collaborate with cross-functional teams to design and implement data platform solutions. Develop and maintain data pipelines for efficient data processing. Optimize data storage and retrieval processes for improved performance. Implement data security measures to protect sensitive information. Conduct regular data platform performance evaluations and make recommendations for improvements. Professional & Technical Skills: Must To Have Skills:Proficiency in Databricks Unified Data Analytics Platform. Strong understanding of cloud-based data platforms. Experience with data modeling and database design. Hands-on experience with ETL processes and tools. Knowledge of data governance and compliance standards. Additional Information: The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform. This position is based at our Pune office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 1 month ago

Apply

2 - 6 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

DESCRIPTION Key Responsibilities: Conduct descriptive and diagnostic analytics on data sources. Apply machine learning tools on data and metadata to improve data quality. Identify patterns and trends in data sets. Create reports and dashboards on analysis results using business intelligence technologies. Develop data profiles for data tables and elements in the data lake. Create data catalog entries and ensure data catalog metadata quality. Analyze diverse data sets to identify data quality, coherency, and integrability issues and reduce data redundancy. Collaborate to develop data cleansing methods and rules. Assist in the creation and maintenance of documentation of key decisions, rules, controls, and processes. RESPONSIBILITIES Qualifications: College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Competencies: Action Oriented: Taking on new opportunities and tough challenges with a sense of urgency, high energy, and enthusiasm. Communicates Effectively: Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer Focus: Building strong customer relationships and delivering customer-centric solutions. Tech Savvy: Anticipating and adopting innovations in business-building digital and technology applications. Data Analytics: Discovering, interpreting, and communicating qualitative and quantitative data; determining conclusions relying on knowledge of business or functional frameworks; simultaneously applying statistics, data validity, data visualization, and problem-solving approaches to effectively extract meaningful patterns and business insights; presenting conclusions and outcomes that enable data-driven business decisions. Data Communication and Visualization: Constructing a tale of the business problem, root cause, solution options, and opportunities through illustrating data visually, including reports and dashboards. Data Literacy: Expressing data in context, including data sources and constructs, analytical methods, and applied techniques; describing the use-case application and resulting value. Data Profiling: Assessing data issues and cleansing requirements to perform data extraction, mapping, collection, and testing; establishing good, quality data. Data Quality: Identifying, understanding, and correcting flaws in data that supports effective information governance across operational business processes and decision-making. Values Differences: Recognizing the value that different perspectives and cultures bring to an organization. QUALIFICATIONS Skills: Demonstrated experience as a Data Analyst or in a related data-centric role. Proficiency in SQL (preferably SQL Server) for querying and manipulating data. Practical experience with Databricks for tasks related to data engineering, data science, and analytics. Experience working with Databricks Lakehouse architecture and data warehousing concepts. Knowledge of Azure cloud services, including Azure Data Factory and Azure SQL Database. Expertise in Python programming. Skilled in using PowerBI to create interactive reports, dashboards, and visualizations, with familiarity in SSAS models. Strong analytical mindset with excellent problem-solving skills and keen attention to detail. Good communication skills, capable of conveying complex data insights clearly to non-technical stakeholders. Experience: Relevant experience preferred such as working in a temporary student employment, intern, co-op, or other extracurricular team activities. Knowledge of the latest technologies and trends in data science is highly preferred and includes: SQL query language execution on NoSQL and SQL sources. Exposure to Agile software development. Exposure to IoT technology. Data profiling tools, technologies, and coding. Data catalog tools and technologies. Business intelligence tools and technologies. Exposure to Big Data open source. Clustered compute cloud-based implementation experience.

Posted 1 month ago

Apply

7 - 11 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : Engineering graduate preferably Computer Science graduate 15 years of full time education Summary :As a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Roles & Responsibilities: Assist with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform. Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Develop and maintain data pipelines using Databricks Unified Data Analytics Platform. Design and implement data security and access controls using Databricks Unified Data Analytics Platform. Troubleshoot and resolve issues related to data platform components using Databricks Unified Data Analytics Platform. Professional & Technical Skills: Must To Have Skills:Experience with Databricks Unified Data Analytics Platform. Must To Have Skills:Strong understanding of data modeling and database design principles. Good To Have Skills:Experience with cloud-based data platforms such as AWS or Azure. Good To Have Skills:Experience with data security and access controls. Good To Have Skills:Experience with data pipeline development and maintenance. Additional Information: The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Bangalore, Hyderabad, Chennai and Pune Offices. Mandatory office (RTO) for 2- 3 days and have to work on 2 shifts (Shift A- 10:00am to 8:00pm IST and Shift B - 12:30pm to 10:30 pm IST) Qualifications Engineering graduate preferably Computer Science graduate 15 years of full time education

Posted 1 month ago

Apply

5 - 10 years

7 - 12 Lacs

Pune

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in the development and maintenance of the data platform components, contributing to the overall success of the project. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Assist with the data platform blueprint and design. Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Develop and maintain data platform components. Contribute to the overall success of the project. Professional & Technical Skills: Must To Have Skills:Proficiency in Databricks Unified Data Analytics Platform. Strong understanding of statistical analysis and machine learning algorithms. Experience with data visualization tools such as Tableau or Power BI. Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform. This position is based at our Bengaluru office. 15 years full time education is required. Qualifications 15 years full time education

Posted 1 month ago

Apply

5 - 8 years

6 - 16 Lacs

Bengaluru

Hybrid

Naukri logo

ROLE SUMMARY Reporting to the Head of Architecture & Engineering APAC, your role as a BI Analyst is responsible for leveraging business intelligence platforms to generate insights, reports, analytics and dashboards. By collaborating with stakeholders within Client ecosystem, you will work to translate business needs into data-driven solutions that support informed decision-making. You will leverage your business acumen and act as a liaison between the technology team and various business units and stakeholders to understand their requirements, key objectives and strategic goals and translate into technical BI solutions; Develop and maintain data-driven reports and visualizations utilising your expertise with PowerBI/Tableau/Sigma. PRIMARY ROLE Collaborating with departments to understand reporting needs and recommend best practices. Analyze large data sets, develop interactive dashboards, and present insights to stakeholders. Carry out data ingestion, star schema data modelling, and visualization to create impactful reports. Develop comprehensive BI dashboards following best practices. Improve and optimize existing Power BI solutions to enhance performance and usability. Leverage SQL skills for advanced data querying and analysis. Conduct data profiling, cleansing, and validation activities to ensure data quality. KEY WORKING RELATIONSHIPS External: Development & Integration partners Technology and Cyber Security partners Cloud and application vendors Client Portfolio Companies Internal: CTO APAC Heads of Architecture & Engineering, Workplace and Service Delivery DevOps engineers Internal business stakeholders WHAT YOU BRING TO THE ROLE Required: 3-5 years experience in a Data engineering and Report development roles. Demonstrable experience in Data Extraction, Manipulation and Visualization. Strong Data Mapping skills, including Source to Target. Strong Power BI and report development skills and experience. Experience with Snowflake and/or Data Bricks. Proven experience partnering with business stakeholders, conceptualizing business objectives and processes, documenting and translating business requirements into technical specifications. Skilled in producing Data flows and models. Exceptional communication and stakeholder engagement skills. Attention to detail and ability to analyses data from diverse application sources and write efficient, effective SQL to handle complex scenarios, providing required outputs under time pressures. Strong ETL development experience, drawing data from disparate systems. Preferred: A degree in Computer science, Mathematics or Statistics. Experience and competency with Sigma or Tableau would also be beneficial. Experience in using cloud technologies such as Azure. Experience with developing Data Governance and Classification framework. Exposure to Machine Learning, AI and Big data.

Posted 1 month ago

Apply

8 - 13 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Education: A Bachelors degree in Computer Science, Engineering (B.Tech, BE), or a related field such as MCA (Master of Computer Applications) is required for this role. Experience: 8+ years in data engineering with a focus on building scalable and reliable data infrastructure. Skills: Language: Proficiency in Java or Python or Scala. Prior experience in Oil Gas, Titles Leases, or Financial Services is a must have. Databases: Expertise in relational and NoSQL databases like PostgreSQL, MongoDB, Redis, and Elasticsearch. Data Pipelines: Strong experience in designing and implementing ETL/ELT pipelines for large datasets. Tools: Hands-on experience with Databricks, Spark, and cloud platforms. Data Lakehouse: Expertise in data modeling, designing Data Lakehouses, and building data pipelines. Modern Data Stack: Familiarity with modern data stack and data governance practices. Data Orchestration: Proficient in data orchestration and workflow tools. Data Modeling: Proficient in modeling and building data architectures for high-throughput environments. Stream Processing: Extensive experience with stream processing technologies such as Apache Kafka. Distributed Systems: Strong understanding of distributed systems, scalability, and availability. DevOps: Familiarity with DevOps practices, continuous integration, and continuous deployment (CI/CD). Problem-Solving: Strong problem-solving skills with a focus on scalable data infrastructure. Key Responsibilities: This is a role with high expectations of hands on design and development. Design and development of systems for ingestion, persistence, consumption, ETL/ELT, versioning for different data types e.g. relational, document, geospatial, graph, timeseries etc. in transactional and analytical patterns. Drive the development of applications related to data extraction, especially from formats like TIFF, PDF, and others, including OCR and data classification/categorization. Analyze and improve the efficiency, scalability, and reliability of our data infrastructure. Assist in the design and implementation of robust ETL/ELT pipelines for processing large volumes of data. Collaborate with cross-functional scrum teams to respond quickly and effectively to business needs. Work closely with data scientists and analysts to define data requirements and develop comprehensive data solutions. Implement data quality checks and monitoring to ensure data integrity and reliability across all systems. Develop and maintain data models, schemas, and documentation to support data-driven decision-making. Manage and scale data infrastructure on cloud platforms, leveraging cloud-native tools and services. Benefits: Salary: Competitive and aligned with local standards. Performance Bonus: According to company policy. Benefits: Includes medical insurance and group term life insurance. Continuous learning and development.10 recognized public holidays. Parental Leave

Posted 1 month ago

Apply

8 - 13 years

6 - 11 Lacs

Gurugram

Work from Office

Naukri logo

AHEAD is looking for a Senior Data Engineer to work closely with our dynamic project teams (both on-site and remotely). This Senior Data Engineer will be responsible for strategic planning and hands-on engineering of Big Data and cloud environments that support our clients advanced analytics, data science, and other data platform initiatives. This consultant will design, build, and support modern data environments that reside in the public cloud or multi-cloud enterprise architectures. They will be expected to be hands-on technically, but also present to leadership, and lead projects. The Senior Data Engineer will have responsibility for working on a variety of data projects. This includes orchestrating pipelines using modern Data Engineering tools/architectures as well as design and integration of existing transactional processing systems. As a Senior Data Engineer, you will design and implement data pipelines to enable analytics and machine learning on rich datasets. Roles and Responsibilities A Data Engineer should be able to design, build, operationalize, secure, and monitor data processing systems Create robust and automated pipelines to ingest and process structured and unstructured data from various source systems into analytical platforms using batch and streaming mechanisms leveraging cloud native toolset Implement custom applications using tools such as Kinesis, Lambda and other cloud native tools as required to address streaming use cases Engineers and supports data structures including but not limited to SQL and NoSQL databases Engineers and maintain ELT processes for loading data lake (Snowflake, Cloud Storage, Hadoop) Engineers APIs for returning data from these structures to the Enterprise Applications Leverages the right tools for the right job to deliver testable, maintainable, and modern data solutions Respond to customer/team inquiries and assist in troubleshooting and resolving challenges Works with other scrum team members to estimate and deliver work inside of a sprint Research data questions, identifies root causes, and interacts closely with business users and technical resources Qualifications 8+ years of professional technical experience 4+ years of hands-on Data Architecture and Data Modelling. 4+ years of experience building highly scalable data solutions using Hadoop, Spark, Databricks, Snowflake 4+ years of programming languages such as Python 2+ years of experience working in cloud environments (AWS and/or Azure) Strong client-facing communication and facilitation skills Key Skills Python, Cloud, Linux, Windows, NoSQL, Git, ETL/ELT, Spark, Hadoop, Data Warehouse, Data Lake, Snowflake, SQL/RDBMS, OLAP, Data Engineering

Posted 1 month ago

Apply

5 - 9 years

6 - 7 Lacs

Noida, Ahmedabad, Chennai

Hybrid

Naukri logo

This role focuses on building efficient, scalable SQL-based data models and pipelines using Data bricks SQL, Spark SQL, and Delta Lake. The ideal candidate will play a key role in transforming raw data into valuable analytical insights.

Posted 1 month ago

Apply

8 - 13 years

13 - 18 Lacs

Pune

Work from Office

Naukri logo

Position Summary We are looking for a highly skilled and experienced Data Engineering Manager to lead our data engineering team. The ideal candidate will possess a strong technical background, strong project management abilities, and excellent client handling/stakeholder management skills. This role requires a strategic thinker who can drive the design, development and implementation of data solutions that meet our clients needs while ensuring the highest standards of quality and efficiency. Job Responsibilities Technology Leadership Lead guide the team independently or with little support to design, implement deliver complex cloud-based data engineering / data warehousing project assignments Solution Architecture & Review Expertise in conceptualizing solution architecture and low-level design in a range of data engineering (Matillion, Informatica, Talend, Python, dbt, Airflow, Apache Spark, Databricks, Redshift) and cloud hosting (AWS, Azure) technologies Managing projects in fast paced agile ecosystem and ensuring quality deliverables within stringent timelines Responsible for Risk Management, maintaining the Risk documentation and mitigations plan. Drive continuous improvement in a Lean/Agile environment, implementing DevOps delivery approaches encompassing CI/CD, build automation and deployments. Communication & Logical Thinking Demonstrates strong analytical skills, employing a systematic and logical approach to data analysis, problem-solving, and situational assessment. Capable of effectively presenting and defending team viewpoints, while securing buy-in from both technical and client stakeholders. Handle Client Relationship Manage client relationship and client expectations independently. Should be able to deliver results back to the Client independently. Should have excellent communication skills. Education BE/B.Tech Master of Computer Application Work Experience Should have expertise and 8+ years of working experience in at least twoETL toolsamong Matillion, dbt, pyspark, Informatica, and Talend Should have expertise and working experience in at least twodatabases among Databricks, Redshift, Snowflake, SQL Server, Oracle Should have strong Data Warehousing, Data Integration and Data Modeling fundamentals like Star Schema, Snowflake Schema, Dimension Tables and Fact Tables. Strong experience on SQL building blocks. Creating complex SQL queries and Procedures. Experience in AWS or Azure cloud and its service offerings Aware oftechniques such asData Modelling, Performance tuning and regression testing Willingness to learn and take ownership of tasks. Excellent written/verbal communication and problem-solving skills and Understanding and working experience on Pharma commercial data sets like IQVIA, Veeva, Symphony, Liquid Hub, Cegedim etc. would be an advantage Hands-on in scrum methodology (Sprint planning, execution and retrospection) Behavioural Competencies Teamwork & Leadership Motivation to Learn and Grow Ownership Cultural Fit Talent Management Technical Competencies Problem Solving Lifescience Knowledge Communication Agile PySpark Data Modelling Matillion Designing technical architecture AWS Data Pipeline

Posted 1 month ago

Apply

5 - 10 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

Position Summary Delivery Director - Big Data & Cloud Data Management This position is part of the Senior leadership in data warehousing and Business Intelligence areas. Someone who can work on multiple project streams and clients for better business decision making especially in the area of Lifesciences/ Pharmaceutical domain. Functional Domain Life Sciences focusing on Pharma Commercial aspect Technology Domain Big DataCloud Data Management ( EDW/ Data Lake/ Big Data) Location Noida/Gurgaon Qualification B.Tech. or equivalent degree is a minimum criterion Job Responsibilities Lead the delivery of major engagements on a day-to-day basis, providing hands on technical leadership to the delivery team.Global program planning and execution for Data Management programs across Enterprise DWH, Data Lake, Big Data, Business Intelligence & Reporting, Master Data Management solutions. This includes setting up project plans, scopes, budgets, staffing resources, leading client workshops, creating and coordinatingfinal deliverables, providing high quality and insightful advice to our clients in terms of Cloud Data Management strategy, architecture, operating model etc.Drive data management business requirements across pharma commercial data sets in a workshop setting with client business teams and onshore counterparts.Be able to guide the team to deliver the logical and physical data models for Pharma Commercial Data Lakes and Warehouse Maintain responsibility for risk management, quality and profitability on engagements and liaise with the client lead and onshore delivery partners .Drive Clear and Timely communication across program stakeholders to ensure everyone is on the same page on the weekly progress.Must maintain high standards of quality and thoroughness. Should be able to set-up adequate controls within team to perform code reviews and monitor the quality Represent the Business Information Management (BIM) practice as a SME in cloud data management space.Coach, mentor and develop middle and junior level staff. Performance management of all the team members rolling up to the role.Develop the manager layers to be leaders of the future.Be known as a Thought Leader in a specific aspect of Information Management technology spectrum or Pharma domain.Direct the training & skill enhancement of the team, in line with pipeline opportunities Understand wider Product and service offerings of Axtria to co-ordinate these to ensure our clients get the best level of advice and support.Should be able to work on large Cloud data management deals within Life Sciences domain.Work with Vertical/Onshore teams to drive Business Development and growth for the practice Education BE/B.Tech Master of Computer Application Work Experience Candidate should have 14+ years of prior experience in delivering customer focused Cloud Data Management solution(s)Enterprise Data Warehouse, Enterprise Data Lake, Master Data Management System, Business Intelligence & Reporting solutions, IT Architecture Consulting, Cloud Platforms ( AWS/AZURE), SaaS/PaaS based solutions in the Life Science industry.Minimum of 5 years of relevant experience in Pharma domain. (Must)Should have successfully program managed/solutioned 2-3 end to end DW Cloud implementations on AWS/Redshift/Cloud in Pharma or Life Sciences Business domains (Must)Candidate must have prior hands-on working experience on big data and ETL technologies.Tech stack exposure (hands on/project mgmt. on few of these)from amongst AWS, ETL, Data modelling, Python, Qlik/Tableau/MicroStrategy, Dataiku, Databricks, Airflow, Graph DB, Full stack.Good to have working knowledge of toolsAt least 2 of the following QlikView, Qlik Sense, Tableau, MicroStrategy, Spotfire, MS PBI. Must have a very good understanding of end-to-end pharma commercial landscape covering both enterprise and syndicated data sets.Decent exposure to Business processes like alignment, market definition, segmentation, sales crediting, activity metrics calculation and Exposure on advanced analytics such as next best action implementation.Ability to handle large teams of IT professionals.Ability to lead large RFP responses.Ability to handle P&L.Ability to handle different stakeholders customers, vendors, internal teams.Strong project/program management capabilities.Strong written and verbal communication skills Behavioural Competencies Project Management Communication Attention to P&L Impact Teamwork & Leadership Motivation to Learn and Grow Lifescience Knowledge Ownership Cultural Fit Scale of resources managed Scale of revenues managed / delivered Problem solving Talent Management Capability Building / Thought Leadership Account management Technical Competencies Delivery Management- BIM/ Cloud Info Management AWS EMR Amazon Redshift Business Intelligence(BI)

Posted 1 month ago

Apply

5 - 9 years

13 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Position Summary Looking for a Salesforce Data Cloud Engineer to design, implement, and manage data integrations and solutions using Salesforce Data Cloud (formerly Salesforce CDP). This role is essential for building a unified, 360-degree view of the customer by integrating and harmonizing data across platforms. Job Responsibilities Consolidate the Customer data to create a Unified Customer profile Design and implement data ingestion pipelines into Salesforce Data Cloud from internal and third-party systems . Work with stakeholders to define Customer 360 data model requirements, identity resolution rules, and calculated insights. Configure and manage the Data Cloud environment, including data streams, data bundles, and harmonization. Implement identity resolution, micro segmentation, and activation strategies. Collaborate with Salesforce Marketing Cloud, to enable real-time personalization and journey orchestration. Ensure data governance, and platform security. Monitor data quality, ingestion jobs, and overall platform performance. Education BE/B.Tech in Computer or IT Master of Computer Application Work Experience Overall experience of minimum 10 years in Data Management and Data Engineering role, with a minimum experience of 3 years as Salesforce Data Cloud Data Engineer Hands-on experience with Salesforce Data Cloud (CDP), including data ingestion, harmonization, and segmentation. Proficient in working with large datasets, data modeling, and ETL/ELT processes. Understanding of Salesforce core clouds (Sales, Service, Marketing) and how they integrate with Data Cloud. Experience with Salesforce tools such as Marketing Cloud. Strong knowledge of SQL, JSON, Apache Iceberg and data transformation logic. Familiarity with identity resolution and customer 360 data unification concepts. Salesforce certifications (e.g., Salesforce Data Cloud Accredited Professional, Salesforce Administrator, Platform App Builder). Experience with CDP platforms other than Salesforce (e.g., Segment, Adobe Experience Platform (Good to have)). Experience with cloud data storage and processing tools (Azure, Snowflake, etc.). Behavioural Competencies Teamwork & Leadership Motivation to Learn and Grow Ownership Cultural Fit Talent Management Technical Competencies Lifescience Knowledge Azure SQL SQL Databricks

Posted 1 month ago

Apply

4 - 9 years

14 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. 7+ Yrs total experience in Data Engineering projects & 4+ years of relevant experience on Azure technology services and Python Azure Azure data factory, ADLS- Azure data lake store, Azure data bricks, Mandatory Programming languages Py-Spark, PL/SQL, Spark SQL Database SQL DB Experience with AzureADLS, Databricks, Stream Analytics, SQL DW, COSMOS DB, Analysis Services, Azure Functions, Serverless Architecture, ARM Templates Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with object-oriented/object function scripting languagesPython, SQL, Scala, Spark-SQL etc. Data Warehousing experience with strong domain Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications Preferred technical and professional experience Experience with AzureADLS, Databricks, Stream Analytics, SQL DW, COSMOS DB, Analysis Services, Azure Functions, Serverless Architecture, ARM Templates Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with object-oriented/object function scripting languagesPython, SQL, Scala, Spark-SQL etc

Posted 1 month ago

Apply

3 - 7 years

7 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities includeProficient Software Development with Microsoft TechnologiesDemonstrate expertise in software development using Microsoft technologies, ensuring high-quality code and efficient application performance. Collaborative Problem-Solving and Stakeholder EngagementCollaborate effectively with stakeholders to understand product requirements and challenges, proactively addressing issues through analytical problem-solving and strategic software solutions. Agile Learning and Technology IntegrationStay updated with the latest Microsoft technologies, eagerly embracing continuous learning and integrating newfound knowledge to enhance software development processes and product features. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise SQL ADF Azure Data Bricks Preferred technical and professional experience PostgreSQL, MSSQL Eureka Hystrix, zuul/API gateway In-Memory storage

Posted 1 month ago

Apply

4 - 9 years

14 - 18 Lacs

Kochi

Work from Office

Naukri logo

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 6 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in Data Bricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Preferred technical and professional experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Knowledge or experience of Snowflake will be an added advantage

Posted 1 month ago

Apply

4 - 9 years

12 - 16 Lacs

Hyderabad

Work from Office

Naukri logo

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on AWS; Exposure to streaming solutions and message brokers like Kafka technologies. Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers AWS S3 , Redshift , and EMR for data storage and distributed processing. AWS Lambda , AWS Step Functions , and AWS Glue to build serverless, event-driven data workflows and orchestrate ETL processes

Posted 1 month ago

Apply

5 - 10 years

11 - 15 Lacs

Gurugram

Work from Office

Naukri logo

Job Title : GN- SC&O - S&P -Spend Analytics & Product Development –Manager Management Level :07 – Manager Location :Gurgaon Must have skills :Spend Analytics, Product Development, AI/ML-based Procurement Automation Good to have skills :NLP, BI Tools (Power BI, Tableau), Contract Analytics, Supplier MDM, Cloud Architecture (AWS, Databricks) Job Summary Roles & Responsibilities Lead end-to-end product development of spend analytics and procurement automation solutions. Implement AI-driven sourcing and savings assessment engines across multiple spend categories including IT, Temp Labor, and Travel. Drive the architecture of GenAI-integrated platforms for PO-Contract matching and compliance monitoring. Build and deliver business cases, custom demos, and POCs for prospective clients during pre-sales cycles. Collaborate with clients to understand pain points and tailor BI dashboards and tools that drive actionable insights. Drive client success through continuous program governance, risk mitigation, and value realization. Mentor junior team members and lead multi-disciplinary teams for project execution and delivery. Professional & Technical Skills Product development for SaaS-based spend optimization tools Expertise in Spend Analytics and NLP-powered classification tools Contract analytics, supplier clustering, and MDM frameworks Development and deployment of BI dashboards using Power BI/Tableau Good to Have Skills : ML/NLP tools for text classification and anomaly detection Cloud platforms such as AWS and Databricks SQL/NoSQL and data governance frameworks Additional Information The ideal candidate should have extensive experience in leading enterprise-scale spend analytics and procurement transformation initiatives. They should demonstrate a strong background in AI-led innovation, product development, and client engagement. Proven ability to drive business growth through solution design, pre-sales leadership, and long-term stakeholder management is essential. This position is based at our Gurgaon office. About Our Company | Accenture Qualification Experience :15+ years of experience Educational Qualification : BE, BTech, BSC, MBA

Posted 1 month ago

Apply

2 - 6 years

17 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

Job Title - Data Eng, Mgmt. & Governance - Analyst – S&C Global Network Management Level: 11 Location: Hyderabad Must have skills: Proficiency and hands-on experience in Data Engineering technologies like Python, R, SQL, Spark, Pyspark, Databricks, Hadoop etc. Good to have skills: Exposure to Retail, Banking, Healthcare projects and knowledge of PowerBI & PowerApps is an added advantage. Job Summary :As a Data Operations Analyst, you would be responsible to ensure our esteemed business is fully supported in using the business-critical AI enabled applications. This involves solving day-to-day application issues, business queries, addressing adhoc data requests to ensure clients can extract maximum value for the AI applications. WHAT'S IN IT FOR YOU? An opportunity to work on high-visibility projects with top clients around the globe. Potential to Co-create with leaders in strategy, industry experts, enterprise function practitioners, and business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed responsible business into everything"”from how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge, and capabilities. Opportunity to thrive in a culture that is committed to accelerating equality for all. Engage in boundaryless collaboration across the entire organization. Roles & Responsibilities: Monitor and maintain pre-processing pipelines, model execution batches and validation of model outputs. In case of deviations or model degradation, take up detailed root cause analysis and implement permanent fixes. Debug issues related to data loads, batch pipeline, application functionality including special handling of data/batch streams. As a Data Operations Analyst, you would be working on initial triaging of code related defects/issues, provide root cause analysis and implement code fix for permanent resolution of the defect. Design, build, test and deploy small to medium size enhancements that deliver value to business and enhance application availability and usability. Responsible for sanity testing of use cases as part of pre-deployment and post-production activities. Primarily responsible for Application availability and stability by remediating application issues/bugs or other vulnerabilities. Data Operations Analysts evolve to become Subject Matter Experts as they mature in servicing the applications. Professional & Technical Skills: Proven experience (2+ years) in working as per the above job description is required. Experience/Education on Statistics, Data Science, Applied Mathematics, Business Analytics, Computer Science, Information Systems is preferable. Proven experience (2+ years) in working as per the above job description is required. Exposure to Retail, Banking, Healthcare projects is added advantage. Proficiency and hands-on experience in Data Engineering technologies like Python, R, SQL, Spark, Pyspark, Databricks, Hadoop etc. Ability to work with large data sets and present findings/insights to key stakeholders; Data management using databases like SQL. Experience with any of the cloud platforms like AWS, Azure, or Google Cloud for deploying and scaling language models. Experience with any of the data visualization tools like Tableau, Qlikview, and Spotfire is good. Knowledge on PowerBI & PowerApps is an added advantage. Excellent analytical and problem-solving skills, with a data-driven mindset. Proficient in Excel, MS Word, PowerPoint, etc. Ability to solve complex business problems and deliver client delight. Strong writing skills to build points of view on current industry trends. Good Client handling skills; able to demonstrate thought leadership & problem-solving skills. Additional Information: - The ideal candidate will possess a strong educational background in computer science or a related field.- This position is based at our Hyderabad office. About Our Company | Accenture Qualification Experience: Minimum 2 years of experience is required Educational Qualification: Bachelors’ or masters’ degree in any engineering stream or MCA.

Posted 1 month ago

Apply

3 - 8 years

9 - 13 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. You will collaborate with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Your typical day will involve working on the data platform blueprint and design, collaborating with architects, and ensuring seamless integration between systems and data models. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Assist with the data platform blueprint and design. Collaborate with Integration Architects and Data Architects. Ensure cohesive integration between systems and data models. Implement data platform components. Troubleshoot and resolve data platform issues. Professional & Technical Skills: Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. Strong understanding of data platform blueprint and design. Experience with data integration and data modeling. Hands-on experience with data platform components. Knowledge of data platform security and governance. Additional Information: The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform. This position is based at our Chennai office. A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies