Home
Jobs

935 Data Bricks Jobs - Page 5

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 12.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : BTech or Equivalent Min 15 years of education Summary :As an AI Advisor, you will be responsible for driving business outcomes for clients through analytics using Databricks Unified Data Analytics Platform. Your typical day will involve supporting delivery leads, account management, and operational excellence teams to deliver client value through analytics and industry best practices. Roles & Responsibilities:- Lead the development and deployment of advanced analytics solutions using Databricks Unified Data Analytics Platform.- Conduct detailed analysis of complex data sets, employing statistical methodologies and data munging techniques for actionable insights.- Collaborate with cross-functional teams, applying expertise in diverse analytics techniques, including experience in implementing various algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms.- Communicate technical findings effectively to stakeholders, utilizing data visualization tools for clarity.- Stay updated with the latest advancements in analytics and data science, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: - Must To Have Skills: Experience with Databricks Unified Data Analytics Platform.- Strong understanding of statistical analysis and machine learning algorithms.- Experience with data visualization tools.- Experience in implementing various analytics techniques such as linear regression, logistic regression, decision trees, and clustering algorithms.- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform.- The ideal candidate will possess a strong educational background in statistics, mathematics, computer science, or a related field, along with a proven track record of delivering impactful data-driven solutions.- This position is based at our Bengaluru office. Qualification BTech or Equivalent Min 15 years of education

Posted 3 days ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : Databricks Unified Data Analytics Platform, Apache Spark, Talend ETLMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with the team to develop and implement solutions, ensuring the applications are aligned with the business needs. You will also engage with multiple teams, contribute to key decisions, and provide problem-solving solutions for your team and across multiple teams. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Design, build, and configure applications based on business process and application requirements- Collaborate with the team to develop and implement solutions- Ensure the applications are aligned with the business needs Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark- Good To Have Skills: Experience with Talend ETL, Apache Spark, Databricks Unified Data Analytics Platform- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have a minimum of 5 years of experience in PySpark- This position is based in Mumbai- A 15 years full-time education is required Qualification 15 years full time education

Posted 3 days ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Python (Programming Language), Microsoft Azure Databricks, Microsoft Azure Data ServicesMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for creating efficient and scalable applications that align with the organization's goals and objectives. Your typical day will involve collaborating with cross-functional teams, analyzing business requirements, and developing innovative solutions to meet customer needs. You will also be involved in testing, debugging, and troubleshooting applications to ensure their smooth functioning and optimal performance. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Develop and maintain high-quality software applications.- Collaborate with business analysts and stakeholders to gather and analyze requirements.- Design and implement application features and enhancements.- Perform code reviews and ensure adherence to coding standards.- Troubleshoot and debug application issues.- Optimize application performance and scalability.- Conduct unit testing and integration testing.- Document application design, functionality, and processes.- Stay updated with emerging technologies and industry trends.- Provide technical guidance and mentorship to junior team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform, Python (Programming Language), Microsoft Azure Databricks, Microsoft Azure Data Services.- Strong understanding of statistical analysis and machine learning algorithms.- Experience with data visualization tools such as Tableau or Power BI.- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms.- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information:- The candidate should have a minimum of 12 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

8.0 - 13.0 years

15 - 19 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Technology Architect Project Role Description : Review and integrate all application requirements, including functional, security, integration, performance, quality and operations requirements. Review and integrate the technical architecture requirements. Provide input into final decisions regarding hardware, network products, system software and security. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Cloud Data ArchitectureMinimum 5 year(s) of experience is required Educational Qualification : BE or MCA Summary :As a Technology Architect, you will be responsible for reviewing and integrating all application requirements, including functional, security, integration, performance, quality, and operations requirements. Your typical day will involve reviewing and integrating technical architecture requirements, providing input into final decisions regarding hardware, network products, system software, and security. Roles & ResponsibilitiesShould have a minimum of 8 years of experience in Databricks Unified Data Analytics Platform. Good experience in implementing data ingestion pipelines from multiple sources and creating end to end data pipeline on Databricks platform Should have strong educational background in technology and information architectures, along with a proven track record of delivering impactful data-driven solutions. Strong requirement analysis and technical solutioning skill in Data and Analytics Client facing role in terms of running solution workshops, client visits, handled large RFP pursuits and managed multiple stakeholders. Technical Experience6 or more years of experience in implementing data ingestion pipelines from multiple sources and creating end to end data pipeline on Databricks platform. 2 or more years of experience using Python, PySpark or Scala. Experience in Databricks on cloud. Exp in any of AWS, Azure or GCPe, ETL, data engineering, data cleansing and insertion into a data warehouse Must have Skills like Databricks, Cloud Data Architecture, Python Programming Language, Data Engineering. Professional AttributesExcellent writing, communication and presentation skills. Eagerness to learn and develop self on an ongoing basis. Excellent client facing and interpersonal skills. Qualification BE or MCA

Posted 3 days ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : Graduate Project Role :Data Platform Engineer Project Role Description :Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have Skills :Databricks Unified Data Analytics Platform, SSINON SSI:Good to Have Skills :SSI:No Technology Specialization NON SSI :Job :Key Responsibilities :1 Show a strong development skill in Pyspark and Databrick sto build complex data pipelines 2 Should be able to deliver the development task assigned independently or with small help 3 Should be able to participate in daily status calls and have good communication skills to manage day to day work Technical Experience :1 Should have more than 5 years of experience in IT 2. Should have more than 2 years of experience in technologies like Databricks and Pyspark 3 Should be able to build end to end pipelines using Pyspark with good knowledge on Delta Lake 4 Should have good knowledge on Azure services like Azure Data Factory, Azure storage solutions like ADLS, Delta Lake, Azure AD Professional Attributes :1 Should have involved in data engineering project from requirements phase to delivery 2 Good communication skill to interact with client and understand the requirement 3 Should have capability to work independently and guide the team Educational Qualification:GraduateAdditional Info :Skill Flex for Pyspark, only Bengaluru, Should be flexible to work form Client Office Qualification Graduate

Posted 3 days ago

Apply

10.0 - 20.0 years

50 - 70 Lacs

Hyderabad

Work from Office

Naukri logo

Role & responsibilities Engineering Director role, most critical technical experiences are around scalable system design, modern data architecture, and team enablement. Experience with languages like Java is key for backend systems, while Python remains important for orchestration and analytics workloads. From a tooling standpoint, familiarity with Kubernetes, Terraform, and observability stacks (e.g. DataDog, Grafana) is essential for operational excellence. On the data side, platforms like Snowflake, Databricks, or lakehouses are important for most modern pipelines. A Director should be comfortable working and evolving data architecture decisions around these with recommendations from Architects. Additionally, privacy and security are becoming first-class concerns so experience in basic data access controls, and compliance policies (GDPR, CCPA) is a strong differentiator. Finally, the ability to mentor engineers and guide technical ad tech business strategy across teams including cross-functional stakeholders in data science, customer success, and measurement is an important characteristic in driving long-term success.

Posted 3 days ago

Apply

6.0 - 11.0 years

15 - 27 Lacs

Bengaluru

Hybrid

Naukri logo

Primary Responsibilities: Develop visual reports, dashboards and KPI scorecards using Power BI desktop. Build Analysis Services reporting models. Connect to data sources, importing data and transforming data for Business Intelligence. Implement row level security on data and understand application security layer models in Power BI. Integrate Power BI reports into other applications using embedded analytics like Power BI service (SaaS), or by API automation. Use advance level calculations on the data set. Design and develop Azure-based data centric application to manage large healthcare data application Design, build, test and deploy streaming pipelines for data processing in real time and at scale Create ETL packages Make use of Azure cloud services in ingestion and data processing Own feature development using Microsoft Azure native services like App Service, Azure Function, Azure Storage, Service Bus queues, Event Hubs, Event Grid, Application Gateway, Azure SQL, Azure DataBricks, etc Identify opportunities to fine-tune and optimize applications running on Microsoft Azure, cost reduction, adoption of best cloud practices, data and application security covering scalability and high availability Mentor team on infrastructural, networking, data migration, monitoring and troubleshooting aspects of Microsoft Azure Focus on automation using Infrastructure as a Code (IaaC), Jenkins, Azure DevOps, Terraform, etc. Communicate effectively with other engineers and QA Establish, refine and integrate development and test environment tools and software as needed Identify production and non-production application issues Senior Cloud Data Engineer Position with about 7+ Years of hands-on technical Experience in the Data processing, reporting and Cloud technologies. Working Knowledge of executing the projects in the Agile Methodologies. 1. Required Skills 1. Be able to envision the overall solution for defined functional and non-functional requirements; and be able to define technologies, patterns and frameworks to materialize it. 2. Design and develop the framework of the system and be able to explain choices made. Also write and review design document explaining overall architecture, framework and high level design of the application. 3. Create, understand and validate Design and estimated effort for given module/task, and be able to justify it. 4. Be able to define in-scope, out-of-scope and taken assumptions while creating effort estimates. 5. Be able to identify and integrate well over all integration points in context of a project as well as other applications in the environment. 6. Understand the business requirements and develop data models Technical Skills: 1. Strong proficiency as a Cloud Data Engineer utilizing Power BI and Azure Data Bricks to support as well as design, develop and deploy requested updates to new and existing cloud-based services. 2. Experience with developing, implementing, monitoring and troubleshooting applications in the Azure Public Cloud. 3. Proficiency in Data Modeling and reporting 4. Design and implement database schema 5. Design and development of well documented source code. 6. Development of both unit testing and system testing scripts that will be incorporated into the QA process. 7. Automating all deployment steps with Infrastructure as Code (IAC) and Jenkins Pipeline as Code (JPaC) concepts. 8. Define guidelines and benchmarks for NFR considerations during project implementation. 9. Do required POCs to make sure that suggested design/technologies meet the requirements. . Required Experience: 5+ to 10+ years of professional experience developing SQL, Power BI, SSIS and Azure Data Bricks. 5+ to 10+ years of professional experience utilizing SQL Server for data storage in large-scale .NET solutions. Strong technical writing skills. Strong knowledge of build/deployment/unit testing tools. Highly motivated team player and a self-starter. Excellent verbal, phone, and written communication skills. Knowledge of Cloud-based architecture and concepts. Required Qualifications: Graduate or Post Graduate in Computer Science /Engineering/Science/Mathematics or related field with around 10 years of experience in executing the Data Reporting solutions Cloud Certification, preferably Azure

Posted 3 days ago

Apply

8.0 - 13.0 years

15 - 25 Lacs

Kolkata

Work from Office

Naukri logo

Design and implement data architecture solutions that align with business requirements Develop and maintain data models, data dictionaries, and data flow diagrams.

Posted 3 days ago

Apply

6.0 - 11.0 years

11 - 21 Lacs

Udaipur, Jaipur, Bengaluru

Work from Office

Naukri logo

Data Architect Kadel Labs is a leading IT services company delivering top-quality technology solutions since 2017, focused on enhancing business operations and productivity through tailored, scalable, and future-ready solutions. With deep domain expertise and a commitment to innovation, we help businesses stay ahead of technological trends. As a CMMI Level 3 and ISO 27001:2022 certified company, we ensure best-in-class process maturity and information security, enabling organizations to achieve their digital transformation goals with confidence and efficiency. Role: Data Architect Experience: 6-10 Yrs Location: Udaipur , Jaipur, Bangalore Domain: Telecom Job Description: We are seeking an experienced Telecom Data Architect to join our team. In this role, you will be responsible for designing comprehensive data architecture and technical solutions specifically for telecommunications industry challenges, leveraging TM forum frameworks and modern data platforms. You will work closely with customers, and technology partners to deliver data solutions that address complex telecommunications business requirements including customer experience management, network optimization, revenue assurance, and digital transformation initiatives. Key Responsibilities: Design and articulate enterprise-scale telecom data architectures incorporating TMforum standards and frameworks, including SID (Shared Information/Data Model), TAM (Telecom Application Map), and eTOM (enhanced Telecom Operations Map) Develop comprehensive data models aligned with TMforum guidelines for telecommunications domains such as Customer, Product, Service, Resource, and Partner management Create data architectures that support telecom-specific use cases including customer journey analytics, network performance optimization, fraud detection, and revenue assurance Design solutions leveraging Microsoft Azure and Databricks for telecom data processing and analytics Conduct technical discovery sessions with telecom clients to understand their OSS/BSS architecture, network analytics needs, customer experience requirements, and digital transformation objectives Design and deliver proof of concepts (POCs) and technical demonstrations showcasing modern data platforms solving real-world telecommunications challenges Create comprehensive architectural diagrams and implementation roadmaps for telecom data ecosystems spanning cloud, on-premises, and hybrid environments Evaluate and recommend appropriate big data technologies, cloud platforms, and processing frameworks based on telecom-specific requirements and regulatory compliance needs. Design data governance frameworks compliant with telecom industry standards and regulatory requirements (GDPR, data localization, etc.) Stay current with the latest advancements in data technologies including cloud services, data processing frameworks, and AI/ML capabilities Contribute to the development of best practices, reference architectures, and reusable solution components for accelerating proposal development Required Skills: 6-10 years of experience in data architecture, data engineering, or solution architecture roles with at least 5 years in telecommunications industry Deep knowledge of TMforum frameworks including SID (Shared Information/Data Model), eTOM, TAM, and their practical implementation in telecom data architectures Demonstrated ability to estimate project efforts, resource requirements, and implementation timelines for complex telecom data initiatives Hands-on experience building data models and platforms aligned with TMforum standards and telecommunications business processes Strong understanding of telecom OSS/BSS systems, network management, customer experience management, and revenue management domains Hands-on experience with data platforms including Databricks, and Microsoft Azure in telecommunications contexts Experience with modern data processing frameworks such as Apache Kafka, Spark and Airflow for real-time telecom data streaming Proficiency in Azure cloud platform and its respective data services with an understanding of telecom-specific deployment requirements Knowledge of system monitoring and observability tools for telecommunications data infrastructure Experience implementing automated testing frameworks for telecom data platforms and pipelines Familiarity with telecom data integration patterns, ETL/ELT processes, and data governance practices specific to telecommunications Experience designing and implementing data lakes, data warehouses, and machine learning pipelines for telecom use cases Proficiency in programming languages commonly used in data processing (Python, Scala, SQL) with telecom domain applications Understanding of telecommunications regulatory requirements and data privacy compliance (GDPR, local data protection laws) Excellent communication and presentation skills with ability to explain complex technical concepts to telecom stakeholders Strong problem-solving skills and ability to think creatively to address telecommunications industry challenges Good to have TMforum certifications or telecommunications industry certifications Relevant data platform certifications such as Databricks, Azure Data Engineer are a plus Willingness to travel as required Educational Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Visit us: https://kadellabs.com/ https://in.linkedin.com/company/kadel-labs https://www.glassdoor.co.in/Overview/Working-at-Kadel-Labs-EI_IE4991279.11,21.htm

Posted 3 days ago

Apply

8.0 - 12.0 years

5 - 10 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Teradata to Snowflake and Databricks on Azure Cloud,data migration projects, including complex migrations to Databricks,Strong expertise in ETL pipeline design and optimization, particularly for cloud environments and large-scale data migration

Posted 3 days ago

Apply

12.0 - 20.0 years

0 - 0 Lacs

Noida, Hyderabad, Bengaluru

Hybrid

Naukri logo

We at EMIDS, are hiring for Sr. Data Architect Role Please find the details below and share your interest at aarati.pardhi@emids.com Job Description : We are looking for a highly experienced Senior Data Architect with a strong background in big data technologies, cloud platforms, advanced analytics, and AI. The ideal candidate will lead the end-to-end design and architecture of scalable, high-performance data platforms using PySpark, Databricks, and major cloud platforms (Azure/AWS/GCP). A strong understanding of AI/ML pipeline integration and enterprise data strategy is essential. Key Responsibilities: Should have experience on Building Data and AI Products Lead the data architecture design across modern data platforms using PySpark, Databricks, and cloud-native technologies. Define data models, data flow, and architecture blueprints aligned with business and analytical requirements. Architect and optimize big data pipelines and AI/ML workflows, ensuring performance, scalability, and reliability. Collaborate with business stakeholders, data scientists, and engineers to enable advanced analytics and predictive modeling capabilities. Design and implement data lakehouses, ingestion frameworks, and transformation layers. Provide technical leadership and mentoring to data engineers and developers. Drive adoption of data governance, security, and metadata management practices. Evaluate emerging technologies and recommend tools to support enterprise data strategies.

Posted 3 days ago

Apply

2.0 - 5.0 years

5 - 8 Lacs

Chennai

Hybrid

Naukri logo

Role & responsibilities We are seeking a skilled SQL Developer with strong experience in Databricks and Power BI to join our data engineering and analytics team. The ideal candidate will have a solid foundation in SQL development, hands-on experience with Databricks for data processing, and proficiency in creating insightful dashboards using Power BI. Key Responsibilities: Design, develop, and optimize SQL queries, stored procedures, and data pipelines. Develop and maintain scalable data workflows using Azure Databricks . Integrate, transform, and consolidate data from various sources into data warehouses or lakes. Create, manage, and publish interactive dashboards and reports using Power BI . Work closely with data engineers, analysts, and business stakeholders to understand requirements and translate them into data solutions. Ensure data quality, integrity, and security in all deliverables. Troubleshoot performance issues and recommend solutions to optimize data processing and reporting performance. Required Skills: Strong proficiency in SQL , including query optimization and data modeling. Hands-on experience with Databricks (preferably on Azure Databricks ). Proficiency in Power BI dashboard creation, DAX, Power Query, data visualization. Familiarity with ETL/ELT processes and tools. Experience with cloud platforms (preferably Azure ). Understanding of data warehousing concepts and architecture.

Posted 3 days ago

Apply

6.0 - 9.0 years

22 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

We are looking for "Sr. Azure DevOps Engineer" with Minimum 6 years experience Contact- Atchaya (95001 64554) Required Candidate profile Exp in DevOps’s role, Data bricks, Terraform, Ansible, API Troubleshooting Azure platform issues. Snowflake provisioning and configuration skills

Posted 3 days ago

Apply

5.0 - 8.0 years

7 - 17 Lacs

Chennai

Work from Office

Naukri logo

Key Responsibilities : Architect and implement end-to-end data solutions using Azure services (Data Factory, Databricks, Data Lake, Synapse, Cosmos DB, etc.). Design robust and scalable data models, including relational, dimensional, and NoSQL schemas. Develop and optimize ETL/ELT pipelines and data lakes using Azure Data Factory, Databricks, and open formats such as Delta and Iceberg. Integrate data governance, quality, and security best practices into all architecture designs. Support analytics and machine learning initiatives through structured data pipelines and platforms. Perform data manipulation and analysis using Pandas, NumPy, and related Python libraries Develop and maintain high-performance REST APIs using FastAPI or Flas Ensure data integrity, quality, and availability across various sources Integrate data workflows with application components to support real-time or scheduled processes Collaborate with data engineers, analysts, data scientists, and business stakeholders to align solutions with business needs. Drive CI/CD integration with Databricks using Azure DevOps and tools like DBT. Monitor system performance, troubleshoot issues, and optimize data infrastructure for efficiency and reliability. Communicate technical concepts effectively to non-technical stakeholders Required Skills & Experience Extensive hands-on experience with Azure services: Data Factory, Databricks, Data Lake, Azure SQL, Cosmos DB, Synapse. Expertise in data modeling and design (relational, dimensional, NoSQL). Proven experience with ETL/ELT processes, data lakes, and modern lake house architectures. Solid experience with SQL Server or any major RDBMS; ability to write complex queries and stored procedures 3+ years of experience with Azure Data Factory, Azure Databricks, and PySpark Strong programming skills in Python, with solid understanding of Pandas and NumPy Proven experience in building REST APIs Good knowledge of data formats (JSON, Parquet, Avro) and API communication patterns Strong knowledge of data governance, security, and compliance frameworks. Experience with CI/CD, Azure DevOps, and infrastructure as code (Terraform or ARM templates). Familiarity with BI and analytics tools such as Power BI or Tableau. Strong problem-solving skills and attention to performance, scalability, and security Excellent communication skills both written and verbal Preferred Qualifications • Experience in regulated industries (finance, healthcare, etc.). • Familiarity with data cataloging, metadata management, and machine learning integration. • Leadership experience guiding teams and presenting architectural strategies to leadership.

Posted 3 days ago

Apply

7.0 - 12.0 years

20 - 27 Lacs

Bengaluru

Work from Office

Naukri logo

TECHNICAL SKILLS AND EXPERIENCE Most important: 7+ years professional experience as a data engineer, with at least 4 utilizing cloud technologies. Proven experience building ETL or ETL data pipelines with Databricks either in Azure or AWS using PySpark language. Strong experience with the Microsoft Azure Data Stack (Databricks, Data Lake Gen2, ADF etc.) Strong SQL skills and proficiency in Python adhering to standards such as PEP Proven experience with unit testing and applying appropriate testing methodologies using libraries such as Pytest, Great Expectations, or similar. Demonstrable experience with CICD including release and test automation tools and processes such as Azure Devops, Terraform, Powershell and Bash scripting or similar. Strong understanding of data modeling, data warehousing, and OLAP concepts. Excellent technical documentation skills. Preferred candidate profile

Posted 3 days ago

Apply

8.0 - 10.0 years

11 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Databricks Architect Should have minimum of 10+ years of experience Must have skills DataBricks, Delta Lake, pyspark or scala spark, Unity Catalog Good to have skills - Azure and/or AWS Cloud Handson exposure in o Strong experience with the use of databricks as lakehouse solution o Establish the Databricks Lakehouse architecture o To ingest and transform batch and streaming data on the Databricks Lakehouse Platform. o Orchestrate diverse workloads for the full lifecycle including Delta Live Tables, PySpark etc Mandatory Skills: DataBricks - Data Engineering. Experience8-10 Years.

Posted 3 days ago

Apply

10.0 - 14.0 years

30 - 37 Lacs

Noida

Hybrid

Naukri logo

Required Qualifications: Undergraduate degree or equivalent experience 5+ years of work experience on Big Data skills 5+ years of experience managing the team 5+ years of work experience on people management skills 3+ years of work experience on Azure Cloud skills Experience or knowledge Azure Cloud, Databricks, Terraform, CI/CD, Spark, Scala, Java, Hbase, Hive, Sqoop, GitHub, Jenkins, Elastic Search, Grafana, UNIX, SQL, OpenShift, Kubernetes and Oozie etc. Solid technical knowledge and work experience on Big Data skills and Azure Cloud skills Primary Responsibilities: Designing and developing large-scale data processing systems. Use the expertise in big data technologies to ensure that the systems are efficient, scalable, and secure Ensuring that the developed systems are running smoothly. Monitor system performance, diagnose and troubleshoot issues, and make necessary changes to optimize system performance Processing, cleaning, and integrating large data sets from various sources to ensure that the data is accurate, complete, and consistent Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders. Collaborate with these teams to ensure that the systems they develop meet the organizations requirements and can support its goals Collaborating closely with senior stakeholders to understand business requirements and effectively translate them into technical requirements for the development team Planning and documenting comprehensive technical specifications for features or system design, ensuring a clear roadmap for development and implementation Designing, building, and configuring applications to meet business process and application requirements, leveraging your technical expertise and problem-solving skills Directing the development team in all aspects of the software development life cycle, including design, development, coding, testing, and debugging, to deliver high-quality solutions Writing testable, scalable, and efficient code, leading by example, and setting coding standards for the team Conducting code reviews and providing constructive feedback to ensure code quality and adherence to best practices Mentoring and guiding junior team members, fostering their professional growth, and encouraging the adoption of industry best practices Ensuring that software quality standards are met by enforcing code standards, conducting rigorous testing, and implementing continuous improvement processes Staying updated with the latest technologies and industry trends, continuously enhancing technical skills, and driving innovation within the development team Set and communicate team priorities that support the broader organization's goals. Align strategy, processes, and decision-making across teams Set clear expectations with individuals based on their level and role and aligned to the broader organization's goals. Meet regularly with individuals to discuss performance and development and provide feedback and coaching Develop the long-term technical vision and roadmap within, and often beyond, the scope of your teams. Evolve the roadmap to meet anticipated future requirements and infrastructure needs. Identify, navigate, and overcome technical and organizational barriers that may stand in the way of delivery Constantly improve the processes and practices around development and delivery Always think customer first, including striving to outperform their expectations Effectively work with Product Managers, Program Managers and other stakeholders to ensure the customer is benefiting from the work Foster and facilitate Agile methodologies globally and work in an agile environment using SCRUM or Kanban Work with Program Managers/leads to consume product backlog and generate technical design Leading by example on design and development of platform features Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so

Posted 3 days ago

Apply

0.0 - 5.0 years

0 Lacs

Pune

Remote

Naukri logo

The candidate must be proficient in Python, libraries and frameworks. Good with Data Modeling, Pyspark, MySQL concepts, Power BI, AWS, Azure concepts Experience in optimizing large transactional DBs Data, visualization tools, Databricks, fast API.

Posted 4 days ago

Apply

7.0 - 12.0 years

16 - 31 Lacs

Pune, Delhi / NCR, Mumbai (All Areas)

Hybrid

Naukri logo

Job Title: Lead Data Engineer Job Summary The Lead Data Engineer will provide technical expertise in analysis, design, development, rollout and maintenance of data integration initiatives. This role will contribute to implementation methodologies and best practices, as well as work on project teams to analyse, design, develop and deploy business intelligence / data integration solutions to support a variety of customer needs. This position oversees a team of Data Integration Consultants at various levels, ensuring their success on projects, goals, trainings and initiatives though mentoring and coaching. Provides technical expertise in needs identification, data modelling, data movement and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective whilst leveraging best fit technologies (e.g., cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges Works with stakeholders to identify and define self-service analytic solutions, dashboards, actionable enterprise business intelligence reports and business intelligence best practices. Responsible for repeatable, lean and maintainable enterprise BI design across organizations. Effectively partners with client team. Leadership not only in the conventional sense, but also within a team we expect people to be leaders. Candidate should elicit leadership qualities such as Innovation, Critical thinking, optimism/positivity, Communication, Time Management, Collaboration, Problem-solving, Acting Independently, Knowledge sharing and Approachable. Responsibilities: Design, develop, test, and deploy data integration processes (batch or real-time) using tools such as Microsoft SSIS, Azure Data Factory, Databricks, Matillion, Airflow, Sqoop, etc. Create functional & technical documentation e.g. ETL architecture documentation, unit testing plans and results, data integration specifications, data testing plans, etc. Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs. Perform data analysis to validate data models and to confirm ability to meet business needs. May serve as project or DI lead, overseeing multiple consultants from various competencies Stays current with emerging and changing technologies to best recommend and implement beneficial technologies and approaches for Data Integration Ensures proper execution/creation of methodology, training, templates, resource plans and engagement review processes Coach team members to ensure understanding on projects and tasks, providing effective feedback (critical and positive) and promoting growth opportunities when appropriate. Coordinate and consult with the project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels Architect, design, develop and set direction for enterprise self-service analytic solutions, business intelligence reports, visualisations and best practice standards. Toolsets include but not limited to: SQL Server Analysis and Reporting Services, Microsoft Power BI, Tableau and Qlik. Work with report team to identify, design and implement a reporting user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability best practices. Required Qualifications: 10 Years industry implementation experience with data integration tools such as AWS services Redshift, Athena, Lambda, Glue, S3, ETL, etc. 5-8 years of management experience required 5-8 years consulting experience preferred Minimum of 5 years of data architecture, data modelling or similar experience Bachelor’s degree or equivalent experience, Master’s Degree Preferred Strong data warehousing, OLTP systems, data integration and SDLC Strong experience in orchestration & working experience cloud native / 3rd party ETL data load orchestration Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms Strong databricks experience required to create notebooks in pyspark Experience using major data modelling tools (examples: ERwin, ER/Studio, PowerDesigner, etc.) Experience with major database platforms (e.g. SQL Server, Oracle, Azure Data Lake, Hadoop, Azure Synapse/SQL Data Warehouse, Snowflake, Redshift etc.) Strong experience in orchestration & working experience in either Data Factory or HDInsight or Data Pipeline or Cloud composer or Similar Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of modern data warehouse capabilities and technologies such as real-time, cloud, Big Data. Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms 3-5 years’ development experience in decision support / business intelligence environments utilizing tools such as SQL Server Analysis and Reporting Services, Microsoft’s Power BI, Tableau, looker etc. Preferred Skills & Experience: Knowledge and working experience with Data Integration processes, such as Data Warehousing, EAI, etc. Experience in providing estimates for the Data Integration projects including testing, documentation, and implementation Ability to analyse business requirements as they relate to the data movement and transformation processes, research, evaluation and recommendation of alternative solutions. Ability to provide technical direction to other team members including contractors and employees. Ability to contribute to conceptual data modelling sessions to accurately define business processes, independently of data structures and then combines the two together. Proven experience leading team members, directly or indirectly, in completing high-quality major deliverables with superior results Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM. Can create documentation and presentations such that the they “stand on their own” Can advise sales on evaluation of Data Integration efforts for new or existing client work. Can contribute to internal/external Data Integration proof of concepts. Demonstrates ability to create new and innovative solutions to problems that have previously not been encountered. Ability to work independently on projects as well as collaborate effectively across teams Must excel in a fast-paced, agile environment where critical thinking and strong problem solving skills are required for success Strong team building, interpersonal, analytical, problem identification and resolution skills Experience working with multi-level business communities Can effectively utilise SQL and/or available BI tool to validate/elaborate business rules. Demonstrates an understanding of EDM architectures and applies this knowledge in collaborating with the team to design effective solutions to business problems/issues. Effectively influences and, at times, oversees business and data analysis activities to ensure sufficient understanding and quality of data. Demonstrates a complete understanding of and utilises DSC methodology documents to efficiently complete assigned roles and associated tasks. Deals effectively with all team members and builds strong working relationships/rapport with them. Understands and leverages a multi-layer semantic model to ensure scalability, durability, and supportability of the analytic solution. Understands modern data warehouse concepts (real-time, cloud, Big Data) and how to enable such capabilities from a reporting and analytic stand-point. Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM.

Posted 5 days ago

Apply

4.0 - 9.0 years

10 - 20 Lacs

Kolkata, Pune, Bengaluru

Work from Office

Naukri logo

About Client Hiring for One of the Most Prestigious Multinational Corporations! Job Description Job Title : Azure Data Engineer Qualification : Any Graduate or above Relevant Experience : 4 to 10 yrs Must Have Skills : Azure, ADB, PySpark Roles and Responsibilites: Strong experience in implementation and management of lake House using Databricks and Azure Tech stack (ADLS Gen2, ADF, Azure SQL) . Strong hands-on expertise with SQL, Python, Apache Spark and Delta Lake. Proficiency in data integration techniques, ETL processes and data pipeline architectures. Demonstrable experience using GIT and building CI/CD pipelines for code management. Develop and maintain technical documentation for the platform. Ensure the platform is developed with software engineering, data analytics and data security practices in mind. Developing and optimizing data processing and data storage systems, ensuring high performance, reliability, and security. Experience working in Agile Methodology and well-knowledgeable in using ADO Boards for Sprint deliveries. Excellent communication skills and able to communicate clearly technical and business concepts both verbally and in writing. Ability to work in a team environment and collaborate with all the levels effectively by sharing ideas and knowledge. Location : Kolkata, Pune, Mumbai, Bangalore, BBSR Notice period : Immediate / 90 days Shift Timing : General Shift Mode of Interview : Virtual Mode of Work : WFO Thanks & Regards Bhavana B Black and White Business Solutions Pvt.Ltd. Bangalore,Karnataka,India. Direct Number:8067432454 bhavana.b@blackwhite.in |www.blackwhite.in

Posted 5 days ago

Apply

12.0 - 16.0 years

1 - 1 Lacs

Hyderabad

Remote

Naukri logo

Were Hiring: Azure Data Factory (ADF) Developer Hyderabad Location: Onsite at Canopy One Office, Hyderabad/Remote Type: Full-time/Partime/Contract | Offshore role | Must be available to work in Eastern Time Zone (EST) We’re looking for an experienced ADF Developer to join our offshore team supporting a major client. This role focuses on building robust data pipelines using Azure Data Factory (ADF) and working closely with client stakeholders for transformation logic and data movement. Key Responsibilities Design, build, and manage ADF data pipelines Implement transformations and aggregations based on mappings provided Work with data from the bronze (staging) area, pre-loaded via Boomi Collaborate with client-side data managers (based in EST) to deliver clean, reliable datasets Requirements Proven hands-on experience with Azure Data Factory Strong understanding of ETL workflows and data transformation Familiarity with data staging/bronze layer concepts Willingness to work in Eastern Time Zone (EST) hours Preferred Qualifications Knowledge of Kimball Data Warehousing (huge advantage!) Experience working in an offshore coordination model Exposure to Boomi is a plus Role & responsibilities Preferred candidate profile

Posted 5 days ago

Apply

5.0 - 7.0 years

18 - 25 Lacs

Hyderabad

Work from Office

Naukri logo

DataBricks , Azure, Big query (Need to be good with SQL ) , Python , Familiar with Data science concepts or implementations .

Posted 5 days ago

Apply

3.0 - 8.0 years

6 - 14 Lacs

Ahmedabad

Work from Office

Naukri logo

Role & responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Preferred candidate profile Bachelor's and/or masters degree in computer science or equivalent experience. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail

Posted 5 days ago

Apply

5.0 - 10.0 years

9 - 19 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Key Responsibilities: ¢ Work on client projects to deliver AWS, PySpark, Databricks based Data engineering & Analytics solutions. €¢ Build and operate very large data warehouses or data lakes. €¢ ETL optimization, designing, coding, & tuning big data processes using Apache Spark. €¢ Build data pipelines & applications to stream and process datasets at low latencies. €¢ Show efficiency in handling data - tracking data lineage, ensuring data quality, and improving discoverability of data. Technical Experience: €¢ Minimum of 5 years of experience in Databricks engineering solutions on AWS Cloud platforms using PySpark, Databricks SQL, Data pipelines using Delta Lake. €¢ Minimum of 5 years of experience years of experience in ETL, Big Data/Hadoop and data warehouse architecture & delivery. Email at- maya@mounttalent.com

Posted 6 days ago

Apply

7.0 - 10.0 years

19 - 27 Lacs

Hyderabad

Hybrid

Naukri logo

We are seeking a highly skilled and experienced Data Engineering to join our team. The ideal candidate will have a strong background in programming, data management, and cloud infrastructure, with a focus on designing and implementing efficient data solutions. This role requires a minimum of 5+ years of experience and a deep understanding of Azure services and infrastructure and ETL/ELT solutions. Key Responsibilities: Azure Infrastructure Management: Own and maintain all aspects of Azure infrastructure, recommending modifications to enhance reliability, availability, and scalability. Security Management: Manage security aspects of Azure infrastructure, including network, firewall, private endpoints, encryption, PIM, and permissions management using Azure RBAC and Databricks roles. Technical Troubleshooting: Diagnose and troubleshoot technical issues in a timely manner, identifying root causes and providing effective solutions. Infrastructure as Code: Create and maintain Azure Infrastructure as Code using Terraform and GitHub Actions. CI/CD Pipelines: Configure and maintain CI/CD pipelines using GitHub Actions for various Azure services such as ADF, Databricks, Storage, and Key Vault. Programming Expertise: Utilize your expertise in programming languages such as Python to develop and maintain data engineering solutions. Generative AI and Language Models: Knowledge of Language Models (LLMs) and Generative AI is a plus, enabling the integration of advanced AI capabilities into data workflows. Real-Time Data Streaming: Use Kafka for real-time data streaming and integration, ensuring efficient data flow and processing. Data Management: Proficiency in Snowflake for data wrangling and management, optimizing data structures for analysis. DBT Utilization: Build and maintain data marts and views using DBT, ensuring data is structured for optimal analysis. ETL/ELT Solutions: Design ETL/ELT solutions using tools like Azure Data Factory and Azure Databricks, leveraging methodologies to acquire data from various structured or semi-structured source systems. Communication: Strong communication skills to explain technical issues and solutions clearly to the Engineering Lead and key stakeholders (as required) Qualifications: Minimum of 5+ years of experience in designing ETL/ELT solutions using tools like Azure Data Factory and Azure Databricks.OR Snowflake Expertise in programming languages such as Python. Experience with Kafka for real-time data streaming and integration. Proficiency in Snowflake for data wrangling and management. Proven ability to use DBT to build and maintain data marts and views. Experience in creating and maintaining Azure Infrastructure as Code using Terraform and GitHub Actions. Ability to configure, set up, and maintain GitHub for various code repositories. Experience in creating and configuring CI/CD pipelines using GitHub Actions for various Azure services. In-depth understanding of managing security aspects of Azure infrastructure. Strong problem-solving skills and ability to diagnose and troubleshoot technical issues. Excellent communication skills for explaining technical issues and solutions.

Posted 6 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies