Home
Jobs

995 Data Bricks Jobs - Page 21

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 10.0 years

30 - 35 Lacs

Hyderabad

Work from Office

Naukri logo

Role Purpose The purpose of the role is to create exceptional architectural solution design and thought leadership and enable delivery teams to provide exceptional client engagement and satisfaction. Do 1.Develop architectural solutions for the new deals/ major change requests in existing deals Creates an enterprise-wide architecture that ensures systems are scalable, reliable, and manageable. Provide solutioning of RFPs received from clients and ensure overall design assurance Develop a direction to manage the portfolio of to-be-solutions including systems, shared infrastructure services, applications in order to better match business outcome objectives Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution design framework/ architecture Provide technical leadership to the design, development and implementation of custom solutions through thoughtful use of modern technology Define and understand current state solutions and identify improvements, options & tradeoffs to define target state solutions Clearly articulate, document and sell architectural targets, recommendations and reusable patterns and accordingly propose investment roadmaps Evaluate and recommend solutions to integrate with overall technology ecosystem Works closely with various IT groups to transition tasks, ensure performance and manage issues through to resolution Perform detailed documentation (App view, multiple sections & views) of the architectural design and solution mentioning all the artefacts in detail Validate the solution/ prototype from technology, cost structure and customer differentiation point of view Identify problem areas and perform root cause analysis of architectural design and solutions and provide relevant solutions to the problem Collaborating with sales, program/project, consulting teams to reconcile solutions to architecture Tracks industry and application trends and relates these to planning current and future IT needs Provides technical and strategic input during the project planning phase in the form of technical architectural designs and recommendation Collaborates with all relevant parties in order to review the objectives and constraints of solutions and determine conformance with the Enterprise Architecture Identifies implementation risks and potential impacts 2.Enable Delivery Teams by providing optimal delivery solutions/ frameworks Build and maintain relationships with executives, technical leaders, product owners, peer architects and other stakeholders to become a trusted advisor Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams Recommend tools for reuse, automation for improved productivity and reduced cycle times Leads the development and maintenance of enterprise framework and related artefacts Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams Ensures architecture principles and standards are consistently applied to all the projects Ensure optimal Client Engagement Support pre-sales team while presenting the entire solution design and its principles to the client Negotiate, manage and coordinate with the client teams to ensure all requirements are met and create an impact of solution proposed Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor 3.Competency Building and Branding Ensure completion of necessary trainings and certifications Develop Proof of Concepts (POCs),case studies, demos etc. for new growth areas based on market and customer research Develop and present a point of view of Wipro on solution design and architect by writing white papers, blogs etc. Attain market referencability and recognition through highest analyst rankings, client testimonials and partner credits Be the voice of Wipros Thought Leadership by speaking in forums (internal and external) Mentor developers, designers and Junior architects in the project for their further career development and enhancement Contribute to the architecture practice by conducting selection interviews etc 4. Team Management Resourcing Anticipating new talent requirements as per the market/ industry trends or client requirements Hire adequate and right resources for the team Talent Management Ensure adequate onboarding and training for the team members to enhance capability & effectiveness Build an internal talent pool and ensure their career progression within the organization Manage team attrition Drive diversity in leadership positions Performance Management Set goals for the team, conduct timely performance reviews and provide constructive feedback to own direct reports Ensure that the Performance Nxt is followed for the entire team Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Mandatory Skills: DataBricks - Data Engineering.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Gurugram

Hybrid

Naukri logo

Exciting opportunity for an ML Platform Specialist to join a leading technology-driven firm. You will be designing, deploying, and maintaining scalable machine learning infrastructure with a strong focus on Databricks, model lifecycle, and MLOps practices. Location: Gurugram (Hybrid) Your Future Employer Our client is a leading digital transformation partner driving innovation across industries. With a strong focus on data-driven solutions and cutting-edge technologies, they are committed to fostering a collaborative and growth-focused environment. Responsibilities Designing and implementing scalable ML infrastructure on Databricks Lakehouse Building CI/CD pipelines and workflows for machine learning lifecycle Managing model monitoring, versioning, and registry using MLflow and Databricks Collaborating with cross-functional teams to optimize machine learning workflows Driving continuous improvement in MLOps and automation strategies Requirements Bachelors or Masters in Computer Science, ML, Data Engineering, or related field 3-5 years of experience in MLOps, with strong expertise in Databricks and Azure ML Proficient in Python, PySpark, MLflow, Delta Lake, and Databricks Feature Store Hands-on experience with cloud platforms (Azure/AWS/GCP), CI/CD, Git Knowledge of Terraform, Kubernetes, Azure DevOps, and distributed computing is a plus Whats in it for you Competitive compensation with performance-driven growth opportunities Work on cutting-edge MLOps infrastructure and enterprise-scale ML solutions Collaborative, diverse, and innovation-driven work culture Continuous learning, upskilling, and career development support

Posted 2 weeks ago

Apply

7.0 years

7 - 17 Lacs

Gurugram

Hybrid

Naukri logo

Position : Azure Data Engineer Experience : 4-7 Years Location : Gurugram Type : Full Time Notice period : Immediate to 30 days Preferred Certifications : Azure Data Engineer Associate, Databricks About the Role : We are looking for a skilled Azure Data Engineer with 4-7 years of experience in Azure Data Services, including Azure Data Factory (ADF), Synapse Analytics , and Databricks. The candidate will play a key role in developing and maintaining data solutions on Azure. Key Responsibilities : Develop and implement data pipelines using Azure Data Factory and Databricks . Work with stakeholders to gather requirements and translate them into technical solutions. Migrate data from various data sources to Azure Data Lake . Optimize data processing workflows for performance and scalability. Ensure data quality and integrity throughout the data lifecycle. Collaborate with data architects and other team members to design and implement data solutions. Required Skills : Strong experience with Azure Data Services, including Azure Data Factory (ADF) , Synapse Analytics, and Databricks. Proficiency in SQL , data transformation and ETL processes . Hands-on experience with Azure Data Lake migrations and Python/Pyspark Strong problem-solving and analytical skills. Excellent communication and teamwork skills. Preferred Qualifications : Azure Data Engineer Associate certification. Databricks Certification. Mandatory skill set: Pyspark- Databricks - Python and Sql

Posted 2 weeks ago

Apply

4.0 - 7.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

With 4 9 Years of Experience Build ETL ELT pipelines with Azure Data Factory Azure Databricks Spark Azure Data Lake Azure SQL Database and Synapse Minimum 3 Years of Hands-on experience developing Solid knowledge of Data Modelling Relational Databases and BI and Data Warehousing Demonstrated expertise in SQL Good to have experience with CICD Cloud architectures NoSQL Databases Azure Analysis Services and Power BI Working knowledge or experience in Agile, DevOps Good written and verbal communication skills English Ability to work with geographically diverse teams via collaborative technologies

Posted 2 weeks ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Happiest Minds Technologies Pvt.Ltd is looking for Databricks Professional to join our dynamic team and embark on a rewarding career journey Assessing and analyzing client requirements related to data processing, analytics, and machine learning. Designing and developing data pipelines, workflows, and applications using the Data Bricks platform. Integrating and connecting Data Bricks with other data sources, databases, and tools. Developing and implementing machine learning models using libraries such as Scikit-learn, Tensorflow, and PyTorch. Databricks, Spark, Python, Core ML, Pipeline creation,Airflow, Snowflake

Posted 2 weeks ago

Apply

3.0 - 5.0 years

11 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Azure Databricks Experience; 3 to 5 years Proficiency in Databricks, Apache Spark, and Delta Lake Strong understanding of cloud platforms such as AWS, Azure, or GCP Experience with SQL, Python, Scala, and/or R Familiarity with data warehousing concepts and ETL processes Problem-Solving: Excellent analytical and problem-solving skills with a keen attention to detail Data bricks Associate certification

Posted 2 weeks ago

Apply

4.0 - 6.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

4-6 years of experience building resilient, highly available and scalable cloud native platforms and solutions. Extensive experience with the .NET framework and other technologies: C#, Web API Experience with using a broad range of Azure services, mainly from the list below: Web Apps, Web jobs, Storage, Azure Key Vault, Blueprint Assignment, Azure Policy, Azure Service Bus. Expertise in creation and usage of ARM Templates is required. Usage and deployment knowledge of Infrastructure as a code using tools such as Terraform is required. Advanced knowledge on IaaS and PaaS services of Azure Knowledge on Monitoring tools (Application Insights) is required. Comprehensive understanding on Azure platform and services Knowledge on IAM Identity and Access Management is needed. APP Insights, Azure SQL DB, Cosmos DB, Functions, Azure Bot Service, Express Route Azure VM, Azure VNet, Azure Active Directory, Azure AD B2C, Azure Analytics Services - Azure Analysis Services, SQL Data Warehouse, Data Factory, Databricks Develop and maintain an Azure based cloud solution, with an emphasis on best practice cloud security. Automating tasks using Azure Dev-Ops and CI/CD Pipeline Expertise in one of the languages such as PowerShell or Python, .Net, C# is preferable. Strong knowledge on Dev-Ops and tools in Azure. Infrastructure and Application monitoring across production and non-production platforms Experience with DevOps Orchestration / Configuration / Continuous Integration Management technologies. Knowledge on hybrid public cloud design concepts Good understanding of High Availability and Disaster Recovery concepts for infrastructure Problem Solving: Ability to analyze and resolve complex infrastructure resource and application deployment issues. Excellent communication skills, understanding customer needs, negotiations skills, Vendor management skills. Education (degree): Bachelor's degree in Computer Science, Business Information Systems or relevant experience and accomplishments Technical Skills 1. Cloud provisioning and management Azure 2. Programming Language C#, .NET Core, PowerShell 3. Web API's

Posted 2 weeks ago

Apply

2.0 - 8.0 years

6 - 10 Lacs

Kolkata, Mumbai, Hyderabad

Work from Office

Naukri logo

- PowerBI and AAS expert (Strong SC or Specialist Senior) - Should have hands-on experience of Data Modelling in Azure SQL Data Ware House and Azure Analysis Service - Should be able twrite and test Dex queries - Should be able generate Paginated Reports in PowerBI - Should have minimum 3 Years' working experience in delivering projects in PowerBI ROLE 2 : - DataBricks expert (Strong SC or Specialist Senior) - Should have minimum 3 years' working experience of writing code in Spark and Scala ROLE 3: - One Azure backend expert (Strong SC or Specialist Senior) - Should have hands-on experience of working with ADLS, ADF and Azure SQL DW - Should have minimum 3 Year's working experience of delivering Azure projects

Posted 2 weeks ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Hyderabad, Pune, Chennai

Hybrid

Naukri logo

Job Title:- Data Engineer Location:- Chennai, Bangalore, Pune, Hyderabad (Hybrid) Job Type:- Permanent Employee Responsibilities: Skillsets required: Application and API development SQL data modeling and automation Experience working with GIS map services and spatial databases Experience creating GIS map services Data and application architecture Handling of legacy data Familiarity with Client work processes and data is a plus Platforms: DataBricks Snowflake ESRI ArcGIS / ArcSDE New GenAI app being developed Tasks that will need to be done: Combining and integrating spatial databases from different sources to be used with the new GenAI application Building of map services with associated metadata to support questions from geoscience users Set up necessary updating cycles for databases and map services to ensure evergreen results Help with constructing APIs for these databases and map services to structure the best possible workflows for users Assistance with data and application architecture Help with handling legacy data, such as links to existing applications, databases, and services Ensure that IT requirements are being met as we build our project, including integration, data tiers, access control and status monitoring

Posted 2 weeks ago

Apply

7.0 - 12.0 years

15 - 22 Lacs

Bengaluru

Hybrid

Naukri logo

Job Summary: We are seeking a talented Data Engineer with strong expertise in Databricks, specifically in Unity Catalog, PySpark, and SQL, to join our data team. Youll play a key role in building secure, scalable data pipelines and implementing robust data governance strategies using Unity Catalog. Key Responsibilities: Design and implement ETL/ELT pipelines using Databricks and PySpark. Work with Unity Catalog to manage data governance, access controls, lineage, and auditing across data assets. Develop high-performance SQL queries and optimize Spark jobs. Collaborate with data scientists, analysts, and business stakeholders to understand data needs. Ensure data quality and compliance across all stages of the data lifecycle. Implement best practices for data security and lineage within the Databricks ecosystem. Participate in CI/CD, version control, and testing practices for data pipelines. Required Skills: Proven experience with Databricks and Unity Catalog (data permissions, lineage, audits). Strong hands-on skills with PySpark and Spark SQL. Solid experience writing and optimizing complex SQL queries. Familiarity with Delta Lake, data lakehouse architecture, and data partitioning. Experience with cloud platforms like Azure or AWS. Understanding of data governance, RBAC, and data security standards. Preferred Qualifications: Databricks Certified Data Engineer Associate or Professional. Experience with tools like Airflow, Git, Azure Data Factory, or dbt. Exposure to streaming data and real-time processing. Knowledge of DevOps practices for data engineering.

Posted 3 weeks ago

Apply

8.0 - 12.0 years

25 - 30 Lacs

Chennai

Work from Office

Naukri logo

Job description Job Title: Manager Data Engineer - Azure Location: Chennai (On-site) Experience: 8 - 12 years Employment Type: Full-Time About the Role We are seeking a highly skilled Senior Azure Data Solutions Architect to design and implement scalable, secure, and efficient data solutions supporting enterprise-wide analytics and business intelligence initiatives. You will lead the architecture of modern data platforms, drive cloud migration, and collaborate with cross-functional teams to deliver robust Azure-based solutions. Key Responsibilities Architect and implement end-to-end data solutions using Azure services (Data Factory, Databricks, Data Lake, Synapse, Cosmos DB Design robust and scalable data models, including relational, dimensional, and NoSQL schemas. Develop and optimize ETL/ELT pipelines and data lakes using Azure Data Factory, Databricks, and open formats such as Delta and Iceberg. Integrate data governance, quality, and security best practices into all architecture designs. Support analytics and machine learning initiatives through structured data pipelines and platforms. Collaborate with data engineers, analysts, data scientists, and business stakeholders to align solutions with business needs. Drive CI/CD integration with Databricks using Azure DevOps and tools like DBT. Monitor system performance, troubleshoot issues, and optimize data infrastructure for efficiency and reliability. Stay current with Azure platform advancements and recommend improvements. Required Skills & Experience Extensive hands-on experience with Azure services: Data Factory, Databricks, Data Lake, Azure SQL, Cosmos DB, Synapse. • Expertise in data modeling and design (relational, dimensional, NoSQL). • Proven experience with ETL/ELT processes, data lakes, and modern lakehouse architectures. • Proficiency in Python, SQL, Scala, and/or Java. • Strong knowledge of data governance, security, and compliance frameworks. • Experience with CI/CD, Azure DevOps, and infrastructure as code (Terraform or ARM templates). • Familiarity with BI and analytics tools such as Power BI or Tableau. • Excellent communication, collaboration, and stakeholder management skills. • Bachelors degree in Computer Science, Engineering, Information Systems, or related field. Preferred Qualifications • Experience in regulated industries (finance, healthcare, etc.). • Familiarity with data cataloging, metadata management, and machine learning integration. • Leadership experience guiding teams and presenting architectural strategies to leadership. Why Join Us? • Work on cutting-edge cloud data platforms in a collaborative, innovative environment. • Lead strategic data initiatives that impact enterprise-wide decision-making. • Competitive compensation and opportunities for professional growth.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

14 - 18 Lacs

Chennai

Work from Office

Naukri logo

Job description Job Title: Lead Data Engineer - Azure | | GeakMinds | Chennai Location: Chennai (On-site) Experience: 5 - 8 years Employment Type: Full-Time About the Role We are seeking a highly skilled Senior Azure Data Solutions Architect to design and implement scalable, secure, and efficient data solutions supporting enterprise-wide analytics and business intelligence initiatives. You will lead the architecture of modern data platforms, drive cloud migration, and collaborate with cross-functional teams to deliver robust Azure-based solutions. Key Responsibilities Architect and implement end-to-end data solutions using Azure services (Data Factory, Databricks, Data Lake, Synapse, Cosmos DB Design robust and scalable data models, including relational, dimensional, and NoSQL schemas. Develop and optimize ETL/ELT pipelines and data lakes using Azure Data Factory, Databricks, and open formats such as Delta and Iceberg. Integrate data governance, quality, and security best practices into all architecture designs. Support analytics and machine learning initiatives through structured data pipelines and platforms. Collaborate with data engineers, analysts, data scientists, and business stakeholders to align solutions with business needs. Drive CI/CD integration with Databricks using Azure DevOps and tools like DBT. Monitor system performance, troubleshoot issues, and optimize data infrastructure for efficiency and reliability. Stay current with Azure platform advancements and recommend improvements. Required Skills & Experience Extensive hands-on experience with Azure services: Data Factory, Databricks, Data Lake, Azure SQL, Cosmos DB, Synapse. • Expertise in data modeling and design (relational, dimensional, NoSQL). • Proven experience with ETL/ELT processes, data lakes, and modern lakehouse architectures. • Proficiency in Python, SQL, Scala, and/or Java. • Strong knowledge of data governance, security, and compliance frameworks. • Experience with CI/CD, Azure DevOps, and infrastructure as code (Terraform or ARM templates). • Familiarity with BI and analytics tools such as Power BI or Tableau. • Excellent communication, collaboration, and stakeholder management skills. • Bachelors degree in Computer Science, Engineering, Information Systems, or related field. Preferred Qualifications • Experience in regulated industries (finance, healthcare, etc.). • Familiarity with data cataloging, metadata management, and machine learning integration. • Leadership experience guiding teams and presenting architectural strategies to leadership. Why Join Us? • Work on cutting-edge cloud data platforms in a collaborative, innovative environment. • Lead strategic data initiatives that impact enterprise-wide decision-making. • Competitive compensation and opportunities for professional growth.

Posted 3 weeks ago

Apply

4.0 - 9.0 years

12 - 18 Lacs

Lucknow

Work from Office

Naukri logo

- Analyze and organize raw data from various sources - Buildand maintain data systems and pipelines - Prepare data for prescriptive and predictive modeling - Combine raw information to generate valuable insights - Enhance data quality and reliability

Posted 3 weeks ago

Apply

6.0 - 11.0 years

18 - 25 Lacs

Hyderabad

Hybrid

Naukri logo

Primary Responsibilities: Design, code, test, document, and maintain high-quality and scalable data pipelines/solutions in cloud Work in both dev and ops and should be open to work in ops with flexible timings in ops Ingest and transform data using variety of technologies from variety of sources (APIs, streaming, Files, Databases) Develop reusable patterns and encourage innovation that will increase team’s velocity Design and develop applications in an agile environment, deploy using CI/CD Participate with prototypes as well as design and code reviews, own or assist with incident and problem management Self-starter who can learn things quickly, who is enthusiastic and actively engaged Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Bachelor's degree in technical domain. Required experience with the following: Databricks, Python, Spark, pyspark, SQL, Azure Data factory Design and Implementation of Datawarehouse/Datalake (Databricks/snowflake) Data architecture, Data modelling Operations Processes, reporting from operations, Incident resolutions Github actions/Jenkins or similar CICD tool, Cloud CICD, GitHub NoSQL and relational databases Preferred Qualifications: Experience or knowledge in Apache Kafka Experience or knowledge in Data ingestions from variety of API’s Working in Agile/Scrum environment

Posted 3 weeks ago

Apply

5.0 - 9.0 years

11 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

5 to 9 years experience Nice to have Worked in hp eco system (FDL architecture) Databricks + SQL combination is must EXPERIENCE 6-8 Years SKILLS Primary Skill: Data Engineering Sub Skill(s): Data Engineering Additional Skill(s): databricks, SQL

Posted 3 weeks ago

Apply

8.0 - 10.0 years

12 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT THE ROLE At Amgen, we believe that innovation can and should be happening across the entire company. Part of the Artificial Intelligence & Data function of the Amgen Technology and Medical Organizations (ATMOS), the AI & Data Innovation Lab (the Lab) is a center for exploration and innovation, focused on integrating and accelerating new technologies and methods that deliver measurable value and competitive advantage. Weve built algorithms that predict bone fractures in patients who havent even been diagnosed with osteoporosis yet. Weve built software to help us select clinical trial sites so we can get medicines to patients faster. Weve built AI capabilities to standardize and accelerate the authoring of regulatory documents so we can shorten the drug approval cycle. And thats just a part of the beginning. Join us! We are seeking a Senior DevOps Software Engineer to join the Labs software engineering practice. This role is integral to developing top-tier talent, setting engineering best practices, and evangelizing full-stack development capabilities across the organization. The Senior DevOps Software Engineer will design and implement deployment strategies for AI systems using the AWS stack, ensuring high availability, performance, and scalability of applications. Roles & Responsibilities: Design and implement deployment strategies using the AWS stack, including EKS, ECS, Lambda, SageMaker, and DynamoDB. Configure and manage CI/CD pipelines in GitLab to streamline the deployment process. Develop, deploy, and manage scalable applications on AWS, ensuring they meet high standards for availability and performance. Implement infrastructure-as-code (IaC) to provision and manage cloud resources consistently and reproducibly. Collaborate with AI product design and development teams to ensure seamless integration of AI models into the infrastructure. Monitor and optimize the performance of deployed AI systems, addressing any issues related to scaling, availability, and performance. Lead and develop standards, processes, and best practices for the team across the AI system deployment lifecycle. Stay updated on emerging technologies and best practices in AI infrastructure and AWS services to continuously improve deployment strategies. Familiarity with AI concepts such as traditional AI, generative AI, and agentic AI, with the ability to learn and adopt new skills quickly. Functional Skills: Deep expertise in designing and maintaining CI/CD pipelines and enabling software engineering best practices and overall software product development lifecycle. Ability to implement automated testing, build, deployment, and rollback strategies. Advanced proficiency managing and deploying infrastructure with the AWS cloud platform, including cost planning, tracking and optimization. Proficiency with backend languages and frameworks (Python, FastAPI, Flask preferred). Experience with databases (Postgres/DynamoDB) Experience with microservices architecture and containerization (Docker, Kubernetes). Good-to-Have Skills: Familiarity with enterprise software systems in life sciences or healthcare domains. Familiarity with big data platforms and experience in data pipeline development (Databricks, Spark). Knowledge of data security, privacy regulations, and scalable software solutions. Soft Skills: Excellent communication skills, with the ability to convey complex technical concepts to non-technical stakeholders. Ability to foster a collaborative and innovative work environment. Strong problem-solving abilities and attention to detail. High degree of initiative and self-motivation. Basic Qualifications: Bachelors degree in Computer Science, AI, Software Engineering, or related field. 8+ years of experience in full-stack software engineering.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

5 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Role & responsibilities Job Description: We are looking for an experienced Cognos Developer with expertise in Framework Management and Admin tasks to join our team. The ideal candidate will have 6+ years of hands-on experience in Cognos development and support, with a strong focus on managing Cognos environments and optimizing reporting solutions. Key Responsibilities: Develop and maintain Cognos Framework Manager models and reports. Perform Cognos Admin tasks, including installation, configuration, and troubleshooting. Support and maintain Cognos BI environments (reporting, security, and user management). Work closely with business teams to understand reporting requirements and design solutions. Perform upgrades, patches, and system optimizations. Ensure high availability and performance of Cognos reporting applications. Provide technical support for Cognos users and resolve issues. Required Skills: 6+ years of experience in Cognos BI development and administration. Strong knowledge of Cognos Framework Manager , Cognos Report Studio , and Cognos Administration . Experience in creating and managing data models, reports, and dashboards. Proficiency in SQL and database concepts. Good understanding of security and deployment in Cognos environments. Excellent problem-solving and troubleshooting skills. Strong communication and collaboration skills.

Posted 3 weeks ago

Apply

8.0 - 13.0 years

11 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Overview As a Data Modeler & Functional Data Senior Analyst, your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analysing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architecture teams. As a member of the Data Modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like Master Data, Finance, Revenue Management, Supply chain, Manufacturing and Logistics. The primary responsibility of this role is to work with Data Product Owners, Data Management Owners, and Data Engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, Data Bricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs Data Design/Modeling documentation of metadata (business definitions of entities and attributes) and construction of database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to existing, or new, applications/reporting. Lead and Support assigned project contractors (both on & offshore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of enhancements or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for proper management ofbusiness and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Analyze/profile source data and identify issues that impact accuracy, completeness, consistency, integrity, timeliness and validity. Create Source to Target Mapping documents including identifying and documenting data transformations. Assume accountability and responsibility for assigned product delivery, be flexible and able to work with ambiguity, changing priorities, tight timelines and critical situations/issues. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications BA or BS degree required in Data Science/Management/Engineering, Business Analytics, Information Systems, Software Engineering or related Technology Discipline. 8+ years of overall technology experience that includes at least 4+ years of Data Modeling and Systems Architecture/Integration. 3+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience developing Enterprise Data Models. 3+ years of Functional experience with SAP Master Data Governance (MDG) including use of T-Codes to create/update records and query tables. Extensive knowledge of all core Master Data tables, Reference Tables, IDoc structures. 3+ years of experience with Customer & Supplier Master Data. Strong SQL skills with ability to understand and write complex queries. Strong understanding of Data Life cycle, Integration and Master Data Management principles. Excellent verbal and written communication and collaboration skills. Strong Excel skills for data analysis and manipulation. Strong analytical and problem-solving skills. Expertise in Data Modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or Snowflake. Experience with version control systems like GitHub and deployment & CI tools. Working knowledge of agile development, including Dev Ops and Data Ops concepts. Experience mapping disparate data sources into a common canonical model. Differentiating Competencies Experience with metadata management, data lineage and data glossaries. Experience with Azure Data Factory, Databricks and Azure Machine learning. Familiarity with business intelligence tools (such as Power BI). CPG industry experience. Experience with Material, Location, Finance, Supply Chain, Logistics, Manufacturing & Revenue Management Data.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Hyderabad

Work from Office

Naukri logo

Overview As an Analyst, Data Modeler, your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analysing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data Modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, Data Bricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper managementbusiness and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levelslow-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications Bachelors degree required in Computer Science, Data Management/Analytics/Science, Information Systems, Software Engineering or related Technology Discipline. 5+ years of overall technology experience that includes at least 2+ years of data modeling and systems architecture. Around 2+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 2+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or Snowflake. Experience with version control systems like GitHub and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including Dev Ops and Data Ops concepts. Familiarity with business intelligence tools (such as Power BI). Excellent verbal and written communication and collaboration skills.

Posted 3 weeks ago

Apply

9.0 - 11.0 years

11 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to lead the engagement effort of providing high-quality and value-adding consulting solutions to customers at different stages- from problem definition to diagnosis to solution design, development and deployment. You will review the proposals prepared by consultants, provide guidance, and analyze the solutions defined for the client business problems to identify any potential risks and issues. You will identify change Management requirements and propose a structured approach to client for managing the change using multiple communication mechanisms. You will also coach and create a vision for the team, provide subject matter training for your focus areas, motivate and inspire team members through effective and timely feedback and recognition for high performance. You would be a key contributor in unit-level and organizational initiatives with an objective of providing high-quality, value-adding consulting solutions to customers adhering to the guidelines and processes of the organization. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Oracle Industry Solutions->Retail Merchandise Technical Skills: o Proficiency in programming languages like Python and R for data manipulation and analysis o Expertise in machine learning algorithms and statistical modeling techniques o Familiarity with data warehousing and data pipelines o Experience with data visualization tools like Tableau or Power BI o Experience in Cloud platforms (e.g., ADF, Data bricks, Azure) and their AI services. Consulting Skills:o Hypothesis-driven problem solvingo Go-to-market pricing and revenue growth executiono Advisory, Presentation, Data Storytellingo Project Leadership and Execution Preferred Skills: Technology->Oracle Industry Solutions->Retail Merchandise->Oracle-Retail Price Management (ORPM) Additional Responsibilities: Good knowledge on software configuration management systems Strong business acumen, strategy and cross-industry thought leadership Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Two or three industry domain knowledge Understanding of the financial processes for various types of projects and the various pricing models available Client Interfacing skills Knowledge of SDLC and agile methodologies Project and Team management Educational Requirements MCA,MSc,Bachelor of Engineering,BBA,BCom Service Line Data & Analytics Unit * Location of posting is subject to business requirements

Posted 3 weeks ago

Apply

6.0 - 9.0 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

Your Role As a senior software engineer with Capgemini, you will have 6 + years of experience in Azure technology with strong project track record In this role you will play a key role in: Strong customer orientation, decision making, problem solving, communication and presentation skills Very good judgement skills and ability to shape compelling solutions and solve unstructured problems with assumptions Very good collaboration skills and ability to interact with multi-cultural and multi-functional teams spread across geographies Strong executive presence and entrepreneurial spirit Superb leadership and team building skills with ability to build consensus and achieve goals through collaboration rather than direct line authority Your Profile Experience with Azure Data Bricks, Data Factory Experience with Azure Data components such as Azure SQL Database, Azure SQL Warehouse, SYNAPSE Analytics Experience in Python/Pyspark/Scala/Hive Programming Experience with Azure Databricks/ADB is must have Experience with building CI/CD pipelines in Data environments

Posted 3 weeks ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Chennai

Remote

Naukri logo

Role & responsibilities Develop, maintain, and enhance new data sources and tables, contributing to data engineering efforts to ensure comprehensive and efficient data architecture. Serves as the liaison between Data Engineer team and the Airport operation teams, developing new data sources and overseeing enhancements to existing database; being one of the main contact points for data requests, metadata, and statistical analysis Migrates all existing Hive Metastore tables to Unity Catalog, addressing access issues and ensuring smooth transition of jobs and tables. Collaborate with IT teams to validate package (gold level data) table outputs during the production deployment of developed notebooks Develop and implement data quality alerting systems and Tableau alerting mechanisms for dashboards, setting up notifications for various thresholds. Create and maintain standard reports and dashboards to provide insights into airport performance, helping guide stations to optimize operations and improve performance. Preferred candidate profile Master's degree / UG Min 5 -10 years of experience Databricks (Azur op) Good Communication Experience developing solutions on a Big Data platform utilizing tools such as Impala and Spark Advanced knowledge/experience with Azure Databricks, PySpark , ( Teradata )/Databricks SQL Advanced knowledge/experience in Python along with associated development environments (e.g. JupyterHub, PyCharm, etc.) Advanced knowledge/experience in building Tableau Dashboard / Clikview / PowerBi Basic idea on HTML and JavaScript Immediate Joiner Skills, Licenses & Certifications Strong project management skills Proficient with Microsoft Office applications (MS Excel, Access and PowerPoint); advanced knowledge of Microsoft Excel Advanced aptitude in problem-solving, including the ability to logically structure an appropriate analytical framework Proficient in SharePoint, PowerApp and ability to use Graph API

Posted 3 weeks ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Data Engineer Brief Posting Description: As a Data Engineer you will work independently or with a team of data engineers on cloud technology products and projects, and initiatives. Work with all customers, both internal and external, to design and implement features. Will collaborate with other technical teams across the organization as required to deliver proposed solutions. You will also be responsible for collaborating with tech leads, architects and managers. Detailed Description: Works with Scrum masters, product owners, and others to identify new features for digital products. Takes responsibility of the maintenance and security of existing solutions, platforms and frameworks, designing new ones and recommending relevant upgrades and fixes. Troubleshoots production issues of all levels and severities, and tracks progress from identification through resolution. Maintains culture of open communication, collaboration, mutual respect and productive behaviour. Identifies risks, barriers, efficiencies and opportunities when thinking through development approach; presents possible platform-wide architectural solutions based on facts, data, and best practices. Explores all technical options when considering solution, including homegrown coding, third-party sub[1]systems, enterprise platforms, and existing technology components. Actively participates in collaborative effort through all phases of software development life cycle (SDLC), including requirements analysis, technical design, coding, testing, release, and customer technical support. Develops technical documentation, such as system context diagrams, design documents, release procedures, and other pertinent artifacts. Collaborates accordingly while working on cross-functional projects or production issues. Job Requirements: EXPERIENCE: 5 - 8 years preferred experience in a data engineering role. Minimum of 4 years of preferred experience in Azure data services (Data Factory, Databricks, ADLS, SQL DB, etc.) EDUCATION: Minimum Bachelor's Degree in Computer Science, Computer Engineering or in "STEM" Majors (Science, Technology, Engineering, and Math) SKILLS/REQUIREMENTS: Strong working knowledge of Databricks, ADF. Expertise working with databases and SQL. Strong working knowledge of code management and continuous integrations systems (Azure DevOps or Github) preferred Familiarity with Agile delivery methodologies Familiarity with NoSQL databases (such as MongoDB) preferred. Any experience on IoT Data Standards like Project Haystack, Brick Schema, Real Estate Core is an added advantage Ability to multi-task and reprioritize in a dynamic environment. Outstanding written and verbal communication skills

Posted 3 weeks ago

Apply

6.0 - 8.0 years

20 - 30 Lacs

Pune

Hybrid

Naukri logo

SAP MDG/SAP ECC experience (T codes, Tables structures etc) Azure Data lake /AWS/Data Bricks Leveraging SAP MDG/ECCs experience the candidate is able to deep dive to do root cause analysis for assigned use cases. Also able to work with Azure data lake (via data Bricks) using SQL/Python. Data Model (Conceptual and Physical) will be needed to be identified and built that provides automated mechanism to monitor on going DQ issues. Multiple workshops may also be needed to work through various options and identifying the one that is most efficient and effective Works with business (Data Owners/Data Stewards) to profile data for exposing patterns indicating data quality issues. Also is able to identify impact to specific CDEs deemed important for each individual business. Identifies financial impacts of Data Quality Issue. Also can identify business benefit (quantitative/qualitative) from a remediation standpoint along with managing implementation timelines. Schedules regular working groups with business that have identified DQ issues and ensures progression for RCA/Remediation or for presenting in DGFs Identifies business DQ rules basis which KPIs/Measures are stood up that feed into the dashboarding/workflows for BAU monitoring. Red flags are raised and investigated Understanding of Data Quality value chain, starting with Critical Data Element concepts, Data Quality Issues, Data Quality KPIs/Measures is needed. Also has experience owing and executing Data Quality Issue assessments to aid improvements to operational process and BAU initiatives Highlights risk/hidden DQ issues to Lead/Manager for further guidance/escalation Support designing, building and deployment of data quality dashboards via PowerBI Determines escalation paths and constructs workflow and alerts which notify process and data owners of unresolved data quality issues Collaborates with IT & analytics teams to drive innovation (AI, ML, cognitive science etc.) Works with business functions and projects to create data quality improvement plans Sets targets for data improvements / maturity. Monitors and intervenes when sufficient progress is not being made Supports initiatives which are driving data clean-up of existing data landscape Owns and develops relevant data quality work products as part of the DAS data change methodology Ensures data quality aspects are delivered as part of Gold and Silver data related change projects Supports the creation of business cases with insight into the cost of poor data

Posted 3 weeks ago

Apply

6.0 - 11.0 years

15 - 30 Lacs

Bengaluru

Hybrid

Naukri logo

EXP : 6.5 Yrs - 13 Yrs NOTICE PERIOD - Immediate to 30 days ONLY Work Location - Bangalore - Manyatha Tech Park JD : Key Message Hand-on Coding skills Primary Skills Data bricks Cloud (Azure or AWS) Py spark Advanced SQL Experience working on Data Warehouse project Secondary Skills Hands on experience on data modelling. Data Vault is good to have Data Security on data bricks Hadoop Python Preferred candidate profile Interested candidates send details to elavarasi-r@hcltech.com in below format Sub Line - DataBricks Name: Contact: Email: TEX: REX: Notice Period: Current org: CTC: ECTC: Current location: Pref location Availability for 12th Hackathon(F2F - Manyata Tech Park)

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies