Home
Jobs

112 Data Architect Jobs - Page 2

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

9.0 - 14.0 years

20 - 32 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Hybrid

Naukri logo

Data Architect/Data Modeler role: Scope & Responsibilities: Enterprise data architecture for a functional domain or for a product group. Design and governs delivery of the Domain Data Architecture / ensures delivery as per design. Ensures consistency in approach for the data modelling of the different solutions of the domain. Design and ensure delivery of data integration across the solutions of the domain. General Expertise: Critical: Methodology expertise on data architecture and modeling from business requirements and functional specifications to data modeling. Critical: Data warehousing and Business intelligence data products modeling (Inmon/Kimball/Data Vault/Codd modeling patterns). Business/Functional knowledge of the domain. This requires business terminology understanding, knowledge of business processes related to the domain, awareness of key principles and objectives, business trends and evolution. Master data management / data management and stewardship processes awareness. Data persistency technologies knowledge: SQL (Ansi-2003 for structured relational data querying & Ansi-2023 for XML, JSON, Property Graph querying) / Snowflake specificities, database structures for performance optimization. NoSQL – Other data persistency technologies awareness. Proficient level of Business English & “Technical writing” Nice to have: Project delivery expertise through agile approach and methodologies experience/knowledge: Scrum, SAFe 5.0, Product-based-organization. Technical Stack expertise: SAP Power Designer modeling (CDM, LDM, PDM) Snowflake General Concepts and specifically DDL & DML, Snow Sight, Data Exchange/Data Sharing concepts AWS S3 & Athena (as a query user) Confluence & Jira (as a contributing user) Nice to have: Bitbucket (as a basic user)

Posted 2 weeks ago

Apply

2.0 - 7.0 years

8 - 12 Lacs

Chennai

Work from Office

Naukri logo

Position Details: Junior AI Engineer (Data Engineer) Location: -, Tamil nadu Openings: 1 Salary Range: Description: Job Title: Junior AI Engineer / Data Engineer Location: Chennai Reports To: Senior AI Engineer / Data Architect Job Summary: This role is ideal for an early-career engineer eager to develop robust data pipelines and support AI/ML model development. The Junior AI Engineer will primarily focus on data preparation, transformation, and infrastructure to support scalable AI systems. Key Responsibilities: Build and maintain ETL pipelines for AI applications. Assist in data wrangling, cleaning, and feature engineering. Support data scientists and AI engineers with curated, high-quality datasets. Contribute to data governance and documentation. Collaborate on proof-of-concepts and prototypes of AI solutions. Required Qualifications: Bachelor s degree in computer science, Engineering, or a related field. 2+ years of experience in data engineering. Proficient in Python, SQL; exposure to Azure platform is a plus. Basic understanding of AL/ML concepts.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

13 - 17 Lacs

Gurugram

Work from Office

Naukri logo

Job Title: Data Architect Experience: 5-10 Years Job Summary: We are looking for an experienced and highly motivated Data Architect to join our team. The ideal candidate will have a strong background in architecture design, and implementing enterprise data solutions. You will play a critical role in shaping our data infrastructure, ensuring scalability, performance, and security across data platforms. Key Responsibilities: Design and implement scalable data architectures for enterprise applications. Develop and maintain conceptual, logical, and physical data models. Define data governance policies and ensure data integrity and security. Collaborate with stakeholders to identify data requirements and translate them into architectural solutions. Lead the evaluation and selection of database technologies and tools. Oversee data integration, data warehousing, and ETL/ELT processes. Optimize database performance and manage data storage solutions. Ensure alignment of data architecture with business and technology strategies. Required Skills Qualifications: Bachelors or Master s degree in Computer Science, Information Systems, or related field. 5-10 years of experience in data architecture, and database design. Strong knowledge of relational (e.g., SQL Server). Expertise in data warehousing, ETL tools (e.g., Informatica, Talend), and big data platforms (e.g., Hadoop, Spark). Strong understanding of data governance, security, and compliance standards. Experience with cloud data platforms (e.g., AWS Redshift, Azure Synapse, Google BigQuery) is a plus. Excellent communication and stakeholder management skills. Preferred Certifications (optional): AWS Certified Data Analytics - Specialty Google Professional Data Engineer Microsoft Certified: Azure Data Engineer Associate

Posted 2 weeks ago

Apply

15.0 - 20.0 years

12 - 16 Lacs

Hyderabad

Work from Office

Naukri logo

Position Summary We are seeking an experienced SAP Data Architect with a strong background in enterprise data management, SAP S/4HANA architecture, and data integration. This role demands deep expertise in designing and governing data structures, ensuring data consistency, and leading data transformation initiatives across large-scale SAP implementations. Key Responsibilities Lead the data architecture design across multiple SAP modules and legacy systems. Define data governance strategies , master data management (MDM), and metadata standards. Architect data migration strategies for SAP S/4HANA projects, including ETL, data validation, and reconciliation. Develop data models (conceptual, logical, and physical) aligned with business and technical requirements. Collaborate with functional and technical teams to ensure integration across SAP and non-SAP platforms. Establish data quality frameworks and monitoring practices. Conduct impact assessments and ensure scalability of data architecture. Support reporting and analytics requirements through structured data delivery frameworks (e.g., SAP BW, SAP HANA). Required Qualifications 15+ years of experience in enterprise data architecture , with 8+ years in SAP landscapes . Proven experience in SAP S/4HANA data models , SAP Datasphere , SAP HANA Cloud SAC and integrating this with AWS Data Lake (S3) Strong knowledge of data migration tools (SAP Data Services, LTMC / LSMW, BODS, etc.). Expertise in data governance , master data strategy, and data lifecycle management. Experience with cloud data platforms (Azure, AWS, or GCP) is a plus. Strong analytical and communication skills to work across business and IT stakeholders. Preferred Certifications SAP Certified Technology Associate - SAP S/4HANA / Datasphere TOGAF or other Enterprise Architecture certifications ITIL Foundation (for process alignment) Technical Skills SAP S/4HANA Data Architecture Datasphere, HANA Cloud, SAC, BODS Data Services, LTMC /LSMW, SLT, CPI-DS SQL, HANA Native Modeling Integration with non-SAP systems Cloud Data Architecture (Azure/AWS/GCP) ETL/ELT and Data Quality tools Data Modeling Metadata Management

Posted 2 weeks ago

Apply

6.0 - 11.0 years

20 - 25 Lacs

Noida, Mumbai

Work from Office

Naukri logo

Responsibilities: Act as Data domain expert for Snowflake in a collaborative environment to provide demonstrated understanding of data management best practices and patterns. Design and implement robust data architectures to meet and support business requirements leveraging Snowflake platform capabilities. Develop and enforce data modeling standards and best practices for Snowflake environments. Develop, optimize, and maintain Snowflake data warehouses. Leverage Snowflake features such as clustering, materialized views, and semi-structured data processing to enhance data solutions. Ensure data architecture solutions meet performance, security, and scalability requirements. Stay current with the latest developments and features in Snowflake and related technologies, continually enhancing our data capabilities. Collaborate with cross-functional teams to gather business requirements, translate them into effective data solutions in Snowflake and provide data-driven insights. Stay updated with the latest trends and advancements in data architecture and Snowflake technologies. Provide mentorship and guidance to junior data engineers and architects. Troubleshoot and resolve data architecture-related issues effectively. Skills Requirement: 5+ years of proven experience as a Data Engineer with 3+ years as Data Architect. Proficiency in Snowflake with Hands-on experience with Snowflake features such as clustering, materialized views, and semi-structured data processing. Experience in designing and building manual or auto ingestion data pipeline using Snowpipe. Design and Develop automated monitoring processes on Snowflake using combination of Python, PySpark, Bash with SnowSQL. SnowSQL Experience in developing stored Procedures writing Queries to analyze and transform data Working experience on ETL tools like Fivetran, DBT labs, MuleSoft Expertise in Snowflake concepts like setting up Resource monitors, RBAC controls, scalable virtual warehouse, SQL performance tuning, zero copy clone, time travel and automating them. Excellent problem-solving skills and attention to detail. Effective communication and collaboration abilities. Relevant certifications (e.g., SnowPro Core / Advanced) are a must have. Must have expertise in AWS. Azure, Salesforce Platform as a Service (PAAS) model and its integration with Snowflake to load/unload data. Strong communication and exceptional team player with effective problem-solving skills Educational Qualification Required: Masters degree in Business Management (MBA / PGDM) / Bachelor's degree in computer science, Information Technology, or related field.

Posted 2 weeks ago

Apply

15.0 - 20.0 years

12 - 16 Lacs

Hyderabad

Work from Office

Naukri logo

Position Summary We are seeking an experienced SAP Data Architect with a strong background in enterprise data management, SAP S/4HANA architecture, and data integration. This role demands deep expertise in designing and governing data structures, ensuring data consistency, and leading data transformation initiatives across large-scale SAP implementations. Key Responsibilities Lead the data architecture design across multiple SAP modules and legacy systems. Define data governance strategies , master data management (MDM), and metadata standards. Architect data migration strategies for SAP S/4HANA projects, including ETL, data validation, and reconciliation. Develop data models (conceptual, logical, and physical) aligned with business and technical requirements. Collaborate with functional and technical teams to ensure integration across SAP and non-SAP platforms. Establish data quality frameworks and monitoring practices. Conduct impact assessments and ensure scalability of data architecture. Support reporting and analytics requirements through structured data delivery frameworks (e.g., SAP BW, SAP HANA). Required Qualifications 15+ years of experience in enterprise data architecture , with 8+ years in SAP landscapes . Proven experience in SAP S/4HANA data models , SAP Datasphere , SAP HANA Cloud & SAC and integrating this with AWS Data Lake (S3) Strong knowledge of data migration tools (SAP Data Services, LTMC / LSMW, BODS, etc.). Expertise in data governance , master data strategy, and data lifecycle management. Experience with cloud data platforms (Azure, AWS, or GCP) is a plus. Strong analytical and communication skills to work across business and IT stakeholders. Preferred Certifications SAP Certified Technology Associate - SAP S/4HANA / Datasphere TOGAF or other Enterprise Architecture certifications ITIL Foundation (for process alignment) Technical Skills SAP S/4HANA Data Architecture Datasphere, HANA Cloud, SAC, BODS Data Services, LTMC /LSMW, SLT, CPI-DS SQL, HANA Native Modeling Integration with non-SAP systems Cloud Data Architecture (Azure/AWS/GCP) ETL/ELT and Data Quality tools Data Modeling & Metadata Management

Posted 2 weeks ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

About Swiss Re Swiss Re is one of the world s leading providers of reinsurance, insurance and other forms of insurance-based risk transfer, working to make the world more resilient. We anticipate and manage a wide variety of risks, from natural catastrophes and climate change to cybercrime. We cover both Property & Casualty and Life & Health. Combining experience with creative thinking and cutting-edge expertise, we create new opportunities and solutions for our clients. This is possible thanks to the collaboration of more than 14,000 employees across the world. Our success depends on our ability to build an inclusive culture encouraging fresh perspectives and innovative thinking. We embrace a workplace where everyone has equal opportunities to thrive and develop professionally regardless of their gender, race, ethnicity, gender identity and/or expression, sexual orientation, physical or mental ability, skillset, thought or other characteristics. In our inclusive and flexible environment everyone can bring their authentic selves to work. About D&A Re The Digital & Tech Re organization is at the forefront of driving the digital transformation for our reinsurance business units across P&C Re, L&H Re and Solutions. We strive to build an inspiring environment for our people to use data and technology to create a sustainable and strategic competitive advantage. Data & Analytics Re is the data arm of Digital & Tech Re. We create innovative analytics, data science and robust data foundation capabilities to generate data-driven insights that serve the heart of Swiss Res business. Together with our business counterparts in Property & Casualty and Life & Health, we work daily to deliver differentiating insights, elevate underwriting excellence and effectively select and manage risk pools. Our team is composed of an international workforce based in different locations and serving a global customer basis, with a large part of our leadership located in Zurich. About the Role Are you an energetic Data Architect who enjoys working with teams to design and deliver exceptional data driven solutions? As a Data Architect you will be responsible for supporting the companies ambition to become a data driven commercial insurer. Working with both internal and external customers, you will need to have strong analytical and conceptual thinking, with the ability to recognize competing requirements and constraints, to balance and negotiate trade-offs. The role is a mix of strong solution architecture awareness combined with a strong background in data modelling and canonical modelling. The role is very much focused on constantly improving the data landscape aimed at democratizing our data assets. You thrive working in an agile, collaborative environment providing knowledge and guidance on the data architecture. Your responsibilities include: High-level architecture and designs for database and data integrations leveraging Conceptual, logical and physical data models. End to end data analysis, from business analysis and data modelling through to data quality assurance, on a major global program in the Commercial Insurance sector in a fast-moving Agile environment. Complex SQL based data analysis Maintenance of the logical and physical data models within a CASE tool (SAP Sybase Power Designer), supporting both SDLC and Agile application development approaches. Collaborate with your customers, understanding their data needs, finding innovative ways to bring data into their daily process. Work with other members of the data team, including data architects, data analysts, data engineers, and data scientists. Create full transparency of data for customers to know what is available and how to use it About you Are you eager to disrupt the insurance industry with us and make an impact? Do you wish to have your talent recognized and rewarded? Then join our growing team and become part of the next wave of insurance innovation. Your experience Proven track record of delivering results and impact Minimum of 5-7 years as a Data Architect Strong business acumen and a deep strategic mindset Requirements gathering, analysis and prioritization for business processes and data flows, persistence and integration. Conceptual and logical data and physical modeling; (Sybase Power Designer) Data retrieval and manipulation using SQL Data architecture: design for integration of complex data flows from multiple applications Data warehousing design, both dimensional and 3NF SDLC and Agile/SCRUM software development processes Experience working with and understanding the needs of customers or clients (internal or external) Ability to understand and internalize an organization s strategy and culture, and contribute to their implementation and reinforcement through technology architecture and data designs A desire and openness to learning and continuous improvement, both of yourself and your team members Proven analytical skills and experience making decisions based on hard and soft data Commitment to the organization s new way of working through greater collaboration and breaking down of siloes English fluency is a requirement, demonstrating excellent verbal and written communication skills Excellent in team and organizational development by providing the highest quality of services Experience in information & data governance would be a plus Self-motivated and good communication skills Dedication and flexibility; aptitude for fast learning Ideally experience in insurance/reinsurance or financial industry Behavioural Competences Excellent organizational skills. Excellent communication and presentation skills; ability to communicate on different levels of seniority. Team player; enjoying being part of a cross-functional setup. Ability to perform well on time-critical endeavours and on multiple fronts at the time. Strong dedication to quality and client mindset. A passion for learning and continuous improvement, both of yourself and your team members. We are an equal opportunity employer, and we value diversity at our company. Our aim is to live visible and invisible diversity -diversity of age, race, ethnicity, nationality, gender, gender identity, sexual orientation, religious beliefs, physical abilities, personalities and experiences - at all levels and in all functions and regions. We also collaborate in a flexible working environment, providing you with a compelling degree of autonomy to decide how, when and where to carry out your tasks. We provide feedback to all candidates via email. If you have not heard back from us, please check your spam folder. About Swiss Re If you are an experienced professional returning to the workforce after a career break, we encourage you to apply for open positions that match your skills and experience. Keywords: Reference Code: 133976

Posted 2 weeks ago

Apply

12.0 - 15.0 years

14 - 18 Lacs

Hyderabad

Work from Office

Naukri logo

Lead the technical vision and strategy for the Data Engineering Center of Excellence across cloud platforms (GCP, Azure, AWS), cloud-agnostic platforms (Databricks, Snowflake), and legacy systems. This leadership role will establish architectural standards and best practices while providing pre-sales leadership for strategic opportunities. Key Responsibilities Define and drive the technical vision and roadmap for the Data Engineering CoE Establish cross-cloud architecture standards and best practices with emphasis on Azure, GCP and AWS Lead pre-sales activities for strategic opportunities, particularly AWS, Azure, GCP-focused clients Build the CoEs accelerator development framework Mentor and guide pillar architects across all platforms Drive platform selection decisions and integration strategies Establish partnerships with key technology providers, especially Cloud Define governance models for the CoE implementation Represent the organization as a technical thought leader in client engagements 12+ years of data engineering experience with 6+ years in leadership roles Deep expertise in Google Cloud Platform data services (BigQuery, Dataflow, Dataproc) Strong knowledge of other cloud platforms (Azure Fabric/Synpase, Data Factory ,

Posted 3 weeks ago

Apply

8.0 - 12.0 years

13 - 17 Lacs

Noida

Work from Office

Naukri logo

Job Title: Data Architect Location: Jewar airport Noida Experience - 8+ Years Data Architect We are looking for a Data Architect to oversee our organizations data architecture, governance, and product lifecycle. The role focuses on managing data layers, maintaining data governance frameworks, and creating data products aligned with business objectives. Key Responsibilities: Design and maintain the Lakehouse architecture, including data lake setup and management. Create and maintain data products, ensuring their alignment with business needs. Develop and enforce data governance policies, including the maintenance of a data catalog. Design data models and define database development standards. Automate workflows using Python, CI/CD pipelines, and unit tests. Required Skills and Experience: Extensive experience in data architecture and data platform management. Expertise in data governance, data modeling, and database development. Proficiency in Python for automation and pipeline development. Familiarity with Azure data services and data processing pipelines.

Posted 3 weeks ago

Apply

8.0 - 13.0 years

16 - 22 Lacs

Noida, Chennai, Bengaluru

Work from Office

Naukri logo

Location : Bangalore, Chennai, Delhi, Pune. Primary Roles And Responsibilities : - Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack. - Ability to provide solutions that are forward-thinking in data engineering and analytics space. - Collaborate with DW/BI leads to understand new ETL pipeline development requirements. - Triage issues to find gaps in existing pipelines and fix the issues. - Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs. - Help joiner team members to resolve issues and technical challenges. - Drive technical discussion with client architect and team members. - Orchestrate the data pipelines in scheduler via Airflow. Skills And Qualifications : - Bachelor's and/or masters degree in computer science or equivalent experience. - Must have total 6+ yrs of IT experience and 3+ years' experience in Data warehouse/ETL projects. - Deep understanding of Star and Snowflake dimensional modelling. - Strong knowledge of Data Management principles. - Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture. - Should have hands-on experience in SQL, Python and Spark (PySpark). - Candidate must have experience in AWS/ Azure stack. - Desirable to have ETL with batch and streaming (Kinesis). - Experience in building ETL / data warehouse transformation processes. - Experience with Apache Kafka for use with streaming data / event-based data. - Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala). - Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J). - Experience working with structured and unstructured data including imaging & geospatial data. - Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. - Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot. - Databricks Certified Data Engineer Associate/Professional Certification (Desirable). - Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects. - Should have experience working in Agile methodology. - Strong verbal and written communication skills. - Strong analytical and problem-solving skills with a high attention to detail.

Posted 3 weeks ago

Apply

10.0 - 15.0 years

12 - 17 Lacs

Indore, Hyderabad, Ahmedabad

Work from Office

Naukri logo

Job Title: Technical Architect / Solution Architect / Data Architect (Data Analytics) ?? Notice Period: Immediate to 15 Days ?? Experience: 9+ Years Job Summary: We are looking for a highly technical and experienced Data Architect / Solution Architect / Technical Architect with expertise in Data Analytics. The candidate should have strong hands-on experience in solutioning, architecture, and cloud technologies to drive data-driven decisions. Key Responsibilities: ? Design, develop, and implement end-to-end data architecture solutions. ? Provide technical leadership in Azure, Databricks, Snowflake, and Microsoft Fabric. ? Architect scalable, secure, and high-performing data solutions. ? Work on data strategy, governance, and optimization. ? Implement and optimize Power BI dashboards and SQL-based analytics. ? Collaborate with cross-functional teams to deliver robust data solutions. Primary Skills Required: ? Data Architecture & Solutioning ? Azure Cloud (Data Services, Storage, Synapse, etc.) ? Databricks & Snowflake (Data Engineering & Warehousing) ? Power BI (Visualization & Reporting) ? Microsoft Fabric (Data & AI Integration) ? SQL (Advanced Querying & Optimization) ?? Looking for immediate to 15-day joiners!

Posted 3 weeks ago

Apply

10.0 - 18.0 years

20 - 35 Lacs

Kolkata, Pune, Chennai

Work from Office

Naukri logo

Job Summary: We are seeking a Snowflake Solution Architect to drive technical sales, solution design, and client engagement for Snowflake-based data solutions. This role requires a deep understanding of Snowflakes architecture, data engineering, cloud ecosystems (AWS/Azure/GCP), and analytics workflows. The ideal candidate will work closely with sales teams, customers, and partners to showcase Snowflakes capabilities, provide technical guidance, and ensure successful adoption. Key Responsibilities: Solution Architecture : Act as a trusted advisor to clients, understanding business challenges and recommending Snowflake-based solutions. Design and present Snowflake architecture, data pipelines, and integration strategies tailored to customer needs. Develop proof-of-concept (POC) and demos to showcase Snowflake’s capabilities. 2. Technical Consultation & Client Engagement: Conduct technical discovery sessions with clients to assess their data architecture, workflows, and pain points. Provide best practices for performance optimization, security, data governance, and cost efficiency in Snowflake. Assist in RFPs, RFIs, and technical documentation for customer proposals. 3. Collaboration & Enablement: Work closely with sales, product, engineering, and customer success teams to drive Snowflake adoption. Conduct workshops, webinars, and training sessions for clients and partners. Stay updated with Snowflake’s latest features, roadmap, and industry trends. 4. Integration & Ecosystem Expertise: Provide guidance on integrating Snowflake with ETL tools (dbt, Matillion, Informatica, Fivetran), BI tools (Tableau, Power BI), and AI/ML frameworks (Databricks, Python, TensorFlow). Understand multi-cloud strategies and data migration best practices from legacy systems to Snowflake. Required Skills & Qualifications: Experience: 10+ years in data architecture, pre-sales, or solution consulting, with 3+ years of Snowflake hands-on expertise. Technical Expertise: Deep knowledge of Snowflake’s architecture, SnowSQL, Snowpipe, Streams, Tasks, Stored Procedures. Strong understanding of cloud platforms (AWS, Azure, GCP). Proficiency in SQL, Python, or scripting languages for data operations. Experience with ETL/ELT tools, data integration, and performance tuning. Familiarity with data security, governance, and compliance standards (GDPR, HIPAA, SOC 2). Soft Skills: Excellent communication, presentation, and client engagement skills. Ability to translate complex technical concepts into business value propositions. Strong problem-solving and consultative approach. Preferred Locations: Offshore [Kochi, Trivandrum, Chennai, Pune, Kolkata]Role & responsibilities Preferred candidate profile

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

5+ years of experience working in data warehousing systems 3+ strong hands-on programming expertise in Databricks landscape, including SparkSQL, Workflows for data processing and pipeline development 3+ strong hands-on data transformation/ETL skills using Spark SQL, Pyspark, Unity Catalog working in Databricks Medallion architecture 2+ yrs work experience in one of cloud platforms: Azure, AWS or GCP Experience working in using Git version control, and well versed with CI/CD best practices to automate the deployment and management of data pipelines and infrastructure Nice to have hands-on experience building data ingestion pipelines from ERP systems (Oracle Fusion preferably) to a Databricks environment, using Fivetran or any alternative data connectors Experience in a fast-paced, ever-changing and growing environment Understanding of metadata management, data lineage, and data glossaries is a plus Must have report development experience using PowerBI, SplashBI or any enterprise reporting tool. Roles & Responsibilities: Involved in design and development of enterprise data solutions in Databricks, from ideation to deployment, ensuring robustness and scalability. Work with the Data Architect to build and maintain robust and scalable data pipeline architectures on Databricks using PySpark and SQL Assemble and process large, complex ERP datasets to meet diverse functional and non-functional requirements. Involve in continuous optimization efforts, implementing testing and tooling techniques to enhance data solution quality. Focus on improving performance, reliability, and maintainability of data pipelines. Implement and maintain PySpark and databricks SQL workflows for querying and analyzing large datasets Involve in release management using Git and CI/CD practices Develop business reports using SplashBI reporting tool leveraging the data from Databricks gold layer Qualifications Bachelors degree in computer science, Engineering, Finance or equivalent experience. Good communication skills.

Posted 3 weeks ago

Apply

10.0 - 15.0 years

16 - 25 Lacs

Bengaluru

Hybrid

Naukri logo

Experience : 10+ Years Location : Bangalore, Hybrid Notice Period : Immediate Joiners to 30 days Mode of Interview : 2-3 rounds Key Skills : Snowflake or Databricks, Python or Java, Cloud, Data Modeling Primary Skills : - Erwin tool - Data Modeling (Logical, Physical) - 2+ years - Snowflake - 2+ years - Warehousing Concepts - Any cloud Secondary Skills : - dbt cloud - Airflow - AWS Job Description : This is a hands-on technology position for a data technology leader with specialized business knowledge in the middle/front office areas. The candidate is someone with a proven record of technology project execution for data on the cloud, able to get hands-on when it comes to analysis, design, and development, and has creativity and self-motivation to deliver on mission-critical projects. These skills will help you succeed in this role : - Having 10+ Years of experience in an application development team with hands-on architecting, designing, developing, and deployment skillset. - Have demonstrated ability to translate business requirements in a technical design and through to implementation. - Experienced Subject Matter Expert in designing & architecting Big Data platforms services, and systems using Java/Python, SQL, Databricks, Snowflake, and cloud-native tools on Azure and AWS. - Experience in event-driven architectures, message hub, MQ, Kafka. - Experience in Kubernetes, ETL tools, Data as a Service, Star Schema, Dimension modeling, OLTP, ACID, and data structures is desired. - Proven Experience with Cloud and Big Data platforms, building data processing applications utilizing Spark, Airflow, Object storage, etc. - Ability to work in an on-shore/off-shore model working with development teams across continents. - Use coding standards, secured application development, documentation, Release and configuration management, and expertise in CI/CD. - Well-versed in SDLC using Agile Scrum. - Plan and execute the deployment of releases. - Ability to work with Application Development, SQA, and Infrastructure team. - Strong leadership skills, analytical problem-solving skills along with the ability to learn and adapt quickly. - Self-motivated, quick learner, and creative problem solver, organized, and responsible for managing a team of dev engineers. Education & Preferred Qualifications : - Bachelor's degree and 6 or more years of experience in Information Technology. - Strong team ethics and team player. - Cloud certification, Databricks, or Snowflake Certification is a plus. - Experience in evaluating software estimating cost and delivery timelines and managing financials. - Experience leading agile delivery & adhering to SDLC processes is required. - Work closely with the business & IT stakeholders to manage delivery. Additional Requirements : - Ability to lead delivery, manage team members if required, and provide feedback. - Ability to make effective decisions and manage change. - Communicates effectively in a professional manner both written and orally. - Team player with a positive attitude, enthusiasm, initiative, and self-motivation.

Posted 3 weeks ago

Apply

12.0 - 22.0 years

25 - 40 Lacs

Bangalore Rural, Bengaluru

Work from Office

Naukri logo

Role & responsibilities Requirements: Data Modeling (Conceptual, Logical, Physical)- Minimum 5 years Database Technologies (SQL Server, Oracle, PostgreSQL, NoSQL)- Minimum 5 years Cloud Platforms (AWS, Azure, GCP) - Minimum 3 Years ETL Tools (Informatica, Talend, Apache Nifi) - Minimum 3 Years Big Data Technologies (Hadoop, Spark, Kafka) - Minimum 5 Years Data Governance & Compliance (GDPR, HIPAA) - Minimum 3 years Master Data Management (MDM) - Minimum 3 years Data Warehousing (Snowflake, Redshift, BigQuery)- Minimum 3 years API Integration & Data Pipelines - Good to have. Performance Tuning & Optimization - Minimum 3 years business Intelligence (Power BI, Tableau)- Minimum 3 years Job Description: We are seeking experienced Data Architects to design and implement enterprise data solutions, ensuring data governance, quality, and advanced analytics capabilities. The ideal candidate will have expertise in defining data policies, managing metadata, and leading data migrations from legacy systems to Microsoft Fabric/DataBricks/ . Experience and deep knowledge about at least one of these 3 platforms is critical. Additionally, they will play a key role in identifying use cases for advanced analytics and developing machine learning models to drive business insights. Key Responsibilities: 1. Data Governance & Management Establish and maintain a Data Usage Hierarchy to ensure structured data access. Define data policies, standards, and governance frameworks to ensure consistency and compliance. Implement Data Quality Management practices to improve accuracy, completeness, and reliability. Oversee Metadata and Master Data Management (MDM) to enable seamless data integration across platforms. 2. Data Architecture & Migration Lead the migration of data systems from legacy infrastructure to Microsoft Fabric. Design scalable, high-performance data architectures that support business intelligence and analytics. Collaborate with IT and engineering teams to ensure efficient data pipeline development. 3. Advanced Analytics & Machine Learning Identify and define use cases for advanced analytics that align with business objectives. Design and develop machine learning models to drive data-driven decision-making. Work with data scientists to operationalize ML models and ensure real-world applicability. Required Qualifications: Proven experience as a Data Architect or similar role in data management and analytics. Strong knowledge of data governance frameworks, data quality management, and metadata management. Hands-on experience with Microsoft Fabric and data migration from legacy systems. Expertise in advanced analytics, machine learning models, and AI-driven insights. Familiarity with data modelling, ETL processes, and cloud-based data solutions (Azure, AWS, or GCP). Strong communication skills with the ability to translate complex data concepts into business insights. Preferred candidate profile Immediate Joiner

Posted 3 weeks ago

Apply

10 - 15 years

15 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

we're looking for a seasoned Data Architect to lead the design and build of a scalable, future-ready data foundation for Uber s Scaled Solution platform. This role will own the end-to-end data solution design - defining the data model and guiding engineers and analysts to implement it as specified. It s a mission-critical role that will shape how we manage both structured and unstructured data, model complex user journeys, and enable data-driven insights to track and accelerate the success and growth of Scaled Solution. What you'll Do: Architect the end-to-end data platform for Scaled solutions tech, including pipelines, storage, and instrumentation Define scalable data models across complex domains: onboarding, assessments, gig management, tasks, demand, and user events Ensure proper version control, lineage, and governance for key datasets (eg, hierarchical gig data, user tasks, assessment flows) Guide engineering teams on accurate data instrumentation, especially in complex scenarios (eg, email changes during onboarding) Partner with cross-functional stakeholders to translate business needs into robust data architecture Design AI-ready data foundations to support automation, predictive analytics, and text/voice-based agentic interfaces Monitor and evolve architecture to support ongoing feature launches and increasing data volume What you'll Need: 10+ years of overall experience in data and analytics domain 8+ years of experience in data architecture, solution architecture, or data modeling Proven expertise designing and building successful data architecture with high-volume, structured and unstructured datasets on Hadoop and Hive based big data platforms Hands on experience of logical and physical data modeling for user journeys, event tracking, and multi-entity relationships on Hive metastore Deep understanding of Hive, Hudi, Airflow, Pinot, and Flink, with practical experience using them to design and implement modern, scalable data solutions Strong expertise in advanced SQL for complex data transformations, modeling, and performance optimization Proven ability to lead cross-functional teams, guiding data engineers and analysts to design, build, and maintain data pipelines and models aligned with architectural standards Bonus points, if: Experience with modern data stacks and tools (eg, Snowflake, dbt, Airflow, BigQuery, Kafka) Experience designing for AI/ML applications or agentic systems is a strong plus Clear understanding of how to design data platforms and architectures optimized for agent analytics, including data modeling, real-time data flow, context awareness, and integration with LLMs or Agentic AI systems Exposure to tools like Looker, Tableau, or Excel is a plus Experience working in a matrixed organization, aligning with Product and Engineering teams on reporting data instrumentation and schema design Strong leadership, communication, and interpersonal skills to effectively collaborate, influence, and drive alignment across teams

Posted 1 month ago

Apply

5 - 10 years

20 - 25 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

As a Lead Data Architect - Data Scientist at JPMorgan Chase within the International Consumer Bank, you will be a part of a flat-structure organization. Your responsibilities are to deliver end-to-end cutting-edge solutions in the form of cloud-native microservices architecture applications leveraging the latest technologies and the best industry practices. You are expected to be involved in the design and architecture of the solutions while also focusing on the entire SDLC lifecycle stages. Our Business Analytics team is at the heart of this venture, focused on getting smart ideas into the hands of our customers. we're looking for people who have a curious mindset, thrive in collaborative squads, and are passionate about new technology. By their nature, our people are also solution-oriented, commercially savvy and have a head for fintech. We work in tribes and squads that focus on specific products and projects - and depending on your strengths and interests, you'll have the opportunity to move between them. Job responsibilities Collaborating with business partners, research teams and domain experts to understand business problems. Providing stakeholders with timely and accurate reporting. Performing ad hoc analysis based on diverse data sources to give decision-makers actionable insights about the performance of the products, customer behavior and market trends. Presenting your findings in a clear, logical, and persuasive manner, illustrating them with effective visualizations. Collaborating with data engineers, machine learning engineers and dashboard developers to automate and optimize business processes. Identifying unexplored opportunities to change how we'do business using data. Required qualifications, capabilities and skills Formal training or certification in Data Analysis using Python and 5+ years applied experience Advanced SQL querying skills. Experience in taking open ended business questions, then use big data and statistics to create analysis that can provide an answer to the questions at hand. Experience with customer analytics such as user behavioral analysis, campaign analysis, etc Demonstrated ability to think beyond raw data and to understand the underlying business context and sense business opportunities hidden in data. Ability to work in a dynamic, agile environment within a geographically distributed team. Excellent written and verbal communication skills in English. Preferred qualifications, capabilities and skills Distinctive problem-solving skills and impeccable business judgment. Familiarity with machine learning.

Posted 1 month ago

Apply

9 - 14 years

19 - 32 Lacs

Gurugram

Remote

Naukri logo

ONLY Immediate Joiners Requirement : Data Architect & Business Intelligence Experience: 9+ Years Location: Remote Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills.

Posted 1 month ago

Apply

8 - 13 years

40 - 45 Lacs

Noida, Gurugram

Work from Office

Naukri logo

Responsibilities: Design and articulate enterprise-scale data architectures incorporating multiple platforms including Open Source and proprietary Data Platform solutions - Databricks, Snowflake, and Microsoft Fabri c, to address customer requirements in data engineering, data science, and machine learning use cases. Conduct technical discovery sessions with clients to understand their data architecture, analytics needs, and business objectives Design and deliver proof of concepts (POCs) and technical demonstrations that showcase modern data platforms in solving real-world problems Create comprehensive architectural diagrams and i mplementation roadmaps for complex data ecosystems spanning cloud and on-premises environments Evaluate and recommend appropriate big data technologies, cloud platforms, and processing frameworks based on specific customer requirements Lead technical responses to RFPs (Request for Proposals), crafting detailed solution architectures, technical approaches, and implementation methodologies Create and review techno-commercial proposals, including solution scoping, effort estimation, and technology selection justifications Collaborate with sales and delivery teams to develop competitive, technically sound proposals with appropriate pricing models for data solutions Stay current with the latest advancements in data technologies including cloud services, data processing frameworks, and AI/ML capabilities Qualifications: Bachelor's or Master's degree in Computer Science, Data Science, or a related technical field. 8+ years of experience in data architecture, data engineering, or solution architecture roles Proven experience in responding to RFPs and developing techno-commercial proposals for data solutions Demonstrated ability to estimate project efforts, resource requirements, and implementation timelines Hands-on experience with multiple data platforms including Databricks, Snowflake, and Microsoft Fabric Strong understanding of big data technologies including Hadoop ecosystem, Apache Spark, and Delta Lake Experience with modern data processing frameworks such as Apache Kafka and Airflow Proficiency in cloud platforms ( AWS, Azure, GCP ) and their respective data services Knowledge of system monitoring and observability tools. Experience implementing automated testing frameworks for data platforms and pipelines Expertise in both relational databases (PostgreSQL, MySQL) and NoSQL databases (MongoDB) Understanding of AI/ML technologies and their integration with data platforms Familiarity with data integration patterns, ETL/ELT processes , and data governance practices Experience designing and implementing data lakes, data warehouses, and machine learning pipelines Proficiency in programming languages commonly used in data processing (Python, Scala, SQL) Strong problem-solving skills and ability to think creatively to address customer challenges Relevant certifications such as Databricks, Snowflake, Azure Data Engineer, or AWS Data Analytics are a plus Willingness to travel as required to meet with customers and attend industry events If interested plz contact Ramya 9513487487, 9342164917

Posted 1 month ago

Apply

8 - 12 years

10 - 14 Lacs

Mumbai, Delhi / NCR

Work from Office

Naukri logo

Job Description: Senior Data Architect (Contract) Company : Emperen Technologies Location: India (Remote), Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Type: Contract (8 -12 Months) Role Overview :We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data -driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions.Responsibilities :Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment.ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms.Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories.Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively.Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non -technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player.Qualifications :Experience : - 8 -12 years of hands -on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS.Soft Skills : - Strong analytical and problem -solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self -motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self -confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.

Posted 1 month ago

Apply

8 - 13 years

12 - 22 Lacs

Gurugram

Work from Office

Naukri logo

Data & Information Architecture Lead 8 to 15 years - Gurgaon Summary An Excellent opportunity for Data Architect professionals with expertise in Data Engineering, Analytics, AWS and Database. Location Gurgaon Your Future Employer : A leading financial services provider specializing in delivering innovative and tailored solutions to meet the diverse needs of our clients and offer a wide range of services, including investment management, risk analysis, and financial consulting. Responsibilities Design and optimize architecture of end-to-end data fabric inclusive of data lake, data stores and EDW in alignment with EA guidelines and standards for cataloging and maintaining data repositories Undertake detailed analysis of the information management requirements across all systems, platforms & applications to guide the development of info. management standards Lead the design of the information architecture, across multiple data types working closely with various business partners/consumers, MIS team, AI/ML team and other departments to design, deliver and govern future proof data assets and solutions Design and ensure delivery excellence for a) large & complex data transformation programs, b) small and nimble data initiatives to realize quick gains, c) work with OEMs and Partners to bring the best tools and delivery methods. Drive data domain modeling, data engineering and data resiliency design standards across the micro services and analytics application fabric for autonomy, agility and scale Requirements Deep understanding of the data and information architecture discipline, processes, concepts and best practices Hands on expertise in building and implementing data architecture for large enterprises Proven architecture modelling skills, strong analytics and reporting experience Strong Data Design, management and maintenance experience Strong experience on data modelling tools Extensive experience in areas of cloud native lake technologies e.g. AWS Native Lake Solution onsibilities

Posted 1 month ago

Apply

12 - 16 years

25 - 35 Lacs

Bengaluru, Delhi / NCR, Mumbai (All Areas)

Hybrid

Naukri logo

We're looking for an experienced Data Engineer Architect with expertise in AWS technologies, to join our team in India. If you have a passion for analytics and a proven track record of designing and implementing complex data solutions, we want to hear from you. Location : Noida/ Gurgaon/ Bangalore/ Mumbai/ Pune Your Future Employer : Join a dynamic and inclusive organization at the forefront of technology, where your expertise will be valued and your career development will be a top priority. Responsibilities: Designing and implementing robust, scalable data pipelines and architectures using AWS technologies Collaborating with cross-functional teams to understand data requirements and develop solutions to meet business needs Optimizing data infrastructure and processes for improved performance and efficiency Providing technical leadership and mentorship to junior team members, and driving best practices in data engineering Requirements: 12+ years of experience in data engineering, with a focus on AWS technologies Strong proficiency in analytics and data processing tools such as SQL, Spark, and Hadoop Proven track record of designing and implementing large-scale data solutions Experience in leading and mentoring teams, and driving technical best practices Excellent communication skills and ability to collaborate effectively with stakeholders at all levels Whats in it for you: Competitive compensation and benefits package Opportunity to work with cutting-edge technologies and make a real impact on business outcomes Career growth and development in a supportive and inclusive work environment Reach us - If you feel this opportunity is well aligned with your career progression plans, please feel free to reach me with your updated profile at isha.joshi@crescendogroup.in Disclaimer - Crescendo Global is specializes in Senior to C-level niche recruitment. We are passionate about empowering job seekers and employers with an engaging memorable job search and leadership hiring experience. Crescendo Global does not discriminate on the basis of race, religion, color, origin, gender, sexual orientation, age, marital status, veteran status or disability status. Note -We receive a lot of applications on a daily basis so it becomes a bit difficult for us to get back to each candidate. Please assume that your profile has not been shortlisted in case you don't hear back from us in 1 week. Your patience is highly appreciated. Profile keywords : Data Engineer, Architect, AWS, Analytics, SQL, Spark, Hadoop, Kafka, Crescendo Global.

Posted 1 month ago

Apply

7 - 12 years

10 - 20 Lacs

Chennai

Remote

Naukri logo

Urgent opening for "SAP PS -Data Architect/ Analyst/ Expert" with a Big4 Firm Location: Pan India (Remote) Chennai candidates preferred Looking for candidates who can join within 10 days Job Description Looking for Data Analyst/ Architect / Expert candidates with more relevant experience in SAP PS module (5+yrs exp in PS, overall 10+) – which includes Project Systems (PS) components like Project Definition and WBS elements in SAP ECCas well as S4 Hana On-Premises setup.. Job Requirements: 10+ years of SAP experience, with 5+ years in SAP PS Deep knowledge of SAP Project Systems (WBS, Project Definitions) Experience in ECC and S/4HANA On-Premises setup Proven ability to work on data migration from ECC to S/4 systems Understanding of impacted table structures and solution design Strong grasp of new S/4HANA features related to Project Systems Ability to work independently and drive PS data-related tasks Interested can share their updated resume to anitha.mudaliyar@quantaleap.com

Posted 1 month ago

Apply

12 - 22 years

35 - 50 Lacs

Hyderabad, Pune, Chennai

Work from Office

Naukri logo

Looking for experts in any one of the following DATA ARCHITECT- ADF, HDInsight, Azure SQL, Pyspark, Python BI ARCHITECT- Tableau, Power BI and Azure SQL. MDM ARCHITECT- Reltio, Profisee, MDM INFORMATICA ARCHITECT- Informatica, MDM, SQL, Python.

Posted 1 month ago

Apply

12 - 22 years

35 - 65 Lacs

Chennai

Hybrid

Naukri logo

Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 - 24 Yrs Location- Pan India Job Description : - Candidates should have minimum 2 Years hands on experience as Azure databricks Architect If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in With Regards, Sankar G Sr. Executive - IT Recruitment

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies