Home
Jobs

112 Data Architect Jobs - Page 4

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7 - 9 years

9 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

The Data & Analytics team at Pure Storage is seeking a Senior Data Engineer - a hands-on and motivated data wizard who is passionate about unleashing the power of data to help us inform decision making, achieve our strategic objectives, and hire and retain world-class talent As an integral part of the team, the Senior Data Engineer will leverage their expert level technical skills to turn data into knowledge and drive business success The ideal candidate will play a key role in designing and developing the critical data infrastructure to get the data easily in the DNA warehouse He/she must be able to develop ETL/ELT pipelines, migrate data across systems, and utilize advanced SQL skills to transform and troubleshoot data In addition, the candidate should be able to use Python to quickly develop automations, perform analyses Responsibilities: Design, develop, and support our Data infrastructure utilizing various technologies to process large volumes of data Actively participate in our one end-to-end Data Warehouse (DWH) building and implementation Serve as the team expert on data infrastructure, standardized reporting ,helping to evaluate alternatives, make design decisions, and implement end-to-end solutions by providing data to the stakeholders Build ETL (or ELT) data pipelines to transform, aggregate, and structure data from the data lake and prepare it for analysis Design data models that define how the tables, columns, and data elements from different sources are connected and stored, based on our reporting and analytics requirements Develop Snowflake views that can be used as a data source in Tableau Partner with analysts and stakeholders to understand business requirements and develop efficient, highly performant, scalable solutions Conduct data wrangling/cleaning to transform raw data into a more usable format Collaborate closely with other team members to comprehend both business and technical requirements, translating them into efficient data integration designs and deliverables Troubleshoot data issues and conduct root cause analysis when reporting data is in question Requirements 7+ years relevant experience with Data Migrations/ETL Development, Data Warehousing, Data Transformations, and Data Troubleshooting on the backend preferably with a globally recognized organization Expert level SQL script writing skills Strong understanding of data modelling fundamentals In-depth knowledge of data warehousing concepts Bachelor s/Advanced degree in Computer Science, Engineering, Information Technology, Information Systems, or similar field of study

Posted 2 months ago

Apply

7 - 10 years

9 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

The Data & Analytics team at Pure Storage is seeking a Senior Data Engineer - a hands-on and motivated data wizard who is passionate about unleashing the power of data to help us inform decision making, achieve our strategic objectives, and hire and retain world-class talent As an integral part of the team, the Senior Data Engineer will leverage their expert level technical skills to turn data into knowledge and drive business success The ideal candidate will play a key role in designing and developing the critical data infrastructure to get the data easily in the DNA warehouse He/she must be able to develop ETL/ELT pipelines, migrate data across systems, and utilize advanced SQL skills to transform and troubleshoot data In addition, the candidate should be able to use Python to quickly develop automations, perform analyses Responsibilities: Design, develop, and support our Data infrastructure utilizing various technologies to process large volumes of data Actively participate in our one end-to-end Data Warehouse (DWH) building and implementation Serve as the team expert on data infrastructure, standardized reporting ,helping to evaluate alternatives, make design decisions, and implement end-to-end solutions by providing data to the stakeholders Build ETL (or ELT) data pipelines to transform, aggregate, and structure data from the data lake and prepare it for analysis Design data models that define how the tables, columns, and data elements from different sources are connected and stored, based on our reporting and analytics requirements Develop Snowflake views that can be used as a data source in Tableau Partner with analysts and stakeholders to understand business requirements and develop efficient, highly performant, scalable solutions Conduct data wrangling/cleaning to transform raw data into a more usable format Collaborate closely with other team members to comprehend both business and technical requirements, translating them into efficient data integration designs and deliverables Troubleshoot data issues and conduct root cause analysis when reporting data is in question Requirements 7+ years relevant experience with Data Migrations/ETL Development, Data Warehousing, Data Transformations, and Data Troubleshooting on the backend preferably with a globally recognized organization Expert level SQL script writing skills Strong understanding of data modelling fundamentals In-depth knowledge of data warehousing concepts Bachelor s/Advanced degree in Computer Science, Engineering, Information Technology, Information Systems, or similar field of study

Posted 2 months ago

Apply

8 - 12 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

ThoughtFocus is looking for Cloud Data Architect to join our dynamic team and embark on a rewarding career journey. A Data Architect is a professional who is responsible for designing, building, and maintaining an organization's data architecture. 1. Designing and implementing data models, data integration solutions, and data management systems that ensure data accuracy, consistency, and security. 2. Developing and maintaining data dictionaries, metadata, and data lineage documents to ensure data governance and compliance. 3. Data Architect should have a strong technical background in data architecture and management, as well as excellent communication skills. 4. Strong problem-solving skills and the ability to think critically are also essential to identify and implement solutions to complex data issues.

Posted 2 months ago

Apply

12 - 16 years

15 - 19 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title: GCP Data Architect Location: Chennai, Hyderabad, Bangalore Experience: 12-16 Years Required skills: GCP, Big Query, Cloud Storage, Dataflow,Python, Cloud Functions, Pub/Sub Job Description: 10-15 yrs experience primarily working on Data Warehousing, BIAnalytics, Data management projects as Tech Architect, delivery, client relationship and practice roles - involving ETL, reporting, big data and analytics, etc. 10+ years of experience as a data engineer, with hands-on experience of 5+ years in GCP cloud data platform. Experience leading, designing developing Data Engineering solutions using Google Cloud Platform (GCP) BigQuery, Cloud Storage, Dataflow, Cloud Functions, Pub/Sub, Cloud Run, Cloud Composer (Airflow), Cloud Spanner, Bigtable etc. Experience building CI/CD pipelines to automate deployment and testing of data pipelines. Experience in managing and deploying containerized applications Proficient in Python for data processing and automation, SQL for querying and data manipulation. Experience with Cloud Monitoring, Datadog, or other monitoring solutions to track pipeline performance and ensure operational efficiency. Familiarity with Terraform or Deployment for Infrastructure as Code (IaC) to manage GCP resources will be plus Expertise in ETL/ELT pipelines, data modeling, and data integration across large datasets. Strong understanding of data warehousing and real-time data processing workflows. Strong communication skills to work effectively with cross-functional teams and mentor junior developers. Proven ability to lead in an Agile environment. Understanding of Business Intelligence and reporting projects Strong knowledge of data management solutions like Data Quality, Metadata, Master Data, Governance etc Experience in Cloud Data platform and large migration programs Focused on value, innovation and automation led delivery processes Ability to drive end to end implementations through effective collaboration with different stakeholders Understanding of Cloud data architecture and data modeling concepts and principles, including Cloud data lakes, warehouses and marts, dimensional modeling, star schemas, real time and batch ETL/ELT Would be good to have experience in AI/ML, GenAI projects Would be good to have knowledge/experience on multiple cloud-based data analytic platforms such as GCP, Snowflake, Azure etc Good understanding of SDLC and Agile methodologies Would be good to have Telecom background.

Posted 2 months ago

Apply

3 - 5 years

8 - 9 Lacs

Pune, Mumbai, Bengaluru

Work from Office

Naukri logo

Role Summary: Lead and drive the development in the BI domain using Qlik Sense eco system with deep technical and BI ecosystem knowledge. The resource will be responsible for the dashboard design, development, and delivery of BI services using the Qlik Sence eco system. Key functions responsibilities: QlikView/Qlik Sense Data Architect (possibly certified) with extensive knowledge of QlikView and Qlik Sense including best practices for data modeling, application design, and development. Familiarity with the use of GeoAnalytics, NPrinting, extensions, widgets, mashups, ODAG, and various other advanced features used in Qlik Sense development. Good knowledge of working with Set Analysis. Experience working with Qlik Sense sites and the Qlik Management Console, creating rules, and managing the streams, as well as user and application security. Knowledge of Active Directory, proxies, load balancers, etc. Experience in troubleshooting connectivity, configuration, performance, etc. Strong communication and presentation skills. Candidate s Profile Academics: Bachelor s degree preferable in Computer science. Master s degree would have an added advantage. Experience: 3 5 years of experience in Qlik Sense designing and development

Posted 2 months ago

Apply

10 - 19 years

25 - 40 Lacs

Pune

Hybrid

Naukri logo

Data Solution Architect to design Solution and Lead the implementation on Google Cloud. data Architecting, Solution design and data management practices. Architect and design end-to-end data solutions on Cloud Platform, focusing on data W/H.

Posted 2 months ago

Apply

10 - 19 years

25 - 40 Lacs

Bengaluru

Hybrid

Naukri logo

Data Solution Architect to design Solution and Lead the implementation on Google Cloud. data Architecting, Solution design and data management practices. Architect and design end-to-end data solutions on Cloud Platform, focusing on data W/H.

Posted 2 months ago

Apply

10 - 20 years

25 - 40 Lacs

Bangalore Rural

Hybrid

Naukri logo

AWS Redshift, AWS EMR, AWS S3, AWS Glue, AWS DMS, AWS Lambda, SNS, SQS, AWS Kinesis, IAM, VPC etc. migrating on-premise data warehouse to AWS cloud. metadata management,Governance, Data Quality, MDM, Lineage, Data Catalog etc.

Posted 2 months ago

Apply

10 - 20 years

25 - 40 Lacs

Pune

Hybrid

Naukri logo

AWS Redshift, AWS EMR, AWS S3, AWS Glue, AWS DMS, AWS Lambda, SNS, SQS, AWS Kinesis, IAM, VPC etc. migrating on-premise data warehouse to AWS cloud. metadata management,Governance, Data Quality, MDM, Lineage, Data Catalog etc.

Posted 2 months ago

Apply

10 - 15 years

17 - 22 Lacs

Mumbai

Work from Office

Naukri logo

Job Description: At Go Digital Technology Consulting LLP (GDTC), we are redefining the future of data consulting and services. Our expertise spans Data Engineering, Analytics, and Data Science, enabling us to craft cutting-edge cloud data solutions tailored to our clients unique needs. Specializing in AWS, Azure, and Snowflake, we empower organizations to harness the full potential of their data, transforming it into actionable insights that drive impact. Our team is a dynamic mix of technologists, product managers, thinkers, and architects, unified by the vision to deliver exceptional value. By blending technological expertise with a deep understanding of business needs, we create solutions that not only meet expectations but set new industry benchmarks. Join us on a transformative journey where your skills and vision will directly shape the future of cloud data architecture. Role Summary: As a Cloud Data Architect , you will play a pivotal role in designing and implementing scalable, high-performance cloud-native data architectures. You will lead the architecture of complex, large-scale data environments while fostering innovation and delivering robust solutions that drive business value. Key Technologies / Skills: Mastery of SQL, Python, PySpark, and Shell scripting. Expertise in data modeling and big data ecosystems (e.g., Hadoop, Hive) and ETL pipelines. Deep understanding of cloud-native data solutions across AWS, Azure, or GCP, with experience in Snowflake. Strong knowledge of modern data architectures, including serverless computing, data lakes, and analytics solutions. Responsibilities: Design and implement innovative cloud-native data architectures, translating business needs into scalable and sustainable solutions. Lead technology selection, ensuring alignment with client needs and organizational standards. Develop comprehensive high-level design documents and frameworks to guide project execution. Architect and optimize ETL pipelines, ensuring efficient data ingestion, transformation, and storage in the cloud. Champion best practices for data governance, security, and compliance in cloud environments. Conduct performance tuning and optimization of cloud-based data platforms. Collaborate with stakeholders to align architectural solutions with business objectives. Support pre-sales efforts by providing technical insights and creating compelling proposals. Mentor and coach technical leads, enabling their growth into future data architects. Stay ahead of emerging technologies and trends, driving continuous innovation in data engineering and architecture. Required Qualifications: 10+ years of experience in data engineering and architecture, with a focus on cloud hyperscalers (AWS, Azure, GCP). Proven expertise in cloud-native data solutions, including Snowflake and modern data lake architectures. Advanced proficiency in Python, PySpark, and SQL, with experience in NoSQL databases. Extensive experience with data warehousing, ETL pipelines, and big data ecosystems. Strong knowledge of ITIL service management practices and design processes. Demonstrated leadership and collaboration skills, with the ability to engage diverse stakeholders. A bachelor s or master s degree in Computer Science, Engineering, or a related field. Why Join Us: Be part of a forward-thinking organization that values innovation, collaboration, and excellence. Work on pioneering projects using state-of-the-art technologies in cloud data engineering. Enjoy competitive compensation, comprehensive benefits, and flexible work arrangements. Thrive in an environment that encourages personal growth and career advancement. As a Cloud Data Architect at GDTC, your expertise will shape the future of data solutions, empowering clients to unlock the full potential of their data. Together, let s build a smarter, data-driven world.

Posted 3 months ago

Apply

6 - 11 years

13 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

About the Role: We are looking for an Azure Data Architect with 12 years of experience to lead the Design, Implementation and Optimization of cloud-based data solutions. Requirements: Proven expertise in Azure Synapse, Azure Data Factory and Azure Data Lake. Deep knowledge of Medallion architecture such as bronze, silver and gold layers. Proficient in SQL and Python for data processing and troubleshooting. Experience with data performance optimization techniques including partitioning and indexing. Hands-on experience with monitoring tools such as Azure Monitor or Log Analytics for pipeline health tracking. Strong understanding of data governance and security best practices in cloud environments. Hands-on expertise in Azure Synapse, Azure Data Factory and Azure Data Lake to manage complex data pipelines and workflows. Deep understanding of Medallion architecture, advanced SQL and Python skills for troubleshooting. Proficient in data performance optimization techniques. Experience with other cloud-native data tools and platforms. Familiarity with big data processing frameworks for example Spark. Relevant Azure certifications such ad Azure Solutions Architect, Azure Data Engineer.

Posted 3 months ago

Apply

13 - 23 years

35 - 60 Lacs

Hyderabad, Noida, Jaipur

Hybrid

Naukri logo

Programmers.io india Pvt. Ltd. is looking for Data Architect to join our awesome team and deliver a streamlined user experience. Please share your resume at kanika.agrawal@programmers.io, before sharing also write in subject line data architect Location:- Jaipur, Hyderabad, Hybrid mode - Yes, Notice Period- who can join in 15 days Experience-15+ year Role Overview We are seeking a seasoned Data Engineering & Analytics Architect with over 15 years of experience to lead and design large-scale data modernization projects on the Azure platform. This strategic role requires expertise in architecting end-to-end data solutions, integrating AI-driven insights, and driving scalable, secure, and high-performance data architectures. The ideal candidate is a visionary leader with deep technical proficiency in Azure services, AI frameworks, and data governance, paired with strong stakeholder management and team mentorship capabilities. Key Focus Areas: Leadership & Strategy : Lead cross-functional teams, align solutions with business goals, and mentor on emerging trends. Solution Design : Architect scalable data lakes, warehouses, and real-time analytics with Azure technologies. AI Integration : Implement advanced analytics using Azure Machine Learning, Databricks, and Cognitive Services. Optimization & Governance : Build and optimize robust data pipelines while ensuring compliance and data quality. This role is an opportunity to deliver transformative data solutions, influence strategic decision-making, and advance organizational data capabilities.

Posted 3 months ago

Apply

4 - 6 years

8 - 14 Lacs

Noida

Work from Office

Naukri logo

Responsibilities : - Collaborate with the sales team to understand customer challenges and business objectives and propose solutions, POC etc. - Develop and deliver impactful technical presentations and demos showcasing the capabilities of GCP Data and AI , GenAI Solutions. - Conduct technical proof-of-concepts (POCs) to validate the feasibility and value proposition of GCP solutions. - Collaborate with technical specialists and solution architects from COE Team to design and configure tailored cloud solutions. - Manage and qualify sales opportunities, working closely with the sales team to progress deals through the sales funnel. - Stay up to date on the latest GCP offerings, trends, and best practices. Experience : - Design and implement a comprehensive strategy for migrating and modernizing existing relational on-premise databases to scalable and cost-effective solution on Google Cloud Platform ( GCP). - Design and Architect the solutions for DWH Modernization and experience with building data pipelines in GCP. - Strong Experience in BI reporting tools ( Looker, PowerBI and Tableau). - In-depth knowledge of Google Cloud Platform (GCP) services, particularly Cloud SQL, Postgres, Alloy DB, BigQuery, Looker Vertex AI and Gemini (GenAI). - Strong knowledge and experience in providing the solution to process massive datasets in real time and batch process using cloud native/open source Orchestration techniques. - Build and maintain data pipelines using Cloud Dataflow to orchestrate real-time and batch data processing for streaming and historical data. - Strong knowledge and experience in best practices for data governance, security, and compliance. - Excellent Communication and Presentation Skills with ability to tailor technical information as per customer needs. - Strong analytical and problem-solving skills. - Ability to work independently and as part of a team.

Posted 3 months ago

Apply

12 - 14 years

14 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Architect, design, and implement scalable Snowflake-based data solutions . Develop data models, schema designs, and ETL pipelines to support business requirements. Lead large-scale data migration projects while ensuring performance optimization. Implement DataOps methodologies for efficient data management and automation. Work with AWS, Azure, and GCP to deploy cloud-based data architectures. Ensure data security and compliance with industry standards. Optimize CDC (Change Data Capture) processes for real-time data updates. Utilize DBT, Python, Java, or Scala to enhance data transformation and integration. Design and implement data visualisation solutions using tools like Tableau and Power BI . Required Skills Expertise: Strong expertise in Snowflake and cloud data warehousing concepts . Hands-on experience with ETL processes , data modelling, and schema design. Proficiency in SQL, DBT, Python, Java, or Scala for data transformation and automation. Experience in data integration and large-scale data migration solutions. Knowledge of CDC (Change Data Capture) methodologies . Familiarity with DataOps practices and modern data engineering workflows. Exposure to AWS, Azure, and GCP cloud platforms. Strong understanding of data security, governance, and compliance .

Posted 3 months ago

Apply

5 - 10 years

0 - 2 Lacs

Bangalore Rural

Work from Office

Naukri logo

Role & responsibilities Support requirements gathering workshops with the business and technical users. Translation of the business user requirements to decision process flows and rules design Identify specific data, API integration, analytics model/credit scoring model input requirements of the decision strategy Collaborate with the ETL/Data Architect in designing the data model, and with the API Specialist in configuring the interoperability of SAS Intelligent Decisioning with the rest of the Banking Applications Analyze the data requirements of the models to be utilized as part of the decision stategyes Prepare Data required as input to the Decision Strategies Hands-on Development of rulesets and decision flows Execute Testing and/or Collaborate with The Clients testing team, and implement the application of fixes. Facilitate and Guide the customer for Production Deployment activities or implement the actual migration of the Decision Strategy Artifacts to the Production Environment, deploy for production use. Documentation and enablement workshop for the customer Experience and Skillsets: At least 5 yrs of relevant working experience under the Business Analytics Domain (which may include data warehousing, BI environment implementation, analytics model development, rules management, etc.) At least 2-3 years cumulative work experience in implementing any of the following: Decision Management Engines/Platforms Business Rules Management Systems (BRMS) Credit Rating Engines Credit Risk Assessment Process and Rules At least 1-2 years cumulative work experience, with hands-on, in at least any of the following: (such as Pega, SAS Real Time Decision Manager, SAS Intelligent Decisioning, IBM Cloud Pak for Business Automation, Actico, FICO Decision Management Platform, InRule, Decisions, FlexRule, 1000minds, XpertRule, Progress Corticon Business Rules Engine, Tibco BPM, JBPM, FICO Blaze Advisor, Experian PowerCurve Strategy Management, Ab Initio Business Rules Environment (BRE) and Express>It, Red Hat Decision Manager, Red Hat Business Process Automation, and Hyperon), SAS Risk Modeling, SAS Viya Experience on the following foundation technical competencies Hands-on: SQL Programming Postman or SOAPUI Working Knowledge: REST APIs Credit Scoring Model Development Preferably with experience on Banking Industry, Credit Risk Decision Management, Credit Approval Process, Underwriting, Credit Scoring, Business Intelligence and Analytics Projects BS Degree in Statistics, BS Degree in Computer Science/Information Technology, Computer Engineering is a plus Detail and Process Oriented Good Communication Skills Stakeholder Management Assertive and clear verbal and written communications to on-site and remote project team members, and customer counterparts.

Posted 3 months ago

Apply

6 - 10 years

15 - 30 Lacs

Pune, Bengaluru, Gurgaon

Hybrid

Naukri logo

Role:- Data Modeler. Location : Bangalore, Pune, Gurugram. Work Mode:-Hybrid. Please Find the JD in Below Job Description: Data Modeler 6+ years of experience, with at least 3 years as a data modeler, Data Vault, data architect or similar roles Analyze and translate business needs to long term, optimized data models. Design, develop and maintain comprehensive conceptual, logical, and physical data models. Develop best practices for data standards and data architecture. Proficiency in data modelling tools such as ERwin, ER/Studio or similar Strong understanding of data warehousing, data vault concepts Proficient in data analysis and profiling on database systems (eg. SQL, Databricks) Good knowledge of data governance and data quality best practices Experience with cloud platforms (such as AWS, Azure, Google Cloud), NoSQL databases (like MongoDB) will be an added advantage. Excellent communication and interpersonal skills. No constraint on working from ODC if there is a client ask. Flexible on work timings based on engagement need. Mandatory : OLTP/OLAP concept ,Relational and Dimensional Modelling STAR/ Snowflake schema knowledge, Data Vault strong knowledge on writing SQL query and Entity Relationship concepts. Must have experience with RDBMS- like oracle, MS SQL Data Modeling (Conceptual, Logical, Physical). Good in analyzing data.

Posted 3 months ago

Apply

17 - 18 years

27 - 32 Lacs

Noida

Work from Office

Naukri logo

Sr. Data Architect / Data Architect Experience: 10+ Years Summary: The Data Architect will design and implement data solutions that support the organization s strategic business goals. Responsibilities: Focuses on enterprise-wide data modelling and database design. Tests, monitors, manage and validate data warehouse activity including data extraction, transformation, movement, loading, cleansing, and updating processes. Sets the strategic technology direction for the business intelligence environment, which will use some combination of data lakes, data warehouses and traditional relational databases. Works with business consumers, managers, analysts, and subject matter experts in a professional and collaborative manner to design strategies and implementation approaches. Develops processes and procedures for data warehouse and business intelligence infrastructure. Skills Needed: Solid experience with data modelling, data warehousing, Data Lake, MDM, SSOT, analytic processes, and methodologies. Minimum of 4+ years of experience in data architecture, configuring and implementing data warehouse solutions. Proficient experience with SQL queries functionality, Power BI, Tableau, Business Intelligence software, and ETL tools. Strong knowledge of data integration, data quality and multi-dimensional design. Experience in Data Migration from on-premises databases to Cloud. Knowledge with AWS/Azure/GCP data technologies is a plus. Knowledge on Big Data Technologies (e.g. HDFS, Apache Spark, Scala, Impala, Hive, Cassandra) will be added plus. Knowledge of web services (e.g., SOAP and REST) Nice to Have: Experienced in ETL / ELT tools Certifications on Snowflake, AWS /Azure/GCP

Posted 3 months ago

Apply

12 - 18 years

20 - 27 Lacs

Bengaluru

Work from Office

Naukri logo

Develop and deliver detailed technology solutions through consulting project activities. Evaluate and recommend emerging cloud data technologies and tools to drive innovation and competitive advantage, with a focus on Azure services such as Azure Data Lake, Azure Synapse Analytics, and Azure Databricks. Lead the design and implementation of cloud-based data architectures using Azure and Databricks to support the companys strategic initiatives and exploratory projects. Collaborate with cross-functional teams to understand business requirements, architect data solutions, and drive the development of innovative data platforms and analytics solutions. Define cloud data architecture standards, best practices, and guidelines to ensure scalability, reliability, and security across the organization. Design and implement data pipelines, ETL processes, and data integration solutions to ingest, transform, and load structured and unstructured data from multiple sources into Azure data platforms. Provide technical leadership and mentorship to junior team members, fostering a culture of collaboration, continuous learning, and innovation in cloud data technologies. Collaborate with Azure and Databricks experts within the organization and the broader community to stay abreast of the latest developments and best practices in cloud data architecture and analytics.

Posted 3 months ago

Apply

8 - 14 years

50 - 75 Lacs

Bengaluru

Work from Office

Naukri logo

JD - Data Architect Myntra is a one stop shop for all your fashion and lifestyle needs. Being Indias largest online store for fashion and lifestyle products, Myntra aims at providing a hassle free and enjoyable shopping experience to shoppers across the country with the widest range of brands and products on offer. The brand is making a conscious effort to bring the power of fashion to shoppers with an array of the latest and trendiest products available in the country. Myntras cloud based big data platform is highly scalable and processes over 7 billion events per day. We are on a journey to modernize our data platform and offer multiple self-serve offerings for Myntras data consumers. We use the best-of-breed open source components and SaaS solutions as starting points to build out these capabilities along with maintaining critical core data assets of the organization. If you are interested in the fast growing field of big data and analytics, and want to work on big data engineering at scale, building data products and building analytical models (Insights) to power business decisions, then this is the team for you. About the Role: We are seeking a highly skilled Data Architect with deep expertise in data modeling to lead the overhaul of our organizations data model and core datasets. The primary focus will be on revamping our current set of 400 data assets and evolving the data architecture with an outward-in mindset. This transformation must be strategic, phased, and aligned with the organization s growth at scale, ensuring minimal disruption to operations and adherence to a cost-efficient model. The ideal candidate will have extensive experience in handling large, complex data environments, bringing innovative solutions to reimagine the data strategy and architecture while upholding data governance principles. Key Responsibilities: Data Modeling Strategy: Redesign and optimize the current set of 400+ tables by applying best practices in data modeling. Focus on creating scalable, high-performance models that address business needs and future growth. Outward-In Transformation: Lead a strategic, phased revamp of the data architecture, starting from understanding external business requirements and user needs, and working inward to shape a future-proof data model. Phased Revamp Approach: Develop and execute a well-structured, phased roadmap for overhauling the data model. Ensure each phase delivers value incrementally without disrupting ongoing operations or significantly increasing costs. ETL Design Frameworks: Lead the creation and review of reusable, efficient ETL pipelines. Ensure that ETL processes are highly optimized for performance, cost-efficiency, and scalability, minimizing processing times and resource usage. Scalability and Cost Efficiency: Analyze the current architecture, identifying bottlenecks, inefficiencies, and areas of improvement. Propose solutions that scale with the business, optimize costs, and adhere to the zero data copy principle. Data Governance: Collaborate with data teams to ensure all changes align with compliance, data security, privacy regulations, and governance policies. Maintain a strong focus on data quality, lineage, and access controls throughout the transition. Data Lakehouse Optimization: Leverage the strengths of the data lakehouse architecture to optimize data processing, storage, and retrieval strategies. Ensure seamless integration between core datasets and analytics tools. Collaboration: Work closely with data engineers, analysts, and business stakeholders to understand their data needs and ensure seamless delivery of new models. Engage with executive teams to communicate the progress and value of the transformation. Documentation Standards: Ensure robust documentation of all new data models, architectural changes, and governance protocols. Establish and enforce data modeling standards across the organization. Continuous Improvement: Stay updated on emerging trends and technologies in data modeling, architecture, and governance. Apply innovative solutions to drive continuous improvement and long-term efficiency. Qualifications: Education: Bachelors or Master s degree in Computer Science, Information Systems, or a related field. Experience: Minimum 12 years of experience in data modeling and data architecture roles, with a proven track record of handling large datasets and complex environments. Technical Skills: Expertise in data modeling tools (e.g., Erwin, IBM InfoSphere Data Architect). Experience with data lakehouse architectures (e.g. Databricks). Strong understanding of ETL processes, data warehousing, and big data technologies. In-depth knowledge of data governance frameworks, including data quality , metadata management, and compliance (e.g., GDPR, HIPAA). Experience with visualization tools like PowerBI Familiarity with SQL, NoSQL, and cloud data platforms (Azure, Google Cloud). Exposure to Transactional Database design is good to have Soft Skills: Strong analytical and problem-solving skills. Excellent communication and collaboration abilities to work with cross-functional teams. Ability to lead large-scale projects in a fast-paced, dynamic environment.

Posted 3 months ago

Apply

17 - 19 years

27 - 32 Lacs

Bengaluru

Work from Office

Naukri logo

Have 17+ years of IT experience Should have excellent work experience as a Data Architect, Data Scientist or in other similar roles Should have in-depth understanding of database structure principles Should have solid experience with gathering and analyzing system requirements Have excellent knowledge of data mining and segmentation techniques Should have expertise in Ms Sql Server and other tools Should have excellent working experience with no-sql databases Should have working experience with data visualization tools (e.g. Tableau, D3.js, PowerBI and R) Should have excellent analytical skills and demonstrate problem-solving attitude Should have experience developing database solutions for enterprise level applications Should have excellent experience with solutioning with the right tools for the tasks at hand Should have proven experience with installing and configuring systems in various envs Should be capable enough to analyze requirements, both from a functional, non-functional and operational standpoint and come up with solutions Should have proven work experience in migrating data from legacy applications or from cross platform models Should be adept at coming up with conceptual, logical and physical data models and flowcharts Should have excellent experience in system performance tuning, troubleshooting and solutioning Should have good experience in optimization of existing and future data systems Should have excellent experience with defining Data security, back-up and retrieval processes and procedures Should have excellent communication skills and be in a position to work with peers and Industry experts to build solutions

Posted 3 months ago

Apply

10 - 15 years

12 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Required Domain - Capital Market, Financial Market, Fixed Income, Equity Job Summary: We're seeking an experienced Data Analytics Leader to spearhead data analytics initiatives for alternative investments. The ideal candidate will lead a cross-functional team, develop data strategies, and drive delivery excellence. Key Responsibilities: - Lead a team of data architects, analysts, and technologists to deliver high-impact data analytics solutions. - Develop and execute data strategies aligned with business objectives. - Architect and oversee data platforms, including SQL Server, Snowflake, and Power BI. - Lead initiatives in Data Architecture, Data Modeling, and Data Warehouse (DWH) development. - Evaluate emerging technologies and recommend their integration into client solutions. Requirements: - 15+ years of experience in program or project management within capital markets and financial services. - Demonstrated expertise in SQL Server, Snowflake, Power BI, ETL processes, and Azure Cloud Data Platforms. - Hands-on experience with big data technologies and modern data architecture. - Proven track record in delivering projects emphasizing data quality, integrity, and accuracy. Preferred Qualifications: - Relevant certifications (e.g., CFA, Snowflake Certified Architect, or Microsoft Power BI Certified) - Advanced degrees (MBA, MS in Computer Science, or related fields) preferred

Posted 3 months ago

Apply

2 - 4 years

40 - 45 Lacs

Bengaluru

Work from Office

Naukri logo

2 to 4 years of hands-on experience in SQL - advanced queries, python for data engineering, azure data engineering services (preferably), snowflake analytical layer development, DBT (nice to have) who can develop, and maintain data pipelines, data cubes and data warehouses. Key Responsibilities: Core Duties: o Understand the high-level design document and be able to comprehend the technical solution provided. o Develop the ADF pipeline as per the HLD. o Develop the analytical layer in the snowflake analytics schema using the data mapping document o Unit test both the ADF pipeline and the analytics schema o Demo the solution built to the Data architect, project manager, product manager etc. Technical Skills or Knowledge Areas: o SQL o Python o AZURE for data engineering o Snowflake for data warehousing/ Data cube development Collaboration Teamwork: o Have familiarity with slack communication, emails, JIRA/Kanban, demoing technical deliverables. o Offshore - onshore collaboration model familiarity. Process Improvement: o Should be able to answer questions during the QA phase, review phase and be able to handle change management. Compliance Reporting: o Should be able to create and improve the low-level design document o Should be able to create stakeholder understanding, training documents for the work done by the developer. Required Qualifications: Education Certifications: BE/ Btech Experience: 2 to 4 years in data engineering - Must have SQL, Python for data engineering, any cloud-based data pipeline/ data warehouse tools preferably Azure and Snowflake. Performance Metrics: Timely delivery on the user stories Able to adapt to a fast changing and evolving technical environment. Able to grasp the proposed solution and actively participate in solutioning. Able to communicate the blockers and be able to troubleshoot the problem Able to work to independently in solving low to medium technical problems Able to communicate very everyday Team player and able to create a positive, collaborative environment.

Posted 3 months ago

Apply

4 - 8 years

27 - 31 Lacs

Mumbai, Bengaluru

Work from Office

Naukri logo

While technology is the heart of our business, a global and diverse culture is the heart of our success. We love our people and we take pride in catering them to a culture built on transparency, diversity, integrity, learning and growth. If working in an environment that encourages you to innovate and excel, not just in professional but personal life, interests you- you would enjoy your career with Quantiphi! You will be working as an Associate Data Architect within the domain, designing and delivering big data pipelines for structured and unstructured data that are running across multiple geographies, helping healthcare organizations achieve their business goals with use of data ingestion technologies, cloud services & DevOps. You will be working with Architects from other specialties such as Cloud engineering, Software engineering, ML engineering to create platforms, solutions and applications that cater to latest trends in the healthcare industry such as digital diagnosis, software as a medical product, AI marketplace, amongst others. Role & Responsibilities: Work with cloud engineers and customers to solve big data problems by developing utilities for migration, storage and processing on Google Cloud. Design and build a cloud migration strategy for cloud and on-premise applications. Diagnose and troubleshoot complex distributed systems problems and develop solutions with a significant impact at massive scale. Build tools to ingest and jobs to process several terabytes or petabytes per day. Design and develop next-gen storage and compute solutions for several large customers. Communicate with a wide set of teams, including Infrastructure, Network, Engineering, DevOps, SiteOps teams, and cloud customers. Build advanced tooling for automation, testing, monitoring, administration, and data operations across multiple cloud clusters. Better understanding of Data modeling and governance If you like wild growth and working with happy, enthusiastic over-achievers, youll enjoy your career with us !

Posted 3 months ago

Apply

10 - 14 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Our client helps forward-looking organizations across various sectors, including financial services, manufacturing, higher education, public sector, and emerging industries, to innovate and achieve a better future faster. Their innovative technology solutions enable customers to deploy new capabilities faster, deliver better user experiences, and drive operating efficiencies. This is achieved through executional excellence and mitigating the risk of change. With global headquarters and over 2,100 employees across five countries, our client is well-positioned to drive transformation and growth for its customers." Job Title: Data Analytics Leader - Alternative Investments Required Domain - Capital Market, Financial Market, Fixed Income, Equity Job Summary: We're seeking an experienced Data Analytics Leader to spearhead data analytics initiatives for alternative investments. The ideal candidate will lead a cross-functional team, develop data strategies, and drive delivery excellence. Key Responsibilities: - Lead a team of data architects, analysts, and technologists to deliver high-impact data analytics solutions. - Develop and execute data strategies aligned with business objectives. - Architect and oversee data platforms, including SQL Server, Snowflake, and Power BI. - Lead initiatives in Data Architecture, Data Modeling, and Data Warehouse (DWH) development. - Evaluate emerging technologies and recommend their integration into client solutions. Requirements: - 15+ years of experience in program or project management within capital markets and financial services. - Demonstrated expertise in SQL Server, Snowflake, Power BI, ETL processes, and Azure Cloud Data Platforms. - Hands-on experience with big data technologies and modern data architecture. - Proven track record in delivering projects emphasizing data quality, integrity, and accuracy. Preferred Qualifications: - Relevant certifications (e.g., CFA, Snowflake Certified Architect, or Microsoft Power BI Certified) - Advanced degrees (MBA, MS in Computer Science, or related fields) preferred

Posted 3 months ago

Apply

10 - 15 years

40 - 45 Lacs

Bengaluru

Work from Office

Naukri logo

Experienced data modelers, SQL, ETL, with some development background to provide defining new data schemas, data ingestion for Adobe Experience Platform customers. Interface directly with enterprise customers and collaborate with internal teams. What you'll do Interface with Adobe customers to gather requirements, design solutions & make recommendations Lead customer project conference calls or interface with a Project Manager Deliver Technical Specifications documents for customer review Strong collaboration with team software engineer consultants onshore & offshore Leverage understanding of data relationships and schemas to structure data to allow clients to perform dynamic customer-level analysis Construct processes to build Customer ID mapping files for use in building 360 degree view of customer across data sources. Leverage scripting languages to automate key processes governing data movement, cleansing, and processing activities Bill & forecast time toward customer projects Innovate on new ideas to solve customer needs Requirements 10+ years of strong experience with data transformation & ETL on large data sets Experience with designing customer centric datasets (ie, CRM, Call Center, Marketing, Offline, Point of Sale etc) 5+ years of Data Modeling experience (ie, Relational, Dimensional, Columnar, Big Data) 5+ years of complex SQL or NoSQL experience Experience in advanced Data Warehouse concepts Experience in industry ETL tools (ie, Informatica, Unifi) Experience with Business Requirements definition and management, structured analysis, process design, use case documentation Experience with Reporting Technologies (ie, Tableau, PowerBI) Experience in professional software development Demonstrate exceptional organizational skills and ability to multi-task simultaneous different customer projects Strong verbal & written communication skills to interface with Sales team & lead customers to successful outcome Must be self-managed, proactive and customer focused Degree in Computer Science, Information Systems, Data Science, or related field Special Consideration given for Experience & knowledge with Adobe Experience Cloud solutions Experience & knowledge with Digital Analytics or Digital Marketing Experience in programming languages (Python, Java, or Bash scripting) Experience with Big Data technologies (ie, Hadoop, Spark, Redshift, Snowflake, Hive, Pig etc) Experience as an enterprise technical or engineer consultant

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies