Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 10.0 years
13 - 17 Lacs
Thane
Remote
We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.
Posted 3 weeks ago
8.0 - 10.0 years
13 - 17 Lacs
Mumbai
Remote
We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.
Posted 3 weeks ago
8.0 - 10.0 years
13 - 17 Lacs
Ahmedabad
Remote
We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.
Posted 3 weeks ago
1.0 - 3.0 years
13 - 17 Lacs
Lucknow
Remote
We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.
Posted 3 weeks ago
1.0 - 3.0 years
13 - 17 Lacs
Visakhapatnam
Remote
We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.
Posted 3 weeks ago
2.0 - 5.0 years
13 - 17 Lacs
Hyderabad
Remote
We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
Department: Development Location: Pune, India Description Our bright team FastTrack their career with international exposure and ways of working based on agile development best practices from globally renowned technology consultancies. Key Responsibilities Responsibilities: Data Architect Creating data models that specify how data is formatted, stored, and retrieved inside an organisation. This comprises data models that are conceptual, logical, and physical. Creating and optimising databases, including the selection of appropriate database management systems (DBMS) and the standardisation and indexing of data. Creating and maintaining data integration processes, ETL (Extract, Transform, Load) workflows, and data pipelines to seamlessly transport data between systems. Collaborating with business analysts, data scientists, and other stakeholders to understand data requirements and align architecture with business objectives. Stay current with industry trends, best practices, and advancements in data management through continuous learning and professional development. Establishing processes for monitoring and improving the quality of data within the organisation. Implement data quality tools and practices to detect and resolve data issues. Requirements and Skills: Data Architect Prior experience in designing Data Warehouse, data modelling, database design, and data administration is required. Database Expertise: Knowledge of data warehousing ideas and proficiency in various database systems (e.g., SQL). Knowledge of data modelling tools such as Visual Paradigm is required. Knowledge of ETL methods and technologies (for example, Azure ADF, Events). Expertise writing complex stored procedures. Good understanding of Data Modelling Concepts like Star Schema ,SnowFlake etc Strong problem-solving and analytical skills are required to build effective data solutions. Excellent communication skills are required to work with cross-functional teams and convert business objectives into technical solutions. Knowledge of Data Governance: Understanding data governance principles, data security, and regulatory compliance. Knowledge of programming languages such as .net can be advantageous.,
Posted 3 weeks ago
1.0 - 5.0 years
0 Lacs
vadodara, gujarat
On-site
Job Title: Data Architect Experience : 3 to 4 Location : Vadodara , Gujarat Contact : 9845135287 Job Summary We are seeking a highly skilled and experienced Data Architect to join our team. As a Data Architect, you will play a crucial role in assessing the current state of our data landscape and working closely with the Head of Data to develop a comprehensive data strategy that aligns with our organisational goals. Your primary responsibility will be to understand and map our current data environments and then help develop a detailed roadmap that will deliver a data estate that enables our business to deliver on its core objectives. Main Duties & Responsibilities The role core duties include but are not limited to: Assess the current state of our data infrastructure, including data sources, storage systems, and data processing pipelines. Collaborate with the Data Ops Director to define and refine the data strategy, taking into account business requirements, scalability, and performance. Design and develop a cloud-based data architecture, leveraging Azure technologies such as Azure Data Lake Storage, Azure Synapse Analytics, and Azure Data Factory. Define data integration and ingestion strategies to ensure smooth and efficient data flow from various sources into the data lake and warehouse. Develop data modelling and schema design to support efficient data storage, retrieval, and analysis. Implement data governance processes and policies to ensure data quality, security, and compliance. Collaborate with cross-functional teams, including data engineers, data scientists, and business stakeholders, to understand data requirements and provide architectural guidance. Conduct performance tuning and optimization of the data infrastructure to meet business and analytical needs. Stay updated with the latest trends and advancements in data management, cloud technologies, and industry best practices. Provide technical leadership and mentorship to junior team members. Key Skills Proven work experience as a Data Architect or in a similar role, with a focus on designing and implementing cloud-based data solutions using Azure technology. Strong knowledge of data architecture principles, data modelling techniques, and database design concepts. Experience with cloud platforms, particularly Azure, and a solid understanding of their data-related services and tools. Proficiency in SQL and one or more programming languages commonly used for data processing and analysis (e.g., Python, R, Scala). Familiarity with data integration techniques, ETL/ELT processes, and data pipeline frameworks. Knowledge of data governance, data security, and compliance practices. Strong analytical and problem-solving skills, with the ability to translate business requirements into scalable and efficient data solutions. Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams and stakeholders. Ability to adapt to a fast-paced and dynamic work environment and manage multiple priorities simultaneously. Working relationships Liaison with stakeholders at all levels of the organisation Communication: Communicate with leadership and colleagues in relation to all business activities Highly articulate and able to explain complex concepts in bite size chunks Strong ability to provide clear written reporting and analysis Personal Qualities Ability to work to deadlines Commercially mindful and able to deliver solution to maximise value Good time management skills and ability to work to deadlines Strong analytical skills Accurate with excellent attention to detail Personal strength and resilience Adaptable and embraces change Reliable, conscientious and hardworking Approachable and professional Show willingness to learn however recognise limits of ability and when to seek advice Knowledge / Key Skills: Essential Desirable Experience of Azure Development and design principals Enterprise level Data warehousing design and implementation Architecture Principles Proficiency in SQL development. Familiarity with data integration techniques, ETL/ELT processes, and data pipeline frameworks. Knowledge of data governance, data security, and compliance practices. Strong experience mapping existing data landscape and developing roadmap to deliver business requirements. Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams and stakeholders. Ability to adapt to a fast-paced and dynamic work environment and manage multiple priorities simultaneously. Knowledge of Enterprise Architecture frameworks (Eg. TOGAF) Programming languages such as R, Python, Scala etc Job Type: Full-time Experience: total work: 1 year (Preferred) Work Location: In person,
Posted 3 weeks ago
10.0 - 17.0 years
0 Lacs
hyderabad, telangana
On-site
We have an exciting opportunity for an ETL Data Architect position with an AI-ML driven SaaS Solution Product Company in Hyderabad. As an ETL Data Architect, you will play a crucial role in designing and implementing a robust Data Access Layer to provide consistent data access needs to the underlying heterogeneous storage layer. You will also be responsible for developing and enforcing data governance policies to ensure data security, quality, and compliance across all systems. In this role, you will lead the architecture and design of data solutions that leverage the latest tech stack and AWS cloud services. Collaboration with product managers, tech leads, and cross-functional teams will be essential to align data strategy with business objectives. Additionally, you will oversee data performance optimization, scalability, and reliability of data systems while guiding and mentoring team members on data architecture, design, and problem-solving. The ideal candidate should have at least 10 years of experience in data-related roles, with a minimum of 5 years in a senior leadership position overseeing data architecture and infrastructure. A deep background in designing and implementing enterprise-level data infrastructure, preferably in a SaaS environment, is required. Extensive knowledge of data architecture principles, data governance frameworks, security protocols, and performance optimization techniques is essential. Hands-on experience with AWS services such as RDS, Redshift, S3, Glue, Document DB, as well as other services like MongoDB, Snowflake, etc., is highly desirable. Familiarity with big data technologies (e.g., Hadoop, Spark) and modern data warehousing solutions is a plus. Proficiency in at least one programming language (e.g., Node.js, Java, Golang, Python) is a must. Excellent communication skills are crucial in this role, with the ability to translate complex technical concepts to non-technical stakeholders. Proven leadership experience, including team management and cross-functional collaboration, is also required. A Bachelor's degree in computer science, Information Systems, or related field is necessary, with a Master's degree being preferred. Preferred qualifications include experience with Generative AI and Large Language Models (LLMs) and their applications in data solutions, as well as familiarity with financial back-office operations and the FinTech domain. Stay updated on emerging trends in data technology, particularly in AI/ML applications for finance. Industry: IT Services and IT Consulting,
Posted 3 weeks ago
8.0 - 10.0 years
16 - 20 Lacs
Bengaluru
Work from Office
Company Background - GET provides skills and expertise to the Oil & Gas industry. - It provisions operational & supervisory field support, remote engineering, technical consulting and training services. Role And Responsibilities : - Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions. - Mine and analyze data from company databases to drive optimization and improvement of business strategies. - Assess the effectiveness and accuracy of data sources and data gathering techniques. - Develop custom data models and algorithms to apply to data sets. - Use predictive modelling to increase and optimize business outcomes. - Work individually or with extended teams to operationalize models & algorithms into structured software, programs or operational processes - Coordinate with different functional teams to implement models and monitor outcomes. - Develop processes and tools to monitor and analyze model performance and data accuracy. - Provide recommendations to business users based upon data/ model outcomes, and implement recommendations including changes in operational processes, technology or data management - Primary area of focus: PSCM/ VMI business; secondary area of focus: ICS KPI s - Business improvements pre & post (either operational program, algorithm, model or resultant software). Improvements measured in time and/or dollar savings - Satisfaction score of business users (of either operational program, algorithm, model or resultant software). Qualifications And Education Requirements - Graduate BSc/BTech in applied sciences with year 2 statistics courses. - Relevant Internship (at least 2 months) OR Relevant Certifications of Preferred Skills. Preferred Skills - Strong problem-solving skills with an emphasis on business development. - Experience the following coding languages : - R or python (Data Cleaning, Statistical and Modelling packages) , SQL, VBA and DAX (PowerBI) - Knowledge of working with and creating data architectures. - Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks. - Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests, and proper usage, etc.) and experience with applications. - Excellent written and verbal communication skills for coordinating across teams. - A drive to learn and master new technologies and technique Compliance Requirements : - GET has a Business Ethics Policy which provides guidance to all employees in their day-to-day roles as well as helping you and the business comply with the law at all times. The incumbent must read, understand and comply with, at all times, the policy along with all other corresponding policies, procedures and directives. QHSE Responsibilities - Demonstrate a personal commitment to Quality, Health, Safety and the Environment. - Apply GET, and where appropriate Client Company s, Quality, Health, Safety & Environment Policies and Safety Management Systems. - Promote a culture of continuous improvement, and lead by example to ensure company goals are achieved and exceeded. Skills - Analytical skills - Negotiation - Convincing skills Key Competencies - Never give up attitude - Flexible - Eye to detail
Posted 3 weeks ago
3.0 - 8.0 years
5 - 10 Lacs
Bengaluru
Remote
Role : Data Modeler Lead Location : Remote Experience : 10years+ Healthcare experience is Mandatory Position Overview : We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities : Data Architecture & Modeling : - Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management - Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) - Create and maintain data lineage documentation and data dictionaries for healthcare datasets - Establish data modeling standards and best practices across the organization Technical Leadership : - Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica - Architect scalable data solutions that handle large volumes of healthcare transactional data - Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise : - Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) - Design data models that support analytical, reporting and AI/ML needs - Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations - Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality : - Implement data governance frameworks specific to healthcare data privacy and security requirements - Establish data quality monitoring and validation processes for critical health plan metrics - Lead eAorts to standardize healthcare data definitions across multiple systems and data sources Required Qualifications : Technical Skills : - 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data - Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches - Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing - Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) - Proficiency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge : - Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data - Experience with healthcare data standards and medical coding systems - Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) - Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication : - Proven track record of leading data modeling projects in complex healthcare environments - Strong analytical and problem-solving skills with ability to work with ambiguous requirements - Excellent communication skills with ability to explain technical concepts to business stakeholders - Experience mentoring team members and establishing technical standards Preferred Qualifications : - Experience with Medicare Advantage, Medicaid, or Commercial health plan operations - Cloud platform certifications (AWS, Azure, or GCP) - Experience with real-time data streaming and modern data lake architectures - Knowledge of machine learning applications in healthcare analytics - Previous experience in a lead or architect role within healthcare organization
Posted 3 weeks ago
8.0 - 10.0 years
16 - 20 Lacs
Chennai
Work from Office
IT & Technology Senior Manager Data Analytics Company Background - GET provides skills and expertise to the Oil & Gas industry. - It provisions operational & supervisory field support, remote engineering, technical consulting and training services. Role And Responsibilities : - Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions. - Mine and analyze data from company databases to drive optimization and improvement of business strategies. - Assess the effectiveness and accuracy of data sources and data gathering techniques. - Develop custom data models and algorithms to apply to data sets. - Use predictive modelling to increase and optimize business outcomes. - Work individually or with extended teams to operationalize models & algorithms into structured software, programs or operational processes - Coordinate with different functional teams to implement models and monitor outcomes. - Develop processes and tools to monitor and analyze model performance and data accuracy. - Provide recommendations to business users based upon data/ model outcomes, and implement recommendations including changes in operational processes, technology or data management - Primary area of focus: PSCM/ VMI business; secondary area of focus: ICS KPI s - Business improvements pre & post (either operational program, algorithm, model or resultant software). Improvements measured in time and/or dollar savings - Satisfaction score of business users (of either operational program, algorithm, model or resultant software). Qualifications And Education Requirements - Graduate BSc/BTech in applied sciences with year 2 statistics courses. - Relevant Internship (at least 2 months) OR Relevant Certifications of Preferred Skills. Preferred Skills - Strong problem-solving skills with an emphasis on business development. - Experience the following coding languages : - R or python (Data Cleaning, Statistical and Modelling packages) , SQL, VBA and DAX (PowerBI) - Knowledge of working with and creating data architectures. - Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks. - Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests, and proper usage, etc.) and experience with applications. - Excellent written and verbal communication skills for coordinating across teams. - A drive to learn and master new technologies and technique Compliance Requirements : - GET has a Business Ethics Policy which provides guidance to all employees in their day-to-day roles as well as helping you and the business comply with the law at all times. The incumbent must read, understand and comply with, at all times, the policy along with all other corresponding policies, procedures and directives. QHSE Responsibilities - Demonstrate a personal commitment to Quality, Health, Safety and the Environment. - Apply GET, and where appropriate Client Company s, Quality, Health, Safety & Environment Policies and Safety Management Systems. - Promote a culture of continuous improvement, and lead by example to ensure company goals are achieved and exceeded. Skills - Analytical skills - Negotiation - Convincing skills Key Competencies - Never give up attitude - Flexible - Eye to detail Experience: Minimum 8 Years Of Experience
Posted 3 weeks ago
3.0 - 8.0 years
9 - 14 Lacs
Mumbai
Remote
Role : Data Modeler Lead Location : Remote Experience : 10years+ Healthcare experience is Mandatory Position Overview : We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities : Data Architecture & Modeling : - Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management - Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) - Create and maintain data lineage documentation and data dictionaries for healthcare datasets - Establish data modeling standards and best practices across the organization Technical Leadership : - Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica - Architect scalable data solutions that handle large volumes of healthcare transactional data - Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise : - Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) - Design data models that support analytical, reporting and AI/ML needs - Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations - Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality : - Implement data governance frameworks specific to healthcare data privacy and security requirements - Establish data quality monitoring and validation processes for critical health plan metrics - Lead eAorts to standardize healthcare data definitions across multiple systems and data sources Required Qualifications : Technical Skills : - 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data - Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches - Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing - Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) - Proficiency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge : - Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data - Experience with healthcare data standards and medical coding systems - Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) - Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication : - Proven track record of leading data modeling projects in complex healthcare environments - Strong analytical and problem-solving skills with ability to work with ambiguous requirements - Excellent communication skills with ability to explain technical concepts to business stakeholders - Experience mentoring team members and establishing technical standards Preferred Qualifications : - Experience with Medicare Advantage, Medicaid, or Commercial health plan operations - Cloud platform certifications (AWS, Azure, or GCP) - Experience with real-time data streaming and modern data lake architectures - Knowledge of machine learning applications in healthcare analytics - Previous experience in a lead or architect role within healthcare organization
Posted 3 weeks ago
10.0 - 12.0 years
20 - 25 Lacs
Mumbai
Work from Office
Key Responsibilities : As an Enterprise Data Architect, you will : - Lead Data Architecture : Design, develop, and implement comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. - Data Transformation & ETL : Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. - Customer-Centric Data Design : Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. - Data Modeling : Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models, to support analytical and operational needs. - Query Optimization : Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. - Data Warehouse Management : Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. - Tool Evaluation & Implementation : Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. - Business Requirements & Analysis : Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. - Reporting & Analytics Support : Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. - Software Development Practices : Apply professional software development principles and best practices to data solution delivery. - Stakeholder Collaboration : Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. - Project Management & Multi-tasking : Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. - Strategic Thinking & Leadership : Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements : of strong experience with data transformation & ETL on large data sets. - Experience with designing customer-centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale, etc.). - 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). - 5+ years of complex SQL or NoSQL experience. - Extensive experience in advanced Data Warehouse concepts. - Proven experience with industry ETL tools (i.e., Informatica, Unifi). - Solid experience with Business Requirements definition and management, structured analysis, process design, and use case documentation. - Experience with Reporting Technologies (i.e., Tableau, PowerBI). - Demonstrated experience in professional software development. - Exceptional organizational skills and ability to multi-task simultaneous different customer projects. - Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. - Must be self-managed, proactive, and customer-focused. Technical Skills : - Cloud Platforms : Microsoft Azure - Data Warehousing : Snowflake - ETL Methodologies : Extensive experience in ETL processes and tools - Data Transformation : Large-scale data transformation - Data Modeling : Relational, Dimensional, Columnar, Big Data - Query Languages : Complex SQL, NoSQL - ETL Tools : Informatica, Unifi (or similar enterprise-grade tools) - Reporting & BI : Tableau, PowerBI
Posted 3 weeks ago
8.0 - 10.0 years
16 - 20 Lacs
Kolkata
Work from Office
Role And Responsibilities : - Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions. - Mine and analyze data from company databases to drive optimization and improvement of business strategies. - Assess the effectiveness and accuracy of data sources and data gathering techniques. - Develop custom data models and algorithms to apply to data sets. - Use predictive modelling to increase and optimize business outcomes. - Work individually or with extended teams to operationalize models & algorithms into structured software, programs or operational processes - Coordinate with different functional teams to implement models and monitor outcomes. - Develop processes and tools to monitor and analyze model performance and data accuracy. - Provide recommendations to business users based upon data/ model outcomes, and implement recommendations including changes in operational processes, technology or data management - Primary area of focus: PSCM/ VMI business; secondary area of focus: ICS KPI s - Business improvements pre & post (either operational program, algorithm, model or resultant software). Improvements measured in time and/or dollar savings - Satisfaction score of business users (of either operational program, algorithm, model or resultant software). Qualifications And Education Requirements - Graduate BSc/BTech in applied sciences with year 2 statistics courses. - Relevant Internship (at least 2 months) OR Relevant Certifications of Preferred Skills. Preferred Skills - Strong problem-solving skills with an emphasis on business development. - Experience the following coding languages : - R or python (Data Cleaning, Statistical and Modelling packages) , SQL, VBA and DAX (PowerBI) - Knowledge of working with and creating data architectures. - Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks. - Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests, and proper usage, etc.) and experience with applications. - Excellent written and verbal communication skills for coordinating across teams. - A drive to learn and master new technologies and technique Compliance Requirements : - GET has a Business Ethics Policy which provides guidance to all employees in their day-to-day roles as well as helping you and the business comply with the law at all times. The incumbent must read, understand and comply with, at all times, the policy along with all other corresponding policies, procedures and directives. QHSE Responsibilities - Demonstrate a personal commitment to Quality, Health, Safety and the Environment. - Apply GET, and where appropriate Client Company s, Quality, Health, Safety & Environment Policies and Safety Management Systems. - Promote a culture of continuous improvement, and lead by example to ensure company goals are achieved and exceeded. Skills - Analytical skills - Negotiation - Convincing skills Key Competencies - Never give up attitude - Flexible - Eye to detail
Posted 3 weeks ago
3.0 - 8.0 years
9 - 14 Lacs
Kolkata
Work from Office
Position Overview : We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities : Data Architecture & Modeling : - Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management - Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) - Create and maintain data lineage documentation and data dictionaries for healthcare datasets - Establish data modeling standards and best practices across the organization Technical Leadership : - Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica - Architect scalable data solutions that handle large volumes of healthcare transactional data - Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise : - Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) - Design data models that support analytical, reporting and AI/ML needs - Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations - Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality : - Implement data governance frameworks specific to healthcare data privacy and security requirements - Establish data quality monitoring and validation processes for critical health plan metrics - Lead eAorts to standardize healthcare data definitions across multiple systems and data sources Required Qualifications : Technical Skills : - 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data - Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches - Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing - Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) - Proficiency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge : - Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data - Experience with healthcare data standards and medical coding systems - Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) - Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication : - Proven track record of leading data modeling projects in complex healthcare environments - Strong analytical and problem-solving skills with ability to work with ambiguous requirements - Excellent communication skills with ability to explain technical concepts to business stakeholders - Experience mentoring team members and establishing technical standards Preferred Qualifications : - Experience with Medicare Advantage, Medicaid, or Commercial health plan operations - Cloud platform certifications (AWS, Azure, or GCP) - Experience with real-time data streaming and modern data lake architectures - Knowledge of machine learning applications in healthcare analytics - Previous experience in a lead or architect role within healthcare organization
Posted 3 weeks ago
8.0 - 10.0 years
16 - 20 Lacs
Mumbai
Work from Office
IT & Technology Senior Manager Data Analytics Company Background - GET provides skills and expertise to the Oil & Gas industry. - It provisions operational & supervisory field support, remote engineering, technical consulting and training services. Role And Responsibilities : - Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions. - Mine and analyze data from company databases to drive optimization and improvement of business strategies. - Assess the effectiveness and accuracy of data sources and data gathering techniques. - Develop custom data models and algorithms to apply to data sets. - Use predictive modelling to increase and optimize business outcomes. - Work individually or with extended teams to operationalize models & algorithms into structured software, programs or operational processes - Coordinate with different functional teams to implement models and monitor outcomes. - Develop processes and tools to monitor and analyze model performance and data accuracy. - Provide recommendations to business users based upon data/ model outcomes, and implement recommendations including changes in operational processes, technology or data management - Primary area of focus: PSCM/ VMI business; secondary area of focus: ICS KPI s - Business improvements pre & post (either operational program, algorithm, model or resultant software). Improvements measured in time and/or dollar savings - Satisfaction score of business users (of either operational program, algorithm, model or resultant software). Qualifications And Education Requirements - Graduate BSc/BTech in applied sciences with year 2 statistics courses. - Relevant Internship (at least 2 months) OR Relevant Certifications of Preferred Skills. Preferred Skills - Strong problem-solving skills with an emphasis on business development. - Experience the following coding languages : - R or python (Data Cleaning, Statistical and Modelling packages) , SQL, VBA and DAX (PowerBI) - Knowledge of working with and creating data architectures. - Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks. - Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests, and proper usage, etc.) and experience with applications. - Excellent written and verbal communication skills for coordinating across teams. - A drive to learn and master new technologies and technique Compliance Requirements : - GET has a Business Ethics Policy which provides guidance to all employees in their day-to-day roles as well as helping you and the business comply with the law at all times. The incumbent must read, understand and comply with, at all times, the policy along with all other corresponding policies, procedures and directives. QHSE Responsibilities - Demonstrate a personal commitment to Quality, Health, Safety and the Environment. - Apply GET, and where appropriate Client Company s, Quality, Health, Safety & Environment Policies and Safety Management Systems. - Promote a culture of continuous improvement, and lead by example to ensure company goals are achieved and exceeded. Skills - Analytical skills - Negotiation - Convincing skills Key Competencies - Never give up attitude - Flexible - Eye to detail
Posted 3 weeks ago
10.0 - 15.0 years
35 - 40 Lacs
Bengaluru
Work from Office
Tech Mahindra Ltd. is looking for Data Architect - Data Governance - Collibra & Purview to join our dynamic team and embark on a rewarding career journey. A Data Architect is a professional who is responsible for designing, building, and maintaining an organization's data architecture. 1. Designing and implementing data models, data integration solutions, and data management systems that ensure data accuracy, consistency, and security. 2. Developing and maintaining data dictionaries, metadata, and data lineage documents to ensure data governance and compliance. 3. Data Architect should have a strong technical background in data architecture and management, as well as excellent communication skills. 4. Strong problem - solving skills and the ability to think critically are also essential to identify and implement solutions to complex data issues.
Posted 3 weeks ago
1.0 - 4.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Req ID: 325274 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Adobe to join our team in Hyderabad, Telangana (IN-TG), India (IN). NTT DATA Services currently seeks Adobe Developer to join our team in Hyderabad Experience in developing digital marketing / digital analytics solutions using Adobe products. Experience in Adobe Experience Cloud products and recent experience with Adobe Experience Platform or similar CDP Good knowledge of Data Science workspace and building intelligent Services on AEP Strong knowledge of datasets in Adobe Experience Platform, load data into Platform through data source connectors, APIs, and streaming ingestion connectors Experience in creating all required Adobe XDM (Experience Data Model) in JSON based on approved data model for all loading data files. Knowledge on utilizing Adobe Experience Platform (AEP) UI & POSTMAN to automate all customer schema data lake & profile design setups within each sandbox environment. Experience in configuration within Adobe Experience Platform all necessary identities & privacy settings and creating new segments within AEP to meet customer use cases. Test/Validate the segments with the required destinations. Managing customer data by using Real-Time Customer Data Platform (RTCDP), and analyze customer data by using Customer Journey Analytics (CJA) Experience with creating connection, data views, and dashboard in CJA Hands-on experience in configuration and integration of Adobe Marketing Cloud modules like Audience Manager, Analytics, Campaign and Target Adobe Experience Cloud tool certifications (Adobe Campaign, Adobe Experience Platform, Adobe Target, Adobe Analytics) are desirable. Proven ability to communicate in both verbal and writing in a high performance, collaborative environment. Experience with data analysis, modeling, and mapping to coordinate closely with Data Architect(s) Build the necessary schemas, workflows to ingest customers data, transform the data & load the data into AEP successfully. Build Audiences (segmentations) and create necessary pipeline for Destination activation.
Posted 3 weeks ago
6.0 - 10.0 years
8 - 12 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Diacto Technologies is looking for Data Architect (Snowflake) to join our dynamic team and embark on a rewarding career journey. A Data Architect is a professional who is responsible for designing, building, and maintaining an organization's data architecture. 1. Designing and implementing data models, data integration solutions, and data management systems that ensure data accuracy, consistency, and security. 2. Developing and maintaining data dictionaries, metadata, and data lineage documents to ensure data governance and compliance. 3. Data Architect should have a strong technical background in data architecture and management, as well as excellent communication skills. 4. Strong problem - solving skills and the ability to think critically are also essential to identify and implement solutions to complex data issues.
Posted 3 weeks ago
5.0 - 9.0 years
7 - 11 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Diacto Technologies is looking for Data Architect (Data Bricks) to join our dynamic team and embark on a rewarding career journey. A Data Architect is a professional who is responsible for designing, building, and maintaining an organization's data architecture. 1. Designing and implementing data models, data integration solutions, and data management systems that ensure data accuracy, consistency, and security. 2. Developing and maintaining data dictionaries, metadata, and data lineage documents to ensure data governance and compliance. 3. Data Architect should have a strong technical background in data architecture and management, as well as excellent communication skills. 4. Strong problem - solving skills and the ability to think critically are also essential to identify and implement solutions to complex data issues.
Posted 3 weeks ago
8.0 - 13.0 years
25 - 30 Lacs
Pune, Chennai, Bengaluru
Work from Office
Role : Senior Databricks Engineer / Databricks Technical Lead/ Data Architect. Experience : 8-15 years. Location : Bangalore, Chennai, Delhi, Pune. Primary Roles And Responsibilities : - Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack. - Ability to provide solutions that are forward-thinking in data engineering and analytics space. - Collaborate with DW/BI leads to understand new ETL pipeline development requirements. - Triage issues to find gaps in existing pipelines and fix the issues. - Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs. - Help joiner team members to resolve issues and technical challenges. - Drive technical discussion with client architect and team members. - Orchestrate the data pipelines in scheduler via Airflow. Skills And Qualifications : - Bachelor's and/or masters degree in computer science or equivalent experience. - Must have total 6+ yrs of IT experience and 3+ years' experience in Data warehouse/ETL projects. - Deep understanding of Star and Snowflake dimensional modelling. - Strong knowledge of Data Management principles. - Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture. - Should have hands-on experience in SQL, Python and Spark (PySpark). - Candidate must have experience in AWS/ Azure stack. - Desirable to have ETL with batch and streaming (Kinesis). - Experience in building ETL / data warehouse transformation processes. - Experience with Apache Kafka for use with streaming data / event-based data. - Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala). - Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J). - Experience working with structured and unstructured data including imaging & geospatial data. - Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. - Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot. - Databricks Certified Data Engineer Associate/Professional Certification (Desirable). - Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects. - Should have experience working in Agile methodology. - Strong verbal and written communication skills. - Strong analytical and problem-solving skills with a high attention to detail.
Posted 3 weeks ago
19.0 - 23.0 years
60 - 75 Lacs
Bengaluru, Delhi / NCR
Hybrid
Preferred candidate profile : Solution Architect Data to lead the design and implementation of scalable, secure, and high-performance data solutions.You will play a key role in defining the data architecture and strategy across enterprise platforms, ensuring alignment with business goals and IT standards. 18+ years of IT experience with Atleast 5 years of worknig as an Architect. Experience working as a Data Architect. Experience architecting reporting and analytics solutions. Expereince architecting AI & ML solutoins. Experience with Databricks.
Posted 4 weeks ago
12.0 - 15.0 years
40 - 50 Lacs
Bengaluru
Work from Office
JD: Snowflake resource with good knowledge in architecting environments and hands on experience in snowflake as well. So we might have to look for 12 to 15 years experience resources with knowledge in Snowflake and all its features include DBT jinga. Should also have strong verbal communication and should be able to think on the organization level and futuristic thoughts.
Posted 1 month ago
13.0 - 15.0 years
20 - 25 Lacs
Bengaluru
Work from Office
[{"Salary":null , "Remote_Job":false , "Posting_Title":"Data Architect" , "Is_Locked":false , "City":"Bangalore" , "Industry":"Technology" , "Job_Description":" KeyResponsibilities: Design and architect end-to-end data solutions usingMicrosoft Fabric, Azure Data Factory, Azure Synapse Analytics, and other Azuredata services Develop comprehensive data architecture blueprints,including logical and physical data models Create data integration patterns and establish bestpractices for data ingestion, transformation, and consumption Design data lake and lakehousearchitectures optimized for performance, cost, and governance Lead implementation of Microsoft Fabric solutions includingData Factory, Data Activator, Power BI, and Real-Time Analytics Design and implement medallion architecture (Bronze,Silver, Gold layers) within Fabric Optimize OneLake storage and data organization strategies Configure and manage Fabricworkspaces, capacity, and security models Architect complex ETL/ELT pipelines using Azure DataFactory and Fabric Data Factory Design real-time and batch data processing solutions Implement data quality frameworks andmonitoring solutions RequiredQualifications: Overall, 13-15 years of experience; 5+ years ofexperience in data architecture and analytics solutions Hands-on experience with MicrosoftFabric, Expert-level proficiency in Azure data services(Azure Data Factory, Synapse Analytics, Azure SQL Database, Cosmos DB) Strong experience with Power BI development andadministration Proficiency in SQL, Python, and/or Scala for dataprocessing Experience with Delta Lake and Apache Spark Proficiency in data cataloging tools and techniques Experience in data governance using Purview or UnityCatalog like tools Expertise in Azure DataBricks in conjunction with AzureData Factory and Synapse Implementation and optimization using Medallionarchitecture Experience with EventHub and IoT data (streaming) Strong understanding of Azure cloud architecture andservices Knowledge of Git, Azure DevOps, andCI/CD pipelines for data solutions Understanding of containerization andorchestration technologies Hands-on experience with Fabric Data Factory pipelines Experience with Fabric Data Activator for real-timemonitoring Knowledge of Fabric Real-Time Analytics (KQL databases) Understanding of Fabric capacity management andoptimization Experience with OneLake and Fabric
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough