Jobs
Interviews

1769 Data Architecture Jobs - Page 9

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 12.0 years

6 - 9 Lacs

Hyderabad

Work from Office

Career Category Engineering Job Description ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world s toughest diseases, and make people s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what s known today. ABOUT THE ROLE Role Description: The role is responsible for developing and maintaining the data architecture of the Enterprise Data Fabric. Data Architecture includes the activities required for data flow design, data modeling, physical data design, query performance optimization. The Data Modeler position is responsible for developing business information models by studying the business, our data, and the industry. This role involves creating data models to realize a connected data ecosystem that empowers consumers. The Data Modeler drives cross-functional data interoperability, enables efficient decision-making, and supports AI usage of Foundational Data. Roles & Responsibilities: Develop and maintain conceptual logical, and physical data models and to support business needs Contribute to and Enforce data standards, governance policies, and best practices Design and manage metadata structures to enhance information retrieval and usability Maintain comprehensive documentation of the architecture, including principles, standards, and models Evaluate and recommend technologies and tools that best fit the solution requirements Drive continuous improvement in the architecture by identifying opportunities for innovation and efficiency Basic Qualifications and Experience: Doctorate / Master s / Bachelor s degree with 8-12 years of experience in Computer Science, IT or related field Functional Skills: Must-Have Skills Data Modeling: Proficiency in creating conceptual, logical, and physical data models to represent information structures. Ability to interview and communicate with business Subject Matter experts to develop data models that are useful for their analysis needs. Metadata Management : Knowledge of metadata standards, taxonomies, and ontologies to ensure data consistency and quality. Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), performance tuning on big data processing Implementing Data testing and data quality strategies. Good-to-Have Skills: Experience with Graph technologies such as Stardog, Allegrograph, Marklogic Professional Certifications (please mention if the certification is preferred or mandatory for the role): Certifications in Databricks are desired Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation. .

Posted 1 week ago

Apply

8.0 - 17.0 years

13 - 17 Lacs

Hyderabad

Work from Office

Career Category Engineering Job Description [Role Name : IS Architecture] Job Posting Title: Data Architect Workday Job Profile : Principal IS Architect Department Name: Digital, Technology Innovation Role GCF: 06A ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world s toughest diseases, and make people s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what s known today. ABOUT THE ROLE Role Description: The role is responsible for developing and maintaining the data architecture of the Enterprise Data Fabric. Data Architecture includes the activities required for data flow design, data modeling, physical data design, query performance optimization. The Data Architect is a senior-level position responsible for developing business information models by studying the business, our data, and the industry. This role involves creating data models to realize a connected data ecosystem that empowers consumers. The Data Architect drives cross-functional data interoperability, enables efficient decision-making, and supports AI usage of Foundational Data. This role will manage a team of Data Modelers. Roles Responsibilities: Provide oversight to data modeling team members. Develop and maintain conceptual logical, and physical data models and to support business needs Establish and enforce data standards, governance policies, and best practices Design and manage metadata structures to enhance information retrieval and usability Maintain comprehensive documentation of the architecture, including principles, standards, and models Evaluate and recommend technologies and tools that best fit the solution requirements Evaluate emerging technologies and assess their potential impact. Drive continuous improvement in the architecture by identifying opportunities for innovation and efficiency Basic Qualifications and Experience: [GCF Level 6A] Doctorate Degree and 8 years of experience in Computer Science, IT or related field OR Master s degree with 12 - 15 years of experience in Computer Science, IT or related field OR Bachelor s degree with 14 - 17 years of experience in Computer Science, IT or related field Functional Skills: Must-Have Skills : Data Modeling: Expert in creating conceptual, logical, and physical data models to represent information structures. Ability to interview and communicate with business Subject Matter experts to develop data models that are useful for their analysis needs. Metadata Management : Knowledge of metadata standards, taxonomies, and ontologies to ensure data consistency and quality. Information Governance: Familiarity with policies and procedures for managing information assets, including security, privacy, and compliance. Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), performance tuning on big data processing Good-to-Have Skills: Experience with Graph technologies such as Stardog, Allegrograph, Marklogic Professional Certifications Certifications in Databricks are desired Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated awareness of presentation skills Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .

Posted 1 week ago

Apply

8.0 - 12.0 years

7 - 11 Lacs

Hyderabad

Work from Office

Career Category Engineering Job Description ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world s toughest diseases, and make people s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what s known today. ABOUT THE ROLE Role Description: The role is responsible for developing and maintaining the data architecture of the Enterprise Data Fabric. Data Architecture includes the activities required for data flow design, data modeling, physical data design, query performance optimization. The Data Modeler position is responsible for developing business information models by studying the business, our data, and the industry. This role involves creating data models to realize a connected data ecosystem that empowers consumers. The Data Modeler drives cross-functional data interoperability, enables efficient decision-making, and supports AI usage of Foundational Data. Roles & Responsibilities: Develop and maintain conceptual logical, and physical data models and to support business needs Contribute to and Enforce data standards, governance policies, and best practices Design and manage metadata structures to enhance information retrieval and usability Maintain comprehensive documentation of the architecture, including principles, standards, and models Evaluate and recommend technologies and tools that best fit the solution requirements Drive continuous improvement in the architecture by identifying opportunities for innovation and efficiency Basic Qualifications and Experience: Doctorate / Master s / Bachelor s degree with 8-12 years of experience in Computer Science, IT or related field Functional Skills: Must-Have Skills Data Modeling: Proficiency in creating conceptual, logical, and physical data models to represent information structures. Ability to interview and communicate with business Subject Matter experts to develop data models that are useful for their analysis needs. Metadata Management : Knowledge of metadata standards, taxonomies, and ontologies to ensure data consistency and quality. Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), performance tuning on big data processing Implementing Data testing and data quality strategies. Good-to-Have Skills: Experience with Graph technologies such as Stardog, Allegrograph, Marklogic Professional Certifications (please mention if the certification is preferred or mandatory for the role): Certifications in Databricks are desired Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. EQUAL OPPORTUNITY STATEMENT .

Posted 1 week ago

Apply

7.0 - 12.0 years

40 - 45 Lacs

Chennai

Hybrid

Role : Data Engineer/Architect Experience : 7 to 16 Years Location : Chennai(3 days office in a week) Mandatory Skills : Data Warehousing, Data Modelling, Snowflake, Data Build Tool(DBT), SQL, Any cloud(AWS/Azure/GCP), Python/Pyspark(Good to have) Overview of the requirement: We are looking for a skilled Data Architect/ Sr. Data Engineer to design and implement data solutions supporting Marketing, Sales, and Customer Service areas. The ideal candidate will have experience with DBT , Snowflake , Python(Good to have) and Azure/AWS/GCP , along with a strong foundation Cloud Platforms . You will be responsible for developing scalable, efficient data architectures that enable personalized customer experiences and advanced analytics. Roles and Responsibility: Implement and maintain data warehousing solutions in Snowflake to handle large-scale data processing and analytics needs. Optimize workflows using DBT to streamline data transformation and modeling processes. Strong expertise in SQL with hands-on experience in querying, transforming, and analysing large datasets. Solid understanding of data profiling, validation, and cleansing techniques. Strong understanding of data modeling , ETL/ELT processes , and modern data architecture frameworks. Expertise with cloud data platforms (Azure/AWS/GCP) for large-scale data processing. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain large-scale data warehouses on Snowflake. Optimize database performance and ensure data quality. Troubleshoot and resolve technical issues related to data processing and analysis. Participate in code reviews and contribute to improving overall code quality. Job Requirements: Strong understanding of data modeling and ETL concepts. Experience with Snowflake and DBT is highly desirable. Strong expertise in SQL with hands-on experience in querying, transforming, and analysing large datasets. Expertise with cloud data platforms (Azure/AWS/GCP) for large-scale data processing. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills. Familiarity with agile development methodologies.

Posted 1 week ago

Apply

4.0 - 7.0 years

12 - 22 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Job description Location: Kumbalgodu, Kengeri, Bangalore (Onsite) Type: Full-time | Monday to Saturday Experience: 3+ years in ERP Implementation We at Girish Exports are transitioning from a legacy ERP (Visual Gems) to a custom-built system on Zoho Creator . We're looking for a practical, hands-on ERP Implementation Lead who understands real-world operations and knows how to bring tech and people together. What Youll Do: Lead the planning and rollout of our ERP system across departments Work closely with developers and business users to map operations into usable system workflows Design modular data flows that connect upstream and downstream processes Collaborate with department heads to drive adoption and coordinate training plans Ensure the ERP system supports teams like merchandising, production, stores, finance, HR, and maintenance Identify bottlenecks, simplify processes, and make sure solutions work in the real world , not just on paper Occasional travel will be required factory units expenses will be fully covered by the company You Should Have: 3+ years of ERP implementation experience in complex, real-world setups Mandatory hands-on experience with Zoho Creator Strong understanding of operational workflows, data architecture, and process mapping Ability to work with non-tech users (shop floor, stores, admin) and ensure smooth adoption Excellent communication and cross-functional collaboration skills A mindset focused on outcomes, not just systems Why Join Us? If you're excited by the idea of driving real change and making a tangible impact on day-to-day operations, this is the role for you. You'll help shape a custom-built ERP system from the ground up and if using data-driven insights to improve how things actually work on the ground excites you, you'll thrive here.

Posted 1 week ago

Apply

5.0 - 10.0 years

12 - 24 Lacs

Kochi

Work from Office

Responsibilities: * Ensure data accuracy & compliance with regulatory standards. * Develop data strategy, governance & quality plans. * Manage metadata, stewardship & lineage. * Collaborate on enterprise-wide data initiatives. Remote work & Saudi Annual bonus Health insurance

Posted 1 week ago

Apply

12.0 - 15.0 years

20 - 25 Lacs

Hyderabad

Work from Office

We are seeking a seasoned Principal Architect - Solutions to drive the architecture, development and implementation of data solutions to Amgen functional groups. The ideal candidate able to work in large scale Data Analytic initiatives, engage and work along with Business, Program Management, Data Engineering and Analytic Engineering teams. Be champions of enterprise data analytic strategy, data architecture blueprints and architectural guidelines. As a Principal Architect, you will play a crucial role in designing, building, and optimizing data solutions to Amgen functional groups such as R&D, Operations and GCO. Roles & Responsibilities: Implement and manage large scale data analytic solutions to Amgen functional groups that align with the Amgen Data strategy Collaborate with Business, Program Management, Data Engineering and Analytic Engineering teams to deliver data solutions Responsible for design, develop, optimize , delivery and support of Data solutions on AWS and Databricks architecture Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions. Provide expert guidance and mentorship to the team members, fostering a culture of innovation and best practices. Be passionate and hands-on to quickly experiment with new data related technologies Define guidelines, standards, strategies, security policies and change management policies to support the Enterprise Data platform. Collaborate and align with EARB, Cloud Infrastructure, Security and other technology leaders on Enterprise Data Architecture changes Work with different project and application groups to drive growth of the Enterprise Data Platform using effective written/verbal communication skills, and lead demos at different roadmap sessions Overall management of the Enterprise Data Platform on AWS environment to ensure that the service delivery is cost effective and business SLAs around uptime, performance and capacity are met Ensure scalability, reliability, and performance of data platforms by implementing best practices for architecture, cloud resource optimization, and system tuning. Collaboration with RunOps engineers to continuously increase our ability to push changes into production with as little manual overhead and as much speed as possible. Maintain knowledge of market trends and developments in data integration, data management and analytics software/tools Work as part of team in a SAFe Agile/Scrum model Basic Qualifications and Experience: Master s degree with 12 - 15 years of experience in Computer Science, IT or related field OR Bachelor s degree with 14 - 17 years of experience in Computer Science, IT or related field Functional Skills: Must-Have Skills: 8+ years of hands-on experience in Data integrations, Data Management and BI technology stack. Strong experience with one or more Data Management tools such as AWS data lake, Snowflake or Azure Data Fabric Expert-level proficiency with Databricks and experience in optimizing data pipelines and workflows in Databricks environments. Strong experience with Python, PySpark , and SQL for building scalable data workflows and pipelines. Experience with Apache Spark, Delta Lake, and other relevant technologies for large-scale data processing. Familiarity with BI tools including Tableau and PowerBI Demonstrated ability to enhance cost-efficiency, scalability, and performance for data solutions Strong analytical and problem-solving skills to address complex data solutions Good-to-Have Skills: Preferred to have experience in life science or tech or consultative solution architecture roles Experience working with agile development methodologies such as Scaled Agile. Professional Certifications AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills.

Posted 1 week ago

Apply

4.0 - 6.0 years

10 - 14 Lacs

Hyderabad

Work from Office

In this role, you will design, build and maintain data lake solutions for scientific data that drive business decisions for Research. You will build scalable and high-performance data engineering solutions for large scientific datasets and collaborate with Research stakeholders. The ideal candidate possesses experience in the pharmaceutical or biotech industr y , demonstrates strong technical skills, is proficient with big data technologies, and has a deep understanding of data architecture and ETL processes. Roles & Responsibilities: D esign, develop, and implement data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manag e scope, timelines, and risks Develop and maintain data models for biopharma scientific data , data dictionaries, and other documentation to ensure data accuracy and consistency Optimize large datasets for query performance Collaborate with global cross-functional teams including research scientists to understand data requirements and design solutions that meet business needs Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate with Data Architects, Business SMEs , Software Engineers and Data Scientists to design and develop end-to-end data pipeline s to meet fast paced business need s across geographic regions Identify and resolve [ complex ] data-related challenges Adhere to best practices for coding, testing , and designing reusable code/component E xplore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation Maintain comprehensive documentation of processes, systems, and solutions Basic Qualifications and Experience: Doctorate Degree OR Master s degree with 4 - 6 years of experience in Computer Science, IT , Computational Chemistry, Computational Biology/ Bioinformatics or related field OR Bachelor s degree with 6 - 8 years of experience in Computer Science, IT , Computational Chemistry, Computational Biology/ Bioinformatics or related field OR Diploma with 10 - 12 years of experience in Computer Science, IT , Computational Chemistry, Computational Biology/ Bioinformatics or related field Preferred Qualifications and Experience: 3+ years of experience in implementing and supporting biopharma scientific research data analytics (software platforms) Functional Skills: Must-Have Skills: Proficiency in SQL and Python for data engineering, test automation frameworks ( pytest ), and scripting tasks Hands on experience with big data technologies and platforms , such as Databricks, Apache Spark ( PySpark , SparkSQL ) , workflow orchestration, performance tuning on big data processing Excellent problem-solving skills and the ability to work with large, complex datasets Good-to-Have Skills: A passion for tackling complex challenges in drug discovery with technology and data Strong understanding of data modeling, data warehousing, and data integration concepts Strong experience using RDBMS ( e.g. Oracle, MySQL , SQL server , Postgre SQL ) Knowledge of cloud data platforms (AWS preferred) E xperience with data visualization tools (e . g. Dash, Plotly , Spotfire ) Experience with diagramming and collaboration tools such as Miro, Lucidchart or similar tools for process mapping and brainstorming Experience writing and maintaining technical documentation in Confluence U nderstanding of data governance frameworks, tools, and best practices Professional Certifications: Databricks Certified Data Engineer Professional preferred Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills EQUAL OPPORTUNITY STATEMENT We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation . Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .

Posted 1 week ago

Apply

7.0 - 11.0 years

22 - 27 Lacs

Hyderabad

Work from Office

We are seeking a seasoned Solution Architect to drive the architecture, development and implementation of data solutions to Amgen functional groups. The ideal candidate able to work in large scale Data Analytic initiatives, engage and work along with Business, Program Management, Data Engineering and Analytic Engineering teams. Be champions of enterprise data analytic strategy, data architecture blueprints and architectural guidelines. As a Solution Architect, you will play a crucial role in designing, building, and optimizing data solutions to Amgen functional groups such as R&D, Operations and GCO. Roles & Responsibilities: Implement and manage large scale data analytic solutions to Amgen functional groups that align with the Amgen Data strategy Collaborate with Business, Program Management, Data Engineering and Analytic Engineering teams to deliver data solutions Responsible for design, develop, optimize, delivery and support of Data solutions on AWS and Databricks architecture Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions. Provide expert guidance and mentorship to the team members, fostering a culture of innovation and best practices. Be passionate and hands-on to quickly experiment with new data related technologies Define guidelines, standards, strategies, security policies and change management policies to support the Enterprise Data platform. Collaborate and align with EARB, Cloud Infrastructure, Security and other technology leaders on Enterprise Data Architecture changes Work with different project and application groups to drive growth of the Enterprise Data Platform using effective written/verbal communication skills, and lead demos at different roadmap sessions Overall management of the Enterprise Data Platform on AWS environment to ensure that the service delivery is cost effective and business SLAs around uptime, performance and capacity are met Ensure scalability, reliability, and performance of data platforms by implementing best practices for architecture, cloud resource optimization, and system tuning. Collaboration with RunOps engineers to continuously increase our ability to push changes into production with as little manual overhead and as much speed as possible. Maintain knowledge of market trends and developments in data integration, data management and analytics software/tools Work as part of team in a SAFe Agile/Scrum model Basic Qualifications and Experience: Master s degree with 7 - 11 years of experience in Computer Science, IT or related field OR Bachelor s degree with 8 - 13 years of experience in Computer Science, IT or related field OR Functional Skills: Must-Have Skills: 7+ years of hands-on experience in Data integrations, Data Management and BI technology stack. Strong experience with one or more Data Management tools such as AWS data lake, Snowflake or Azure Data Fabric Expert-level proficiency with Databricks and experience in optimizing data pipelines and workflows in Databricks environments. Strong experience with Python, PySpark, and SQL for building scalable data workflows and pipelines. Experience with Apache Spark, Delta Lake, and other relevant technologies for large-scale data processing. Familiarity with BI tools including Tableau and PowerBI Demonstrated ability to enhance cost-efficiency, scalability, and performance for data solutions Strong analytical and problem-solving skills to address complex data solutions Good-to-Have Skills: Preferred to have experience in life science or tech or consultative solution architecture roles Experience working with agile development methodologies such as Scaled Agile. Professional Certifications AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills.

Posted 1 week ago

Apply

8.0 - 10.0 years

22 - 27 Lacs

Hyderabad

Work from Office

We are seeking a seasoned and passionate Principal Architect (Enterprise Architect - Data Platform Engineering ) in our Data Architecture & Engineer ing group to drive the architecture, development and implementation of our strategy spanning across Data Fabric, Data Management, and Data Analytics Platform stack . The ideal candidate possesses a deep technical expertise and understanding of data and analytics landscape, current tools and technology trends, and data engineering principles, coupled with strong leadership and data-driven problem-solving skills . As a Principal Architect , you will play a crucial role in building the strategy and driving the implementation of best practices across data and analytics platforms . Roles & Responsibilities: Must be passionate about Data, Content and AI technologies - with ability to evaluate and assess new technology and trends in the market quickly - with enterprise architecture in mind Drive the strategy and implementation of enterprise data platform and technical roadmaps that align with the Amgen Data strategy Maintain the pulse of current market trends in data & AI space and be able to quickly perform hands-on experimentation and evaluations Provide expert guidance and influence the management and peers from functional groups with Enterprise mindset and goals R esponsible for design, develop , optimize , delivery and support of Enterprise Data platform on AWS and Databricks architecture Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions. Advice and support Application teams (product managers, architects, business analysts, and developers) on tools, technology, and methodology related to the design and development of applications that have large data volume and variety of data types Collaborate and align with EARB, Cloud Infrastructure , Security and other technology leaders on Enterprise Data Architecture changes Ensure scalability, reliability, and performance of data platforms by implementing best practices for architecture, cloud resource optimization, and system tuning. Collaboration with RunOps engineers to continuously increase our ability to push changes into production with as little manual overhead and as much speed as possible. Basic Qualifications and Experience: Master s degree with 8 - 10 years of experience in Computer Science, IT or related field OR Bachelor s degree with 10 - 14 years of experience in Computer Science, IT or related field Functional Skills: Must-Have Skills: 8 + years of experience in data architecture and engineering or related roles with hands-on experience building enterprise data platforms in a cloud environment (AWS, Azure, GCP). 5+ years of experience in leading enterprise scale data platforms and solutions Expert-level proficiency with Databricks and experience in optimizing data pipelines and workflows in Databricks environments. Deep understanding of distributed computing, data architecture, and performance optimization in cloud-based environments. Experience with Enterprise mindset / certifications like TOGAF etc. are a plus. Highly preferred to have Big Tech or Big Consulting experience. Solid knowledge of data security, governance, and compliance practices in cloud environments. Must have exceptional communication to engage and influence architects and leaders in the organization Good-to-Have Skills: Experience with Gen AI tools in databricks Experience with unstructured data architecture and pip elines Experience working with agile development methodologies such as Scaled Agile. Professional Certifications AWS Certified Data Engineer preferred Databricks Certifi cate preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills.

Posted 1 week ago

Apply

3.0 - 8.0 years

3 - 6 Lacs

Hyderabad

Work from Office

The R&D Data Catalyst Team is responsible for building Data Searching, Cohort Building, and Knowledge Management tools that provide the Amgen scientific community with visibility to Amgen s wealth of human datasets, projects and study histories, and knowledge over various scientific findings . These solutions are pivotal tools in Amgen s goal to accelerate the speed of discovery, and speed to market of advanced precision medications . The S r. Data Engineer will be responsible for the end-to-end development of an enterprise analytics and data mastering solution leveraging Databricks and Power BI. This role requires expertise in both data architecture and analytics, with the ability to create scalable, reliable, and high-performing enterprise solutions that research cohort-building and advanced research pipeline . The ideal candidate will have experience creating and surfacing large unified repositories of human data, based on integrations from multiple repositories and solutions , and be exceptionally skilled with data analysis and profiling . You will collaborate closely with stakeholders , product team members , and related I T teams, to design and implement data models, integrate data from various sources, and ensure best practices for data governance and security. The ideal candidate will have a strong background in data warehousing, ETL, Databricks, Power BI, and enterprise data mastering. Roles & Responsibilities: Design and build scalable enterprise analytics solutions using Databricks, Power BI, and other modern data tools. Leverage data virtualization, ETL, and semantic layers to balance need for unification, performance, and data transformation with goal to reduce data proliferation Break down features into work that aligns with the architectural direction runway Participate hands-on in pilots and proofs-of-concept for new patterns Create robust documentation from data analysis and profiling, and proposed designs and data logic Develop advanced sql queries to profile, and unify data Develop data processing code in sql , along with semantic views to prepare data for reporting Develop PowerBI Models and reporting packages Design robust data models , and processing layers, that support both analytical processing and operational reporting needs. D esign and develop solutions based on best practices for data governance, security, and compliance within Databricks and Power BI environments. Ensure the integration of data systems with other enterprise applications, creating seamless data flows across platforms. Develop and maintain Power BI solutions, ensuring data models and reports are optimized for performance and scalability. Collaborate with stakeholders to define data requirements, functional specifications, and project goals. Continuously evaluate and adopt new technologies and methodologies to enhance the architecture and performance of data solutions. Basic Qualifications and Experience: Master s degree with 4 to 6 years of experience in Product Owner / Platform Owner / Service Owner OR Bachelor s degree with 8 to 10 years of experience in Product Owner / Platform Owner / Service Owner Functional Skills: Must-Have Skills : Minimum of 3 years of hands-on experience with BI solutions (Preferrable Power BI or Business Objects) including report development, dashboard creation, and optimization. Minimum of 6 years of hands-on experience building Change-data-capture (CDC) ETL pipelines, data warehouse design and build, and enterprise-level data management. Hands-on experience with Databricks, including data engineering, optimization, and analytics workloads. Deep understanding of Power BI, including model design , DAX, and Power Query. Proven experience designing and implementing data mastering solutions and data governance frameworks. Expertise in cloud platforms ( AWS ), data lakes, and data warehouses. Strong knowledge of ETL processes, data pipelines, and integration technologies. Strong communication and collaboration skills to work with cross-functional teams and senior leadership. Ability to assess business needs and design solutions that align with organizational goals. Exceptional h ands - on capabilities with data profiling, data transformation, data mastering Success in mentoring and training team members Good-to-Have Skills: Experience in developing differentiated and deliverable solutions Experience with human data, ideally human healthcare data Familiarity with laboratory testing, patient data from clinical care, HL7, FHIR, and/or clinical trial data management Professional Certifications (please mention if the certification is preferred or mandatory for the role): ITIL Foundation or other relevant certifications (preferred) SAFe Agile Practitioner (6.0) Microsoft Certified: Data Analyst Associate (Power BI) or related certification. Databricks Certified Professional or similar certification. Soft Skills: Excellent analytical and troubleshooting skills Deep intellectual curiosity High est degree of initiative and self-motivation Strong verbal and written communication skills , including presentation to varied audiences of complex technical/business topics Confidence technical leader Ability to work effectively with global, virtual teams , specifically including leveraging of tools and artifacts to assure clear and efficient collaboration across time zones Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong problem solving, analytical skills; Ability to learn quickly and retain and synthesize complex information from diverse sources

Posted 1 week ago

Apply

7.0 - 9.0 years

5 - 9 Lacs

Hyderabad

Work from Office

The R&D Data Catalyst Team is responsible for building Data Searching, Cohort Building, and Knowledge Management tools that provide the Amgen scientific community with visibility to Amgen s wealth of human datasets, projects and study histories, and knowledge over various scientific findings . These solutions are pivotal tools in Amgen s goal to accelerate the speed of discovery, and speed to market of advanced precision medications . The Data Engineer will be responsible for the end-to-end development of an enterprise analytics and data mastering solution leveraging Databricks and Power BI. This role requires expertise in both data architecture and analytics, with the ability to create scalable, reliable, and high-performing enterprise solutions that research cohort-building and advanced research pipeline . The ideal candidate will have experience creating and surfacing large unified repositories of human data, based on integrations from multiple repositories and solutions , and be exceptionally skilled with data analysis and profiling . You will collaborate closely with stakeholders , product team members , and related I T teams, to design and implement data models, integrate data from various sources, and ensure best practices for data governance and security. The ideal candidate will have a strong background in data warehousing, ETL, Databricks, Power BI, and enterprise data mastering. Roles & Responsibilities: Design and build scalable enterprise analytics solutions using Databricks, Power BI, and other modern data tools. Leverage data virtualization, ETL, and semantic layers to balance need for unification, performance, and data transformation with goal to reduce data proliferation Break down features into work that aligns with the architectural direction runway Participate hands-on in pilots and proofs-of-concept for new patterns Create robust documentation from data analysis and profiling, and proposed designs and data logic Develop advanced sql queries to profile, and unify data Develop data processing code in sql , along with semantic views to prepare data for reporting Develop PowerBI Models and reporting packages Design robust data models , and processing layers, that support both analytical processing and operational reporting needs. D esign and develop solutions based on best practices for data governance, security, and compliance within Databricks and Power BI environments. Ensure the integration of data systems with other enterprise applications, creating seamless data flows across platforms. Develop and maintain Power BI solutions, ensuring data models and reports are optimized for performance and scalability. Collaborate with stakeholders to define data requirements, functional specifications, and project goals. Continuously evaluate and adopt new technologies and methodologies to enhance the architecture and performance of data solutions. Basic Qualifications and Experience: Master s degree with 1 to 3 years of experience in Data Engineering OR Bachelor s degree with 4 to 5 years of experience in Data Engineering Diploma and 7 to 9 years of experience in Data Engineering. Functional Skills: Must-Have Skills : Minimum of 3 years of hands-on experience with BI solutions (Preferrable Power BI or Business Objects) including report development, dashboard creation, and optimization. Minimum of 3 years of hands-on experience building Change-data-capture (CDC) ETL pipelines, data warehouse design and build, and enterprise-level data management. Hands-on experience with Databricks, including data engineering, optimization, and analytics workloads. Deep understanding of Power BI, including model design , DAX, and Power Query. Proven experience designing and implementing data mastering solutions and data governance frameworks. Expertise in cloud platforms ( AWS ), data lakes, and data warehouses. Strong knowledge of ETL processes, data pipelines, and integration technologies. Strong communication and collaboration skills to work with cross-functional teams and senior leadership. Ability to assess business needs and design solutions that align with organizational goals. Exceptional h ands - on capabilities with data profiling, data transformation, data mastering Success in mentoring and training team members Good-to-Have Skills: Experience in developing differentiated and deliverable solutions Experience with human data, ideally human healthcare data Familiarity with laboratory testing, patient data from clinical care, HL7, FHIR, and/or clinical trial data management Professional Certifications (please mention if the certification is preferred or mandatory for the role): ITIL Foundation or other relevant certifications (preferred) SAFe Agile Practitioner (6.0) Microsoft Certified: Data Analyst Associate (Power BI) or related certification. Databricks Certified Professional or similar certification. Soft Skills: Excellent analytical and troubleshooting skills Deep intellectual curiosity High est degree of initiative and self-motivation Strong verbal and written communication skills , including presentation to varied audiences of complex technical/business topics Confidence technical leader Ability to work effectively with global, virtual teams , specifically including leveraging of tools and artifacts to assure clear and efficient collaboration across time zones Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong problem solving, analytical skills; Ability to learn quickly and retain and synthesize complex information from diverse sources

Posted 1 week ago

Apply

4.0 - 6.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Let s do this. Let s change the world. In this vital role you will be part of Research s Semantic Graph Team is seeking a dedicated and skilled Semantic Data Engineer to build and optimize knowledge graph-based software and data resources. This role primarily focuses on working with technologies such as RDF, SPARQL, and Python. In addition, the position involves semantic data integration and cloud-based data engineering. The ideal candidate should possess experience in the pharmaceutical or biotech industry, demonstrate deep technical skills, and be proficient with big data technologies and demonstrate experience in semantic modeling. A deep understanding of data architecture and ETL processes is also essential for this role. In this role, you will be responsible for constructing semantic data pipelines, integrating both relational and graph-based data sources, ensuring seamless data interoperability, and leveraging cloud platforms to scale data solutions effectively. Roles & Responsibilities: Develop and maintain semantic data pipelines using Python, RDF, SPARQL, and linked data technologies. Develop and maintain semantic data models for biopharma scientific data Integrate relational databases (SQL, PostgreSQL, MySQL, Oracle, etc.) with semantic frameworks. Ensure interoperability across federated data sources, linking relational and graph-based data. Implement and optimize CI/CD pipelines using GitLab and AWS. Leverage cloud services (AWS Lambda, S3, Databricks, etc.) to support scalable knowledge graph solutions. Collaborate with global multi-functional teams, including research scientists, Data Architects, Business SMEs, Software Engineers, and Data Scientists to understand data requirements, design solutions, and develop end-to-end data pipelines to meet fast-paced business needs across geographic regions. Collaborate with data scientists, engineers, and domain experts to improve research data accessibility. Adhere to standard processes for coding, testing, and designing reusable code/components. Explore new tools and technologies to improve ETL platform performance. Participate in sprint planning meetings and provide estimations on technical implementation. Maintain comprehensive documentation of processes, systems, and solutions. Harmonize research data to appropriate taxonomies, ontologies, and controlled vocabularies for context and reference knowledge. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications and Experience: Doctorate Degree OR Master s degree with 4 - 6 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Bachelor s degree with 6 - 8 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Diploma with 10 - 12 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field Preferred Qualifications and Experience: 6+ years of experience in designing and supporting biopharma scientific research data analytics (software platforms) Functional Skills: Must-Have Skills: Advanced Semantic and Relational Data Skills: Proficiency in Python, RDF, SPARQL, Graph Databases (e.g. Allegrograph), SQL, relational databases, ETL pipelines, big data technologies (e.g. Databricks), semantic data standards (OWL, W3C, FAIR principles), ontology development and semantic modeling practices. Cloud and Automation Expertise: Good experience in using cloud platforms (preferably AWS) for data engineering, along with Python for automation, data federation techniques, and model-driven architecture for scalable solutions. Technical Problem-Solving: Excellent problem-solving skills with hands-on experience in test automation frameworks (pytest), scripting tasks, and handling large, complex datasets. Good-to-Have Skills: Experience in biotech/drug discovery data engineering Experience applying knowledge graphs, taxonomy and ontology concepts in life sciences and chemistry domains Experience with graph databases (Allegrograph, Neo4j, GraphDB, Amazon Neptune) Familiarity with Cypher, GraphQL, or other graph query languages Experience with big data tools (e.g. Databricks) Experience in biomedical or life sciences research data management Soft Skills: Excellent critical-thinking and problem-solving skills Good communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.

Posted 1 week ago

Apply

8.0 - 13.0 years

17 - 18 Lacs

Hyderabad

Work from Office

Let s do this. Let s change the world. In this vital role you will responsible for developing and maintaining the overall IT architecture of the organization. In this role you will be responsible for designing and implementing information system architectures to support business needs. You will analyze requirements, develop architectural designs, evaluate technology solutions, and ensure alignment with industry best practices and standards. You will be working closely with collaborators to understand requirements, develop architectural blueprints, and ensure that solutions are scalable, secure, and aligned with enterprise standards. Architects will be involved in defining the enterprise architecture strategy, guiding technology decisions, and ensuring that all IT projects adhere to established architectural principles. Roles & Responsibilities: Develop and maintain the enterprise architecture vision and strategy, ensuring alignment with business objectives for Corporate Functions data architecture. Collaborating closely with business clients and key collaborators to align solutions with strategic objectives. Create and maintain architectural roadmaps that guide the evolution of IT systems and capabilities for Corporate Functions data architecture Establish and enforce architectural standards, policies, and governance frameworks Evaluate emerging technologies and assess their potential impact on the solution architecture Identify and mitigate architectural risks, ensuring that IT systems are scalable, secure, and resilient Maintain comprehensive documentation of the architecture, including principles, standards, and models Drive continuous improvement in the architecture by finding opportunities for innovation and efficiency Work with partners to gather and analyze requirements, ensuring that solutions meet both business and technical needs Ensure seamless integration between systems and platforms, both within the organization and with external partners Design systems that can scale to meet growing business needs and performance demands Deliver high-quality Salesforce solutions using LWC, Apex, Flows and other Salesforce technologies. Ensure alignment to established standard methodologies and definitions of done, maintaining high-quality standards in work Create architectural design and data model as per business requirements and Salesforce standard methodologies Proactively identify technical debt and collaborate with the Principal Architect and Product Owner to prioritize and address it effectively Negotiate solutions to complex problems with both the product teams and third-party service providers Build relationships and work with product teams; contribute to broader goals and growth beyond the scope of a single or your current project What we expect of you We are all different, yet we all use our unique contributions to serve patients. Doctorate degree / Masters degree / Bachelors degree and 8 to 13 years of Computer Science, IT or related field experience Preferred Qualifications: Strong architectural design and modeling skills Proficiency in Salesforce Health Cloud / Service Cloud implementation for a Call Center Solid hands-on experience of implementing Salesforce Configurations, Apex, LWC and integrations Solid understanding of declarative tools like Flows and Process Builder Proficiency in using Salesforce tools such as Data Loader, Salesforce Inspector to query, manipulate and export data Experience in developing differentiated and deliverable solutions Ability to analyze client requirements and translate them into solutions Ability to train and guide junior developers in standard methodologies Familiarity with Agile practices such as User Story Creation and, sprint planning Experience creating proofs of concept (PoCs) to validate new ideas or backlog items. Professional Certifications: Salesforce Admin Salesforce Advanced Administrator Salesforce Platform Developer 1 (Mandatory) Salesforce Platform Developer 2 Platform Builder Salesforce Application Architect Salesforce Health Cloud Accredited Professional (Preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Good communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated awareness of presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.

Posted 1 week ago

Apply

1.0 - 3.0 years

14 - 16 Lacs

Hyderabad

Work from Office

Let s do this. Let s change the world. In this vital role you will be responsible for designing, building, maintaining , analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation Basic Qualifications : Master s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Must have Skills : Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark ( PySpark , SparkSQL ), workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools ( eg. SQL) Proficient in SQL for extracting, transforming, and analyzing complex datasets from relational data stores Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Proven ability to optimize query performance on big data platforms Preferred Qualifications: Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Knowledge of Python/R, Databricks, cloud data platforms Strong understanding of data governance frameworks, tools, and best practices. Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Professional Certifications: AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills : Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.

Posted 1 week ago

Apply

10.0 - 12.0 years

14 - 18 Lacs

Hyderabad

Work from Office

Career Category Information Systems Job Description Amgen s Precision Medicine technology te am is responsible for building Data Searching, Cohort Building, and Knowledge Management tools that provide the Amgen scientific community with visibility to Amgen s wealth of human datasets, projects and study histories, and knowledge over various scientific findings. These data include multiomics data (genomics, transcriptomics, proteomics, etc.), clinical study subject measurement and outcome data, images, and specimen inventory data . Our PMED data management , standardization, surfacing, and processing capabilities are pivotal tools in Amgen s goal to accelerate the speed of discovery, and speed to market of advanced precision medications . The Solution and Data Architect will be responsible for the end-to-end architecture of an enterprise analytics and data mastering solution leveraging Databricks and Power BI. This role requires expertise in both data architecture and analytics, with the ability to create scalable, reliable, and high-performing enterprise solutions that research cohort-building and advanced research pipeline . The ideal candidate will have experience creating and surfacing large unified repositories of human data, based on integrations from multiple repositories and solutions. You will collaborate closely with stakeholders across departments, including data engineering, business intelligence, and IT teams, to design and implement data models, integrate data from various sources, and ensure best practices for data governance and security. The ideal candidate will have a strong background in data warehousing, ETL, Databricks, Power BI, and enterprise data mastering. Roles & Responsibilities: Architect scalable enterprise analytics solutions using Databricks, Power BI, and other modern data tools. Leverage data virtualization, ETL, and semantic layers to balance need for unification, performance, and data transformation with goal to reduce data proliferation Support development planning by break ing down features into work that aligns with the architectural direction runway Participate hands-on in pilots and proofs-of-concept for new patterns Create robust documentation of architectural direction, patterns, and standards Present and train engineers and cross-team collaborators on architecture strategy and patterns Collaborate with data engineers to build and optimize ETL pipelines, ensuring efficient data ingestion and processing from multiple sources. Design robust data models , and processing layers, that support both analytical processing and operational reporting needs. Develop and implement best practices for data governance, security, and compliance within Databricks and Power BI environments. Ensure the integration of data systems with other enterprise applications, creating seamless data flows across platforms. Provide thought leadership and strategic guidance on data architecture, advanced analytics, and data mastering best practices. Develop and maintain Power BI solutions, ensuring data models and reports are optimized for performance and scalability. Serve as a subject matter expert on Power BI and Databricks, providing technical leadership and mentoring to other teams. Collaborate with stakeholders to define data requirements, architecture specifications, and project goals. Continuously evaluate and adopt new technologies and methodologies to enhance the architecture and performance of data solutions. Basic Qualifications and Experience: Master s degree with 6 to 8 years of experience in data management and data solution architecture Bachelor s degree with 8 to 10 years of experience in in data management and data solution architecture Diploma and 10 to 12 years of experience in in data management and data solution architecture Functional Skills: Must-Have Skills : Minimum of 3 years of hands-on experience with BI solutions (Preferable Power BI or Business Objects) including report development, dashboard creation, and optimization. Minimum of 7 years of hands-on experience building Change-data-capture (CDC) ETL pipelines, data warehouse design and build, and enterprise-level data management. Hands-on experience with Databricks, including data engineering, optimization, and analytics workloads. Deep understanding of Power BI, including model design , DAX, and Power Query. Proven experience designing and implementing data mastering solutions and data governance frameworks. Expertise in cloud platforms ( AWS ), data lakes, and data warehouses. Strong knowledge of ETL processes, data pipelines, and integration technologies. Strong communication and collaboration skills to work with cross-functional teams and senior leadership. Ability to assess business needs and design solutions that align with organizational goals. Exceptional h ands - on capabilities with data profiling, data transformation, data mastering Success in mentoring and training team members Good-to-Have Skills: Experience in developing differentiated and deliverable solutions Experience with human data, ideally human healthcare data Familiarity with laboratory testing, patient data from clinical care, HL7, FHIR, and/or clinical trial data management Professional Certifications (please mention if the certification is preferred or mandatory for the role): ITIL Foundation or other relevant certifications (preferred) SAFe Agile Practitioner (6.0) Microsoft Certified: Data Analyst Associate (Power BI) or related certification. Databricks Certified Professional or similar certification. Soft Skills: Excellent analytical and troubleshooting skills Deep intellectual curiosity High est degree of initiative and self-motivation Strong verbal and written communication skills , including presentation to varied audiences of complex technical/business topics Confidence technical leader Ability to work effectively with global, virtual teams , specifically including leveraging of tools and artifacts to assure clear and efficient collaboration across time zones Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong problem solving, analytical skills; Ability to learn quickly and retain and synthesize complex information from diverse sources .

Posted 1 week ago

Apply

6.0 - 11.0 years

20 - 25 Lacs

Gurugram

Work from Office

Overview We are seeking a Senior Engagement Manager with proven experience in leading data projects using Agile methodologies. This role requires expertise in managing Databricks platform implementations and migrations, with a focus on healthcare industry data projects. The ideal candidate will have a modern approach to tracking and managing dependencies across complex data initiatives. About the Role As a Senior Engagement Manager, you will be responsible for ensuring successful delivery of Databricks-based Modern Data Platform implementations. You will lead client engagements from inception to completion, working closely with cross-functional teams to deliver high-value data solutions that meet business objectives and technical requirements. Key Responsibilities Lead client engagements for Databricks platform implementations and migrations from legacy systems Apply Agile methodologies to manage project delivery, ensuring iterative value creation and continuous improvement Develop and maintain project plans, timelines, and resource allocations using modern tracking tools and approaches Manage project dependencies, risks, and issues with sophisticated tracking mechanisms Serve as the primary point of contact between client stakeholders and delivery teams Collaborate with technical teams to ensure solutions meet client requirements and adhere to best practices Conduct regular status meetings and provide transparent reporting on project progress Manage project scope, timeline, and budget to ensure successful delivery within constraints Identify and mitigate project risks proactively Drive client satisfaction throughout the engagement lifecycle Identify opportunities for additional services and project extensions Facilitate knowledge transfer to client teams Document project outcomes, lessons learned, and best practices Qualifications 6+ years of experience managing data projects, with a focus on Agile delivery methodologies Proven track record of successfully implementing data solutions in the healthcare industry Experience with Databricks platform implementations and migrations from legacy systems Strong understanding of modern data architecture, including Lakehouse concepts Expertise in Agile project management frameworks (Scrum, Kanban, SAFe) Experience with modern project tracking tools and dependency management approaches Excellent client relationship management skills Strong communication and presentation abilities at all organizational levels Ability to translate complex technical concepts into business value Experience managing cross-functional teams Bachelor's degree in Computer Science, Information Systems, or related field; advanced degree preferred Professional certifications such as PMP, PMI-ACP, CSM, or equivalent Technical Knowledge Basic understanding of Databricks platform and ecosystem Modern data architecture principles Agile project management methodologies and tools Data migration strategies and approaches Cloud platforms (AWS, Azure, GCP) Data engineering and analytics workflows Project tracking and dependency management tools Risk management frameworks Budget and resource management

Posted 1 week ago

Apply

5.0 - 10.0 years

11 - 15 Lacs

Mumbai, Nagpur, Thane

Work from Office

The Data Manager will be responsible for designing, implementing, and maintaining our data management systems, policies, and procedures. This includes ensuring data quality, integrity, and security, as well as providing data insights and analytics to support business and program decisions and impact communication. Product and Data Manager Job Summary The Data Manager will be responsible for designing, implementing, and maintaining our data management systems, policies, and procedures. This includes ensuring data quality, integrity, and security, as well as providing data insights and analytics to support business and program decisions and impact communication. Roles and Responsibilities Data Governance: Develop, implement, and maintain data management policies, procedures, and standards to ensure data quality, integrity, and security. The objective is to streamline and aggregate all org data into one system and platform. Data Architecture: Design and implement data management systems for all verticals and projects including data collection tools, dashboards for visualisation and communication and dissemination to the right stakeholders. Data Quality: Develop and implement data quality metrics and monitoring/ auditing systems to ensure data accuracy, completeness, and consistency. Data Security: Ensure data security and compliance with relevant regulations. Data Analytics: Provide data insights and analytics regularly to support business decisions, including data visualization, reporting, and analysis. Stakeholder Management: Collaborate with stakeholders, including business leaders, data analysts, and IT teams, external consultants to understand data requirements, keep updated on new tools and technologies and provide data solutions. Data Documentation: Maintain accurate and up-to-date data documentation Team Management: Supervise and mentor team members across verticals to drive a culture of data-driven decision making in the organization. Qualifications Education: Bachelors degree in Computer Science, Information Technology, or related field. Experience: Minimum 5 years of experience in data management, data analytics, or related field. Skills: Strong knowledge of data management principles, data governance, and data quality. Experience with data management tools. Proficiency in data analysis and visualization tools, such as Tableau, Power BI, or D3.js. Strong understanding of data security and compliance regulations. 5. Excellent communication and project management skills.

Posted 1 week ago

Apply

5.0 - 10.0 years

15 - 20 Lacs

Pune

Work from Office

Job Description: As a Senior Data Architect, you will be instrumental in shaping the bank s enterprise data landscape supporting teams in designing, evolving, and implementing data architectures that align with the enterprise target state and enable scalable, compliant, and interoperable solutions. You will also serve as the go-to expert and trusted advisor on what good looks like in data architecture, helping to set high standards and drive continuous improvement across the organization. This role is ideal for an experienced data professional with deep technical expertise, strong solution architecture skills, and a proven ability to influence design decisions across both business and technology teams. Responsibilities 1. Enterprise Data Architecture & Solution Design Support teams in designing, evolving, and implementing data architectures that align with the enterprise target state and enable scalable, compliant, and interoperable solutions. Serve as the go-to person for data architecture best practices and standards, helping to define and communicate what good looks like to ensure consistency and quality. Lead and contribute to solution architecture for key programs, ensuring architectural decisions are well-documented, justified, and aligned to enterprise principles. Work with engineering and platform teams to design end-to-end data flows, integration patterns, data processing pipelines, and storage strategies across structured and unstructured data. Drive the application of modern data architecture principles including event-driven architecture, data mesh, streaming, and decoupled data services. 2. Data Modelling and Semantics Provide hands-on leadership in data modelling efforts, including the occasional creation and stewardship of conceptual, logical, and physical models that support enterprise data domains. Partner with product and engineering teams to ensure data models are fit-for-purpose, extensible, and aligned with enterprise vocabularies and semantics. Support modelling use cases across regulatory, operational, and analytical data assets. 3. Architecture Standards & Frameworks Define and continuously improve data architecture standards, patterns, and reference architectures that support consistency and interoperability across platforms. Embed standards into engineering workflows and tooling to encourage automation and reduce delivery friction. Measure and report on adoption of architectural principles using architecture KPIs and compliance metrics. 4. Leadership, Collaboration & Strategy Act as a technical advisor and architectural leader across initiatives mentoring junior architects and supporting federated architecture teams in delivery. Build strong partnerships with senior stakeholders across the business, CDIO, engineering, and infrastructure teams to ensure alignment and adoption of architecture strategy. Stay current with industry trends, regulatory changes, and emerging technologies, advising on their potential impact and application. Skills Extensive experience in data architecture, data engineering, or enterprise architecture, preferably within a global financial institution. Deep understanding of data platforms, integration technologies, and architectural patterns for real-time and batch processing. Proficiency with data architecture tools such as Sparx Enterprise Architect, ERwin, or similar. Experience designing solutions in cloud and hybrid environments (e.g. GCP, AWS, or Azure), with knowledge of associated data services. Hands-on experience with data modelling, semantic layer design, and metadata-driven architecture approaches. Strong grasp of data governance, privacy, security, and regulatory compliance especially as they intersect with architectural decision-making. Strategic mindset, with the ability to connect architectural goals to business value, and communicate effectively with technical and non-technical stakeholders. Experience working across business domains including Risk, Finance, Treasury, or Front Office functions. Well-being & Benefits Emotionally and mentally balanced: we support you in dealing with life crises, maintaining stability through illness, and maintaining good mental health Empowering managers who value your ideas and decisions. Show your positive attitude, determination, and open-mindedness. A professional, passionate, and fun workplace with flexible Work from Home options. A modern office with fun and relaxing areas to boost creativity. Continuous learning culture with coaching and support from team experts. Physically thriving we support you managing your physical health by taking appropriate preventive measures and providing a workplace that helps you thrive Private healthcare and life insurance with premium benefits for you and discounts for your loved ones. Socially connected: we strongly believe in collaboration, inclusion and feeling connected to open up new perspectives and strengthen our self-confidence and wellbeing. Kids@TheOffice - support for unexpected events requiring you to care for your kids during work hours. Enjoy retailer discounts, cultural and CSR activities, employee sport clubs, workshops, and more. Financially secure: : we support you to meet personal financial goals during your active career and for the future Competitive income, performance-based promotions, and a sense of purpose. 24 days holiday, loyalty days, and bank holidays (including weekdays for weekend bank holidays).

Posted 1 week ago

Apply

5.0 - 10.0 years

15 - 20 Lacs

Pune

Work from Office

Job Description: This role is for a motivated and curious Data Architect / Data Engineer to join the Group Architecture team. This is a hands-on role focused on the development of tools, prototypes, and reference solutions that support enterprise data architecture standards. The successful candidate will work with senior architects and engineers to enable the adoption of best practices across data platforms, pipelines, and domains, helping to ensure scalable, secure, and consistent data delivery across the organization. Group Architecture is responsible for setting the strategic direction for technology architecture across the enterprise. The team partners with all business divisions to define architecture principles and standards, evaluate emerging technologies, and guide implementation through hands-on support, tooling, and governance. Responsibilities Design and develop lightweight tools, scripts, and utilities that support the implementation and adoption of data architecture standards (e.g., metadata enrichment, model validation, lineage capture, standard compliance checks). Contribute to the development of reference implementations and prototypes demonstrating approved data architecture patterns. Support the creation and enhancement of data pipelines, APIs, and other data integration components across various platforms. Assist in the evaluation and testing of new tools, frameworks, or services for potential use in the data architecture landscape. Collaborate with senior architects, engineers, and business stakeholders to gather requirements and deliver technical solutions that meet enterprise standards. Prepare and maintain documentation, dashboards, and visual materials to communicate technical concepts and track adoption of architecture standards. Participate in architecture review forums and support data governance processes as needed. Skills Foundational experience in data engineering or software development, with the ability to write clean, maintainable code in Python, SQL, or other languages. Exposure to cloud platforms (such as GCP, AWS, or Azure) and experience with relevant data services and APIs. Interest in or experience developing internal tools or automation scripts to improve engineering workflows. Familiarity with concepts such as data lineage, metadata, data quality, or governance is a plus. Understanding of basic architecture principles and willingness to apply them in practical solution design. Ability to work collaboratively in a cross-functional team, take initiative, and communicate effectively with technical and non-technical stakeholders. Exposure to business intelligence tools like Looker, Tableau, or similar. Understanding of data modeling, even at a high level, is beneficial but not a core focus. Experience with Git, CI/CD, or cloud-native development practices. Well-being & Benefits Emotionally and mentally balanced: we support you in dealing with life crises, maintaining stability through illness, and maintaining good mental health Empowering managers who value your ideas and decisions. Show your positive attitude, determination, and open-mindedness. A professional, passionate, and fun workplace with flexible Work from Home options. A modern office with fun and relaxing areas to boost creativity. Continuous learning culture with coaching and support from team experts. Physically thriving we support you managing your physical health by taking appropriate preventive measures and providing a workplace that helps you thrive Private healthcare and life insurance with premium benefits for you and discounts for your loved ones. Socially connected: we strongly believe in collaboration, inclusion and feeling connected to open up new perspectives and strengthen our self-confidence and wellbeing. Kids@TheOffice - support for unexpected events requiring you to care for your kids during work hours. Enjoy retailer discounts, cultural and CSR activities, employee sport clubs, workshops, and more. Financially secure: : we support you to meet personal financial goals during your active career and for the future Competitive income, performance-based promotions, and a sense of purpose. 24 days holiday, loyalty days, and bank holidays (including weekdays for weekend bank holidays).

Posted 1 week ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Gurugram

Work from Office

Position Summary: We are seeking a highly motivated and experienced Business Analyst (BA) to act as a critical liaison between our Clients and the Rackspace technical delivery team. The BA will be responsible for eliciting, analyzing, validating, and documenting business requirements related to data ingestion, processing, storage, reporting, and analytics. This role requires a strong understanding of business analysis principles, data concepts, and the ability to quickly grasp the nuances of airline operations (both passenger and cargo) and their supporting systems. Key Responsibilities: Requirement Elicitation & Analysis: Collaborate closely with client stakeholders across various departments to understand their business processes, pain points, and data needs. Conduct workshops, interviews, and document analysis to elicit detailed functional and non-functional requirements for the data platform. Analyze data originating from diverse source systems Translate business needs into clear, concise, and actionable requirements documentation (e.g., user stories, use cases, business process models, data mapping specifications). Data Focus: Analyse source system data structures and data relationships relevant to business requirements. Define business rules for data transformation, data quality, and data validation. Develop detailed source-to-target data mapping specifications in collaboration with data architects and engineers. Define requirements for reporting, dashboards, and analytical use cases, identifying key metrics and KPIs. Contribute to the definition of data governance policies and procedures from a business perspective Stakeholder Management & Communication Serve as the primary bridge between the airline client's business users and the Rackspace technical team (Data Engineers, Data Architects). Clearly articulate business requirements and context to the technical team and translate technical considerations back to the business stakeholders. Facilitate effective communication and collaboration sessions. Documentation & Support Create and maintain comprehensive requirements documentation throughout the project. Develop process flow diagrams (As-Is and To-Be) to visualize data flows. Assist in the creation of test cases and scenarios. Support User Acceptance Testing (UAT) by clarifying requirements and validating results against business needs. Support project management activities, including scope management and change request analysis. Required Qualifications Bachelor's degree in Business Administration, Information Systems, Computer Science, or a related field. 5+ years of experience as a Business Analyst, with a proven track record on data-centric projects (e.g., Data Warehousing, Business Intelligence, Data Analytics, Data Migration, Data Platform implementation). Strong analytical and problem-solving skills with the ability to understand complex business processes and data landscapes. Excellent requirements elicitation techniques (interviews, workshops, surveys, document analysis). Proficiency in creating standard BA artifacts (BRDs, User Stories, Use Cases, Process Flows, Data Mapping). Exceptional communication (written and verbal), presentation, and interpersonal skills. Experience working directly with business stakeholders at various levels. Ability to manage ambiguity and work effectively in a fast-paced, client-facing environment. Understanding of data modelling principles. Preferred Qualifications Experience working within the healthcare industry (knowledge of clinical workflows, EHR/EMR systems, medical billing, patient data privacy, care coordination, or public health analytics is a significant plus). Specific experience analyzing data from or integrating with systems like Epic, Cerner, Meditech, Allscripts, or other healthcare-specific platforms . Proficiency in SQL for data analysis and querying. Familiarity with Agile/Scrum methodologies. Experience with BI and data visualization tools (e.g., Tableau, Power BI, Qlik). CBAP or similar Business Analysis certification.

Posted 1 week ago

Apply

3.0 - 7.0 years

11 - 15 Lacs

Gurugram

Work from Office

Overview We are seeking an experienced Data Modeller with expertise in designing and implementing data models for modern data platforms. This role requires deep knowledge of data modeling techniques, healthcare data structures, and experience with Databricks Lakehouse architecture. The ideal candidate will have a proven track record of translating complex business requirements into efficient, scalable data models that support analytics and reporting needs. About the Role As a Data Modeller, you will be responsible for designing and implementing data models for our Databricks-based Modern Data Platform. You will work closely with business stakeholders, data architects, and data engineers to create logical and physical data models that support the migration from legacy systems to the Databricks Lakehouse architecture, ensuring data integrity, performance, and compliance with healthcare industry standards. Key Responsibilities Design and implement logical and physical data models for Databricks Lakehouse implementations Translate business requirements into efficient, scalable data models Create and maintain data dictionaries, entity relationship diagrams, and model documentation Develop dimensional models, data vault models, and other modeling approaches as appropriate Support the migration of data models from legacy systems to Databricks platform Collaborate with data architects to ensure alignment with overall data architecture Work with data engineers to implement and optimize data models Ensure data models comply with healthcare industry regulations and standards Implement data modeling best practices and standards Provide guidance on data modeling approaches and techniques Participate in data governance initiatives and data quality assessments Stay current with evolving data modeling techniques and industry trends Qualifications Extensive experience in data modeling for analytics and reporting systems Strong knowledge of dimensional modeling, data vault, and other modeling methodologies Experience with Databricks platform and Delta Lake architecture Expertise in healthcare data modeling and industry standards Experience migrating data models from legacy systems to modern platforms Strong SQL skills and experience with data definition languages Understanding of data governance principles and practices Experience with data modeling tools and technologies Knowledge of performance optimization techniques for data models Bachelor's degree in Computer Science, Information Systems, or related field; advanced degree preferred Professional certifications in data modeling or related areas Technical Skills Data modeling methodologies (dimensional, data vault, etc.) Databricks platform and Delta Lake SQL and data definition languages Data modeling tools (erwin, ER/Studio, etc.) Data warehousing concepts and principles ETL/ELT processes and data integration Performance tuning for data models Metadata management and data cataloging Cloud platforms (AWS, Azure, GCP) Big data technologies and distributed computing Healthcare Industry Knowledge Healthcare data structures and relationships Healthcare terminology and coding systems (ICD, CPT, SNOMED, etc.) Healthcare data standards (HL7, FHIR, etc.) Healthcare analytics use cases and requirements Optionally Healthcare regulatory requirements (HIPAA, HITECH, etc.) Clinical and operational data modeling challenges Population health and value-based care data needs Personal Attributes Strong analytical and problem-solving skills Excellent attention to detail and data quality focus Ability to translate complex business requirements into technical solutions Effective communication skills with both technical and non-technical stakeholders Collaborative approach to working with cross-functional teams Self-motivated with ability to work independently Continuous learner who stays current with industry trends What We Offer Opportunity to design data models for cutting-edge healthcare analytics Collaborative and innovative work environment Competitive compensation package Professional development opportunities Work with leading technologies in the data space This position requires a unique combination of data modeling expertise, technical knowledge, and healthcare industry understanding. The ideal candidate will have demonstrated success in designing efficient, scalable data models and a passion for creating data structures that enable powerful analytics and insights.

Posted 1 week ago

Apply

10.0 - 17.0 years

20 - 35 Lacs

Gurugram

Hybrid

Who We Are: As the worlds leading sustainability consulting firm, ERM is uniquely positioned to contribute to the environment and society through the expertise and energy of our employees worldwide. Sustainability is what we do, and is at the heart of both our service offerings and how we operate our business. For our people, our vision means attracting, inspiring, developing and rewarding our people to work with the best clients and on the biggest challenges, thus creating valuable careers. We achieve our vision in a sustainable manner by maintaining and living our ERM values that include Accountability, Caring for our People, Client Focus, Collaboration, Empowerment, and Transparency. ERM does not accept recruiting agency resumes. Please do not forward resumes to our jobs alias, ERM employees or any other company location. ERM is not responsible for any fees related to unsolicited resumes. ERM is proud to be an Equal Employment Opportunity employer. We do not discriminate based upon race, religion, color, national origin, gender, sexual orientation, gender identity, age, marital status or disability status. Job Description An exciting opportunity has emerged for a seasoned Data Architect to become a vital member of our ERM Technology team. You will report to the Lead Enterprise Architect and join a dynamic team focused on delivering corporate and technology strategic initiatives. The role demands high-level analytical, problem-solving, and communication skills, along with a strong commitment to customer service. As the Data Architect for ERM, you will work closely with both business and technology stakeholders, utilizing your expertise in business intelligence, analytics, data engineering, data management, and data integration to significantly advance our data strategy and ecosystem. Key responsibilities include: Empowered to define the data and information management architecture for ERM. Collaborate with product owners, engineers, data scientists, and business stakeholders to understand data needs across the full product lifecycle. Ensure a shared understanding of our data, including its quality, ownership, and lineage throughout its lifecycle, from initial capture via client interaction to final consumption by internal and external processes and stakeholders. Ensure that our data landscape effectively meets corporate and regulatory reporting requirements. Establish clear ownership and governance for comprehensive data domain models, encompassing both data in motion and data at rest. Provide expert guidance on solution architecture, engineering principles, and the implementation of data applications utilizing both existing and cutting-edge technology platforms. Build a robust data community by collaborating with architects and engineers, leveraging this community to implement solutions that enhance client and business outcomes through data. The successful candidate will have: Proven experience as an enterprise data architect. Experience in end-to-end implementation of data-intensive analytics-based projects encompassing data acquisition, ingestion, integration, transformation and consumption. Proven experience in the design, development, and implementation of data engineering technologies. Strong knowledge of data management and governance principles. A strong understanding of Azure and AWS service landscapes, particularly data services. Proven experience with various data modelling techniques. Understanding of big data architectures and emerging trends in technology. A solid familiarity with Agile methodologies, test-driven development, source control management, and automated testing. Thank you for your interest in ERM.

Posted 1 week ago

Apply

11.0 - 15.0 years

50 - 100 Lacs

Bengaluru

Work from Office

"The Comms Data Engineering team enables high-quality, data-driven decision-making that improves the way Uber supports Communications, the cost incurred and optimizations.. We are responsible for building and maintaining the foundational data infrastructure around Commumications platform. Our systems serve thousands of support professionals and power real-time dashboards, operational alerts, ML-driven recommendations, and long-term strategy planning. As a Staff Data Engineer, you will be a technical anchor for the team setting the vision for scalable data architecture and ensuring technical excellence across every layer of the stack. Youll have a unique opportunity to influence not just engineering outcomes, but how the company understands and evolves the Communication experience We are looking for engineers who are passionate about solving complex data problems at scale, enjoy working in high-ownership environments, and want to make a real difference in how we support users at Uber. "The Comms Data Engineering team enables high-quality, data-driven decision-making that improves the way Uber supports Communications, the cost incurred and optimizations.. We are responsible for building and maintaining the foundational data infrastructure around Commumications platform. Our systems serve thousands of support professionals and power real-time dashboards, operational alerts, ML-driven recommendations, and long-term strategy planning. As a Staff Data Engineer, you will be a technical anchor for the team setting the vision for scalable data architecture and ensuring technical excellence across every layer of the stack. Youll have a unique opportunity to influence not just engineering outcomes, but how the company understands and evolves the Communication experience We are looking for engineers who are passionate about solving complex data problems at scale, enjoy working in high-ownership environments, and want to make a real difference in how we support users at Uber. *Accommodations may be available based on religious and/or medical conditions, or as required by applicable law.

Posted 1 week ago

Apply

15.0 - 20.0 years

35 - 40 Lacs

Pune

Work from Office

What you ll do: Key Areas of Responsibility: Strategic Planning and Execution: Owns the complete responsibility of DHE deliverables from teams in EIIC . End to End DHE development, test and delivery Approach and Strategy. Ensuring First Time Right Sprint deliverables from team by adopting the best development practices Oversee the development and delivery of critical platform features as defined by platform roadmaps. Leverage work from platform adopters by bringing their work into the platform through inner-sourcing. The manager plays a key role in forecasting future resource needs and aligning them with profit planning. Contribute to the overall embedded software platforms strategy to maximize business impact. Cross-Functional Collaboration: DHE currently managing 30+ NPI programs and expected gro upto 50 by Q2 2026. This role requires coordination with product teams, NPI programs, and other platform stakeholders to ensure seamless integration and delivery . Analyze delivery plans for schedule risks, develop, and communicate alternate solutions proactively. People Leadership: Building Capability RTOS, Linux and QA platform teams. Manages the hiring, upskilling, competency management of the DHE team with the help of managers under this role Responsible for mentoring and developing talent, conducting performance reviews, and fostering a culture of accountability and innovation Process Ownership: Oversees the DHE engagement operating model, including SAFe release train participation, delivery reporting, and cost/chargeback mechanisms . Continuously improve the development process, quality attainment, automation, DevOps,and AI enabled development tools used to deliver software with year over year improvements to productivity Qualifications: B.E. / B Tech / M Tech 15+ years experience Skills: The DHE Manager is expected to bring a blend of technical, strategic, and leadership capabilities: Technical Expertise: Deep understanding of embedded systems, RTOS, Linux platform evolution, DevOps practices, and test automation. The manager must ensure that all contributions meet rigorous quality and compliance standards Leadership and Collaboration: Proven track record of leading cross-functional teams and driving the adoption of best practices across the organization. Strong stakeholder engagement skills to manage expectations and ensure alignment between technical architecture and business objectives. Problem-Solving Skills: Excellent analytical and problem-solving abilities to identify risks within the data architecture and take proactive steps to mitigate potential issues. Communication Skills: Strong verbal and written communication skills to effectively convey complex technical concepts to both technical and non-technical stakeholders. Emotional Intelligence, Ownership & Commitment, Stakeholder Partnership, Network Performance, Customer Centricity, Judgment and Learning Agility

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies