Jobs
Interviews

1781 Data Architecture Jobs - Page 31

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 13.0 years

13 - 17 Lacs

Hyderabad

Work from Office

Career Category Information Systems Job Description Join Amgen s Mission of Serving Patients At Amgen, if you feel like you re part of something bigger, it s because you are. Our shared mission to serve patients living with serious illnesses drives all that we do. Since 1980, we ve helped pioneer the world of biotech in our fight against the world s toughest diseases. With our focus on four therapeutic areas -Oncology, Inflammation, General Medicine, and Rare Disease- we reach millions of patients each year. As a member of the Amgen team, you ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Specialist IS Architect What you will do Let s do this. Let s change the world. In this vital role you will responsible for developing and maintaining the overall IT architecture of the organization. In this role you will be responsible for designing and implementing information system architectures to support business needs. You will analyze requirements, develop architectural designs, evaluate technology solutions, and ensure alignment with industry best practices and standards. You will be working closely with collaborators to understand requirements, develop architectural blueprints, and ensure that solutions are scalable, secure, and aligned with enterprise standards. Architects will be involved in defining the enterprise architecture strategy, guiding technology decisions, and ensuring that all IT projects adhere to established architectural principles. Roles & Responsibilities: Develop and maintain the enterprise architecture vision and strategy, ensuring alignment with business objectives for Corporate Functions data architecture. Collaborating closely with business clients and key collaborators to align solutions with strategic objectives. Create and maintain architectural roadmaps that guide the evolution of IT systems and capabilities for Corporate Functions data architecture Establish and enforce architectural standards, policies, and governance frameworks Evaluate emerging technologies and assess their potential impact on the solution architecture Identify and mitigate architectural risks, ensuring that IT systems are scalable, secure, and resilient Maintain comprehensive documentation of the architecture, including principles, standards, and models Drive continuous improvement in the architecture by finding opportunities for innovation and efficiency Work with partners to gather and analyze requirements, ensuring that solutions meet both business and technical needs Ensure seamless integration between systems and platforms, both within the organization and with external partners Design systems that can scale to meet growing business needs and performance demands Deliver high-quality Salesforce solutions using LWC, Apex, Flows and other Salesforce technologies. Ensure alignment to established standard methodologies and definitions of done, maintaining high-quality standards in work Create architectural design and data model as per business requirements and Salesforce standard methodologies Proactively identify technical debt and collaborate with the Principal Architect and Product Owner to prioritize and address it effectively Negotiate solutions to complex problems with both the product teams and third-party service providers Build relationships and work with product teams; contribute to broader goals and growth beyond the scope of a single or your current project What we expect of you We are all different, yet we all use our unique contributions to serve patients. Doctorate degree / Masters degree / Bachelors degree and 8 to 13 years of Computer Science, IT or related field experience Preferred Qualifications: Strong architectural design and modeling skills Proficiency in Salesforce Health Cloud / Service Cloud implementation for a Call Center Solid hands-on experience of implementing Salesforce Configurations, Apex, LWC and integrations Solid understanding of declarative tools like Flows and Process Builder Proficiency in using Salesforce tools such as Data Loader, Salesforce Inspector to query, manipulate and export data Experience in developing differentiated and deliverable solutions Ability to analyze client requirements and translate them into solutions Ability to train and guide junior developers in standard methodologies Familiarity with Agile practices such as User Story Creation and, sprint planning Experience creating proofs of concept (PoCs) to validate new ideas or backlog items. Professional Certifications: Salesforce Admin Salesforce Advanced Administrator Salesforce Platform Developer 1 (Mandatory) Salesforce Platform Developer 2 Platform Builder Salesforce Application Architect Salesforce Health Cloud Accredited Professional (Preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Good communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated awareness of presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers. amgen. com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Bengaluru

Work from Office

About the job We are seeking a highly skilled and motivated AI-ML Lead with expertise in Generative AI and LLMs to join our team. As a Generative AI and LLM Expert, you will play a crucial role in developing and implementing cutting-edge generative models and algorithms to solve complex problems and generate high-quality outputs. You will collaborate with a multidisciplinary team of researchers, engineers, and data scientists to explore innovative applications of generative AI across various domains. Responsibilities: Research and Development: Stay up-to-date with the latest advancements in generative AI, including LLMs, GPTs, GANs (Generative Adversarial Networks), VAEs (Variational Autoencoders), and other related techniques. Conduct research to identify and develop novel generative models and algorithms. Model Development: Design, develop, and optimize generative models to generate realistic and diverse outputs. Implement and fine-tune state-of-the-art generative AI architectures to achieve desired performance metrics. Data Processing and Preparation: Collect, preprocess, and curate large-scale datasets suitable for training generative models. Apply data augmentation techniques and explore strategies to handle complex data types and distributions. Training and Evaluation: Train generative models using appropriate deep learning frameworks and libraries. Evaluate model performance using quantitative and qualitative metrics. Iterate and improve models based on feedback and analysis of results. Collaboration: Collaborate with cross-functional teams, including researchers, engineers, and data scientists, to understand project requirements, define objectives, and identify opportunities to leverage generative AI techniques. Provide technical guidance and support to team members. Innovation and Problem Solving: Identify and tackle challenges related to generative AI, such as mode collapse, training instability, and generating diverse and high-quality outputs. Propose innovative solutions and approaches to address these challenges. Documentation and Communication: Document research findings, methodologies, and model architectures. Prepare technical reports, papers, and presentations to communicate results and insights to both technical and non-technical stakeholders. Requirements: Education: A Master's or Ph.D. degree in Computer Science, Artificial Intelligence, or a related field. A strong background in deep learning, generative models, and computer vision is preferred. Experience: Proven experience in designing and implementing generative models using deep learning frameworks (e.g., TensorFlow, PyTorch). Demonstrated expertise in working with GPTs, GANs, VAEs, or other generative AI techniques. Experience with large-scale dataset handling and training deep neural networks is highly desirable. Technical Skills: Proficiency in programming languages such as Python, and familiarity with relevant libraries and tools. Strong mathematical and statistical skills, including linear algebra and probability theory. Experience with cloud computing platforms and GPU acceleration is a plus. Research and Publication: Track record of research contributions in generative AI, demonstrated through publications in top-tier conferences or journals. Active participation in the AI research community, such as attending conferences or workshops, is highly valued. Analytical and Problem-Solving Abilities: Strong analytical thinking and problem-solving skills to tackle complex challenges in generative AI. Ability to think creatively and propose innovative solutions. Attention to detail and the ability to analyze and interpret experimental results. Collaboration and Communication: Excellent teamwork and communication skills to effectively collaborate with cross-functional teams. Ability to explain complex technical concepts to both technical and non-technical stakeholders. Strong written and verbal communication skills. Adaptability and Learning: Enthusiasm for staying updated with the latest advancements in AI and generative models. Willingness to learn new techniques and adapt to evolving technologies and methodologies.

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

We are seeking a skilled and motivated Data Engineer with hands-on experience in Snowflake , Azure Data Factory (ADF) , and Fivetran . The ideal candidate will be responsible for building and optimizing data pipelines, ensuring efficient data integration and transformation to support analytics and business intelligence initiatives. Key Responsibilities: Design, develop, and maintain robust data pipelines using Fivetran , ADF , and other ETL tools. Build and manage scalable data models and data warehouses on Snowflake . Integrate data from various sources into Snowflake using automated workflows. Implement data transformation and cleansing processes to ensure data quality and integrity. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements. Monitor pipeline performance, troubleshoot issues, and optimize for efficiency. Maintain documentation related to data architecture, processes, and workflows. Ensure data security and compliance with company policies and industry standards. Required Skills & Qualifications: Bachelor's or Masters degree in Computer Science, Information Systems, or a related field. 3+ years of experience in data engineering or a similar role. Proficiency with Snowflake including architecture, SQL scripting, and performance tuning. Hands-on experience with Azure Data Factory (ADF) for pipeline orchestration and data integration. Experience with Fivetran or similar ELT/ETL automation tools. Strong SQL skills and familiarity with data warehousing best practices. Knowledge of cloud platforms, preferably Microsoft Azure. Familiarity with version control tools (e.g., Git) and CI/CD practices. Excellent communication and problem-solving skills. Preferred Qualifications: Experience with Python, dbt, or other data transformation tools. Understanding of data governance, data quality, and compliance frameworks. Knowledge of additional data tools (e.g., Power BI, Databricks, Kafka) is a plus.

Posted 1 month ago

Apply

3.0 - 8.0 years

55 - 60 Lacs

Bengaluru

Work from Office

Our vision for the future is based on the idea that transforming financial lives starts by giving our people the freedom to transform their own. We have a flexible work environment, and fluid career paths. We not only encourage but celebrate internal mobility. We also recognize the importance of purpose, well-being, and work-life balance. Within Empower and our communities, we work hard to create a welcoming and inclusive environment, and our associates dedicate thousands of hours to volunteering for causes that matter most to them. Chart your own path and grow your career while helping more customers achieve financial freedom. Empower Yourself. Job Summary: At Empower, a Sr. Architect is a mix of leadership position and thought leadership role. A Sr. Architect works with enterprise architects, and both business and IT teams, to align solutions to the technology vision they help create. This role supports enterprise Architects in the development of technology strategies, reference architectures, solutions, best practices and guidance across the entire IT development organization; all the while addressing total cost of ownership, stability, performance and efficiency. The candidate will also be working with Empower Innovation Lab team as the team is experimenting with emerging technologies, such as Generative AI, and Advanced Analytics. In this rapid paced environment, the person must possess a "can-do" attitude while demonstrating a strong work ethic. This person should have a strong aptitude to help drive decisions. He or she will be actively involved in influencing the strategic direction of technology at Empower Retirement. There will be collaboration across all teams including IT Infrastructure, PMO office, Business, and third-party integrators in reviewing, evaluating, designing and implementing solutions. The Architect must understand available technology options and educate and influence technology teams to leverage them where appropriate. The Architect will recognize and propose alternatives, make recommendations, and describe any necessary trade-offs. In some cases, particularly on key initiatives, the Architect will participate on the design and implementation of end-to-end solutions directly with development teams. The ideal candidate will leverage their technical leadership/direction-setting skills with the development organization to be able to prove technical concepts quickly using a variety of tools, methods, & frameworks. Responsibilities: Help Enterprise Architect, work with peer Sr. Architects and more junior resources to define and execute on the business aligned IT strategy and vision. Develop, document, and provide input into the technology roadmap for Empower. Create reference architectures that demonstrate an understanding of technology components and the relationships between them. Design and modernize complex systems into cloud compatible or cloud native applications where applicable. Create strategies and designs for migrating applications to cloud systems. Participate in the evaluation of new applications, technical options, challenge the status quo, create solid business cases, and influence direction while establishing partnerships with key constituencies. Implement best practices, standards & guidance, then subsequently provide coaching of technology team members. Make leadership recommendations regarding strategic architectural considerations related to process design and process orchestration. Provide strong leadership and direction in development/engineering practices. Collaborate with other business and technology teams on architecture and design issues. Respond to evolving and changing security conditions. Implement and recommend security guidelines. Provide thought-leadership, advocacy, articulation, assurance, and maintenance of the enterprise architecture discipline. Provide solution, guidance, and implementation assistance within full stack development teams. Recommend long term scalable and performant architecture changes keeping cost in control. Preferred Qualifications: 12+ years of experience in the development and delivery of data systems. This experience should be relevant to roles such as Data Analyst, ETL (Extract, Transform and Load) Developer (Data Engineer), Database Administrator (DBA), Business Intelligence Developer (BI Engineer), Machine Learning Developer (ML Engineer), Data Scientist, Data Architect, Data Governance Analyst, or a managerial position overseeing any of these functions. 3+ years of experience creating solution architectures and strategies across multiple architecture domains (business, application, data, integration, infrastructure and security). Solid experience with the following technology disciplines: Python, Cloud architectures, AWS (Amazon Web Services), Bigdata (300+TBs), Advanced Analytics, Advance SQL Skills, Data Warehouse systems(Redshift or Snowflake), Advanced Programming, NoSQL, Distributed Computing, Real-time streaming Nice to have experience in Java, Kubernetes, Argo, Aurora, Google Analytics, META Analytics, Integration with 3rd party APIs, SOA & microservices design, modern integration methods (API gateway/web services, messaging & RESTful architectures). Familiarity with BI tools such as Tableau/QuickSight. Experience with code coverage tools. Working knowledge of addressing architectural cross cutting concerns and their tradeoffs, including topics such as caching, monitoring, operational surround, high availability, security, etc. Demonstrates competency applying architecture frameworks and development methods. Understanding of business process analysis and business process management (BPM). Excellent written and verbal communication skills. Experience in mentoring junior team members through code reviews and recommend adherence to best practices. Experience working with global, distributed teams. Interacts with people constantly, demonstrating strong people skills. Able to motivate and inspire, influencing and evangelizing a set of ideals within the enterprise. Requires a high degree of independence, proactively achieving objectives without direct supervision. Negotiates effectively at the decision-making table to accomplish goals. Evaluates and solves complex and unique problems with strong problem-solving skills. Thinks broadly, avoiding tunnel vision and considering problems from multiple angles. Possesses a general understanding of the wealth management industry, comprehending how technology impacts the business. Stays on top of the latest technologies and trends through continuous learning, including reading, training, and networking with industry colleagues. Data Architecture - Proficiency in platform design and data architecture, ensuring scalable, efficient, and secure data systems that support business objectives. Data Modeling - Expertise in designing data models that accurately represent business processes and facilitate efficient data retrieval and analysis. Cost Management - Ability to manage costs associated with data storage and processing, optimizing resource usage, and ensuring budget adherence. Disaster Recovery Planning - Planning for data disaster recovery to ensure business continuity and data integrity in case of unexpected events. SQL Optimization/Performance Improvements - Advanced skills in optimizing SQL queries for performance, reducing query execution time, and improving overall system efficiency. CICD - Knowledge of continuous integration and continuous deployment processes, ensuring rapid and reliable delivery of data solutions. Data Encryption - Implementing data encryption techniques to protect sensitive information and ensure data privacy and security. Data Obfuscation/Masking - Techniques for data obfuscation and masking to protect sensitive data while maintaining its usability for testing and analysis. Reporting - Experience with static and dynamic reporting to provide comprehensive and up-to-date information to business users. Dashboards and Visualizations - Creating d ashboards and visualizations to present data in an intuitive and accessible manner, facilitating data-driven insights. Generative AI / Machine Learning - Understanding of generative artificial intelligence and machine learning to develop advanced predictive models and automate decision-making processes. Understanding of machine learning algorithms, deep learning frameworks, and AI model architectures. Understanding of ethical AI principles and practices. Experience implementing AI transparency and explainability techniques. Knowledge of popular RAG frameworks and tools (e.g., LangChain, LlamaIndex). Familiarity with fairness metrics and techniques to mitigate bias in AI models. Sample technologies: Cloud Platforms – AWS (preferred) or Azure or Google Cloud Databases - Oracle, Postgres, MySQL(preferred), RDS, DynamoDB(preferred), Snowflake or Redshift(preferred) Data Engineering (ETL, ELT) - Informatica, Talend, Glue, Python(must), Jupyter Streaming – Kafka or Kinesis CICD Pipeline – Jenkins or GitHub or GitLab or ArgoCD Business Intelligence – Quicksight (preferred), Tableau(preferred), Business Objects, MicroStrategy, Qlik, PowerBI, Looker Advanced Analytics - AWS Sagemaker(preferred), TensorFlow, PyTorch, R, scikit learn Monitoring tools – DataDog(preferred) or AppDynamics or Splunk Bigdata technologies – Apache Spark(must), EMR(preferred) Container Management technologies – Kubernetes, EKS(preferred), Docker, Helm Preferred Certifications: AWS Solution Architect AWS Data Engineer AWS Machine Learning Engineer AWS Machine Learning EDUCATION: Bachelor’s and/or master’s degree in computer science or related field (information systems, mathematics, software engineering) . We are an equal opportunity employer with a commitment to diversity. All individuals, regardless of personal characteristics, are encouraged to apply. All qualified applicants will receive consideration for employment without regard to age, race, color, national origin, ancestry, sex, sexual orientation, gender, gender identity, gender expression, marital status, pregnancy, religion, physical or mental disability, military or veteran status, genetic information, or any other status protected by applicable state or local law.

Posted 1 month ago

Apply

10.0 - 20.0 years

30 - 45 Lacs

Hyderabad, Jaipur

Hybrid

Data Architect - AI & Azure - Lead & Coach Team Job Description: Job Title: Data Architect - AI & Azure - Lead & Coach Teams Shift Timings: 12 PM - 9 PM Location: Jaipur, hybrid Experience required: 10 to 20 years Job Title: Data Architect - AI & Azure - Lead & Coach Teams Experience: Total Experience: 5+ years in data architecture and implementation. Pre-Sales Experience: Minimum 1 year in a client-facing pre-sales or technical solutioning role is mandatory. Must-Have Skills & Qualifications: Technical Expertise: In-depth knowledge of the Microsoft Azure data platform (Azure Synapse Analytics, Azure Data Factory, Azure SQL, Azure Data Lake Storage). Modern Data Platforms: Hands-on experience with Databricks and/or Snowflake. AI Acumen: Strong understanding of AI workflows and data requirements. Must have a solid grasp of Gen AI applications and concepts. Leadership: Experience in mentoring, coaching, or leading technical teams or project initiation phases. Solutioning: Proven ability to create high-quality technical proposals, respond to RFPs, and design end-to-end data solutions. Communication: Exceptional English communication and presentation skills are essential for this client-facing role. If interested Please share your resume on shivam.gaurav@programmers.io

Posted 1 month ago

Apply

2.0 - 5.0 years

12 - 16 Lacs

Pune

Work from Office

Overview We are looking for a Senior Data Engineer with deep hands-on expertise in PySpark, Databricks, and distributed data architecture. This individual will play a lead role in designing, developing, and optimizing data pipelines critical to our Ratings Modernization, Corrections, and Regulatory implementation programs under PDB 2.0. The ideal candidate will thrive in fast-paced, ambiguous environments and collaborate closely with engineering, product, and governance teams. Responsibilities Design, develop, and maintain robust ETL/ELT pipelines using PySpark and Databricks . Own pipeline architecture and drive performance improvements through partitioning, indexing, and Spark optimization . Collaborate with product owners, analysts, and other engineers to gather requirements and resolve complex data issues. Perform deep analysis and optimization of SQL queries , functions, and procedures for performance and scalability. Ensure high standards of data quality and reliability via robust validation and cleansing processes. Lead efforts in Delta Lake and cloud data warehouse architecture , including best practices for data lineage and schema management. Troubleshoot and resolve production incidents and pipeline failures quickly and thoroughly. Mentor junior team members and guide best practices across the team. Qualifications Bachelor's degree in Computer Science, Engineering, or a related technical field. 6+ years of experience in data engineering or related roles. Advanced proficiency in Python, PySpark, and SQL . Strong experience with Databricks , BigQuery , and modern data lakehouse design. Hands-on knowledge of Azure or GCP data services. Proven experience in performance tuning and large-scale data processing . Strong communication skills and the ability to work independently in uncertain or evolving contexts What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com

Posted 1 month ago

Apply

6.0 - 11.0 years

17 - 20 Lacs

Mumbai

Work from Office

We are looking for a highly skilled and experienced professional with 6 to 11 years of experience to join our team as a Manager - Business Transformation in Mumbai. Roles and Responsibility Develop and implement automation processes to enhance efficiency and productivity. Create MIS dashboards using Tableau for business insights and decision-making. Design analytical models, including scorecards, based on business requirements. Conduct deviation analytics to identify areas for improvement. Collaborate with cross-functional teams to drive business transformation initiatives. Analyze data to provide actionable recommendations to stakeholders. Job Graduate with a strong understanding of MIS and visualization tools. Proven experience in process automation and data analysis. Strong knowledge of Tableau and other data visualization tools. Excellent analytical and problem-solving skills. Ability to work collaboratively with cross-functional teams. Strong communication and interpersonal skills.

Posted 1 month ago

Apply

10.0 - 15.0 years

25 - 40 Lacs

Noida, Pune, Delhi / NCR

Hybrid

Role & responsibilities As an Architect, you will work to solve some of the most complex and captivating data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member, and Data Modeling Architect as demanded by each project to define, design, and deliver actionable insights. On a typical day, you might Engage the clients & understand the business requirements to translate those into data models. Analyze customer problems, propose solutions from a data structural perspective, and estimate and deliver proposed solutions. Create and maintain a Logical Data Model (LDM) and Physical Data Model (PDM) by applying best practices to provide business insights. Use the Data Modelling tool to create appropriate data models Create and maintain the Source to Target Data Mapping document that includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Gather and publish Data Dictionaries. Ideate, design, and guide the teams in building automations and accelerators Involve in maintaining data models as well as capturing data models from existing databases and recording descriptive information. Contribute to building data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Use version control to maintain versions of data models. Collaborate with Data Engineers to design and develop data extraction and integration code modules. Partner with the data engineers & testing practitioners to strategize ingestion logic, consumption patterns & testing. Ideate to design & develop the next-gen data platform by collaborating with cross-functional stakeholders. Work with the client to define, establish and implement the right modelling approach as per the requirement Help define the standards and best practices Involve in monitoring the project progress to keep the leadership teams informed on the milestones, impediments, etc. Coach team members, and review code artifacts. Contribute to proposals and RFPs Preferred candidate profile 10+ years of experience in Data space. Decent SQL knowledge Able to suggest modeling approaches for a given problem. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) Enthuse to coach team members & collaborate with various stakeholders across the organization and take complete ownership of deliverables. Experience in contributing to proposals and RFPs Good experience in stakeholder management Decent communication and experience in leading the team You are important to us, lets stay connected! Every individual comes with a different set of skills and qualities so even if you don’t tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal-opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire. Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry . Perks and benefits

Posted 1 month ago

Apply

10.0 - 15.0 years

30 - 40 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Role & responsibilities As an Architect, you will work to solve some of the most complex and captivating data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member, and Data Modeling Architect as demanded by each project to define, design, and deliver actionable insights. On a typical day, you might Engage the clients & understand the business requirements to translate those into data models. Analyze customer problems, propose solutions from a data structural perspective, and estimate and deliver proposed solutions. Create and maintain a Logical Data Model (LDM) and Physical Data Model (PDM) by applying best practices to provide business insights. Use the Data Modelling tool to create appropriate data models Create and maintain the Source to Target Data Mapping document that includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Gather and publish Data Dictionaries. Ideate, design, and guide the teams in building automations and accelerators Involve in maintaining data models as well as capturing data models from existing databases and recording descriptive information. Contribute to building data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Use version control to maintain versions of data models. Collaborate with Data Engineers to design and develop data extraction and integration code modules. Partner with the data engineers & testing practitioners to strategize ingestion logic, consumption patterns & testing. Ideate to design & develop the next-gen data platform by collaborating with cross-functional stakeholders. Work with the client to define, establish and implement the right modelling approach as per the requirement Help define the standards and best practices Involve in monitoring the project progress to keep the leadership teams informed on the milestones, impediments, etc. Coach team members, and review code artifacts. Contribute to proposals and RFPs Preferred candidate profile 10+ years of experience in Data space. Decent SQL knowledge Able to suggest modeling approaches for a given problem. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) Enthuse to coach team members & collaborate with various stakeholders across the organization and take complete ownership of deliverables. Experience in contributing to proposals and RFPs Good experience in stakeholder management Decent communication and experience in leading the team You are important to us, lets stay connected! Every individual comes with a different set of skills and qualities so even if you dont tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal-opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire. Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry.

Posted 1 month ago

Apply

10.0 - 17.0 years

25 - 40 Lacs

Chennai

Work from Office

Extensive experience in big data architecture, with a focus on Cloud native and/or Cloud based services / solutions. data processing technologies such as Hadoop, Spark, and Kafka, in the Cloud ecosystem. AWS, Azure and GCP.

Posted 1 month ago

Apply

15.0 - 20.0 years

13 - 17 Lacs

Noida

Work from Office

We are looking for a skilled Senior Data Architect with 15 to 20 years of experience to lead our data warehousing function, setting the vision and direction for driving actionable insights across revenue, subscriptions, paid marketing channels, and operational functions. This role is based in Remote. Roles and Responsibility Define and execute the long-term strategy for our data warehousing platform using medallion architecture and modern cloud-based solutions. Oversee end-to-end pipeline design, implementation, and maintenance for seamless integration with business intelligence tools. Champion best practices in data modeling, including the effective use of DBT packages to streamline complex transformations. Establish rigorous data quality standards, governance policies, and automated validation frameworks across all data streams. Develop frameworks to reconcile revenue discrepancies and unify validation across Finance, SEM, and Analytics teams. Implement robust monitoring and alerting systems to quickly identify, diagnose, and resolve data pipeline issues. Lead, mentor, and grow a high-performing team of data warehousing specialists, fostering a culture of accountability, innovation, and continuous improvement. Partner with RevOps, Analytics, SEM, Finance, and Product teams to align the data infrastructure with business objectives, serving as the primary data warehouse expert in discussions around revenue attribution and paid marketing channel performance. Translate complex technical concepts into clear business insights for both technical and non-technical stakeholders. Oversee deployment processes, including staging, QA, and rollback strategies, to ensure minimal disruption during updates. Regularly assess and optimize data pipelines for performance, scalability, and reliability while reducing operational overhead. Lead initiatives to transition from legacy on-premise systems to modern cloud-based architectures for improved agility and cost efficiency. Stay abreast of emerging trends and technologies in data warehousing, analytics, and cloud solutions. Propose and lead innovative projects to enhance our data capabilities, with a particular focus on predictive and prescriptive analytics. Represent the data warehousing function in senior leadership discussions and strategic planning sessions. Job Bachelor's or Master's degree in Computer Science, Data Science, Information Systems, or a related field. Proven track record in designing and implementing scalable data warehousing solutions in cloud environments. Deep experience with medallion architecture and modern data pipeline tools, including DBT (and DBT packages), Databricks, SQL, and cloud-based data platforms. Strong understanding of ETL/ELT best practices, data modeling (logical and physical), and large-scale data processing. Hands-on experience with BI tools (e.g., Tableau, Looker) and familiarity with Google Analytics, and other tracking systems. Solid understanding of attribution models (first-touch, last-touch, multi-touch) and experience working with paid marketing channels. Excellent leadership and team management skills with the ability to mentor and inspire cross-functional teams. Outstanding communication skills, capable of distilling complex technical information into clear business insights. Demonstrated ability to lead strategic initiatives, manage competing priorities, and deliver results in a fast-paced environment.

Posted 1 month ago

Apply

10.0 - 17.0 years

35 - 60 Lacs

Noida, Gurugram, Bengaluru

Hybrid

This is a individual contributor role. Looking candidates from Product/Life Science/ Pharma/Consulting background only. POSITION: Data Architect. LOCATION: NCR/Bangalore/Gurugram. PRODUCT: Axtria DataMAx is a global cloud-based data management product specifically designed for the life sciences industry. It facilitates the rapid integration of both structured and unstructured data sources, enabling accelerated and actionable business insights from trusted data This product is particularly useful for pharmaceutical companies looking to streamline their data processes and enhance decision-making capabilities. JOB OBJECTIVE: To leverage expertise in data architecture and management to design, implement, and optimize a robust data warehousing platform for the pharmaceutical industry. The goal is to ensure seamless integration of diverse data sources, maintain high standards of data quality and governance, and enable advanced analytics through the definition and management of semantic and common data layers. Utilizing Axtria DataMAx and generative AI technologies, the aim is to accelerate business insights and support regulatory compliance, ultimately enhancing decision-making and operational efficiency. Key Responsibilities: Data Modeling: Design logical and physical data models to ensure efficient data storage and retrieval. ETL Processes: Develop and optimize ETL processes to accurately and efficiently move data from various sources into the data warehouse. Infrastructure Design: Plan and implement the technical infrastructure, including hardware, software, and network components. Data Governance: Ensure compliance with regulatory standards and implement data governance policies to maintain data quality and security. Performance Optimization: Continuously monitor and improve the performance of the data warehouse to handle large volumes of data and complex queries. Semantic Layer Definition: Define and manage the semantic layer architecture and technology stack to manage the lifecycle of semantic constructs including consumption into downstream systems. Common Data Layer Management: Integrate data from multiple sources into a centralized repository, ensuring consistency and accessibility. Deep expertise in architecting enterprise grade software systems that are performant, scalable, resilient and manageable. Architecting GenAI based systems is an added plus. Advanced Analytics: Enable advanced analytics and machine learning to identify patterns in genomic data, optimize clinical trials, and personalize medication. Generative AI: Should have worked with production ready usecase for GenAI based data and Stakeholder Engagement: Work closely with business stakeholders to understand their data needs and translate them into technical solutions. Cross-Functional Collaboration: Collaborate with IT, data scientists, and business analysts to ensure the data warehouse supports various analytical and operational needs. Data Modeling: Strong expertise in Data Modelling, with ability to design complex data models from the ground up and clearly articulate the rationale behind design choices. ETL Processes: Must have worked with different loading strategies for facts and dimensions like SCD, Full Load, Incremental Load, Upsert, Append only, Rolling Window etc.. Cloud Warehouse skills: Expertise in leading cloud data warehouse platformsSnowflake, Databricks, and Amazon Redshift—with a deep understanding of their architectural nuances, strengths, and limitations, enabling the design and deployment of scalable, high-performance data solutions aligned with business objectives. Qualifications: Proven experience in data architecture and data warehousing, preferably in the pharmaceutical industry. Strong knowledge of data modeling, ETL processes, and infrastructure design. Experience with data governance and regulatory compliance in the life sciences sector. Proficiency in using Axtria DataMAx or similar data management products. Excellent analytical and problem-solving skills. Strong communication and collaboration skills. Preferred Skills: Familiarity with advanced analytics and machine learning techniques. Experience in managing semantic and common data layers. Knowledge of FDA guidelines, HIPAA regulations, and other relevant regulatory standards. Experience with generative AI technologies and their application in data warehousing.

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Noida

Work from Office

company name=Apptad Technologies Pvt Ltd., industry=Employment Firms/Recruitment Services Firms, experience=5 to 10 , jd= Job Title:-SQL+ ADF Job Location:- Gurgaon Job Type:- Full Time JD:- Strong exp in SQL developmentalong with exp in cloud AWS & good exp in ADF Job Summary : We are looking for a skilled SQL + Azure Data Factory (ADF) Developer to join our data engineering team. The ideal candidate will have strong experience in writing complex SQL queries, developing ETL pipelines using Azure Data Factory, and integrating data from multiple sources into cloud-based data solutions. This role will support data warehousing, analytics, and business intelligence initiatives. Key Responsibilities : Design, develop, and maintain data integration pipelines using Azure Data Factory (ADF) . Write optimized and complex SQL queries , stored procedures, and functions for data transformation and reporting. Extract data from various structured and unstructured sources and load into Azure-based data platforms (e.g., Azure SQL Database, Azure Data Lake). Schedule and monitor ADF pipelines, ensuring data quality, accuracy, and availability. Collaborate with data analysts, data architects, and business stakeholders to gather requirements and deliver solutions. Troubleshoot data issues and implement corrective actions to resolve pipeline or data quality problems. Implement and maintain data lineage, metadata, and documentation for pipelines. Participate in code reviews, performance tuning, and optimization of ETL processes. Ensure compliance with data governance, privacy, and security standards. Hands-on experience with T-SQL / SQL Server . Experience working with Azure Data Factory (ADF) and Azure SQL . Strong understanding of ETL processes , data warehousing concepts , and cloud data architecture . Experience working with Azure services such as Azure Data Lake, Blob Storage, and Azure Synapse Analytics (preferred). Familiarity with Git/DevOps CI/CD pipelines for ADF deployments is a plus. Excellent problem-solving, analytical, and communication skills. , Title=SQL+ ADF, ref=6566294

Posted 1 month ago

Apply

5.0 - 10.0 years

8 - 15 Lacs

Kochi

Remote

We are seeking a highly skilled ETL/Data Engineer with expertise in Informatica DEI BDM to design and implement robust data pipelines handling medium to large-scale datasets. The role involves building efficient ETL frameworks that support batch .

Posted 1 month ago

Apply

0.0 - 3.0 years

2 - 6 Lacs

Chandigarh

Work from Office

We are looking for a highly skilled and experienced Analyst to join our team at eClerx Services Ltd. The ideal candidate will have a strong background in IT Services & Consulting, with excellent analytical skills and attention to detail. Roles and Responsibility Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain complex data analysis systems and reports. Provide expert-level support for data analysis and reporting needs. Identify trends and patterns in large datasets to inform business decisions. Design and implement process improvements to increase efficiency and productivity. Develop and maintain technical documentation for data analysis systems. Job Requirements Strong understanding of data analysis principles and techniques. Proficiency in data visualization tools and programming languages. Excellent communication and problem-solving skills. Ability to work in a fast-paced environment and meet deadlines. Strong attention to detail and organizational skills. Experience working with large datasets and developing complex reports.

Posted 1 month ago

Apply

8.0 - 13.0 years

2 - 30 Lacs

Bengaluru

Work from Office

Were Hiring: Data Engineer Experience: 8+ Years Location: Bangalore / Chennai / Gurugram Company: Derisk360 Are you passionate about building scalable data systems and driving data quality across complex ecosystemsJoin Derisk360 to work on advanced cloud and data engineering initiatives that power intelligent business decision-making What Youll Do: Work with a broad stack of AWS services: S3, AWS Glue, Glue Catalog, Lambda, Step Functions, EventBridge, and more Develop and implement robust data quality checks using DQ libraries Lead efforts in data modeling and manage relational and NoSQL databases Build and automate ETL workflows using Informatica, Python, and Unix scripting Apply DevOps and Agile methodologies, including use of CI/CD tools and code repositories Engineer scalable big data solutions with Hadoop and Apache Spark Design impactful dashboards using Tableau, Amazon QuickSight, and Microsoft Power BI Work extensively on PostgreSQL and MongoDB databases Integrate real-time data pipelines with StreamSets and Kafka Drive data sourcing strategies, including real-time integration solutions Spearhead cloud migration efforts to Snowflake and Azure Data Lake, including data transitions from on-premise environments What You Bring: 8+ years of hands-on experience in data engineering roles Proficiency in AWS cloud services and modern ETL technologies Solid programming experience in Python and Unix Strong understanding of data architecture, quality frameworks, and reporting tools Experience working in Agile environments and using version control/CI pipelines Exposure to big data frameworks, real-time integration tools, and cloud data platforms What Youll Get: Competitive compensation Lead and contribute to mission-critical data engineering projects Work in a high-performance team at the intersection of cloud, data, and AI Continuous learning environment with access to cutting-edge technologies Collaborative work culture backed by technical excellence

Posted 1 month ago

Apply

2.0 - 5.0 years

8 - 12 Lacs

Pune

Work from Office

The headlines Job Title Senior Data Consultant (Delivery) Start Date Mid-July 2025 Location Hybrid; 2 days a week on-site in our office in Creaticity Mall, Shashtrinagar, Yerawada Salary ??2,300,000 ??3,800,000/annum A bit about the role Were looking for passionate Senior Data Consultants to join our Delivery team; a thriving and fast-growing community of some of the industrys best cloud data engineers, ranging from interns and graduates up to seasoned experts In this role, you'll combine deep technical expertise with strategic leadership and client engagement, acting as a trusted advisor to senior stakeholders Youll take ownership of solution architecture, project planning, and business development opportunities, driving the successful delivery of high-impact data solutions Youll have the opportunity to lead and mentor teams, shape best practices, and contribute to internal initiatives, thought leadership, and go-to-market propositions With a culture that values collaboration, innovation, and professional growth, this is the perfect opportunity for a data leader looking to make a real impact within an international, industry-leading consultancy What you'll be doing Leading the design and delivery of enterprise-scale data solutions, ensuring alignment with business objectives Building and managing client relationships at a senior level, acting as a trusted advisor to stakeholders Owning and driving solution architecture, contributing to proposal development and project planning Managing and mentoring teams, ensuring high-quality project execution and professional growth of team members Identifying new business opportunities by understanding client needs and proactively proposing solutions Driving internal initiatives such as capability development, internal products, and best practice frameworks Contributing to pre-sales efforts, presenting at client meetings, industry events, and marketing initiatives Establishing thought leadership, writing blogs, publishing articles, and presenting at external events What you'll need to succeed Expertise in data warehousing, cloud analytics, and modern data architectures (Snowflake, Matillion, Databricks, or similar) Proven ability to engage and influence senior stakeholders, providing strategic guidance and technical leadership Strong consulting and client management experience, with a track record of delivering high-impact data projects Leadership and team management skills, with experience guiding multiple teams or large-scale projects An ability to manage complex project priorities, balancing resources effectively and ensuring on-time delivery A passion for innovation and continuous improvement, with the ability to identify and implement best practices Strong communication and influencing skills, capable of managing high-tension situations and facilitating negotiations So, what's in it for you The chance to work on cutting-edge cloud data projects for leading enterprise clients An opportunity to shape strategy and drive business impact, with the autonomy to influence major decisions A chance to lead and mentor talented consultants, fostering a culture of excellence and knowledge-sharing Opportunities to contribute to thought leadership through blogging, speaking engagements, and industry networking A fast-growing, dynamic company culture that values innovation, collaboration, and personal development About Snap Analytics We're a high-growth data analytics consultancy on a mission to help enterprise businesses unlock the full potential of their data With offices in the UK, India, and South Africa, we specialise in cutting-edge cloud analytics solutions, transforming complex data challenges into actionable business insights We partner with some of the biggest brands worldwide to modernise their data platforms, enabling smarter decision-making through Snowflake, Databricks, Matillion, and other cloud technologies Our approach is customer-first, innovation-driven, and results-focused, delivering impactful solutions with speed and precision At Snap, were not just consultants, were problem-solvers, engineers, and strategists who thrive on tackling complex data challenges Our culture is built on collaboration, continuous learning, and pushing boundaries, ensuring our people grow just as fast as our business Join us and be part of a team thats shaping the future of data analytics!

Posted 1 month ago

Apply

8.0 - 13.0 years

2 - 30 Lacs

Pune

Work from Office

Join Barclays as a Senior Data Engineer At Barclays, we are building the bank of tomorrow As a Strategy and Transformation Lead you will build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure To be a successful Senior Data Engineer, you should have experience with: Hands on experience to work with large scale data platforms & in development of cloud solutions in AWS data platform with proven track record in driving business success Strong understanding of AWS and distributed computing paradigms, ability to design and develop data ingestion programs to process large data sets in Batch mode using Glue, Lambda, S3, redshift and snowflake and data bricks Ability to develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Hands on programming experience in python and PY-Spark Understanding of Dev Ops Pipelines using Jenkins, GitLab & should be strong in data modelling and Data architecture concepts & well versed with Project management tools and Agile Methodology Sound knowledge of data governance principles and tools (alation/glue data quality, mesh), Capable of suggesting solution architecture for diverse technology applications Additional Relevant Skills Given Below Are Highly Valued Experience working in financial services industry & working in various Settlements and Sub ledger functions like PNS, Stock Record and Settlements, PNL Knowledge in BPS, IMPACT & Gloss products from Broadridge & creating ML model using python, Spark & Java You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills This role is based in Pune Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures Development of processing and analysis algorithms fit for the intended data complexity and volumes Collaboration with data scientist to build and deploy machine learning models Vice President Expectations To contribute or set strategy, drive requirements and make recommendations for change Plan resources, budgets, and policies; manage and maintain policies/ processes; deliver continuous improvements and escalate breaches of policies/procedures If managing a team, they define jobs and responsibilities, planning for the departments future needs and operations, counselling employees on performance and contributing to employee pay decisions/changes They may also lead a number of specialists to influence the operations of a department, in alignment with strategic as well as tactical priorities, while balancing short and long term goals and ensuring that budgets and schedules meet corporate requirements If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard The four LEAD behaviours are: L Listen and be authentic, E Energise and inspire, A Align across the enterprise, D Develop others OR for an individual contributor, they will be a subject matter expert within own discipline and will guide technical direction They will lead collaborative, multi-year assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments They will train, guide and coach less experienced specialists and provide information affecting long term profits, organisational risks and strategic decisions Advise key stakeholders, including functional leadership teams and senior management on functional and cross functional areas of impact and alignment Manage and mitigate risks through assessment, in support of the control and governance agenda Demonstrate leadership and accountability for managing risk and strengthening controls in relation to the work your team does Demonstrate comprehensive understanding of the organisation functions to contribute to achieving the goals of the business Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategies Create solutions based on sophisticated analytical thought comparing and selecting complex alternatives In-depth analysis with interpretative thinking will be required to define problems and develop innovative solutions Adopt and include the outcomes of extensive research in problem solving processes Seek out, build and maintain trusting relationships and partnerships with internal and external stakeholders in order to accomplish key business objectives, using influencing and negotiating skills to achieve outcomes All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship our moral compass, helping us do what we believe is right They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge and Drive the operating manual for how we behave

Posted 1 month ago

Apply

6.0 - 9.0 years

8 - 11 Lacs

Hyderabad

Work from Office

This role involves the development and application of engineering practice and knowledge in defining, configuring and deploying industrial digital technologies including but not limited to PLM MES for managing continuity of information across the engineering enterprise, including design, industrialization, manufacturing supply chain, and for managing the manufacturing data. Job Description - Grade Specific Focus on Digital Continuity Manufacturing. Fully competent in own area. Acts as a key contributor in a more complex critical environment. Proactively acts to understand and anticipates client needs. Manages costs and profitability for a work area. Manages own agenda to meet agreed targets. Develop plans for projects in own area. Looks beyond the immediate problem to the wider implications. Acts as a facilitator, coach and moves teams forward.

Posted 1 month ago

Apply

11.0 - 16.0 years

40 - 45 Lacs

Pune

Work from Office

Role Description This role is for a Senior business functional analyst for Group Architecture. This role will be instrumental in establishing and maintaining bank wide data policies, principles, standards and tool governance. The Senior Business Functional Analyst acts as a link between the business divisions and the data solution providers to align the target data architecture against the enterprise data architecture principles, apply agreed best practices and patterns. Group Architecture partners with each division of the bank to ensure that Architecture is defined, delivered, and managed in alignment with the banks strategy and in accordance with the organizations architectural standards. Your key responsibilities Data Architecture: The candidate will work closely with stakeholders to understand their data needs and break out business requirements into implementable building blocks and design the solution's target architecture. AI/ML: Identity and support the creation of AI use cases focused on delivery the data architecture strategy and data governance tooling. Identify AI/ML use cases and architect pipelines that integrate data flows, data lineage, data quality. Embed AI-powered data quality, detection and metadata enrichment to accelerate data discoverability. Assist in defining and driving the data architecture standards and requirements for AI that need to be enabled and used. GCP Data Architecture & Migration: A strong working experience on GCP Data architecture is must (BigQuery, Dataplex, Cloud SQL, Dataflow, Apigee, Pub/Sub, ...). Appropriate GCP architecture level certification. Experience in handling hybrid architecture & patterns addressing non- functional requirements like data residency, compliance like GDPR and security & access control. Experience in developing reusable components and reference architecture using IaaC (Infrastructure as a code) platforms such as terraform. Data Mesh: The candidate is expected to have proficiency in Data Mesh design strategies that embrace the decentralization nature of data ownership. The candidate must have good domain knowledge to ensure that the data products developed are aligned with business goals and provide real value. Data Management Tool: Access various tools and solutions comprising of data governance capabilities like data catalogue, data modelling and design, metadata management, data quality and lineage and fine-grained data access management. Assist in development of medium to long term target state of the technologies within the data governance domain. Collaboration: Collaborate with stakeholders, including business leaders, project managers, and development teams, to gather requirements and translate them into technical solutions. Your skills and experience Demonstrable experience in designing and deploying AI tooling architectures and use cases Extensive experience in data architecture, within Financial Services Strong technical knowledge of data integration patterns, batch & stream processing, data lake/ data lake house/data warehouse/data mart, caching patterns and policy bases fine grained data access. Proven experience in working on data management principles, data governance, data quality, data lineage and data integration with a focus on Data Mesh Knowledge of Data Modelling concepts like dimensional modelling and 3NF. Experience of systematic structured review of data models to enforce conformance to standards. High level understanding of data management solutions e.g. Collibra, Informatica Data Governance etc. Proficiency at data modeling and experience with different data modelling tools. Very good understanding of streaming and non-streaming ETL and ELT approaches for data ingest. Strong analytical and problem-solving skills, with the ability to identify complex business requirements and translate them into technical solutions.

Posted 1 month ago

Apply

8.0 - 13.0 years

20 - 25 Lacs

Pune

Remote

Design Databases & Data Warehouses, Power BI Solutions, Support Enterprise Business Intelligence, Strong Team Player & Contributor, Continuous Improvement Experience in SQL, Oracle, SSIS/SSRS, Azure, ADF, CI/CD, Power BI, DAX, Microsoft Fabric Required Candidate profile Source system data structures, data extraction, data transformation, warehousing, DB administration, query development, and Power BI. Develop WORK FROM HOME

Posted 1 month ago

Apply

3.0 - 8.0 years

12 - 13 Lacs

Pune

Work from Office

Join us as a Data Records Governance Analyst at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. you'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unapparelled customer experiences. You may be assessed on the key critical skills relevant for success in role, such as experience with Data and Records Management Governance, Data Lineage, Data Controls, as we'll as job-specific skillsets. To be successful as a Data Records Governance Analyst, you should have experience with: Basic/ Essential Qualifications: Strategic Vision and Leadership. Data Governance and Quality Management. Knowledge that includes data architecture, integration, analytics, Artificial Intelligence, or Cloud computing. Desirable skillsets/ good to have: Data Modelling. Knowledge of Data Architecture or experience with working with Data Architects. Data Sourcing Provisioning. Data Analytics. Data Privacy and Security. This role will be based out of Pune. Purpose of the role To develop, implement, and maintain effective governance frameworks for all data and records across the banks global operations. Accountabilities Development and maintenance of a comprehensive data and records governance framework aligned with regulatory requirements and industry standards. Monitoring data quality and records metrics and compliance with standards across the organization. Identification and addressing of data and records management risks and gaps. Development and implementation of a records management programme that ensures the proper identification, classification, storage, retention, retrieval and disposal of records. Development and implementation of a data governance strategy that aligns with the banks overall data management strategy and business objectives. Provision of Group wide guidance and training on Data and Records Management standard requirements. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L - Listen and be authentic, E - Energise and inspire, A - Align across the enterprise, D - Develop others. OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation.

Posted 1 month ago

Apply

4.0 - 9.0 years

30 - 35 Lacs

Hyderabad

Work from Office

Career Category Information Systems Job Description Join Amgen s Mission of Serving Patients At Amgen, if you feel like you're part of something bigger, it s because you are. Our shared mission to serve patients living with serious illnesses drives all that we'do. Since 1980, we've helped pioneer the world of biotech in our fight against the world s toughest diseases. With our focus on four therapeutic areas -Oncology , Inflammation, General Medicine, and Rare Disease- we'reach millions of patients each year. As a member of the Amgen team, you'll help make a lasting impact on the lives of patients as we'research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you'll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What you will do Let s do this. Let s change the world. In this vital role you will be responsible for designing, building, maintaining , analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation Basic Qualifications : masters degree and 1 to 3 years of Computer Science, IT or related field experience OR bachelors degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Must have Skills : Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark ( PySpark , SparkSQL ), workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools ( eg. SQL) Proficient in SQL for extracting, transforming, and analyzing complex datasets from relational data stores Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Proven ability to optimize query performance on big data platforms Preferred Qualifications: Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Knowledge of Python/R, Databricks, cloud data platforms Strong understanding of data governance frameworks, tools, and best practices. Knowledge of data protection regulations and compliance requirements (e. g. , GDPR, CCPA) Professional Certifications: AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills : Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers. amgen. com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease . Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .

Posted 1 month ago

Apply

1.0 - 3.0 years

3 - 4 Lacs

Nagercoil

Work from Office

We are seeking a skilled Data Migration Specialist to support critical data transition initiatives, particularly involving Salesforce and Microsoft SQL Server . This role will be responsible for the end-to-end migration of data between systems, including data extraction, transformation, cleansing, loading, and validation. The ideal candidate will have a strong foundation in relational databases, a deep understanding of the Salesforce data model, and proven experience handling large-volume data loads. Required Skills and Qualifications: 1+ years of experience in data migration , ETL , or database development roles. Strong hands-on experience with Microsoft SQL Server and T-SQL (complex queries, joins, indexing, and profiling). Proven experience using Salesforce Data Loader for bulk data operations. Solid understanding of Salesforce CRM architecture , including object relationships and schema design. Strong background in data transformation and cleansing techniques . Nice to Have: Experience with large-scale data migration projects involving CRM or ERP systems. Exposure to ETL tools such as Talend, Informatica, Mulesoft, or custom scripts. Salesforce certifications (eg, Administrator , Data Architecture & Management Designer ) are a plus. Knowledge of Apex , Salesforce Flows , or other declarative tools is a bonus. Key Responsibilities: Execute end-to-end data migration activities , including data extraction, transformation, and loading (ETL). Develop and optimize complex SQL queries, joins, and stored procedures for data profiling, analysis, and validation. Utilize Salesforce Data Loader and/or Apex DataLoader CLI to manage high-volume data imports and exports. Understand and work with the Salesforce data model , including standard/custom objects and relationships (Lookup, Master-Detail). Perform data cleansing, de-duplication , and transformation to ensure quality and consistency. Troubleshoot and resolve data-related issues , load failures, and anomalies. Collaborate with cross-functional teams to gather data mapping requirements and ensure accurate system integration. Ensure data integrity , adherence to compliance standards, and document migration processes and mappings. Ability to independently analyze, troubleshoot, and resolve data-related issues effectively. Follow best practices for data security, performance tuning, and migration efficiency.

Posted 1 month ago

Apply

1.0 - 4.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Data Analyst II Join the industry leader to design the next generation of breakthroughs. When you join Honeywe'll, you become a member of our global team of thinkers, innovators, dreamers, and doers who make the things that make the future. That means changing the way we fly, fueling jets in an eco-friendly way, keeping buildings smart and safe and driving automation with software embedded products. Working at Honeywe'll isn t just about developing cool things. That s why all our employees enjoy access to dynamic career opportunities across different fields and industries. We offer amazing opportunities for career growth with a world-class team of diverse experts. Are you'ready to help us make the future Join a team that is elevating our strategy to drive advanced analytics and visualization tools across the Commercial enterprise. In this role, Advanced Data Analyst - CX, you will design, implement, and manage the data architecture, systems, and processes to effectively collect, store, process and analyze high volume, high dimensional data to provide strategic insight into complex business problems. This will involve creating and maintaining scalable, efficient, and secure data pipelines, data warehouses, and data lakes. You need to ensure consistency in data quality and availability for analysis and reporting including compliance with data governance and security standards. YOU MUST HAVE 6 or more years of relevant experience in Data Engineering, ETL Development Visualization Hands-On Experience in Power BI development Expert in scripting and querying languages, such as Python, SQL Experience with both Structured and Unstructured data Experience in Snowflake SFDC or SAP business and technical knowledge Knowledge of Agile development methodology Adaptability of business priorities and being flexible with work time management WE VALUE Predictive analysis / trend analysis with large data Knowledge of databases, data warehouse platforms (Snowflake) and Cloud based tools. Experience in using data integration tools for ETL processes. Demonstrated experience in adobe analytics or google analytics Ability to develop and communicate technical vision for projects and initiatives that can be understood by stakeholders and management. Proven mentoring ability to drive results and technical growth in peers. Effective communication skills (verbal, written, and presentation) for interacting with customers and peers. Demonstrated application of statistics, statistical modeling, and statistical process control. YOU MUST HAVE 6 or more years of relevant experience in Data Engineering, ETL Development Visualization Hands-On Experience in Power BI development Expert in scripting and querying languages, such as Python, SQL Experience with both Structured and Unstructured data Experience in Snowflake SFDC or SAP business and technical knowledge Knowledge of Agile development methodology Adaptability of business priorities and being flexible with work time management WE VALUE Predictive analysis / trend analysis with large data Knowledge of databases, data warehouse platforms (Snowflake) and Cloud based tools. Experience in using data integration tools for ETL processes. Demonstrated experience in adobe analytics or google analytics Ability to develop and communicate technical vision for projects and initiatives that can be understood by stakeholders and management. Proven mentoring ability to drive results and technical growth in peers. Effective communication skills (verbal, written, and presentation) for interacting with customers and peers. Demonstrated application of statistics, statistical modeling, and statistical process control. Duties and Responsibilities Work in complex data science and analytics projects in support of the Customer Experience organization. Work with GDM owner to identify the data requirements and design/ maintain/ optimize data pipeline to ingest, transform, and load structured and unstructured data from various sources into the data warehouse or data lake. Design and implement data models and to support analytical and reporting requirements. Develop, operate and maintain Advanced Power BI reporting for visualization Develop and maintain ETL (Extract, Transform, Load) processes. Develop and maintain complex SQL queries. Exploratory Data Analysis to solve complex business problems. Enforce data governance policies, standards, and best practices to maintain data quality, privacy, and security - perform audit of the same Create and maintain comprehensive documentation for data architecture, processes, and systems. Troubleshoot and resolve data-related problems and optimize system performance. Partner with IT support team on production processes, continuous improvement, and production deployments. Duties and Responsibilities Work in complex data science and analytics projects in support of the Customer Experience organization. Work with GDM owner to identify the data requirements and design/ maintain/ optimize data pipeline to ingest, transform, and load structured and unstructured data from various sources into the data warehouse or data lake. Design and implement data models and to support analytical and reporting requirements. Develop, operate and maintain Advanced Power BI reporting for visualization Develop and maintain ETL (Extract, Transform, Load) processes. Develop and maintain complex SQL queries. Exploratory Data Analysis to solve complex business problems. Enforce data governance policies, standards, and best practices to maintain data quality, privacy, and security - perform audit of the same Create and maintain comprehensive documentation for data architecture, processes, and systems. Troubleshoot and resolve data-related problems and optimize system performance. Partner with IT support team on production processes, continuous improvement, and production deployments.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies