Jobs
Interviews

1779 Data Architecture Jobs - Page 29

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 12.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Design and implement cloud-based integrations using APIs Design and provide subject matter expertise for the development of custom implementations using Java-based solutions, including Vault Java SDK Proficiency in cloud-based DAM architectures (AWS, AZURE, Google Cloud) Develop internal and external tools to help our customers and our consultants. Manage integration and development projects and resources. Creating the framework of technological systems. Plan and design the structure of a technology solution. Communicate system requirements to software development teams. Ability to lead technical discussions and provide clear documentation. Evaluate and select appropriate software or hardware and suggest integration methods. Oversee assigned programs (eg, conduct code review) and provide guidance to team members. Assist with solving technical problems when they arise. Experience in Project Management & Stakeholder Engagement. Ensure the implementation of the agreed architecture and infrastructure. Address technical concerns, ideas, and suggestions. Monitor systems to ensure they meet both user needs and business goals. Desired Profile (Key Skills) 7+ years of experience with integration architecture, design, and development for Content management systems (Veeva, Aprimo, OpenText, SharePoint, Salesforce, etc) Experience with commercial solutions & architecture design & data architecture 8+ years of development experience with one of the following languages: Java, .NET, Python, C#, or C++ Experience building solutions and interfaces in the cloud, SaaS applications using REST APIs and Java SDK Work experience in the Life Sciences. Aptitude for identifying and resolving DAM system issues. Flexibility to adapt to changing DAM requirements and technologies. Good to have Certifications in Adobe AEM, Azure, AWS, Veeva Vault Administration, or DAM-related platforms are a plus.

Posted 1 month ago

Apply

10.0 - 15.0 years

12 - 17 Lacs

Pune

Work from Office

Project description We are a leading international Bank that is going through a significant transformation of its front-to-back operations, marked as one of the banks top 3 transformation agendas. F2B Business Architecture is a small central global team in CIB- CTB to support the delivery of this key front-to-back (F2B) transformation priorities of the management board. The Data Architecture team will play the central role of defining the data model that will align the business processes and ensure data lineage, effective controls, and implement efficient client strategy and reporting solutions. This will require building strong relationships with key stakeholders and help deliver tangible value. The role will report to the India Head of Investment Bank and Cross Product F2B Operations. Responsibilities Be part of the CTB team to define and manage Data models used to implement solutions to automate F2B business processes and controls Ensure the models follows bank's data modelling standards and principles and influence them as necessary Actively partner with various functional leads & teams to socialize the data models towards adoption and execution of front-to-back solutions SkillsMust have 10+ years in financial services, preferably Strategy and solutions in the Corporate and Investment Banking domain. Strong Knowledge of transaction banking domain process and controls for banking & trading business to drive conversation with business SMEs. Experience in developing models for transacting banking products is preferable. Knowledge of Loans or Cash/Deposits lifecycle and/or Customer lifecycle and related business data required to manage operations and analytics is desirable. Well-developed business requirements analysis skills, including good communication abilities (both speaking and listening), influencing, and stakeholder management (all levels up to managing director). Can partner with Technology and business to understand current issues and articulate recommendations and solutions. Experience working in an enterprise agile environment in a matrix organization. Critical problem-solving skills, able to think tactically and strategically. Strong design experience and defining solutions. Knowledge of banking industry data models/best practices is a plus. Consolidates process and data, and existing architecture to drive recommendations and solutions. Strong Data analysis skills, SQL/Python experience, and the ability to build data models are desirable. Nice to have Good Tech stack

Posted 1 month ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Project description We are looking for an experienced Senior ServiceNow (SNOW) Engineer to join our IT Operations team. You are responsible for designing robust data models, developing custom reports, and building seamless API integrations within the ServiceNow platform. You should have a strong background in ITSM processes, data architecture, and hands-on experience with ServiceNow development and automation. You will play a pivotal role in optimizing our ServiceNow environment to enhance service delivery, operational visibility, and integration with enterprise systems. Responsibilities Internal Data Structures & Configuration Design, build, and maintain data models, tables, and relationships within the ServiceNow platform. Extend and customize out-of-the-box modules (e.g., CMDB, Incident, Change, Request, etc.) to meet business requirements. Ensure data integrity, normalization, and performance optimization across the ServiceNow environment. Collaborate with stakeholders to translate business requirements into scalable ServiceNow configurations or custom applications. Reporting & Dashboards Develop real-time dashboards and reports using ServiceNow Reporting Tools and Performance Analytics. Deliver insights into key ITSM metrics such as SLAs, incident trends, and operational KPIs. Automate the generation and distribution of recurring reports to stakeholders. Work with business and technical teams to define and implement reporting frameworks tailored to their needs. Automated Feeds & API Integration Develop and manage robust data integrations using ServiceNow REST/SOAP APIs. Build and maintain data pipelines to and from external systems (e.g., CMDB, HRIS, ERP, Flexera, etc.). Implement secure, scalable automation for data exchange with appropriate error handling, logging, and monitoring. Troubleshoot and resolve integration-related issues to ensure smooth system interoperability. Skills Must have Minimum 6+ years of hands-on experience with ServiceNow, including ITSM, CMDB, and integrations. Technical Expertise Advanced knowledge of ServiceNow architecture, configuration, and scripting (JavaScript, Glide). Strong experience with REST/SOAP APIs for ServiceNow integrations. Solid understanding of relational databases, data normalization, and model optimization. Familiarity with common enterprise systems such as ERP, HRIS, Flexera, and CMDB tools. Reporting Skills: Proficiency in ServiceNow Performance Analytics, standard reporting, and dashboard design. Experience defining KPIs and building automated reporting solutions. Soft Skills: Strong communication and collaboration skills. Proven ability to translate business requirements into scalable ServiceNow solutions. Analytical and detail-oriented mindset with a problem-solving approach. Nice to have N/A.

Posted 1 month ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Graduate with a minimum of 6+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.

Posted 1 month ago

Apply

10.0 - 16.0 years

20 - 25 Lacs

Hosur, Bengaluru

Work from Office

Position OverviewWe are seeking an experienced Data Architect to design, implement, and optimize cutting-edge data solutions with a focus on Big Data technologies This role will involve creating robust and scalable data architectures to support analytics, AI, and business intelligence initiatives The ideal candidate will have deep expertise in data modeling, integration, governance, and advanced tools and frameworks used in Big Data environments Key ResponsibilitiesData Architecture DesignDefine and design Big Data architecture solutions, including data lakes, data warehouses, and real-time processing systems Architect and implement scalable, secure, and high-performance data pipelines and data integration solutions Ensure alignment with industry best practices and organizational goals for data architecture Big Data Ecosystem Management Develop and manage workflows using Big Data tools like Hadoop, Spark, Kafka, Hive, and Flink Leverage cloud-based Big Data services (AWS EMR, Azure Synapse, GCP BigQuery, or similar) to optimize performance and scalability Oversee the implementation of streaming data platforms to support real-time analytics Data Modeling and Integration Design and maintain data models (conceptual, logical, and physical) that support structured and unstructured data Build robust ETL/ELT processes to ingest, process, and integrate large volumes of diverse data sources Implement APIs and frameworks for seamless data sharing and consumption Data Governance and SecurityEstablish frameworks to ensure data quality, lineage, and governance across the data lifecycle Implement security measures for data at rest and in motion using encryption and access controls Ensure compliance with global data regulations such as GDPR, CCPA, or similar Collaboration and Stakeholder EngagementPartner with data engineers, data scientists, business analysts, and IT teams to align architecture with business needs Translate complex technical concepts into actionable insights for stakeholders Performance Optimization and MonitoringMonitor and optimize performance of Big Data systems, ensuring low latency and high reliability Troubleshoot and resolve performance bottlenecks in distributed data environments Emerging Technology and InnovationEvaluate and implement emerging technologies, such as Graph Databases, NoSQL Systems, and AI-driven analytics platforms Continuously explore innovations in the Big Data ecosystem to drive efficiency and competitive advantage Success CriteriaSuccessful implementation of robust, secure, and scalable Big Data solutions Improved performance and cost-efficiency of data architectures Positive stakeholder feedback and high business alignment of solutions Continuous adherence to governance and security standards Preferred Qualifications Education:Bachelor s or Master s degree in Computer Science, Data Engineering, Information Systems, or a related field Experience:10+ years of experience in data architecture, with at least 3+ years focusing on Big Data technologies 5+ years as Data Architect with proficiency working in environments supporting solutions designProven track record of delivering end-to-end Big Data solutions in enterprise environments Technical Expertise:Strong understanding of Big Data frameworks like Hadoop, Spark, Kafka, Hive, Flink, and Presto Proficiency in cloud-based Big Data platforms (AWS EMR, Azure Synapse, GCP BigQuery, or Databricks)

Posted 1 month ago

Apply

10.0 - 16.0 years

20 - 25 Lacs

Hosur, Bengaluru

Work from Office

Key Responsibilities Data Architecture Design Define and design Big Data architecture solutions, including data lakes, data warehouses, and real-time processing systems. Architect and implement scalable, secure, and high-performance data pipelines and data integration solutions. Ensure alignment with industry best practices and organizational goals for data architecture. Big Data Ecosystem Management Develop and manage workflows using Big Data tools like Hadoop, Spark, Kafka, Hive, and Flink. Leverage cloud-based Big Data services (AWS EMR, Azure Synapse, GCP BigQuery, or similar) to optimize performance and scalability. Oversee the implementation of streaming data platforms to support real-time analytics. Data Modeling and Integration Design and maintain data models (conceptual, logical, and physical) that support structured and unstructured data. Build robust ETL/ELT processes to ingest, process, and integrate large volumes of diverse data sources. Implement APIs and frameworks for seamless data sharing and consumption. Data Governance and Security Establish frameworks to ensure data quality , lineage, and governance across the data lifecycle. Implement security measures for data at rest and in motion using encryption and access controls. Ensure compliance with global data regulations such as GDPR, CCPA, or similar. Gen AI exposure / experience as mandatory . Collaboration and Stakeholder Engagement Partner with data engineers, data scientists, business analysts, and IT teams to align architecture with business needs. Translate complex technical concepts into actionable insights for stakeholders. Performance Optimization and Monitoring Monitor and optimize performance of Big Data systems, ensuring low latency and high reliability. Troubleshoot and resolve performance bottlenecks in distributed data environments. Emerging Technology and Innovation Evaluate and implement emerging technologies, such as Graph Databases, NoSQL Systems, and AI-driven analytics platforms . Continuously explore innovations in the Big Data ecosystem to drive efficiency and competitive advantage. Success Criteria Explore different tech stacks and architecture design Document supporting evidence with KPIs for decision on solution design and document product guideline and protocols to seamlessly utilize the framework Prior experience with ETL & Big Data to set up scalable pipeline to process data in real time and batch Develop configurable solutions to support cross functional requirements and support multiple platforms. Experience in managing cross-functional team, requirements gathering , day-to-day relationships with clients, and stakeholders supporting clients to achieve better outcomes Preferred Qualifications Experience : 10+ years of experience in data architecture, with at least 3+ years focusing on Big Data technologies. 5+ years as Data Architect with proficiency working in environments supporting solutions design Proven track record of delivering end-to-end Big Data solutions in enterprise environments. Technical Expertise : Strong understanding of Big Data frameworks like Hadoop, Spark, Kafka, Hive, Flink, and Presto. Proficiency in cloud-based Big Data platforms (AWS EMR, Azure Synapse, GCP BigQuery, or Databricks). Expertise in database systems, including both SQL (e. g. , PostgreSQL, MySQL) and NoSQL (e. g. , MongoDB, Cassandra). Hands-on experience with ETL tools like Talend, Informatica, or Apache NiFi. Familiarity with data visualization tools (e. g. , Tableau, Power BI) and analytics platforms. Certifications : Certifications such as AWS Certified Data Analytics, Azure Data Engineer Associate, GCP Professional Data Engineer, or Hadoop certifications are highly desirable. Key Attributes Strong analytical and problem-solving skills with a passion for data-driven innovation. Excellent communication and collaboration abilities to engage both technical and non-technical stakeholders. Strategic mindset with a focus on scalability, performance, and alignment with business objectives. Ability to thrive in fast-paced environments and handle multiple priorities effectively.

Posted 1 month ago

Apply

1.0 - 6.0 years

20 - 25 Lacs

Bengaluru

Work from Office

We have an opportunity to impact your career and provide an adventure where you can push the limits of whats possible. As a Lead Software Engineer at JPMorgan Chase within the Commercial and Investment Bank - Payments Technology team, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm s business objectives. Job responsibilities Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture Contributes to software engineering communities of practice and events that explore new and emerging technologies Provides guidance to immediate team of software engineers on daily tasks and activities Sets the overall guidance and expectations for team output, practices, and collaboration Anticipates dependencies with other teams to deliver products and applications in line with business requirements Manages stakeholder relationships and the team s work in accordance with compliance standards, service level agreements, and business requirements Creates a culture of diversity, equity, inclusion, and respect for the team members and prioritizes diverse representation Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience Hands on experience in data mapping, data architecture, and data modeling on Databricks Extensive experience in AWS, Design, implementation, and maintenance of data pipelines using Python, pyspark on Databricks Proficient in Python, and PySpark, able to write and execute complex queries to perform curation and build views required by end users (single and multi-dimensional) Extensive experience in Databricks Data engineering (Job Runs, Data Ingestion and Delta Live Tables, Spark Streaming) Experienced in standing up and maintaining EC2 instances, Kubernetes clusters and Lambda services Experience in building Notebooks with complex code structures and debugging failed jobs. Proven experience in performance and tuning to ensure jobs are running at optimal levels and no performance bottleneck Proven ability to deliver high quality features into production system in a rapid paced, iterative development environment. Preferred qualifications, capabilities, and skills Exposure to cloud technologies We have an opportunity to impact your career and provide an adventure where you can push the limits of whats possible. As a Lead Software Engineer at JPMorgan Chase within the Commercial and Investment Bank - Payments Technology team, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm s business objectives. Job responsibilities Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture Contributes to software engineering communities of practice and events that explore new and emerging technologies Provides guidance to immediate team of software engineers on daily tasks and activities Sets the overall guidance and expectations for team output, practices, and collaboration Anticipates dependencies with other teams to deliver products and applications in line with business requirements Manages stakeholder relationships and the team s work in accordance with compliance standards, service level agreements, and business requirements Creates a culture of diversity, equity, inclusion, and respect for the team members and prioritizes diverse representation Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience Hands on experience in data mapping, data architecture, and data modeling on Databricks Extensive experience in AWS, Design, implementation, and maintenance of data pipelines using Python, pyspark on Databricks Proficient in Python, and PySpark, able to write and execute complex queries to perform curation and build views required by end users (single and multi-dimensional) Extensive experience in Databricks Data engineering (Job Runs, Data Ingestion and Delta Live Tables, Spark Streaming) Experienced in standing up and maintaining EC2 instances, Kubernetes clusters and Lambda services Experience in building Notebooks with complex code structures and debugging failed jobs. Proven experience in performance and tuning to ensure jobs are running at optimal levels and no performance bottleneck Proven ability to deliver high quality features into production system in a rapid paced, iterative development environment. Preferred qualifications, capabilities, and skills Exposure to cloud technologies

Posted 1 month ago

Apply

7.0 - 12.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Job Description Job Profile - Lead Data Engineer Does working with data on a day to day basis excite you? Are you interested in building robust data architecture to identify data patterns and optimise data consumption for our customers, who will forecast and predict what actions to undertake based on data? If this is what excites you, then you ll love working in our intelligent automation team. Schneider AI Hub is leading the AI transformation of Schneider Electric by building AI-powered solutions. We are looking for a savvy Data Engineer to join our growing team of AI and machine learning experts. You will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support our software engineers, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. Responsibilities Create and maintain optimal data pipeline architecture; assemble large, complex data sets that meet functional / non-functional requirements. Design the right schema to support the functional requirement and consumption patter. Design and build production data pipelines from ingestion to consumption. Create necessary preprocessing and postprocessing for various forms of data for training/ retraining and inference ingestions as required. Create data visualization and business intelligence tools for stakeholders and data scientists for necessary business/ solution insights. Identify, design, and implement internal process improvements: automating manual data processes, optimizing data delivery, etc. Ensure our data is separated and secure across national boundaries through multiple data centers Requirements and Skills You should have a bachelors or master s degree in computer science, Information Technology or other quantitative fields You should have at least 8 years working as a data engineer in supporting large data transformation initiatives related to machine learning, with experience in building and optimizing pipelines and data sets Strong analytic skills related to working with unstructured datasets. Experience with Azure cloud services, ADF, ADLS, HDInsight, Data Bricks, App Insights etc Experience in handling ETL s using Spark. Experience with object-oriented/object function scripting languages: Python, Pyspark, etc. Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. You should be a good team player and committed for the success of team and overall project. About Us Schneider Electric creates connected technologies that reshape industries, transform cities and enrich lives. Our 144, 000 employees thrive in more than 100 countries. From the simplest of switches to complex operational systems, our technology, software and services improve the way our customers manage and automate their operations. Great people make Schneider Electric a great company. . Schedule: Full-time Req: 0098TK

Posted 1 month ago

Apply

7.0 - 14.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Not Applicable Specialism Data, Analytics & AI & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC s Job Summary We are seeking a talented Data Engineer with strong expertise in Databricks, specifically in Unity Catalog, PySpark, and SQL, to join our data team. You ll play a key role in building secure, scalable data pipelines and implementing robust data governance strategies using Unity Catalog. Key Responsibilities Design and implement ETL/ELT pipelines using Databricks and PySpark. Work with Unity Catalog to manage data governance, access controls, lineage, and auditing across data assets. Develop highperformance SQL queries and optimize Spark jobs. Collaborate with data scientists, analysts, and business stakeholders to understand data needs. Ensure data quality and compliance across all stages of the data lifecycle. Implement best practices for data security and lineage within the Databricks ecosystem. Participate in CI/CD, version control, and testing practices for data pipelines. Required Skills Proven experience with Databricks and Unity Catalog (data permissions, lineage, audits). Strong handson skills with PySpark and Spark SQL. Solid experience writing and optimizing complex SQL queries. Familiarity with Delta Lake, data lakehouse architecture, and data partitioning. Experience with cloud platforms like Azure or AWS. Understanding of data governance, RBAC, and data security standards. Preferred Qualifications Databricks Certified Data Engineer Associate or Professional. Experience with tools like Airflow, Git, Azure Data Factory, or dbt. Exposure to streaming data and realtime processing. Knowledge of DevOps practices for data engineering. Mandatory skill sets Databricks Preferred skill sets Databricks Years of experience required 714 years Education qualification BE/BTECH, ME/MTECH, MBA, MCA Education Degrees/Field of Study required Master of Engineering, Master of Business Administration, Bachelor of Engineering, Bachelor of Technology Degrees/Field of Study preferred Required Skills Databricks Platform Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 33 more} No

Posted 1 month ago

Apply

6.0 - 11.0 years

8 - 12 Lacs

Hyderabad

Work from Office

We are seeking a Lead Snowflake Engineer to join our dynamic Data Engineering team. This role will involve owning the architecture, implementation, and optimization of our Snowflake-based data warehouse solutions while mentoring a team of engineers and driving project success. The ideal candidate will bring deep technical expertise in Snowflake, hands-on experience with DBT (Data Build Tool), and a collaborative mindset for working across data, analytics, and business teams. Key Responsibilities: Design and implement scalable and efficient Snowflake data warehouse architectures and ELT pipelines. Leverage DBT to build and manage data transformation workflows within Snowflake. Lead data modeling efforts to support analytics and reporting needs across the organization. Optimize Snowflake performance including query tuning, resource scaling, and storage usage. Collaborate with business stakeholders and data analysts to gather requirements and deliver high-quality data solutions. Manage and mentor a team of data engineers; provide technical guidance, code reviews, and career development support. Establish and enforce best practices for data engineering, including version control, CI/CD, documentation, and data quality. Ensure data solutions are secure, compliant, and aligned with privacy regulations (e.g., GDPR, CCPA). Continuously evaluate emerging tools and technologies to enhance our data ecosystem. Bachelor s degree in Computer Science, Information Technology, or related field. 6+ years of experience in data engineering, including at least 2+ years of hands-on experience with Snowflake.

Posted 1 month ago

Apply

7.0 - 11.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Job Description We are seeking a Lead Snowflake Engineer to join our dynamic Data Engineering team. This role will involve owning the architecture, implementation, and optimization of our Snowflake-based data warehouse solutions while mentoring a team of engineers and driving project success. The ideal candidate will bring deep technical expertise in Snowflake, hands-on experience with DBT (Data Build Tool), and a collaborative mindset for working across data, analytics, and business teams. Key Responsibilities: Design and implement scalable and efficient Snowflake data warehouse architectures and ELT pipelines. Leverage DBT to build and manage data transformation workflows within Snowflake. Lead data modeling efforts to support analytics and reporting needs across the organization. Optimize Snowflake performance including query tuning, resource scaling, and storage usage. Collaborate with business stakeholders and data analysts to gather requirements and deliver high-quality data solutions. Manage and mentor a team of data engineers; provide technical guidance, code reviews, and career development support. Establish and enforce best practices for data engineering, including version control, CI/CD, documentation, and data quality. Ensure data solutions are secure, compliant, and aligned with privacy regulations (e.g., GDPR, CCPA). Continuously evaluate emerging tools and technologies to enhance our data ecosystem. Qualifications Bachelor s degree in Computer Science, Information Technology, or related field. 6+ years of experience in data engineering, including at least 2+ years of hands-on experience with

Posted 1 month ago

Apply

5.0 - 10.0 years

13 - 23 Lacs

Bengaluru

Work from Office

Job Title: Data Architect - Supply Chain Please apply using Below link: Link to apply: https://jobs.exxonmobil.com/ExxonMobil/job/Bengaluru-Data-Architect-Supply-Chain-KA/1296382000/ About us At ExxonMobil, our vision is to lead in energy innovations that advance modern living and a net-zero future. As one of the worlds largest publicly traded energy and chemical companies, we are powered by a unique and diverse workforce fueled by the pride in what we do and what we stand for. The success of our Upstream, Product Solutions and Low Carbon Solutions businesses is the result of the talent, curiosity and drive of our people. They bring solutions every day to optimize our strategy in energy, chemicals, lubricants and lower-emissions technologies. We invite you to bring your ideas to ExxonMobil to help create sustainable solutions that improve quality of life and meet societys evolving needs. Learn more about our What and our Why and how we can work together . ExxonMobil’s affiliates in India ExxonMobil’s affiliates have offices in India in Bengaluru, Mumbai and the National Capital Region. ExxonMobil’s affiliates in India supporting the Product Solutions business engage in the marketing, sales and distribution of performance as well as specialty products across chemicals and lubricants businesses. The India planning teams are also embedded with global business units for business planning and analytics. ExxonMobil’s LNG affiliate in India supporting the upstream business provides consultant services for other ExxonMobil upstream affiliates and conducts LNG market-development activities. The Global Business Center - Technology Center provides a range of technical and business support services for ExxonMobil’s operations around the globe. ExxonMobil strives to make a positive contribution to the communities where we operate and its affiliates support a range of education, health and community-building programs in India. Read more about our Corporate Responsibility Framework. To know more about ExxonMobil in India, visit ExxonMobil India and the Energy Factor India. What role you will play in our team The CDO Data Architect will work towards building and/or providing knowledge in support of the overall Central Data Office mission and data strategy. Job will be based at Bangalore, Whitefield office ( WFO) for 5 days in a week. What you will do The Data Architect will work closely with other Data Architects, subject matter experts (SMEs), data engineers and developers to design enterprise data sets and consult with projects to ensure data architecture is aligned with enterprise data principles and standards. The position will report to the Data Architect Manager. The CDO Data Architect will work towards building and/or providing knowledge in support of the overall Central Data Office mission and data strategy. To achieve these goals, the Data Architect will be required to analyze current state data architectures and conceive desired future state data architectures and identify activities needed to close the gap to achieve the future state. Support development of conceptual and logical data models Work with Data Governance teams to ensure business glossaries, data dictionaries and data catalogs are created and maintained Drive strategies/approaches and principles for data management (including master/reference data and identification of key data domains, data governance framework, etc.) Ensure data architecture of delivered solutions support proper security/access, retention and classification Consult with projects and ensure solutions are aligned with the CDO vision/mission and enterprise data architecture principles using established data architecture assessment methodologies Partner with business leads and data governance groups to support business data needs Lead assessments of business/data requirements for projects, products, MVPs and other efforts to validate that overall design satisfies business needs and adheres to key standards and principles Provide architectural leadership, guidance, consulting, mentoring, education and support to other architects, data professionals and to projects and product teams Provide input to Management of Change (MOC) plans to ensure enterprise data set adoption and sustainability Analyze business processes and identify/implement improvement opportunities Support development of conceptual and logical data models Work with Data Governance teams to ensure business glossaries, data dictionaries and data catalogs are created and maintained Drive strategies/approaches and principles for data management (including master/reference data and identification of key data domains, data governance framework, etc.) Ensure data architecture of delivered solutions support proper security/access, retention and classification Consult with projects and ensure solutions are aligned with the CDO vision/mission and enterprise data architecture principles using established data architecture assessment methodologies Partner with business leads and data governance groups to support business data needs Lead assessments of business/data requirements for projects, products, MVPs and other efforts to validate that overall design satisfies business needs and adheres to key standards and principles Provide architectural leadership, guidance, consulting, mentoring, education and support to other architects, data professionals and to projects and product teams Provide input to Management of Change (MOC) plans to ensure enterprise data set adoption and sustainability Analyze business processes and identify/implement improvement opportunities About You Required Skills and Qualifications: Master's or Bachelor's degree in business, computer science, engineering, systems analysis or a related field Minimum 3 years of experience in data design/architecture and strong willingness to continue learning Recent experience developing reference data architecture, data modeling (conceptual, logical and physical), data profiling, data quality analysis, building business data glossaries and data catalogs Knowledge regarding data governance and master/reference data management programs Experience using SQL Server, SQL query language and E/R Studio data modeling tool Experience working with agile delivery teams Effective planning, communication, collaboration and persuasion skills to drive needed change across the company Expert written and verbal communication skills; familiarity with SharePoint team sites to collaborate; self-starter, takes initiative and is able to work in a fast-paced environment Preferred Qualifications/ Experience Knowledge of TOGAF (The Open Group Architectural Framework) and DAMA (Data Management Association) DMBoK (Data Management Body Of Knowledge) v.2 desirable (key differentiator) Experience with MDM tools such as Precisely Enter Works and cataloging tools such as Collibra Understanding of large data store technologies (Data Lakes, Data Warehouse, Data Hubs, etc.) Knowledge of XML, JSON, Python; understanding of API concepts and integration architecture Leadership: Strong interpersonal skills with ability to influence without direct authority; able to effectively interact with large and diverse user community Strong background in data, analytics, systems and tools: good working knowledge of SAP database technologies (ERP, HANA, etc.) and related data extraction tools Your benefits An ExxonMobil career is one designed to last. Our commitment to you runs deep: our employees grow personally and professionally, with benefits built on our core categories of health, security, finance and life. We offer you: Competitive compensation Medical plans, maternity leave and benefits, life, accidental death and dismemberment benefits Retirement benefits Global networking & cross-functional opportunities Annual vacations & holidays Day care assistance program Training and development program Tuition assistance program Workplace flexibility policy Relocation program Transportation facility Please note benefits may change from time to time without notice, subject to applicable laws. The benefits programs are based on the Company’s eligibility guidelines. Stay connected with us Learn more about ExxonMobil in India, visit ExxonMobil India and Energy Factor India. Follow us on LinkedIn and Instagram Like us on Facebook Subscribe our channel at YouTube EEO Statement ExxonMobil is an Equal Opportunity Employer: All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, age, national origin or disability status. Business solicitation and recruiting scams ExxonMobil does not use recruiting or placement agencies that charge candidates an advance fee of any kind (e.g., placement fees, immigration processing fees, etc.). Follow the LINK to understand more about recruitment scams in the name of ExxonMobil. Nothing herein is intended to override the corporate separateness of local entities. Working relationships discussed herein do not necessarily represent a reporting connection, but may reflect a functional guidance, stewardship, or service relationship. Exxon Mobil Corporation has numerous affiliates, many with names that include ExxonMobil, Exxon, Esso and Mobil. For convenience and simplicity, those terms and terms like corporation, company, our, we and its are sometimes used as abbreviated references to specific affiliates or affiliate groups. Abbreviated references describing global or regional operational organizations and global or regional business lines are also sometimes used for convenience and simplicity. Similarly, ExxonMobil has business relationships with thousands of customers, suppliers, governments, and others. For convenience and simplicity, words like venture, joint venture, partnership, co-venturer, and partner are used to indicate business relationships involving common activities and interests, and those words may not indicate precise legal relationships.

Posted 1 month ago

Apply

4.0 - 10.0 years

22 - 27 Lacs

Bengaluru

Work from Office

At Elanco (NYSE: ELAN) it all starts with animals! As a global leader in animal health, we are dedicated to innovation and delivering products and services to prevent and treat disease in farm animals and pets. We re driven by our vision of Food and Companionship Enriching Life and our approach to sustainability the Elanco Healthy Purpose to advance the health of animals, people, the planet and our enterprise. Making animals lives better makes life better join our team today! Your Role: Sr Data Engineer The data engineer s role is delivery focused. The person in this role will drive data pipeline and data product delivery through data- architecture, modeling, design, and development a professional grade solution on premise and/or Microsoft Azure cloud. Partner with data scientists and statisticians across Elanco global business functions to help prepare and transform their data into data products that further drive the scientific and/or business knowledge discovery, insights, and forecasting. Data engineers will be part of a highly collaborative and cross-functional team of technology and data experts working on solving complex scientific and business challenges in animal health using cutting edge data and analytics technologies. Your Responsibilities: Provide data engineering subject matter expertise and hands-on data- capture, ingestion, curation, and pipeline development expertise on Azure to deliver cloud optimized data solutions. Provide expert data PaaS on Azure storage; big data platform services; server-less architectures; Azure SQL DB; NoSQL databases and secure, automated data pipelines. Participate in data/data-pipeline architectural discussions to help build cloud native solutions or migrate existing data applications from on premise to Azure platform. Perform current state AS-IS and future state To-Be analysis. Participate and help develop data engineering community of practice as a global go-to expert panel/resource. Develop and evolve new or existing data engineering methods and procedures to create possible alternative, agile solutions to moderately complex problems. What You Need to Succeed (minimum qualifications): At least 2 years of data pipeline and data product design, development, delivery experience and deploying ETL/ELT solutions on Azure Data Factory . Education : Bachelors or higher degree in Computer Science or a related discipline. What will give you a competitive edge (preferred qualifications): Azure native data/big-data tools, technologies and services experience including Storage BLOBS, ADLS, Azure SQL DB, COSMOS DB , NoSQL and SQL Data Warehouse. Sound problem solving skills in developing data pipelines using Data Bricks , Stream Analytics and PowerBI. Minimum of 2 years of hands-on experience in programming languages, Azure and Big Data technologies such as PowerShell, C#, Java, Python, Scala, SQL , ADLS/Blob, Hadoop, Spark/SparkSQL, Hive , and streaming technologies like Kafka, EventHub etc. Additional Information: Travel: 0% Location: India, Bangalore Don t meet every single requirementStudies have shown underrecognized groups are less likely to apply to jobs unless they meet every single qualification. At Elanco we are dedicated to building a diverse and inclusive work environment. If you think you might be a good fit for a role but dont necessarily meet every requirement, we encourage you to apply. You may be the right candidate for this role or other roles! Elanco is an EEO/Affirmative Action Employer and does not discriminate on the basis of age, race, color, religion, gender, sexual orientation, gender identity, gender expression, national origin, protected veteran status, disability or any other legally protected status

Posted 1 month ago

Apply

12.0 - 22.0 years

20 - 32 Lacs

Pune, Chennai, Bengaluru

Work from Office

LOCATION-BENGALURU, CHENNAI,PUNE NOTE- Immediate Joiners only Core Qualifications 12+ years in software & data architecture with hands-on delivery. Agentic AI & AWS Bedrock (Must-Have): Practical experience designing, deploying, and operating Agentic AI solutions using AWS Bedrock & Bedrock Agents. Cloud-Native AWS Expertise: Deep knowledge across compute, storage, networking, and security. Modern Architectures: Proven success in defining stacks for microservices, event-driven systems, and data platforms (e.g., Snowflake, Databricks). DevOps & IaC: Skilled in CI/CD pipelines and Infrastructure as Code using Azure DevOps & Terraform. Data & Integration: Strong in data modeling, REST/GraphQL API design, ETL/ELT, CDC, and messaging integration. Stakeholder Engagement: Excellent communicator with ability to align tech solutions to business outcomes. Preferred: Experience in media or broadcasting. Familiar with Salesforce or enterprise iPaaS platforms. Certifications: AWS/Azure/GCP Architect, Salesforce Integration Architect, TOGAF Have questions? I'm happy to help just connect with me on 9899080360, email- admin@spearheadps.com

Posted 1 month ago

Apply

19.0 - 23.0 years

60 - 75 Lacs

Bengaluru, Delhi / NCR

Hybrid

Preferred candidate profile : Solution Architect Data to lead the design and implementation of scalable, secure, and high-performance data solutions.You will play a key role in defining the data architecture and strategy across enterprise platforms, ensuring alignment with business goals and IT standards. 18+ years of IT experience with Atleast 5 years of worknig as an Architect. Experience working as a Data Architect. Experience architecting reporting and analytics solutions. Expereince architecting AI & ML solutoins. Experience with Databricks.

Posted 1 month ago

Apply

8.0 - 13.0 years

32 - 45 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Job Title: Data Architect Location: Bangalore ,Hyderabad,Chennai, Pune, Gurgaon - hybrid- 2/3 days WFO Experience: 8+ years Position Overview: We are seeking a highly skilled and strategic Data Architect to design, build, and maintain the organizations data architecture. The ideal candidate will be responsible for aligning data solutions with business needs, ensuring data integrity, and enabling scalable and efficient data flows across the enterprise. This role requires deep expertise in data modeling, data integration, cloud data platforms, and governance practices. Key Responsibilities: Architectural Design: Define and implement enterprise data architecture strategies, including data warehousing, data lakes, and real-time data systems. Data Modeling: Develop and maintain logical, physical, and conceptual data models to support analytics, reporting, and operational systems. Platform Management: Select and oversee implementation of cloud and on-premises data platforms (e.g., Snowflake, Redshift, BigQuery, Azure Synapse, Databricks). Integration & ETL: Design robust ETL/ELT pipelines and data integration frameworks using tools such as Apache Airflow, Informatica, dbt, or native cloud services. Data Governance: Collaborate with stakeholders to implement data quality, data lineage, metadata management, and security best practices. Collaboration: Work closely with data engineers, analysts, software developers, and business teams to ensure seamless and secure data access. Performance Optimization: Tune databases, queries, and storage strategies for performance, scalability, and cost-efficiency. Documentation: Maintain comprehensive documentation for data structures, standards, and architectural decisions. Required Qualifications: Bachelors or master’s degree in computer science, Information Systems, or a related field. 5+ years of experience in data architecture, data engineering, or database development. Strong expertise in data modeling, relational and NoSQL databases (e.g., PostgreSQL, MySQL, MongoDB, Cassandra). Experience with modern data platforms and cloud ecosystems (AWS, Azure, or GCP). Hands-on experience with data warehousing solutions and tools (e.g., Snowflake, Redshift, BigQuery). Proficiency in SQL and data scripting languages (e.g., Python, Scala). Familiarity with data privacy regulations (e.g., GDPR, HIPAA) and security standards. Tech Stack AWS Cloud – S3, EC2, EMR, Lambda, IAM, Snowflake DB Databricks Spark/Pyspark, Python Good Knowledge of Bedrock and Mistral AI RAG & NLP LangChain and LangRAG LLMs Anthropic Claude, Mistral, LLaMA etc.,

Posted 1 month ago

Apply

1.0 - 6.0 years

3 - 8 Lacs

Lucknow

Work from Office

Product Manager - Backend Services & Integrations. Product Manager - Backend Services & Integrations. Role Overview: As the Backend Services and Integration Product Manager, you will own the technical backbone of Urban IQ s platform You ll drive the development of scalable, reliable backend services and integrations, ensuring our platform can seamlessly integrate with IoT devices and external systems. Key Responsibilities: Own and define platform capabilities, focusing on backend scalability, APIs , and data pipelines. Collaborate with engineering teams to implement backend solutions that integrate AI and IoT technologies. Lead feasibility testing for backend services, ensuring performance and scalability meet customer requirements. Develop product specifications and user stories for backend services, using tools like Jira for quick, iterative cycles. Engage with customers to understand technical integration needs, working with cross-functional teams to ensure successful deployments. Partner with sales and marketing teams to build technical collateral for market- facing initiatives, showcasing the platform s backend strengths. Drive collaboration across departments, ensuring alignment on technical capabilities, customer needs, and market opportunities. Qualifications: 1+ years of product management experience , with a focus on backend technologies and integrations. Hands-on experience working with scrum teams , driving technical projects from conception to delivery. Strong expertise in cloud platform , data architecture, API development, and integrations. Good understanding of AI concepts and how to use agentic frameworks in solving complex business and operational problems Proven ability to lead technical feasibility testing and develop product specifications in agile environments. Experience collaborating with sales, marketing, and engineering to build comprehensive product strategies. Excellent technical understanding, problem-solving skills, and leadership abilities. Engineering Degree (BS and/or Masters) and prior technical experience preferred MBA would be a big plus in this role

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Mumbai

Work from Office

Summary: Skima is seeking a skilled Data Engineer to join our dynamic team in Mumbai The ideal candidate will have 3 years of experience in data engineering, with a strong background in designing, building, and maintaining scalable data pipelines and systems As a Data Engineer at Skima, you will be responsible for developing and optimizing our data architecture, ensuring the reliability and efficiency of our data processes You will work closely with data scientists, analysts, and other stakeholders to support their data needs and drive data-driven decision-making across the organization The role requires proficiency in SQL, Python, and big data technologies such as Hadoop, Spark, and Kafka Experience with cloud platforms like AWS, Azure, or Google Cloud is highly desirable This is an in-office position, offering a competitive CTC range of 50,000 to 70,000 If you are passionate about data engineering and eager to contribute to a forward-thinking company, we encourage you to apply and become a part of Skimas innovative team. Responsibilities Design, build, and maintain scalable data pipelines and systems. Develop and optimize data architecture to ensure reliability and efficiency. Collaborate with data scientists, analysts, and other stakeholders to support their data needs. Drive data-driven decision-making across the organization. Ensure data quality and integrity through robust data validation and monitoring processes. Requirements 3 years of experience in data engineering Strong background in designing, building, and maintaining scalable data pipelines and systems Proficiency in SQL Proficiency in Python Experience with big data technologies such as Hadoop, Spark, and Kafka

Posted 1 month ago

Apply

7.0 - 10.0 years

25 - 35 Lacs

Hyderabad

Work from Office

Job Summary As a Senior Data Engineer, you will play a key role in developing and maintaining the databases and scripts that power Creditsafes products and websites. You will be responsible for handling large datasets, designing scalable data pipelines, and ensuring seamless data processing across cloud environments. This role provides an excellent opportunity to contribute to an exciting, fast paced, and rapidly expanding organization. Key Responsibilities Develop and maintain scalable, metadata-driven, event-based distributed data processing platforms. Design and implement data solutions using Python, Airflow, Redshift, DynamoDB, AWS Glue, and S3. Build and optimize APIs to securely handle over 1,000 transactions per second using serverless technologies. Participate in peer reviews and contribute to a clean, efficient, and high-performance codebase. Implement best practices such as continuous integration, test-driven development, and cloud optimization. Understand company and domain data to suggest improvements in existing products. Provide mentorship and technical leadership to the engineering team. Skills & Qualifications Proficiency in Python and experience in building scalable data pipelines. Experience working in cloud environments such as AWS (S3, Glue, Redshift, DynamoDB). Strong understanding of data architecture, database design, and event-driven data processing. Ability to write clean, efficient, and maintainable code. Excellent communication skills and ability to collaborate within a team. Experience in mentoring engineers and providing leadership on complex technology issues. Benefits Competitive salary and performance bonus scheme. Hybrid working model for better work-life balance. 20 days annual leave plus 10 bank holidays. Healthcare, company pension, gratuity, and parental insurance. Cab services for women for enhanced safety. Global company gatherings and career growth opportunities.

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

Job Summary We are seeking a skilled and motivated AWS Glue Data Engineer to join our data engineering team The ideal candidate will have handson experience designing and implementing scalable ETL pipelines using AWS Glue and a strong understanding of cloudbased data architecture You will play a key role in transforming raw data into actionable insights that drive business decisions Key Responsibilities Design develop and maintain ETL pipelines using AWS Databricks/Glue PySpark and other AWS services Collaborate with data scientists analysts and business stakeholders to understand data requirements Optimize data workflows for performance scalability and costefficiency Implement data quality checks validation and monitoring processes Integrate data from various sources including S3 RDS Redshift and external APIs Ensure data security and compliance with organizational and regulatory standards Document technical solutions and maintain data engineering best practices Required Qualifications Bachelors or Masters degree in Computer Science Engineering or a related field 3 years of experience in data engineering with at least 12 years using AWS Glue Proficiency in Python PySpark and SQL Strong understanding of AWS services such as S3 Lambda Redshift Athena and IAM Experience with data modeling data warehousing and big data technologies Familiarity with CICD pipelines and version control eg Git Preferred Qualifications AWS Certified Data Analytics or Solutions Architect certification Experience with orchestration tools like Apache Airflow or AWS Step Functions Knowledge of data governance and metadata management tools Exposure to DevOps practices and infrastructureascode eg CloudFormation Terraform What We Offer Competitive salary and benefits Flexible work environment Opportunities for professional growth and certification Collaborative and inclusive team culture

Posted 1 month ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Mumbai

Work from Office

Are you curious, excited by experimentation and always looking to innovate? Do you want to work in embedded payments where you can keep learning and developing whilst getting hands-on experience? Do you want to have the opportunity to play an important role in a rapidly growing and exciting Fintech business? If so, we would love to connect and collaborate! We want to hire ambitious, and value-adding talent into Modulr, one of the fastest growing payments businesses in the UK and Europe. Modulr is experiencing significant growth, and in this role you will work in a cross-functional team who are asked to solve a problem, rather than be handed a task to do. This is an excellent opportunity to work in a high-growth environment with a fast-paced and collaborative culture where you will have great opportunities to work on challenging problems. About us At Modulr, our vision is a world where all businesses are powered by embedded payments. Modulr enables businesses, from SMEs to Enterprise, initially across the UK and Europe to efficiently pay-in, collect and disburse funds instantly via a range of payment schemes, accounts, and card products. We have created an industry-leading API platform with comprehensive online tools and access, to meet the demands of daily business payments. We have two routes to market. Our Core Business Payments product allows customers in any sector to connect to us and our expanding network of accounting and payroll platforms, including Sage, Xero, BrightPay and IRIS to automate payments. Our Vertical Solutions targets a growing range of industry verticals which directly connect their IT platforms to our APIs and webhooks. We solve complex payment problems for hundreds of clients in a range of industries, including Travel, Lending, Wage Advance, and Investment & Wealth. We are deeply integrated into the payment eco-system. In the UK, we are direct participants of Faster Payments and Bacs. Modulr hold settlement accounts at the Bank of England. Our payment network connectivity includes CHAPS, Open Banking, SEPA, SWIFT and account issuance in multiple currencies. We are principal issuing members of Visa and Mastercard schemes across UK and Europe. Our regulatory permissions and governance structure are the foundations of our business. We are regulated and supervised as an Authorised Electronic Money Institution (AEMI) in the UK by the Financial Conduct Authority and in the Netherlands by De Nederlandsche Bank. Our founding team has a wealth of experience in the payments industry and growing successful businesses. Modulr is backed by the venture arms of payments giants PayPal and FIS , as well as growth investors Blenheim Chalcot , General Atlantic , Frog Capital and Highland Europe . Modulr now has over 400 employees spread globally across offices in London, Edinburgh, Amsterdam, and Mumbai. Modulr values Building the extraordinary; going that extra mile. Owning the opportunity; be passionate and proud of the time you invest. Move at pace; reach goals faster whilst supported on your career journey. Achieve it together, working collaboratively and being a Modulite. The team The Data team at Modulr is a dynamic and innovative group that is responsible for managing Modulr s data warehouse, reporting, analytics, and data science capabilities. This role will report to the Principal Data Engineer and will work closely with Product Managers, Business Stakeholders and other cross-functional teams. You will have the opportunity to mentor other data team members and users, and contribute to the growth and development of the team. Summary The Data Engineer is a vital role within Modulr and this role will support the continuous improvement and innovation of our data platform, ensuring processes are robust, efficient and scalable. Specific duties Extract and integrate data from various sources, including APIs, internal databases, and third-party platforms. Design and build efficient analytical data models using dimensional modelling methodologies and best practices. Build and maintain semantic models to enable self service access to data for internal users. Write clean, maintainable, and well-documented code following best practices. Write and execute tests and data quality checks. Collaborate with cross-functional teams to understand data requirements and develop scalable and effective solutions. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery and re-designing infrastructure for greater scalability. About You A successful Data Engineer will have a track-record in delivering results in a fast-moving business and hence be comfortable with change and uncertainty. Excellent stakeholder management experience is essential to being successful in this role. 4+ years of experience in a Data Engineering role Extensive experience with Python and SQL Excellent knowledge of data architecture, data modelling and ETL methodologies. Experience using ETL and data orchestration tools Experience integrating with BI tools and building semantic models Proven experience in supporting and working with cross-functional teams in a dynamic environment. Comfortable working in fast-paced, agile environments with a focus on getting things done efficiently. Understanding of agile methodologies and practices. Nice to Have Experience using Snowflake Experience using DBT (or similar) Experience using semantic layer tools (Cube.dev, AtScale or similar) Experience using PowerBI, Streamlit Experience using Data Quality tools (Soda, Great Expectations or similar) Building and using CI/CD pipelines Understanding of AI/ML and GenAI frameworks Experience with AWS (if not, then other cloud platforms) ModInclusion At Modulr, we are working hard to build a more positive diverse and inclusive culture that helps everyone to feel they belong and can truly bring their whole self to work. Not only is it the right thing to do for everyone in the Modulr team, it s also the right thing to do for our business, the community we operate in and attracting future talent. As part of our approach, we actively welcome applications from candidates with diverse backgrounds. By submitting your CV you understand that we have a legitimate interest to use your personal data for the purposes of assessing your eligibility for this role. This means that we may use your personal data to contact you to discuss your CV or arrange an interview, or transfer your CV to the hiring manager(s) of the role you have applied for. You can ask us at any time to remove your CV from our database by emailing peopleops@modulrfinance.com - but please note that this means we will no longer consider you for the role you have applied for.

Posted 1 month ago

Apply

10.0 - 15.0 years

35 - 40 Lacs

Bengaluru

Work from Office

Title: Senior Adobe Solution Architect Date: 1 Jul 2025 Location: Bangalore, KA, IN Job Description We are a technology-led healthcare solutions provider. We are driven by our purpose to enable healthcare organizations to be future-ready. We offer accelerated, global growth opportunities for talent that s bold, industrious, and nimble. With Indegene, you gain a unique career experience that celebrates entrepreneurship and is guided by passion, innovation, collaboration, and empathy. To explore exciting opportunities at the convergence of healthcare and technology, check out www.careers.indegene.com Looking to jump-start your career We understand how important the first few years of your career are, which create the foundation of your entire professional journey. At Indegene, we promise you a differentiated career experience. You will not only work at the exciting intersection of healthcare and technology but also will be mentored by some of the most brilliant minds in the industry. We are offering a global fast-track career where you can grow along with Indegene s high-speed growth. We are purpose-driven. We enable healthcare organizations to be future ready and our customer obsession is our driving force. We ensure that our customers achieve what they truly want. We are bold in our actions, nimble in our decision-making, and industrious in the way we work. If this excites you, then apply below. Role: Senior Adobe Solution Architect Description: Key Responsibilities: Serve as a senior Adobe technology expert, architecting robust, scalable Adobe Experience Cloud solutions tailored to the pharma and life sciences industry. Design and deliver end-to-end digital solutions across Adobe Experience Manager (AEM), Adobe Analytics, Adobe Target, and Campaign. Translate business needs into technical architectures and solution designs in compliance with healthcare industry standards and regulatory requirements. Collaborate with product, delivery, and client engagement teams to ensure successful implementation and integration of Adobe solutions. Conduct technical workshops and stakeholder sessions to define solution strategies, integration requirements, and data architecture. Lead RFP solutioning and estimation efforts, working closely with sales, pre-sales, and marketing teams to support new business development. Mentor and coach junior architects and technical leads, promoting technical excellence and innovation. Ensure adherence to best practices, governance, and security protocols across Adobe solution implementations. Stay current with Adobe product updates, releases, and industry trends to advise clients on future-ready strategies. Drive innovation by developing reusable assets, accelerators, and frameworks that enhance delivery efficiency. Must Have 10+ years of hands-on experience with Adobe Experience Cloud solutions, including deep expertise in AEM Sites and Assets, Adobe Analytics, and Target. Proven track record in architecting complex digital solutions and leading Adobe platform implementations, preferably in pharma/life sciences. Strong understanding of martech ecosystems, data integration, and personalization strategies. Adept at stakeholder management with experience working with C-level executives and cross-functional teams. Experience with Agile, DevOps, and CI/CD best practices. Excellent communication, presentation, and problem-solving skills. Adobe Certified Expert (e.g., AEM Architect, Adobe Analytics Business Practitioner) preferred. PMP, ITIL, or Agile certifications are a plus. Good to have EQUAL OPPORTUNITY

Posted 1 month ago

Apply

11.0 - 17.0 years

35 - 40 Lacs

Chennai

Work from Office

Senior Management | Full-Time | Supply Chain and Logistics Are you a visionary leader ready to transform the future of renewable energy with data and AI? At Vestas, we re looking for a Vice President of AI and Data Products to accelerate execution of our Data & AI strategy, and deliver impact on a global scale. In this role, you will collaborate closely with senior leadership and digital experts to rethink how we leverage data and AI, positioning Vestas at the forefront of digital transformation in the energy sector. If this sounds interesting, we d like to hear from you. Digital Solutions & Development > Data Domain & AI At Vestas, the Data & AI unit is crucial to our digital transformation, cultivating innovation and operational Excellence by leveraging data and artificial intelligence. This essential team plays a vital role in enabling strategic decision-making, optimising business processes, and delivering measurable value throughout the organisation. With primary hubs in Aarhus and Copenhagen (Denmark) and Chennai (India), the unit consists of 75 skilled professionals, including product owners, chapter leads, business analysts, data engineers, information architects, and Scrum masters. Together, they form a collaborative ecosystem that bridges technology and business to unlock new opportunities. In your role as Vice President of AI and Data Products, you will oversee this forward-thinking organisation, transforming the landscape of data and AI at Vestas. Alongside your leadership team, you will foster a culture of collaboration, accountability, simplicity, and passion, empowering the team together with our full digital capacity to reach ambitious goals and accelerate our journey toward autonomous products and operations. Responsibilities Oversee the development and operations of Data & AI products to rethink Vestas business and drive automation and profitability. Ensure quality and scalability across our data and AI practices within digital transformation initiatives. Partner with senior business leaders to identify high-impact opportunities and deliver AI-driven solutions that create tangible business value. Act as a visible leader and role model for Data & AI across Vestas, influencing stakeholders and driving adoption. Launch and scale reusable, reliable data products built on a modular architecture and governed by enterprise business objects. Deliver on Vestas AI Big Bets multi-year AI transformations that create competitive advantage. Build and lead an effective, cross-border team, optimising productivity, quality, and speed to meet growing demands. Establish effective partnerships and networks to accelerate value and impact. Competencies Strategic thinker with solid business acumen and the ability to translate complex data challenges into business opportunities. Deep knowledge of data architecture, governance, and AI/ML technologies. Focused on promoting creativity, prioritising continuous advancement, and generating valuable contributions. Extensive experience leading data and AI functions within large scale digital transformation programs, with a strong track record of delivering enterprise grade data products and AI solutions. Proficient in managing relationships with stakeholders and effectively communicating across all levels. Proven experience in building and scaling data and AI teams within global, matrix driven organizations, while managing large, diverse teams across multiple geographies. Fluent in English; additional languages are highly beneficial Additional information The work location for this position is Chennai, India or Copenhagen or Aarhus, Denmark. Applications are handled on an ongoing basis. Please apply online with your letter of motivation and CV as soon as possible, but no later than 1st August 2025. For any additional information, please reach out to Vips Patel, Vippa@vestas.com BEWARE - RECRUITMENT FRAUD It has come to our attention that there are a number of fraudulent emails from people pretending to work for Vestas. Read more via this link, https: / / www.vestas.com / en / careers / our-recruitment-process DEIB Statement At Vestas, we recognise the value of diversity, equity, and inclusion in driving innovation and success. We strongly encourage individuals from all backgrounds to apply, particularly those who may hesitate due to their identity or feel they do not meet every criterion. As our CEO states, "Expertise and talent come in many forms, and a diverse workforce enhances our ability to think differently and solve the complex challenges of our industry". Your unique perspective is what will help us powering the solution for a sustainable, green energy future. About Vestas Across the globe, we have installed more wind power than anyone else. We consider ourselves pioneers within the industry, as we continuously aim to design new solutions and technologies to create a more sustainable future for all of us. With more than 185 GW of wind power installed worldwide and 40+ years of experience in wind energy, we have an unmatched track record demonstrating our expertise within the field. With 30,000 employees globally, we are a diverse team united by a common goal: to power the solution - today, tomorrow, and far into the future. Vestas promotes a diverse workforce which embraces all social identities and is free of any discrimination. We commit to create and sustain an environment that acknowledges and harvests different experiences, skills, and perspectives.

Posted 1 month ago

Apply

15.0 - 20.0 years

45 - 55 Lacs

Hyderabad

Work from Office

Silicon Labs (NASDAQ: SLAB) is the leading innovator in low-power wireless connectivity, building embedded technology that connects devices and improves lives. Merging cutting-edge technology into the world s most highly integrated SoCs, Silicon Labs provides device makers the solutions, support, and ecosystems needed to create advanced edge connectivity applications. Headquartered in Austin, Texas, Silicon Labs has operations in over 16 countries and is the trusted partner for innovative solutions in the smart home, industrial IoT, and smart cities markets. Learn more at www.silabs.com . What we are looking for: Director of Data Analytics - IT at Silicon Labs, who will lead the strategy, governance, and execution of data and analytics initiatives across the enterprise. Reporting to the Chief Information Officer, they will oversee a global team, deliver trusted insights, and support both strategic and day-to-day reporting needs for Sales, Operations, Finance, and other key functions. Their leadership will ensure data quality, integrity, and enable data-driven decision-making at all levels of the organization. The expectation from this role is to come up with the Data strategy for AI readiness with the ROI in mind. Meet The Team: Our global Data Analytics team is a high-impact group embedded within the IT organization at Silicon Labs. Spanning locations across North America, Europe, and Asia, the team partners cross-functionally with Sales, Marketing, Operations, Finance, Engineering, and HR to deliver trusted, actionable insights. From real-time dashboards to long-term data governance, we enable data-informed decisions that drive business success. We re a collaborative, technically skilled team that thrives on solving complex data challenges using tools like Azure, SQL, Tableau, and Python while fostering a culture of mentorship, innovation, and continuous improvement. Key Responsibilities: Lead & Shape Strategy Define and execute the global data analytics strategy for IT, driving alignment with Silicon Labs overall business goals. Build, mentor, and grow a high-performing team of data analysts, data engineers, and BI developers. Promote a strong data-driven culture that empowers teams across IT and the broader organization. Drive Data Architecture & Governance Oversee the design and management of scalable, secure data architectures and pipelines both cloud-based and on-premises. Implement and enforce best practices for data governance, privacy, and quality to ensure trusted, compliant data. Own and optimize enterprise data warehousing, reporting platforms, and analytics tools. Deliver Business Intelligence & Advanced Analytics Lead the delivery of comprehensive BI solutions, including operational, financial, and executive dashboards. Enable cutting-edge analytics initiatives such as predictive modeling, machine learning, and self-service analytics. Collaborate with business partners to transform strategic objectives into clear, actionable insights. Manage Programs & Ensure Impact Drive multiple analytics projects simultaneously, ensuring on-time delivery with measurable outcomes. Work cross-functionally to prioritize analytics use cases based on business value and ROI. Ensure analytics platforms and solutions are highly available, scalable, and performant. Skills you will need: Proven experience with cloud data platforms such as Azure, AWS, or GCP, including expertise in data lakes and enterprise data warehouses. Experience in building the data lake house platform with technologies like Data Bricks or snowflake etc. Hands on experience in guiding the team in coming up with the architecture that prepares data for future with Model Strong proficiency in tools and technologies including Power BI, Tableau, Snowflake, SQL, Python, and modern ETL frameworks. Demonstrated success leading data transformation programs in mid- to large-scale, global organizations. Ability to approach data and systems from an enterprise architecture perspective, ensuring alignment with overall business strategy. Experience in semiconductor, IoT, or high-tech industries is highly desirable. Education and/or Experience: 15+ years of experience in IT, with at least 8 years in a leadership role focused on data analytics or business intelligence. Bachelor s or Master s degree in Computer Science, Information Systems, Data Science, or a related field. MBA or equivalent business training is a plus. Benefits & Perks: Not only will you be joining a highly skilled and tight-knit team where every engineer makes a significant impact on the product; we also strive for excellent work/life balance and to make our environment welcoming and fun. Equity Rewards (Restricted Stock Units)) Employee Stock Purchase Plan (ESPP) Insurance plans with Outpatient cover National Pension Scheme (NPS) Flexible work policy Childcare support Silicon Labs is an equal opportunity employer and values the diversity of our employees. Employment decisions are made on the basis of qualifications and job-related criteria without regard to race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status, or any other characteristic protected by applicable law.

Posted 1 month ago

Apply

13.0 - 15.0 years

20 - 25 Lacs

Bengaluru

Work from Office

[{"Salary":null , "Remote_Job":false , "Posting_Title":"Data Architect" , "Is_Locked":false , "City":"Bangalore" , "Industry":"Technology" , "Job_Description":" KeyResponsibilities: Design and architect end-to-end data solutions usingMicrosoft Fabric, Azure Data Factory, Azure Synapse Analytics, and other Azuredata services Develop comprehensive data architecture blueprints,including logical and physical data models Create data integration patterns and establish bestpractices for data ingestion, transformation, and consumption Design data lake and lakehousearchitectures optimized for performance, cost, and governance Lead implementation of Microsoft Fabric solutions includingData Factory, Data Activator, Power BI, and Real-Time Analytics Design and implement medallion architecture (Bronze,Silver, Gold layers) within Fabric Optimize OneLake storage and data organization strategies Configure and manage Fabricworkspaces, capacity, and security models Architect complex ETL/ELT pipelines using Azure DataFactory and Fabric Data Factory Design real-time and batch data processing solutions Implement data quality frameworks andmonitoring solutions RequiredQualifications: Overall, 13-15 years of experience; 5+ years ofexperience in data architecture and analytics solutions Hands-on experience with MicrosoftFabric, Expert-level proficiency in Azure data services(Azure Data Factory, Synapse Analytics, Azure SQL Database, Cosmos DB) Strong experience with Power BI development andadministration Proficiency in SQL, Python, and/or Scala for dataprocessing Experience with Delta Lake and Apache Spark Proficiency in data cataloging tools and techniques Experience in data governance using Purview or UnityCatalog like tools Expertise in Azure DataBricks in conjunction with AzureData Factory and Synapse Implementation and optimization using Medallionarchitecture Experience with EventHub and IoT data (streaming) Strong understanding of Azure cloud architecture andservices Knowledge of Git, Azure DevOps, andCI/CD pipelines for data solutions Understanding of containerization andorchestration technologies Hands-on experience with Fabric Data Factory pipelines Experience with Fabric Data Activator for real-timemonitoring Knowledge of Fabric Real-Time Analytics (KQL databases) Understanding of Fabric capacity management andoptimization Experience with OneLake and Fabric

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies