Jobs
Interviews

21 Data Mesh Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

0 Lacs

delhi

On-site

The ideal candidate should possess extensive expertise in SQL, data modeling, ETL/ELT pipeline development, and cloud-based data platforms like Databricks or Snowflake. You will be responsible for designing scalable data models, managing reliable data workflows, and ensuring the integrity and performance of critical financial datasets. Collaboration with engineering, analytics, product, and compliance teams is a key aspect of this role. Responsibilities: - Design, implement, and maintain logical and physical data models for transactional, analytical, and reporting systems. - Develop and oversee scalable ETL/ELT pipelines to process large volumes of financial transaction data. - Optimize SQL queries, stored procedures, and data transformations for enhanced performance. - Create and manage data orchestration workflows using tools like Airflow, Dagster, or Luigi. - Architect data lakes and warehouses utilizing platforms such as Databricks, Snowflake, BigQuery, or Redshift. - Ensure adherence to data governance, security, and compliance standards (e.g., PCI-DSS, GDPR). - Work closely with data engineers, analysts, and business stakeholders to comprehend data requirements and deliver solutions. - Conduct data profiling, validation, and quality assurance to maintain clean and consistent data. - Maintain comprehensive documentation for data models, pipelines, and architecture. Required Skills & Qualifications: - Proficiency in advanced SQL, including query tuning, indexing, and performance optimization. - Experience in developing ETL/ELT workflows with tools like Spark, dbt, Talend, or Informatica. - Familiarity with data orchestration frameworks such as Airflow, Dagster, Luigi, etc. - Hands-on experience with cloud-based data platforms like Databricks, Snowflake, or similar technologies. - Deep understanding of data warehousing principles like star/snowflake schema, slowly changing dimensions, etc. - Knowledge of cloud services (AWS, GCP, or Azure) and data security best practices. - Strong analytical and problem-solving skills in high-scale environments. Preferred Qualifications: - Exposure to real-time data pipelines like Kafka, Spark Streaming. - Knowledge of data mesh or data fabric architecture paradigms. - Certifications in Snowflake, Databricks, or relevant cloud platforms. - Familiarity with Python or Scala for data engineering tasks.,

Posted 1 day ago

Apply

15.0 - 19.0 years

0 Lacs

hyderabad, telangana

On-site

We are looking for a highly skilled and experienced Data Architect to join our team. With at least 15 years of experience in Data engineering and Analytics, the ideal candidate will have a proven track record of designing and implementing complex data solutions. As a senior principal data architect, you will play a key role in designing, creating, deploying, and managing Blackbaud's data architecture. This position holds significant technical influence within the Data Platform, Data Engineering teams, and the Data Intelligence Center of Excellence at Blackbaud. You will act as an evangelist for proper data strategy across various teams within Blackbaud, and provide technical guidance, particularly in the area of data, for other projects. Responsibilities: - Develop and direct the strategy for all aspects of Blackbaud's Data and Analytics platforms, products, and services. - Set, communicate, and facilitate technical direction for the AI Center of Excellence and beyond collaboratively. - Design and develop innovative products, services, or technological advancements in the Data Intelligence space to drive business expansion. - Collaborate with product management to create technical solutions that address customer business challenges. - Take ownership of technical data governance practices to ensure data sovereignty, privacy, security, and regulatory compliance. - Challenge existing practices and drive innovation in the data space. - Create a data access strategy to securely democratize data and support research, modeling, machine learning, and artificial intelligence initiatives. - Contribute to defining tools and pipeline patterns used by engineers and data engineers for data transformation and analytics support. - Work within a cross-functional team to translate business requirements into data architecture solutions. - Ensure that data solutions prioritize performance, scalability, and reliability. - Mentor junior data architects and team members. - Stay updated on technology trends such as distributed computing, big data concepts, and architecture. - Advocate internally for the transformative power of data at Blackbaud. Required Qualifications: - 15+ years of experience in data and advanced analytics. - Minimum of 8 years of experience with data technologies in Azure/AWS. - Proficiency in SQL and Python. - Expertise in SQL Server, Azure Data Services, and other Microsoft data technologies. - Familiarity with Databricks, Microsoft Fabric. - Strong grasp of data modeling, data warehousing, data lakes, data mesh, and data products. - Experience with machine learning. - Excellent communication and leadership abilities. Preferred Qualifications: - Experience with .Net/Java and Microservice Architecture.,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You strive to be an essential member of a diverse team of visionaries dedicated to making a lasting impact. Don't pass up this opportunity to collaborate with some of the brightest minds in the field and deliver best-in-class solutions to the industry. As a Senior Lead Data Architect at JPMorgan Chase within the Consumer and Community Banking Data Technology, you are an integral part of a team that works to develop high-quality data architecture solutions for various software applications, platform, and data products. Drive significant business impact and help shape the global target state architecture through your capabilities in multiple data architecture domains. Represents the data architecture team at technical governance bodies and provides feedback regarding proposed improvements regarding data architecture governance practices. Evaluates new and current technologies using existing data architecture standards and frameworks. Regularly provides technical guidance and direction to support the business and its technical teams, contractors, and vendors. Design secure, high-quality, scalable solutions and reviews architecture solutions designed by others. Drives data architecture decisions that impact data product & platform design, application functionality, and technical operations and processes. Serves as a function-wide subject matter expert in one or more areas of focus. Actively contributes to the data engineering community as an advocate of firmwide data frameworks, tools, and practices in the Software Development Life Cycle. Influences peers and project decision-makers to consider the use and application of leading-edge technologies. Advises junior architects and technologists. Required qualifications, capabilities, and skills: - Formal training or certification on software engineering concepts and 5+ years of applied experience. - Advanced knowledge of architecture, applications, and technical processes with considerable in-depth knowledge in data architecture discipline and solutions (e.g., data modeling, native cloud data services, business intelligence, artificial intelligence, machine learning, data domain driven design, etc.). - Practical cloud-based data architecture and deployment experience, preferably AWS. - Practical SQL development experiences in cloud-native relational databases, e.g. Snowflake, Athena, Postgres. - Ability to deliver various types of data models with multiple deployment targets, e.g. conceptual, logical, and physical data models deployed as operational vs. analytical data stores. - Advanced in one or more data engineering disciplines, e.g. streaming, ELT, event processing. - Ability to tackle design and functionality problems independently with little to no oversight. - Ability to evaluate current and emerging technologies to select or recommend the best solutions for the future state data architecture. Preferred qualifications, capabilities, and skills: - Financial services experience, card and banking a big plus. - Practical experience in modern data processing technologies, e.g., Kafka streaming, DBT, Spark, Airflow, etc. - Practical experience in data mesh and/or data lake. - Practical experience in machine learning/AI with Python development a big plus. - Practical experience in graph and semantic technologies, e.g. RDF, LPG, Neo4j, Gremlin. - Knowledge of architecture assessments frameworks, e.g. Architecture Trade-off Analysis.,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You will be joining Teamware Solutions, a division of Quantum Leap Consulting Pvt. Ltd, as a Data Strategy Consultant focusing on Data Lake. This role is based in Bangalore and follows a hybrid work model. You should have 5-7 years of relevant experience and be ready to join within a notice period of immediate-15 days. Your primary responsibilities will include providing data strategy and consulting services, with a focus on Data Lake implementation. It is essential to have expertise in Data Lake, Data Mesh, Powerpoint, and Consulting. Additionally, having strong Communication Skills would be a plus. The interview process will be conducted virtually, and the work model is remote. If you are excited about this opportunity, please share your resume with netra.s@twsol.com.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Software Architect at our organization, you will be responsible for owning the software architecture vision, principles, and technology standards across the organization. Working closely with engineering leadership and product management, you will craft roadmaps and success criteria to ensure alignment with the wider target architecture. Your primary responsibilities will include developing and leading the architectural model for a unit, directing and leading teams, and designing interaction points between application components and applications. You will be required to evaluate and recommend toolsets, standardize the use of third-party components and libraries, and facilitate developers to understand business and functional requirements. Additionally, you will periodically review scalability and resiliency of application components, recommend steps for refinement and improvement, and enable reusable components to be shared across the enterprise. In this role, you will devise technology and architecture solutions that propel engineering excellence across the organization, simplify complex problems, and address key aspects such as portability, usability, scalability, and security. You will also extend your influence across the organization, leading distributed teams to make strong architecture decisions independently through documentation, mentorship, and training. Moreover, you will be responsible for driving engineering architecture definition using multi-disciplinary knowledge, including cloud engineering, middleware engineering, data engineering, and security engineering. Understanding how to apply Agile, Lean, and principles of fast flow to drive engineering department efficiency and productivity will be essential. You will provide and oversee high-level estimates for scoping large features utilizing Wideband Delphi and actively participate in the engineering process to evolve an Architecture practice to support the department. To excel in this role, you should have the ability to depict technical information conceptually, logically, and visually, along with a strong customer and business focus. Your leadership, communication, and problem-solving skills will play a crucial role in influencing and retaining composure under pressure in environments of rapid change. A forward-thinking mindset to keep the technology modern for value delivery will be key. In terms of qualifications, you should have a minimum of 10 years of software engineering experience, primarily in back-end or full-stack development, and at least 5 years of experience as a Software Senior Architect or Principal Architect using microservices. Experience in a Lean Agile development environment, deep understanding of event-driven architectures, and knowledge of REST, gRPC, and GraphQL architecture are required. Extensive background in Public Cloud platforms, modular Java Script frameworks, databases, caching solutions, and search technologies is also essential. Additionally, strong skills in containerization, including Docker, Kubernetes, and Service Mesh, as well as the ability to articulate an architecture or technical design concept, are desired for this role.,

Posted 1 week ago

Apply

8.0 - 10.0 years

15 - 30 Lacs

Hyderabad

Hybrid

Job Title: IT- Lead Engineer/Architect Azure Lake Years of Experience: 8-10 Years Mandatory Skills: Azure, DataLake, Databricks, SAP BW Key Responsibilities: Lead the development and maintenance of data architecture strategy, including design and architecture validation reviews with all stakeholders. Architect scalable data flows, storage, and analytics platforms in cloud/hybrid environments, ensuring secure, high-performing, and cost-effective solutions. Establish comprehensive data governance frameworks and promote best practices for data quality and enterprise compliance. Act as a technical leader on complex data projects and drive the adoption of new technologies, including AI/ML. Collaborate extensively with business stakeholders to translate needs into architectural solutions and define project scope. Support a wide range of Datalakes and Lakehouses technologies (SQL, SYNAPSE, Databricks, PowerBI, Fabric). Required Qualifications & Experience: Bachelors or Master’s degree in Computer Science or related field. At least 3 years in a leadership role in data architecture. Proven ability leading Architecture/AI/ML projects from conception to deployment. Deep knowledge of cloud data platforms (Microsoft Azure, Fabric, Databricks), data modeling, ETL/ELT, big data, relational/NoSQL databases, and data security. Experience in designing and implementing AI solutions within cloud architecture. 3 years as a project lead in large-scale projects. 5 years in development with Azure, Synapse, and Databricks. Excellent communication and stakeholder management skills.

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

hyderabad, telangana

On-site

The role of Data Lead at LERA Technologies involves owning the data strategy, architecture, and engineering roadmap for key client engagements. As a Data Lead, you will lead the design and development of scalable, secure, and high-performance data pipelines, marts, and warehouses. Additionally, you will mentor a team of data engineers and collaborate with BI/reporting teams and solution architects. Your responsibilities will include overseeing data ingestion, transformation, consolidation, and validation across cloud and hybrid environments. It is essential to champion best practices for data quality, data lineage, and metadata management. You will also be expected to evaluate emerging tools, technologies, and frameworks to enhance platform capabilities and engage with business and technical stakeholders to translate analytics needs into scalable data solutions. Monitoring performance and optimizing storage and processing layers for efficiency and scalability are key aspects of this role. The ideal candidate for this position should have at least 7 years of experience in Data Engineering, including proficiency in SQL/PLSQL/TSQL, ETL development, and data pipeline architecture. A strong command of ETL tools such as SSIS or equivalent and Data Warehousing concepts is required. Expertise in data modeling, architecture, and integration frameworks is essential, along with experience leading data teams and managing end-to-end data delivery across projects. Hands-on knowledge of BI tools like Power BI, Tableau, SAP BO, or OBIEE and their backend integration is a must. Proficiency in big data technologies and cloud platforms such as Azure, AWS, or GCP is also necessary. Programming experience in Python, Java, or equivalent languages, as well as proven experience in performance tuning and optimization of large datasets, are important qualifications. A strong understanding of data governance, data security, and compliance best practices is required, along with excellent communication, stakeholder management, and team mentoring abilities. Desirable skills for this role include leadership experience in building and managing high-performing data teams, exposure to data mesh, data lake house architectures, or modern data platforms, experience defining and enforcing data quality and lifecycle management practices, and familiarity with CI/CD for data pipelines and infrastructure-as-code. At LERA Technologies, you will have the opportunity to embrace innovation, creativity, and experimentation while significantly impacting our clients" success across various industries. You will thrive in a workplace that values diversity and inclusive excellence, benefit from extensive opportunities for career advancement, and lead cutting-edge projects with an agile and visionary team. If you are ready to lead data-driven transformation and shape the future of enterprise data, apply now to join LERA Technologies as a Data Lead.,

Posted 2 weeks ago

Apply

3.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

We are seeking a highly skilled and experienced Snowflake Architect to take charge of designing, developing, and deploying enterprise-grade cloud data solutions. As the ideal candidate, you should possess a robust background in data architecture, cloud data platforms, and Snowflake implementation. Hands-on experience in end-to-end data pipeline and data warehouse design is essential for this role. Your responsibilities will include leading the architecture, design, and implementation of scalable Snowflake-based data warehousing solutions. You will be tasked with defining data modeling standards, best practices, and governance frameworks. Designing and optimizing ETL/ELT pipelines using tools such as Snowpipe, Azure Data Factory, Informatica, or DBT will be a key aspect of your role. Collaboration with stakeholders to understand data requirements and translating them into robust architectural solutions will also be expected. Additionally, you will be responsible for implementing data security, privacy, and role-based access controls within Snowflake. Guiding development teams on performance tuning, query optimization, and cost management in Snowflake is crucial. Ensuring high availability, fault tolerance, and compliance across data platforms will also fall under your purview. Mentoring developers and junior architects on Snowflake capabilities is another important aspect of this role. In terms of Skills & Experience, we are looking for candidates with at least 8+ years of overall experience in data engineering, BI, or data architecture, and a minimum of 3+ years of hands-on Snowflake experience. Expertise in Snowflake architecture, data sharing, virtual warehouses, clustering, and performance optimization is highly desirable. Strong proficiency in SQL, Python, and cloud data services (e.g., AWS, Azure, or GCP) is required. Hands-on experience with ETL/ELT tools like ADF, Informatica, Talend, DBT, or Matillion is also necessary. A good understanding of data lakes, data mesh, and modern data stack principles is preferred. Experience with CI/CD for data pipelines, DevOps, and data quality frameworks is a plus. Solid knowledge of data governance, metadata management, and cataloging is beneficial. Preferred qualifications include holding a Snowflake certification (e.g., SnowPro Core/Advanced Architect), familiarity with Apache Airflow, Kafka, or event-driven data ingestion, knowledge of data visualization tools such as Power BI, Tableau, or Looker, and experience in healthcare, BFSI, or retail domain projects. If you meet these requirements and are ready to take on a challenging and rewarding role as a Snowflake Architect, we encourage you to apply.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

navi mumbai, maharashtra

On-site

Arcadis is the world's leading company delivering sustainable design, engineering, and consultancy solutions for natural and built assets. We are more than 36,000 people, in over 70 countries, dedicated to improving quality of life. Everyone has an important role to play. With the power of many curious minds, together we can solve the worlds most complex challenges and deliver more impact together. Individual Accountabilities Collaboration Collaborates with domain architects in the DSS, OEA, EUS, and HaN towers and if appropriate, the respective business stakeholders in architecting data solutions for their data service needs. Collaborates with the Data Engineering and Data Software Engineering teams to effectively communicate the data architecture to be implemented. Contributes to prototype or proof of concept efforts. Collaborates with InfoSec organization to understand corporate security policies and how they apply to data solutions. Collaborates with the Legal and Data Privacy organization to understand the latest policies so they may be incorporated into every data architecture solution. Suggest architecture design with Ontologies, MDM team. Technical Skills & Design Significant experience working with structured and unstructured data at scale and comfort with a variety of different stores (key-value, document, columnar, etc.) as well as traditional RDBMS and data warehouses. Deep understanding of modern data services in leading cloud environments, and able to select and assemble data services with maximum cost efficiency while meeting business requirements of speed, continuity, and data integrity. Creates data architecture artifacts such as architecture diagrams, data models, design documents, etc. Guides domain architect on the value of a modern data and analytics platform. Research, design, test, and evaluate new technologies, platforms and third-party products. Working experience with Azure Cloud, Data Mesh, MS Fabric, Ontologies, MDM, IoT, BI solution and AI would be greater assets. Expert troubleshoot skills and experience. Leadership Mentors aspiring data architects typically operating in data engineering and software engineering roles. Key Shared Accountabilities Leads medium to large data services projects. Provides technical partnership to product owners Shared stewardship, with domains architects, of the Arcadis data ecosystem. Actively participates in Arcadis Tech Architect community. Key Profile Requirements Minimum of 7 years of experience in designing and implementing modern solutions as part of variety of data ingestion and transformation pipelines Minimum of 5 years of experience with best practice design principles and approaches for a range of application styles and technologies to help guide and steer decisions. Experience working in large scale development and cloud environment. Why Arcadis We can only achieve our goals when everyone is empowered to be their best. We believe everyone's contribution matters. Its why we are pioneering a skills-based approach, where you can harness your unique experience and expertise to carve your career path and maximize the impact we can make together. Youll do meaningful work, and no matter what role, youll be helping to deliver sustainable solutions for a more prosperous planet. Make your mark, on your career, your colleagues, your clients, your life and the world around you. Together, we can create a lasting legacy. Join Arcadis. Create a Legacy. Our Commitment to Equality, Diversity, Inclusion & Belonging We want you to be able to bring your best self to work every day, which is why we take equality and inclusion seriously and hold ourselves to account for our actions. Our ambition is to be an employer of choice and provide a great place to work for all our people.,

Posted 3 weeks ago

Apply

10.0 - 18.0 years

9 - 19 Lacs

Pune

Work from Office

Role : Cloud Solution Architect Location : Pune Exp : 10 yrs Requirement: Working experience across the financial or banking sector for at least 10 years in roles relevant to the understanding and management of an asset management architecture. Knowledgeable in Cloud services including Databases (Azure SQL, PostgreSQL), Caching, Object and Block Storage, Scaling, Load Balancers, Networking etc. Work across Asset Management to review requirements, ensuring fit for purpose solutions with high levels of security, scalability, and performance. Drive the next phase of our enterprise architecture and data architecture. Ensure the architecture of solutions is aligned to business and IT requirements Experience of Architecture patterns such as event driven design, microservices, data mesh etc. understanding of security best practices, including encryption, firewalls, and identity management Proven development experience including engineering best practices Play a part in the wider collective management of technology based on significant relevant industry experience, develop the architecture agenda and help to drive simplification, efficiency and effectiveness. Maintain high-level design documents, architectural standards, and other technical documentation Aware of business continuity and disaster recovery concepts

Posted 1 month ago

Apply

8.0 - 10.0 years

12 - 16 Lacs

Pune

Work from Office

Working experience across the financial or banking sector for at least 10 years in roles relevant to the understanding and management of an asset management architecture. Knowledgeable in Cloud services including Databases (Azure SQL, PostgreSQL), Caching, Object and Block Storage, Scaling, Load Balancers, Networking etc. Work across Asset Management to review requirements, ensuring fit for purpose solutions with high levels of security, scalability, and performance. Drive the next phase of our enterprise architecture and data architecture. Ensure the architecture of solutions is aligned to business and IT requirements Experience of Architecture patterns such as event driven design, microservices, data mesh etc. understanding of security best practices, including encryption, firewalls, and identity management Proven development experience including engineering best practices Play a part in the wider collective management of technology based on significant relevant industry experience, develop the architecture agenda and help to drive simplification, efficiency and effectiveness. Maintain high-level design documents, architectural standards, and other technical documentation Aware of business continuity and disaster recovery concepts Note: ***Notice Period should not be more than 10-15 days. ***

Posted 1 month ago

Apply

6.0 - 10.0 years

7 - 14 Lacs

Bengaluru

Hybrid

Roles and Responsibilities Architect and incorporate an effective Data framework enabling end to end Data Solution. Understand business needs, use cases and drivers for insights and translate them into detailed technical specifications. Create epics, features and user stories with clear acceptance criteria for execution and delivery by the data engineering team. Create scalable and robust data solution designs that incorporate governance, security and compliance aspects. Develop and maintain logical and physical data models and work closely with data engineers, data analysts and data testers for successful implementation of them. Analyze, assess and design data integration strategies across various sources and platforms. Create project plans and timelines while monitoring and mitigating risks and controlling progress of the project. Conduct daily scrum with the team with a clear focus on meeting sprint goals and timely resolution of impediments. Act as a liaison between technical teams and business stakeholders and ensure. Guide and mentor the team for best practices on Data solutions and delivery frameworks. Actively work, facilitate and support the stakeholders/ clients to complete User Acceptance Testing ensure there is strong adoption of the data products after the launch. Defining and measuring KPIs/KRA for feature(s) and ensuring the Data roadmap is verified through measurable outcomes Prerequisites 5 to 8 years of professional, hands on experience building end to end Data Solution on Cloud based Data Platforms including 2+ years working in a Data Architect role. Proven hands on experience in building pipelines for Data Lakes, Data Lake Houses, Data Warehouses and Data Visualization solutions Sound understanding of modern Data technologies like Databricks, Snowflake, Data Mesh and Data Fabric. Experience in managing Data Life Cycle in a fast-paced, Agile / Scrum environment. Excellent spoken and written communication, receptive listening skills, and ability to convey complex ideas in a clear, concise fashion to technical and non-technical audiences Ability to collaborate and work effectively with cross functional teams, project stakeholders and end users for quality deliverables withing stipulated timelines Ability to manage, coach and mentor a team of Data Engineers, Data Testers and Data Analysts. Strong process driver with expertise in Agile/Scrum framework on tools like Azure DevOps, Jira or Confluence Exposure to Machine Learning, Gen AI and modern AI based solutions. Experience Technical Lead Data Analytics with 6+ years of overall experience out of which 2+ years is on Data architecture. Education Engineering degree from a Tier 1 institute preferred. Compensation The compensation structure will be as per industry standards

Posted 1 month ago

Apply

4.0 - 8.0 years

15 - 22 Lacs

Hyderabad

Work from Office

Senior Data Engineer Cloud & Modern Data Architectures Role Overview: We are looking for a Senior Data Engineer with expertise in ETL/ELT, Data Engineering, Data Warehousing, Data Lakes, Data Mesh, and Data Fabric architectures . The ideal candidate should have hands-on experience in at least one or two cloud data platforms (AWS, GCP, Azure, Snowflake, or Databricks) and a strong foundation in building PoCs, mentoring freshers, and contributing to accelerators and IPs. Must-Have: 5-8 years of experience in Data Engineering & Cloud Data Services . Hands-on with AWS (Redshift, Glue), GCP (BigQuery, Dataflow), Azure (Synapse, Data Factory), Snowflake, Databricks . Strong SQL, Python, or Scala skills. Knowledge of Data Mesh & Data Fabric principles . Nice-to-Have: Exposure to MLOps, AI integrations, and Terraform/Kubernetes for DataOps . Contributions to open-source, accelerators, or internal data frameworks Interested candidates share cv to dikshith.nalapatla@motivitylabs.com with below mention details for quick response. Total Experience: Relevant DE Experience : SQL Experience : SQL Rating out of 5 : Python Experience: Do you have experience in any 2 clouds(yes/no): Mention the cloud experience you have(Aws, Azure,GCP): Current Role / Skillset: Current CTC: Fixed: Payroll Company(Name): Client Company(Name): Expected CTC: Official Notice Period: (if it negotiable kindly mention up to how many days) Serving Notice (Yes / No): CTC of offer in hand: Last Working Day (in current organization): Location of the Offer in hand: ************* 5 DAYS WORK FROM OFFICE ****************

Posted 1 month ago

Apply

3.0 - 4.0 years

4 - 6 Lacs

Hyderabad

Work from Office

Senior Manager Information Systems Automation What you will do We are seeking a hands-on , experienced and dynamic Technical Infrastructure Automation Manager to lead and manage our infrastructure automation initiatives. The ideal candidate will have a strong hands-on background in IT infrastructure, cloud services, and automation tools, along with leadership skills to guide a team towards improving operational efficiency, reducing manual processes, and ensuring scalability of systems. This role will lead a team of engineers across multiple functions, including Ansible Development, ServiceNow Development, Process Automation, and Site Reliability Engineering (SRE). This role will be responsible for ensuring the reliability, scalability, and security of automation services. The Infrastructure Automation team will be responsible for automating infrastructure provisioning, deployment, configuration management, and monitoring. You will work closely with development, operations, and security teams to drive automation solutions that enhance the overall infrastructures efficiency and reliability. This role demands the ability to drive and deliver against key organizational strategic initiatives, foster a collaborative environment, and deliver high-quality results in a matrixed organizational structure. Please note, this is an onsite role based in Hyderabad. Roles & Responsibilities: Automation Strategy & Leadership : Lead the development and implementation of infrastructure automation strategies. Collaborate with key collaborators (DevOps, IT Operations, Security, etc.) to define automation goals and ensure alignment with company objectives. Provide leadership and mentorship to a team of engineers, ensuring continuous growth and skill development. Infrastructure Automation : Design and implement automation frameworks for infrastructure provisioning, configuration management, and orchestration (e.g., using tools like Terraform, Ansible, Puppet, Chef, etc.). Manage and optimize CI/CD pipelines for infrastructure as code (IaC) to ensure seamless delivery and updates. Work with cloud providers (AWS, Azure, GCP) to implement automation solutions for managing cloud resources and services. Process Improvement : Identify areas for process improvement by analyzing current workflows, systems, and infrastructure operations. Create and implement solutions to reduce operational overhead and increase system reliability, scalability, and security. Automate and streamline recurring tasks, including patch management, backups, and system monitoring. Collaboration & Communication : Collaborate with multi-functional teams (Development, IT Operations, Security, etc.) to ensure infrastructure automation aligns with business needs. Regularly communicate progress, challenges, and successes to management, offering insights on how automation is driving efficiencies. Documentation & Standards : Maintain proper documentation for automation scripts, infrastructure configurations, and processes. Develop and enforce best practices and standards for automation and infrastructure management. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master's degree with 8-10 years of experience in Observability operation, with at least 3 years in management OR Bachelor's degree with 10-14years of experience in Observability Operations, with at least 4 years in management OR Diploma with 14-18 years of experience in Observability Operations, with at least 5 years in management 12+ years of experience in IT infrastructure management, with at least 4+ years in a leadership or managerial role. Strong expertise in automation tools and frameworks such as Terraform, Ansible, Chef, Puppet, or similar. Proficiency in scripting languages (e.g., Python, Bash, PowerShell). Hands-on experience with cloud platforms (AWS) and containerization technologies (Docker, Kubernetes). Hands-on of Infrastructure as Code (IaC) principles and CI/CD pipeline implementation. Experience with ServiceNow Development and Administration Solid understanding of networking, security protocols, and infrastructure design. Excellent problem-solving skills and the ability to troubleshoot complex infrastructure issues. Strong leadership and communication skills, with the ability to work effectively across teams. Professional Certifications (Preferred): ITIL or PMP Certification Red Hat Certified System Administrator Service Now Certified System Administrator AWS Certified Solutions Architect Preferred Qualifications: Strong experience with Ansible, including playbooks, roles, and modules. Strong experience with infrastructure-as-code concepts and other automation tools like Terraform or Puppet. Strong understanding of user-centered design and building scalable, high-performing web and mobile interfaces on the ServiceNow platform Proficiency with both Windows and Linux/Unix-based operating systems. Knowledge of cloud platforms (AWS, Azure, Google Cloud) and automation techniques in those environments. Familiarity with CI/CD tools and processes, particularly with integration of Ansible in pipelines. Understanding of version control systems (Git). Strong troubleshooting, debugging, and performance optimization skills. Experience with hybrid cloud environments and multi-cloud strategies. Familiarity with DevOps practices and tools. Experience operating within a validated systems environment (FDA, European Agency for the Evaluation of Medicinal Products, Ministry of Health, etc.) Soft Skills: Excellent leadership and team management skills. Change management expertise Crisis management capabilities Strong presentation and public speaking skills Analytical mindset with a focus on continuous improvement. Detail-oriented with the capacity to manage multiple projects and priorities. Self-motivated and able to work independently or as part of a team. Strong communication skills to effectively interact with both technical and non-technical collaborators. Ability to work effectively with global, virtual teams Shift Information: This position is an onsite role and may require working during later hours to align with business hours. Candidates must be willing and able to work outside of standard hours as required to meet business needs.

Posted 1 month ago

Apply

8.0 - 13.0 years

30 - 45 Lacs

Bengaluru

Hybrid

Job Title: Enterprise Data Architect | Immediate Joiner Experience: 8 15 Years Location: Bengaluru (Onsite/Hybrid) Joining Time: Immediate Joiners Only (015 Days) Job Description We are looking for an experienced Enterprise Data Architect to join our dynamic team in Bengaluru. This is an exciting opportunity to shape modern data architecture across finance and colleague (HR) domains using the latest technologies and design patterns. Key Responsibilities Design and implement conceptual and logical data models for finance and colleague domains. Define complex as-is and to-be data architectures, including transition states. Develop and maintain data standards, principles, and architecture artifacts. Build scalable solutions using data lakes, data warehouses, and data governance platforms. Ensure data lineage, quality, and consistency across platforms. Translate business requirements into technical solutions for data acquisition, storage, transformation, and governance. Collaborate with cross-functional teams for data solution design and delivery Required Skills Strong communication and stakeholder engagement. Hands-on experience with Kimball dimensional modeling and/or Snowflake modeling. Expertise in modern cloud data platforms and architecture (AWS, Azure, or GCP). Proficient in building solutions for web, mobile, and tablet platforms. Background in Finance and/or Colleague Technology (HR systems) is a strong plus. Preferred Qualifications Bachelors/Masters degree in Computer Science, Engineering, or a related field. 8–15 years of experience in data architecture and solution design. Important Notes Immediate Joiners Only (Notice period max of 15 days) Do not apply if you’ve recently applied or are currently in the Xebia interview process Location: Bengaluru – candidates must be based in or open to relocating immediately To Apply Send your updated resume with the following details to vijay.s@xebia.com: Full Name: Total Experience: Current CTC: Expected CTC: Current Location: Preferred Location: Notice Period / Last Working Day (if serving notice): Primary Skill Set: LinkedIn URL: Apply now and be part of our exciting transformation journey at Xebia!

Posted 1 month ago

Apply

10.0 - 12.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Req ID: 323226 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Solution Architect Sr. Advisor to join our team in Bengaluru, India, Karn?taka (IN-KA), India (IN). Key Responsibilities: Design data platform architectures (data lakes, lakehouses, DWH) using modern cloud-native tools (e.g., Databricks, Snowflake, BigQuery, Synapse, Redshift). Architect data ingestion, transformation, and consumption pipelines using batch and streaming methods. Enable real-time analytics and machine learning through scalable and modular data frameworks. Define data governance models, metadata management, lineage tracking, and access controls. Collaborate with AI/ML, application, and business teams to identify high-impact use cases and optimize data usage. Lead modernization initiatives from legacy data warehouses to cloud-native and distributed architectures. Enforce data quality and observability practices for mission-critical workloads. Required Skills: 10+ years in data architecture, with strong grounding in modern data platforms and pipelines. Deep knowledge of SQL/NoSQL, Spark, Delta Lake, Kafka, ETL/ELT frameworks. Hands-on experience with cloud data platforms (AWS, Azure, GCP). Understanding of data privacy, security, lineage, and compliance (GDPR, HIPAA, etc.). Experience implementing data mesh/data fabric concepts is a plus. Expertise in technical solutions writing and presenting using tools such as Word, PowerPoint, Excel, Visio etc. High level of executive presence to be able to articulate the solutions to CXO level executives. Preferred Qualifications: Certifications in Snowflake, Databricks, or cloud-native data platforms. Exposure to AI/ML data pipelines, MLOps, and real-time data applications. Familiarity with data visualization and BI tools (Power BI, Tableau, Looker, etc.). About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.

Posted 2 months ago

Apply

13.0 - 21.0 years

45 - 60 Lacs

Hyderabad

Hybrid

Job Description Summary: As a Data Architect, you will play a pivotal role in defining and implementing common data models, API standards, and leveraging the Common Information Model (CIM) standard across a portfolio of products deployed in Critical National Infrastructure (CNI) environments globally. GE Vernova is the leading software provider for the operations of national and regional electricity grids worldwide. Our software solutions range from supporting electricity markets, enabling grid and network planning, to real-time electricity grid operations. In this senior technical role, you will collaborate closely with lead software architects to ensure secure, performant, and composable designs and implementations across our portfolio. Job Description Grid Software (a division of GE Vernova) is driving the vision of GridOS - a portfolio of software running on a common platform to meet the fast-changing needs of the energy sector and support the energy transition. Grid Software has extensive and well-established software stacks that are progressively being ported to a common microservice architecture, delivering a composable suite of applications. Simultaneously, new applications are being designed and built on the same common platform to provide innovative solutions that enable our customers to accelerate the energy transition. This role is for a senior data architect who understands the core designs, principles, and technologies of GridOS. Key responsibilities include: Formalizing Data Models and API Standards : Lead the formalization and standardization of data models and API standards across products to ensure interoperability and efficiency. Leveraging CIM Standards : Implement and advocate for the Common Information Model (CIM) standards to ensure consistent data representation and exchange across systems. Architecture Reviews and Coordination : Contribute to architecture reviews across the organization as part of Architecture Review Boards (ARB) and the Architecture Decision Record (ADR) process. Knowledge Transfer and Collaboration : Work with the Architecture SteerCo and Developer Standard Practices team to establish standard pratcise around data modeling and API design. Documentation : Ensure that data modeling and API standards are accurately documented and maintained in collaboration with documentation teams. Backlog Planning and Dependency Management : Work across software teams to prepare backlog planning, identify, and manage cross-team dependencies when it comes to data modeling and API requirements. Key Knowledge Areas and Expertise Data Architecture and Modeling : Extensive experience in designing and implementing data architectures and common data models. API Standards : Expertise in defining and implementing API standards to ensure seamless integration and data exchange between systems. Common Information Model (CIM) : In-depth knowledge of CIM standards and their application within the energy sector. Data Mesh and Data Fabric : Understanding of data mesh and data fabric principles, enabling software composability and data-centric design trade-offs. Microservice Architecture : Understandig of microservice architecture and software development Kubernetes : Understanding of Kubernetes, including software development in an orchestrated microservice architecture. This includes Kubernetes API, custom resources, API aggregation, Helm, and manifest standardization. CI/CD and DevSecOps : Experience with CI/CD pipelines, DevSecOps practices, and GitOps, especially in secure, air-gapped environments. Mobile Software Architecture : Knowledge of mobile software architecture for field crew operations, offline support, and near-realtime operation. Additional Knowledge (Advantageous but not Essential) Energy Industry Technologies : Familiarity with key technologies specific to the energy industry, such as Supervisory Control and Data Acquisition (SCADA), Geospatial network modeling, etc. This is a critical role within Grid Software, requiring a broad range of knowledge and strong organizational and communication skills to drive common architecture, software standards, and principles across the organization.

Posted 2 months ago

Apply

8.0 - 10.0 years

25 - 35 Lacs

Pune

Hybrid

Warm Greetings from Dataceria Software Solutions Pvt Ltd . We are Looking For: Senior Data Scientist Immediate joiners send your resume to carrers@dataceria.com: -------------------------------------------------------------------------------------------------------------- We are seeking a highly skilled Senior Data Scientist to lead the development of classification models and customer segmentation strategies for a major investment bank. This role is central to profiling the existing client base on legacy infrastructure and supporting business stakeholders in defining clear migration paths to modern platforms such as Data Mesh. Responsibilities: Lead the design and implementation of classification models to categorize clients based on product usage, service engagement, and behavioral patterns. Develop robust customer segmentation strategies to support personalization and migration strategies. Collaborate closely with stakeholders across business and technology to understand legacy data landscapes and future-state architecture. Oversee and mentor a small team of data scientists and analysts. Conduct advanced analysis using Python and SQL; apply machine learning techniques for predictive insights. Translate complex data findings into clear business narratives and actionable outcomes. What we're looking for in our applicants: 8+ years of experience in data science, preferably in financial services or investment banking . Proven expertise in machine learning, particularly classification and clustering models. Advanced proficiency in Python ( Pandas, Scikit-learn, etc.) and SQL. Experience leading technical teams or mentoring junior data professionals. Strong understanding of client lifecycle in financial institutions and regulatory considerations. Familiarity with data migration strategies and legacy modernization projects is a plus. ---------------------------------------------------------------------------------------------------------------------------------------------------- Joining: Immediate Work location: Pune (hybrid). Open Positions: Senior Data Scientist If interested, please share your updated resume to carrers@dataceria.com: We welcome applications from skilled candidates who are open to working in a hybrid model. Candidates with less experience but strong technical abilities are also encouraged to apply. ----------------------------------------------------------------------------------------------------- Thanks, Dataceria Software Solutions Pvt Ltd Follow our LinkedIn for more job openings : https://www.linkedin.com/company/dataceria/ Email : careers@dataceria.com ------------------------------------------------------------------------------------------------------

Posted 2 months ago

Apply

7.0 - 12.0 years

0 - 3 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Required Skills: Python, ETL, SQL, GCP, Bigquery, Pub/Sub, Airflow. Good to Have: DBT, Data mesh Job Title: Senior GCP Engineer Data Mesh & Data Product Specialist We are hiring a Senior GCP Developer to join our high-performance data engineering team. This is a mission-critical role where you will design, build, and maintain scalable ETL pipelines and frameworks in a Data Mesh architecture. You will work with modern tools like Python, dbt, BigQuery (GCP), and SQL to deliver high-quality data products that power decision-making across the organization. We are looking for a highly skilled professional who thrives in demanding environments, takes ownership of their work, and delivers results with precision and reliability. Key Responsibilities * Design, Build, and Maintain ETL Pipelines: Develop robust, scalable, and efficient ETL workflows to ingest, transform, and load data into distributed data products within the Data Mesh architecture. * Data Transformation with dbt: Use dbt to build modular, reusable transformation workflows that align with the principles of Data Products. * Cloud Expertise: Leverage Google Cloud Platform (GCP) services such as BigQuery, Cloud Storage, Pub/Sub, and Dataflow to implement highly scalable data solutions. * Data Quality & Governance: Enforce strict data quality standards by implementing validation checks, anomaly detection mechanisms, and monitoring frameworks. * Performance Optimization: Continuously optimize ETL pipelines for speed, scalability, and cost efficiency. * Collaboration & Ownership: Work closely with data product owners, BI developers, and stakeholders to understand requirements and deliver on expectations. Take full ownership of your deliverables. * Documentation & Standards: Maintain detailed documentation of ETL workflows, enforce coding standards, and adhere to best practices in data engineering. * Troubleshooting & Issue Resolution: Proactively identify bottlenecks or issues in pipelines and resolve them quickly with minimal disruption. Required Skills & Experience * 10+ or 7+ years of hands-on experience in designing and implementing ETL workflows in large-scale environments (Lead & Dev) * Advanced proficiency in Python for scripting, automation, and data processing. * Expert-level knowledge of SQL for querying large datasets with performance optimization techniques. * Deep experience working with modern transformation tools like dbt in production environments. * Strong expertise in cloud platforms like Google Cloud Platform (GCP) with hands-on experience using BigQuery. * Familiarity with Data Mesh principles and distributed data architectures is mandatory. * Proven ability to handle complex projects under tight deadlines while maintaining high-quality standards. * Exceptional problem-solving skills with a strong focus on delivering results. What We Expect This is a demanding role that requires: 1. A proactive mindset – you take initiative without waiting for instructions. 2. A commitment to excellence – no shortcuts or compromises on quality. 3. Accountability – you own your work end-to-end and deliver on time. 4. Attention to detail – precision matters; mistakes are not acceptable.

Posted 2 months ago

Apply

8.0 - 10.0 years

25 - 35 Lacs

Chennai, Bengaluru

Hybrid

Warm Greetings from Dataceria Software Solutions Pvt Ltd . We are Looking For: Senior Data Scientist -------------------------------------------------------------------------------------------------------------- We are seeking a highly skilled Senior Data Scientist to lead the development of classification models and customer segmentation strategies for a major investment bank. This role is central to profiling the existing client base on legacy infrastructure and supporting business stakeholders in defining clear migration paths to modern platforms such as Data Mesh. Responsibilities: Lead the design and implementation of classification models to categorize clients based on product usage, service engagement, and behavioral patterns. Develop robust customer segmentation strategies to support personalization and migration strategies. Collaborate closely with stakeholders across business and technology to understand legacy data landscapes and future-state architecture. Oversee and mentor a small team of data scientists and analysts. Conduct advanced analysis using Python and SQL; apply machine learning techniques for predictive insights. Translate complex data findings into clear business narratives and actionable outcomes. What we're looking for in our applicants: 8+ years of experience in data science, preferably in financial services or investment banking . Proven expertise in machine learning, particularly classification and clustering models. Advanced proficiency in Python ( Pandas, Scikit-learn, etc.) and SQL. Experience leading technical teams or mentoring junior data professionals. Strong understanding of client lifecycle in financial institutions and regulatory considerations. Familiarity with data migration strategies and legacy modernization projects is a plus. ---------------------------------------------------------------------------------------------------------------------------------------------------- Joining: Immediate Work location: Bangalore (hybrid) , Chennai. Open Positions: Senior Data Scientist If interested, please share your updated resume to carrers@dataceria.com: We welcome applications from skilled candidates who are open to working in a hybrid model. Candidates with less experience but strong technical abilities are also encouraged to apply. ----------------------------------------------------------------------------------------------------- Thanks, Dataceria Software Solutions Pvt Ltd Follow our LinkedIn for more job openings : https://www.linkedin.com/company/dataceria/ Email : careers@dataceria.com ------------------------------------------------------------------------------------------------------

Posted 2 months ago

Apply

8 - 13 years

25 - 30 Lacs

Chennai, Bangalore Rural, Hyderabad

Work from Office

Company Name: (One of the Leading General Insurance company in India,( Chennai) Industry: General Insurance Years of Experience 7+ Years Location -Chennai Mail at manjeet.kaur@mounttalent.com Purpose The candidate is responsible for designing, creating, deploying, and maintaining an organization's data architecture. To ensure that the organization's data assets are managed effectively and efficiently, and that they are used to support the organization's goals and objectives. Responsible for ensuring that the organization's data is secure, and that appropriate data governance policies and procedures are in place to protect the organization's data assets. Key Responsibilities Responsibilities will include but will not be restricted to: Responsible for designing and implementing a data architecture that supports the organization's business goals and objectives. Developing data models, defining data standards and guidelines, and establishing processes for data integration, migration, and management. Create and maintain data dictionaries, which are a comprehensive set of data definitions and metadata that provide context and understanding of the organization's data assets. Ensure that the data is accurate, consistent, and reliable across the organization. This includes establishing data quality metrics and monitoring data quality on an ongoing basis. Organization's data is secure, and that appropriate data governance policies and procedures are in place to protect the organization's data assets. Work closely with other IT professionals, including database administrators, data analysts, and developers, to ensure that the organization's data architecture is integrated and aligned with other IT systems and applications. Stay up to date with new technologies and trends in data management and architecture and evaluate their potential impact on the organization's data architecture. Communicate with stakeholders across the organization to understand their data needs and ensure that the organization's data architecture is aligned with the organization's strategic goals and objectives. Technical requirements Bachelor's or masters degree in Computer Science or a related field. Certificates in Database Management will be preferred. Expertise in data modeling and design, including conceptual, logical, and physical data models, and must be able to translate business requirements into data models. Proficient in a variety of data management technologies, including relational databases, NoSQL databases, data warehouses, and data lakes. Expertise in ETL processes, including data extraction, transformation, and loading, and must be able to design and implement data integration processes. Experience with data analysis and reporting tools and techniques and must be able to design and implement data analysis and reporting processes. Familiar with industry-standard data architecture frameworks, such as TOGAF and Zachman, and must be able to apply them to the organization's data architecture. Familiar with cloud computing technologies, including public and private clouds, and must be able to design and implement data architectures that leverage cloud computing. Qualitative Requirements Able to effectively communicate complex technical concepts to both technical and non-technical stakeholders. Strong analytical and problem-solving skills. Must be able to inspire and motivate their team to achieve organizational goal. Following skills can be deemed good to have but not necessary: Databricks, Snowflake, Redshift, Data Mesh, Medallion, Lambda

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies