Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 10.0 years
25 - 40 Lacs
panchkula, gurugram, bengaluru
Work from Office
Appian Technical Lead / Technical Solution Architect Location-Gurgaon, Bangalore, Panchkula, Hyderabad (India) We are seeking an accomplished Appian Technical Lead (or Solution Architect) with a strong track record in designing and delivering enterprise-scale BPM solutions. The ideal candidate will lead technical strategy, provide architectural vision, and mentor development teams to deliver high-quality, scalable, and secure Appian applications. KEY RESPONSIBILITIES Lead the architecture, design, and development of complex Appian-based systems, ensuring scalability and robustness. Mentor and coach Appian developers, promoting best practices and team excellence. ¢ Conduct code reviews and enforce standards for performance, security, scalability, and maintainability. ¢ Integrate Appian applications with external systems using REST/SOAP APIs, Connected Systems, and Integration Objects. ¢ Drive platform upgrades, DevOps initiatives, CI/CD pipelines, and automated deployments. ¢ Perform unit testing, debugging, and performance optimization to ensure quality deliverables. ¢ Collaborate with business stakeholders to translate requirements into technical solutions. ¢ Actively participate in Agile/Scrum ceremonies including sprint planning, retrospectives, and demos. REQUIRED EXPERIENCE AND SKILLSET ¢ 10 years of IT delivery experience, with 5+ years in Appian architecture or leadership roles. ¢ Strong experience in Agile methodologies (Scrum) with disciplined delivery. ¢ Expertise in Appian modules: SAIL, Process Models, Interfaces, Records, Sites, Reports, with advanced capabilities in AI Skills, RPA, Data Fabric, and plug-ins. ¢ Deep architectural understanding of integration techniques (REST/SOAP, JWT, SSO/SAML, LDAP), cloud deployments (AWS, Azure, GCP), security best practices, and data modeling. ¢ Proven leadership, problem-solving, and stakeholder management skills. ¢ Excellent communication and collaboration abilities. PREFERRED QUALIFICATIONS ¢ Appian Level 2 or Level 3 Lead Developer/Architect Certification ¢ Hands-on experience with Appian RPA, AI Skills, Data Fabric, and plug-in development ¢ Proficiency in DevOps, CI/CD tools (e.g., Jenkins, Git, ANT), and automated deployment methodologies for Appian ¢ Background in cloud environments (AWS, Azure, GCP). EDUCATION ¢ Bachelor's degree in computer science, Information Technology, or a related field. Interested Candidates kindly email us at hr@mvida.in
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
hyderabad, telangana
On-site
We are looking for a driven individual to join our core Product Management team of Infors Technology Platform, called Infor OS. You will be working on Infors Data strategy, focusing on Data Fabric, Streaming Technologies, Data Catalogue, Data ledger, and Data ingestion & extraction services. Your responsibilities will include gathering requirements from internal and external stakeholders, establishing relationships, and driving adoption. You will collaborate with industry analysts and customers to understand the latest Data trends, shaping the product vision, and working with seniors to define the direction for the future. Your role will involve ensuring a well-defined vision, partnering with engineering for realization, releasing the product periodically, and managing the release control process. Additionally, you will be responsible for enabling and educating field-facing professionals of Infor, Partners, and Customers on new feature launches. You should possess an entrepreneurial drive to self-start initiatives, excellent communication, and presentation skills to articulate complex topics effectively. Working independently on large technical projects, networking with key contacts, exercising sound judgment, providing input to business cases, working hands-on, giving demos, and owning product enablement/trainings are essential functions of this role. The ideal candidate should have a Bachelor's degree in Business Administration, Computer Science, or relevant Engineering education, along with 2 to 3 years of experience in Product Management or functional/Operational business analysis for software product development. Technical aptitude in Data streaming protocols, SQL, JSON, and ETL domain, as well as knowledge of AWS and Data platforms like Snowflake and Data bricks, is preferred. Strong problem resolution skills, interpersonal skills, and proficiency in English communication are required. This role may involve occasional travel to visit development teams or present the product at conferences. Join us at Infor, a global leader in business cloud software products, where we prioritize Principle Based Management and uphold values such as integrity, stewardship, compliance, transformation, entrepreneurship, knowledge, humility, respect, and self-actualization. We are committed to creating a diverse environment that reflects the markets, customers, partners, and communities we serve.,
Posted 2 weeks ago
12.0 - 17.0 years
2 - 6 Lacs
Hyderabad, Telangana, India
On-site
Roles & Responsibilities: Maintain strategic relationships and strong communication with the leadership team about IS services and service roadmaps to ensure that all the customers feel informed and engaged Lead and manage large, diverse teams within a matrixed organization. Collaborate with geographically dispersed teams, including those in the US and other international locations. Foster a culture of collaboration, innovation, and continuous improvement. Develop talent, motivate the team, delegate effectively, champion diversity within the team and act as a role model of servant leadership. Responsible for managing, growing, and developing the Amgen Technology team in India, ensuring global ways of working are imbedded in the local organization Understand the decision-making process, workflows, and business and information needs of business partners and collaborators Contribute and define business outcomes + requirements, technology solutions, and services Improve activities being measured by crafting, monitoring, and optimizing relevant feedback loops through test & learn activities Work with Product Owners, Service Owners and/or delivery teams to ensure that delivery matches commitments, acting as an escalation point and facilitating communication when service commitments are not met Ensure communication of key performance metrics and analysis of unmet needs Participate in collaborator and other leadership meetings, working with other parts of the organization, and functional groups to ensure successful delivery Ensure ongoing alignment with strategy, compliance, and regulatory requirements for technology investments and services Facilitate standard process sharing, ensuring ongoing alignment with the Technology & Digital strategy Provide education to new partners with regards to IT service offerings Remain accountable for ensuring overall organizational compliance to quality/compliance requirements such as GXP and Privacy Basic Qualifications: Doctorate degree / Masters degree / Bachelors degree and 12 to 17 years of experience in Business, Engineering, IT or related field . Preferred Qualifications: Functional Skills:Must-Have Skills: Strategize, plan, and implement various phases of PLM roadmap using Dassault Systemes 3DEXPERIENCE platform and integrating with key enterprise platforms Strong Technical/Functional experience solutioning and supporting GMP applications across Engineering, Clinical, Commercial and Quality functions. Demonstrated hands-on experience in managing technology solutions involving one of the leading PLM solutions (Windchill, Teamcenter or 3DEXPERIENCE). Experience integrate PLM with enterprise systems such as Data Fabric, Veeva vault, SAP, MES and SCADA systems. Experience in people management and leading highly skilled matrixed teams, passion for mentorship, culture and fostering the development of talent. Must be flexible and able to manage multiple activities and priorities with minimal direction in a rapidly changing and demanding environment. Exceptional collaboration, communication and interpersonal skills to effectively manage collaborator relationships and build new partnerships. Experience in applying technology standard process methodologies: Scaled Agile (SAFe) Good-to-Have Skills: Experience in a leadership role within a pharmaceutical or technology organization Experience with configuring and customizing solutions for Requirements, CAD, Risk, EBOM, MBOM, Configuration/Variants, Recipe, MPP, Document and Change management. Experience leading data migration from various sources to 3DEXPERIENCE platforms. Experience creating solutions using Enterprise Integration Framework and usage of middleware systems such as Mulesoft, Boomi or Tibco. Extensive experience in software development lifecycle. Experience using and adoption of Scaled Agile Framework (SAFe) Strong analytic/critical-thinking and decision-making abilities. Ability to work effectively in a fast-paced, dynamic environment. Established business partnerships and IS governance practices involving senior business stakeholders Broad working knowledge of key IS domains and layers Professional Certifications: Scaled Agile Framework (SAFe) for Teams (preferred) Soft Skills: Excellent leadership and team management skills. Strong transformation and change management experience. Exceptional collaboration and communication skills. High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented with a focus on achieving team goals. Strong presentation and public speaking skills. Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams
Posted 1 month ago
13.0 - 20.0 years
35 - 70 Lacs
Bengaluru, Mumbai (All Areas)
Work from Office
Required Skills and Experience 13+ Years is a must with 7+ years of relevant experience working on Big Data Platform technologies. Proven experience in technical skills around Cloudera, Teradata, Databricks, MS Data Fabric, Apache, Hadoop, Big Query, AWS Big Data Solutions (EMR, Redshift, Kinesis, Qlik) Good Domain Experience in BFSI or Manufacturing area . Excellent communication skills to engage with clients and influence decisions. High level of competence in preparing Architectural documentation and presentations. Must be organized, self-sufficient and can manage multiple initiatives simultaneously. Must have the ability to coordinate with other teams independently. Work with both internal/external stakeholders to identify business requirements, develop solutions to meet those requirements / build the Opportunity. Note: If you have experience in BFSI Domain than the location will be Mumbai only If you have experience in Manufacturing Domain the location will be Mumbai & Bangalore only. Interested candidates can share their updated resumes on shradha.madali@sdnaglobal.com
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
At Moody's, we aim to unite the brightest minds to transform today's risks into tomorrow's opportunities. We strive to cultivate an inclusive environment where everyone is encouraged to express their true selves, exchange ideas freely, think innovatively, and engage with each other and customers in meaningful ways. If you are enthusiastic about this opportunity, even if you do not meet every requirement listed, we encourage you to apply. You may still be a great fit for this role or other available positions. We are looking for candidates who embody our values: investing in every relationship, leading with curiosity, championing diverse perspectives, turning ideas into actions, and upholding trust through integrity. Skills and Competencies - Experience in utilizing industry-standard data transformation, low-code automation, business intelligence solutions, and operational responsibilities with tools like Power BI, Alteryx, and Automation Anywhere. - Familiarity with Python, Data Fabric, and a working understanding of Hyperion/OneStream (EPM) would be advantageous. - Proficiency in SQL for working with both structured and unstructured data. - Strong knowledge of data structures, algorithms, and software development best practices. - Understanding of version control systems such as Git and agile methodologies. - Knowledge of cloud platforms like AWS, Azure, or Google Cloud is a plus. - Effective communication skills to articulate technical concepts to non-technical stakeholders. - Strong problem-solving abilities and the capability to work independently or collaboratively within a team. Education - Bachelor's degree in Computer Science, Information Technology, Engineering, or a related field. Responsibilities - Support operational processes within the Data and Analytics team. - Contribute to the creation of process documentation (SOPs) and aid in establishing guidelines to standardize and enhance operations. - Provide assistance for Automation/Data/BI servers to ensure system performance and reliability. - Monitor and support automation, BI, and data processes to maintain seamless operations and address issues promptly. - Aid in managing security and access controls to safeguard data and uphold data integrity. - Track automation, Gen AI, data, and BI use cases across the team. - Support Gen AI application environments, security, and post-production operations. - Assist in BI incident management and efficiently route issues to the appropriate area. - Monitor and report on metrics to evaluate performance and identify areas for enhancement. - Maintain comprehensive documentation and communication to ensure transparency. - Monitor and report on metrics to ensure operational efficiency. About The Team The Automation Operations & Innovation Team is a dynamic group within the Data and Analytics department focused on enhancing operational efficiency and driving digital transformation through automation, GenAI, business intelligence, and data management. The team collaborates with developers, analysts, and process managers to design and implement scalable solutions using tools like Alteryx, Power BI, Automation Anywhere, and Python. Through collaboration, innovation, and continuous improvement, the team supports strategic initiatives by streamlining workflows, enhancing data quality, and integrating emerging technologies. Candidates applying to Moody's Corporation may be required to disclose securities holdings in accordance with Moodys Policy for Securities Trading and the position's requirements. Employment is subject to adherence to the Policy, including addressing any necessary remediation of positions in those holdings.,
Posted 1 month ago
9.0 - 12.0 years
9 - 12 Lacs
Hyderabad, Telangana, India
On-site
The ideal candidate will be responsible to design, develop, and optimize data pipelines, data integration frameworks, and metadata-driven architectures that enable seamless data access and analytics. This role prefers deep expertise in big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines to support structured, semi-structured, and unstructured data processing across the Enterprise Data Fabric. Implement real-time and batch data processing solutions, integrating data from multiple sources into a unified, governed data fabric architecture. Optimize big data processing frameworks using Apache Spark, Hadoop, or similar distributed computing technologies to ensure high availability and cost efficiency. Work with metadata management and data lineage tracking tools to enable enterprise-wide data discovery and governance. Ensure data security, compliance, and role-based access control (RBAC) across data environments. Optimize query performance, indexing strategies, partitioning, and caching for large-scale data sets. Develop CI/CD pipelines for automated data pipeline deployments, version control, and monitoring. Implement data virtualization techniques to provide seamless access to data across multiple storage systems. Collaborate with cross-functional teams, including data architects, business analysts, and DevOps teams, to align data engineering strategies with enterprise goals. Stay up to date with emerging data technologies and best practices, ensuring continuous improvement of Enterprise Data Fabric architectures. Must-Have Skills: Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies. Proficiency in workflow orchestration, performance tuning on big data processing. Strong understanding of AWS services Experience with Data Fabric, Data Mesh, or similar enterprise-wide data architectures. Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Good-to-Have Skills: Good to have deep expertise in Biotech & Pharma industries Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Education and Professional Certifications 9 to 12 years of Computer Science, IT or related field experience AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills.
Posted 2 months ago
5.0 - 10.0 years
4 - 10 Lacs
Hyderabad, Telangana, India
On-site
Key Deliverables: Design and deploy scalable ETL/ELT pipelines for structured and unstructured data Implement real-time and batch data processing on Enterprise Data Fabric Optimize big data performance using Apache Spark and AWS stack Enable enterprise-wide data discovery, governance, and CI/CD integration Role Responsibilities: Collaborate across teams to align data engineering with business strategy Ensure data security, compliance, and access control in distributed environments Build metadata-driven data pipelines with version control and monitoring Integrate diverse data sources into a unified, governed architecture
Posted 2 months ago
3.0 - 13.0 years
3 - 13 Lacs
Hyderabad, Telangana, India
On-site
Key Deliverables: Design and deploy scalable ETL/ELT pipelines for structured and unstructured data Implement real-time and batch data processing on Enterprise Data Fabric Optimize big data performance using Apache Spark and AWS stack Enable enterprise-wide data discovery, governance, and CI/CD integration Role Responsibilities: Collaborate across teams to align data engineering with business strategy Ensure data security, compliance, and role-based access across systems Drive performance tuning and metadata-driven architecture development Adopt and implement emerging data technologies and DevOps practices
Posted 2 months ago
5.0 - 9.0 years
15 - 30 Lacs
Pune, Bengaluru
Work from Office
Hiring for Appian developer for Wipro limited *Excellent English Communication *5+ years of hand on experience in Appian BPM *Knowledge or working experience with SAP or Enterprise system * Notice period - Immediate to 60 Days HR Kanchan 9691001643 Required Candidate profile 1. Appian developer- L2 certification is mandatory (B3) 2. Appian developer- L3 certification is mandatory (C1) Lead or support solution design discussions with onshore leads based in the UK/NL
Posted 3 months ago
4.0 - 9.0 years
15 - 30 Lacs
Pune, Bengaluru
Work from Office
Hiring for Appian developer for Wipro limited *Excellent English Communication *5+ years of hand on experience in Appian BPM *Knowledge or working experience with SAP or Enterprise system * Notice period - Immediate to 60 Days HR Kanchan 9691001643 Required Candidate profile Experience with Appian design patterns, objects, interfaces, and best practices Ability to work independently and replace senior consultants effectively
Posted 3 months ago
4.0 - 9.0 years
15 - 30 Lacs
Noida, Pune, Bengaluru
Work from Office
Hiring for Appian developer for Wipro limited *Excellent English Communication *5+ years of hand on experience in Appian BPM *Knowledge or working experience with SAP or Enterprise system * Notice period - Immediate to 60 Days HR Kanchan 9691001643 Required Candidate profile Experience with Appian design patterns, objects, interfaces, and best practices Ability to work independently and replace senior consultants effectively
Posted 3 months ago
3.0 - 4.0 years
4 - 6 Lacs
Hyderabad
Work from Office
Senior Manager Information Systems Automation What you will do We are seeking a hands-on , experienced and dynamic Technical Infrastructure Automation Manager to lead and manage our infrastructure automation initiatives. The ideal candidate will have a strong hands-on background in IT infrastructure, cloud services, and automation tools, along with leadership skills to guide a team towards improving operational efficiency, reducing manual processes, and ensuring scalability of systems. This role will lead a team of engineers across multiple functions, including Ansible Development, ServiceNow Development, Process Automation, and Site Reliability Engineering (SRE). This role will be responsible for ensuring the reliability, scalability, and security of automation services. The Infrastructure Automation team will be responsible for automating infrastructure provisioning, deployment, configuration management, and monitoring. You will work closely with development, operations, and security teams to drive automation solutions that enhance the overall infrastructures efficiency and reliability. This role demands the ability to drive and deliver against key organizational strategic initiatives, foster a collaborative environment, and deliver high-quality results in a matrixed organizational structure. Please note, this is an onsite role based in Hyderabad. Roles & Responsibilities: Automation Strategy & Leadership : Lead the development and implementation of infrastructure automation strategies. Collaborate with key collaborators (DevOps, IT Operations, Security, etc.) to define automation goals and ensure alignment with company objectives. Provide leadership and mentorship to a team of engineers, ensuring continuous growth and skill development. Infrastructure Automation : Design and implement automation frameworks for infrastructure provisioning, configuration management, and orchestration (e.g., using tools like Terraform, Ansible, Puppet, Chef, etc.). Manage and optimize CI/CD pipelines for infrastructure as code (IaC) to ensure seamless delivery and updates. Work with cloud providers (AWS, Azure, GCP) to implement automation solutions for managing cloud resources and services. Process Improvement : Identify areas for process improvement by analyzing current workflows, systems, and infrastructure operations. Create and implement solutions to reduce operational overhead and increase system reliability, scalability, and security. Automate and streamline recurring tasks, including patch management, backups, and system monitoring. Collaboration & Communication : Collaborate with multi-functional teams (Development, IT Operations, Security, etc.) to ensure infrastructure automation aligns with business needs. Regularly communicate progress, challenges, and successes to management, offering insights on how automation is driving efficiencies. Documentation & Standards : Maintain proper documentation for automation scripts, infrastructure configurations, and processes. Develop and enforce best practices and standards for automation and infrastructure management. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master's degree with 8-10 years of experience in Observability operation, with at least 3 years in management OR Bachelor's degree with 10-14years of experience in Observability Operations, with at least 4 years in management OR Diploma with 14-18 years of experience in Observability Operations, with at least 5 years in management 12+ years of experience in IT infrastructure management, with at least 4+ years in a leadership or managerial role. Strong expertise in automation tools and frameworks such as Terraform, Ansible, Chef, Puppet, or similar. Proficiency in scripting languages (e.g., Python, Bash, PowerShell). Hands-on experience with cloud platforms (AWS) and containerization technologies (Docker, Kubernetes). Hands-on of Infrastructure as Code (IaC) principles and CI/CD pipeline implementation. Experience with ServiceNow Development and Administration Solid understanding of networking, security protocols, and infrastructure design. Excellent problem-solving skills and the ability to troubleshoot complex infrastructure issues. Strong leadership and communication skills, with the ability to work effectively across teams. Professional Certifications (Preferred): ITIL or PMP Certification Red Hat Certified System Administrator Service Now Certified System Administrator AWS Certified Solutions Architect Preferred Qualifications: Strong experience with Ansible, including playbooks, roles, and modules. Strong experience with infrastructure-as-code concepts and other automation tools like Terraform or Puppet. Strong understanding of user-centered design and building scalable, high-performing web and mobile interfaces on the ServiceNow platform Proficiency with both Windows and Linux/Unix-based operating systems. Knowledge of cloud platforms (AWS, Azure, Google Cloud) and automation techniques in those environments. Familiarity with CI/CD tools and processes, particularly with integration of Ansible in pipelines. Understanding of version control systems (Git). Strong troubleshooting, debugging, and performance optimization skills. Experience with hybrid cloud environments and multi-cloud strategies. Familiarity with DevOps practices and tools. Experience operating within a validated systems environment (FDA, European Agency for the Evaluation of Medicinal Products, Ministry of Health, etc.) Soft Skills: Excellent leadership and team management skills. Change management expertise Crisis management capabilities Strong presentation and public speaking skills Analytical mindset with a focus on continuous improvement. Detail-oriented with the capacity to manage multiple projects and priorities. Self-motivated and able to work independently or as part of a team. Strong communication skills to effectively interact with both technical and non-technical collaborators. Ability to work effectively with global, virtual teams Shift Information: This position is an onsite role and may require working during later hours to align with business hours. Candidates must be willing and able to work outside of standard hours as required to meet business needs.
Posted 3 months ago
10.0 - 12.0 years
0 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Req ID: 323226 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Solution Architect Sr. Advisor to join our team in Bengaluru, India, Karn?taka (IN-KA), India (IN). Key Responsibilities: Design data platform architectures (data lakes, lakehouses, DWH) using modern cloud-native tools (e.g., Databricks, Snowflake, BigQuery, Synapse, Redshift). Architect data ingestion, transformation, and consumption pipelines using batch and streaming methods. Enable real-time analytics and machine learning through scalable and modular data frameworks. Define data governance models, metadata management, lineage tracking, and access controls. Collaborate with AI/ML, application, and business teams to identify high-impact use cases and optimize data usage. Lead modernization initiatives from legacy data warehouses to cloud-native and distributed architectures. Enforce data quality and observability practices for mission-critical workloads. Required Skills: 10+ years in data architecture, with strong grounding in modern data platforms and pipelines. Deep knowledge of SQL/NoSQL, Spark, Delta Lake, Kafka, ETL/ELT frameworks. Hands-on experience with cloud data platforms (AWS, Azure, GCP). Understanding of data privacy, security, lineage, and compliance (GDPR, HIPAA, etc.). Experience implementing data mesh/data fabric concepts is a plus. Expertise in technical solutions writing and presenting using tools such as Word, PowerPoint, Excel, Visio etc. High level of executive presence to be able to articulate the solutions to CXO level executives. Preferred Qualifications: Certifications in Snowflake, Databricks, or cloud-native data platforms. Exposure to AI/ML data pipelines, MLOps, and real-time data applications. Familiarity with data visualization and BI tools (Power BI, Tableau, Looker, etc.). About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.
Posted 3 months ago
13.0 - 21.0 years
45 - 60 Lacs
Hyderabad
Hybrid
Job Description Summary: As a Data Architect, you will play a pivotal role in defining and implementing common data models, API standards, and leveraging the Common Information Model (CIM) standard across a portfolio of products deployed in Critical National Infrastructure (CNI) environments globally. GE Vernova is the leading software provider for the operations of national and regional electricity grids worldwide. Our software solutions range from supporting electricity markets, enabling grid and network planning, to real-time electricity grid operations. In this senior technical role, you will collaborate closely with lead software architects to ensure secure, performant, and composable designs and implementations across our portfolio. Job Description Grid Software (a division of GE Vernova) is driving the vision of GridOS - a portfolio of software running on a common platform to meet the fast-changing needs of the energy sector and support the energy transition. Grid Software has extensive and well-established software stacks that are progressively being ported to a common microservice architecture, delivering a composable suite of applications. Simultaneously, new applications are being designed and built on the same common platform to provide innovative solutions that enable our customers to accelerate the energy transition. This role is for a senior data architect who understands the core designs, principles, and technologies of GridOS. Key responsibilities include: Formalizing Data Models and API Standards : Lead the formalization and standardization of data models and API standards across products to ensure interoperability and efficiency. Leveraging CIM Standards : Implement and advocate for the Common Information Model (CIM) standards to ensure consistent data representation and exchange across systems. Architecture Reviews and Coordination : Contribute to architecture reviews across the organization as part of Architecture Review Boards (ARB) and the Architecture Decision Record (ADR) process. Knowledge Transfer and Collaboration : Work with the Architecture SteerCo and Developer Standard Practices team to establish standard pratcise around data modeling and API design. Documentation : Ensure that data modeling and API standards are accurately documented and maintained in collaboration with documentation teams. Backlog Planning and Dependency Management : Work across software teams to prepare backlog planning, identify, and manage cross-team dependencies when it comes to data modeling and API requirements. Key Knowledge Areas and Expertise Data Architecture and Modeling : Extensive experience in designing and implementing data architectures and common data models. API Standards : Expertise in defining and implementing API standards to ensure seamless integration and data exchange between systems. Common Information Model (CIM) : In-depth knowledge of CIM standards and their application within the energy sector. Data Mesh and Data Fabric : Understanding of data mesh and data fabric principles, enabling software composability and data-centric design trade-offs. Microservice Architecture : Understandig of microservice architecture and software development Kubernetes : Understanding of Kubernetes, including software development in an orchestrated microservice architecture. This includes Kubernetes API, custom resources, API aggregation, Helm, and manifest standardization. CI/CD and DevSecOps : Experience with CI/CD pipelines, DevSecOps practices, and GitOps, especially in secure, air-gapped environments. Mobile Software Architecture : Knowledge of mobile software architecture for field crew operations, offline support, and near-realtime operation. Additional Knowledge (Advantageous but not Essential) Energy Industry Technologies : Familiarity with key technologies specific to the energy industry, such as Supervisory Control and Data Acquisition (SCADA), Geospatial network modeling, etc. This is a critical role within Grid Software, requiring a broad range of knowledge and strong organizational and communication skills to drive common architecture, software standards, and principles across the organization.
Posted 3 months ago
5.0 - 10.0 years
13 - 23 Lacs
hyderabad, delhi / ncr, mumbai (all areas)
Hybrid
Role & responsibilities Practitioner should be able to architect and implement a Data fabric solution layer on the client platform. Should be able to advise on the optimal solution and work on optimization. Primary Skill Required for the Role Azure Data Engineer, Data Fabric, Power BI, Python data engineering, AWS Primary Skills(Good to Have Skills) 4-5 years of exp Hands on exp Data fabric infra on Azure or cloud: 2.5 to 3 years Exposure to: Power BI, Python/Pyspark, SQL, EXP in Cloud tech such as AWS or azure or GCP
Posted Date not available
5.0 - 8.0 years
9 - 19 Lacs
bengaluru
Hybrid
Role & responsibilities 5-8 years of experience with data virtualization, data mesh, data fabric, and federated querying platform such as Denodo, Starburst or OSS platforms is highly desirable. 5+ Years of experience working as a Report Visualization Engineer with Power BI, Tableau, or any similar Reporting Platforms with End-to-End delivery. 3+ Year of experience with implementation of Data Modelling, Data Governance and RLS. 3+ Years of experience with Enterprise Deployment Strategies and migration of legacy platform reports to modern reporting platforms. Extract, transform, and load large volumes of structured and unstructured data from various sources into AWS data lakes or modern data platforms like Snowflake. Assist Data Management Engineering Team (either for Data Pipelines Engineering or Data Service & Data Access Engineering) for ETL or BI Design and other framework related items. Solid understanding of data modeling, database design, and ETL principles. Experience working with data lakes, data warehouses, and distributed computing systems. Familiarity with data governance, data security, and compliance practices in cloud environments. Strong problem-solving skills and the ability to optimize and fine-tune data pipelines and Spark jobs for performance. Excellent communication and collaboration skills, with the ability to work effectively in a team environment. Tableau/Power BI / Snowflake / Starburst certifications on Data related specailities are a plus. Power Platform experience ( Power Apps, Power Automate) will be a plus.
Posted Date not available
15.0 - 20.0 years
20 - 30 Lacs
noida
Work from Office
Job Title: DBA Architect Azure ADF, Databricks – 15+ Years – Noida – Immediate Joiners Payroll: Venpa Staffing Services (Preferred) | PERMANENT also acceptable Work Location: Noida (Onsite) Shift: EST Shift Notice Period: Immediate Joiners Only CTC: As per market standards. Job Summary: BrickRed Systems, through Venpa Staffing Services, is urgently hiring a DBA Architect with deep expertise in Azure Data Factory (ADF) , Azure Databricks , and Data Fabric technologies . This is a high-impact onsite role for experienced architects with 15+ years of end-to-end data platform expertise and leadership experience. Key Responsibilities: Architect and implement scalable, secure, and enterprise-grade Azure-based data solutions Lead design-to-delivery of data architectures including ADF, Databricks, Synapse, Data Lake, Data Fabric Drive data governance, ingestion, transformation, and security initiatives Collaborate with engineers, analysts, and DevOps teams for solution alignment Define architecture best practices , patterns, and governance frameworks Migrate legacy/on-prem data platforms to Azure cloud environments Optimize performance, cost, and reliability of data pipelines Mentor and lead junior engineers and architects Required Skills & Experience: 15+ years in database technology and architecture 3–5+ years in a dedicated Architect role Expert in Azure ADF, Azure Databricks, Data Fabric solutions Strong hands-on in data modeling, warehousing, ETL/ELT Proficient in SQL, NoSQL, Big Data tech Knowledge of CI/CD , cost optimization, and data monitoring on Azure Strong leadership, governance, and compliance experience Preferred Qualifications: Azure certifications (e.g., Azure Data Engineer Associate, Solutions Architect) Experience with Microsoft Fabric Exposure to AI/ML integration in data pipelines Excellent communication & stakeholder management Note: Candidates must be available immediately Apply Now If you’re a seasoned data architect ready to make an impact, apply today or email your resume to: karthika@venpastaffing.com +91 9036237987 Subject Line: DBA Architect – Noida – Immediate Joiner
Posted Date not available
3.0 - 7.0 years
12 - 16 Lacs
noida, bengaluru, mumbai (all areas)
Work from Office
About the Role: We are seeking a detail-oriented and technically fluent Project & Data Analyst to support the execution and operationalization of the NextGen SIEM and Data Fabric programs. This role will bridge project management, data analysis, and compliance documentation to ensure timely delivery, data-driven decision-making, and audit readiness across both initiatives. Key Responsibilities: Data Analysis & Reporting Analyze ingestion patterns, data quality, and usage metrics across SIEM and Data Fabric platforms. Develop dashboards and reports to track operational KPIs, compliance status, and integration progress. Support detection engineering and platform teams with data validation and enrichment insights. Documentation & Compliance Own the creation and maintenance of DRNs and related compliance documentation. Ensure traceability of data sources, transformation logic, and routing decisions. Collaborate with governance and audit teams to ensure documentation aligns with regulatory and internal standards. Project Management Support Maintain project plans, RAID logs, and milestone trackers for both SIEM and Data Fabric workstreams. Coordinate cross-functional meetings, track action items, and escalate risks or blockers. Support vendor coordination and internal stakeholder alignment, including with engineering leads and compliance officers. Cross-Functional Collaboration Work closely with engineering, detection, compliance, and platform teams to align priorities and timelines. Assist in the preparation of executive updates and program reviews (e.g., AI-224, SG4) using insights from AI-224 Executive Review v2 and SG4 Project Close Presentation - AR-2.e Qualifications 3+ years of experience in project coordination, data analysis, or technical program support. Familiarity with SIEM platforms, data pipeline tools, and compliance frameworks. Proficiency in Excel, Power BI/Tableau, and project management tools (e.g., MS Project, Jira). Strong organizational and communication skills with a bias for action and clarity.
Posted Date not available
6.0 - 11.0 years
15 - 30 Lacs
pune, bengaluru
Work from Office
Hiring for Appian developer for Wipro limited *Excellent English Communication *5+ years of hand on experience in Appian BPM *Knowledge or working experience with SAP or Enterprise system * Notice period - Immediate to 60 Days HR Kanchan 9691001643 Required Candidate profile 1. Appian developer- L2 certification is mandatory (B3) 2. Appian developer- L3 certification is mandatory (C1) Lead or support solution design discussions with onshore leads based in the UK/NL
Posted Date not available
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |