Jobs
Interviews

263 Rbac Jobs - Page 4

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 10.0 years

7 - 11 Lacs

Chennai

Work from Office

Job Title:DevOps with Harness OSP or OTPExperience8-10YearsLocation:Remote : Harness CI/CD Expertise: Proficiency in setting up and managing Harness CI/CD pipelines for enterprise systems. Experience in creating CI/CD pipelines across any 2 technologies (Java, .NET, Lambda, Python and Database) with a focus on build, automation and deployment optimization. Hands-on experience integrating SaaS platforms (e.g., JIRA, JFrog, DataPower, F5) and hosted platforms (e.g., GitHub, Jenkins, SonarQube, Checkmarx) into Harness pipelines. Knowledge of deployment patterns such as Blue-Green and Canary. Familiarity with quality gates (e.g., SonarQube, Checkmarx) and artifact repositories like JFrog Artifactory. Expertise in configuring Harness Delegate Provisioning, RBAC rules and LDAP Integrations setups. Cloud and Infrastructure as Code (IaC): Strong expertise in AWS and cloud infrastructure management. 4years of experience with Terraform for provisioning across development, staging, and production environments. CI/CD Tools and Pipelines: Experience with Jenkins, Harness or similar CI/CD tools for pipeline development Proficiency in pipeline scripting, including Jenkins and Harness-specific pipelines. Version Control: Expertise in GitHub, including enterprise-level setups and integration with CI/CD pipelines. Programming and Scripting: 4years of experience with scripting languages such as Python, Shell and Terraform for automation and troubleshooting. Containers and Orchestration: 1years of hands-on experience with Docker and Kubernetes (CKA certification preferred). Knowledge of container image repositories such as ECR. Secondary Skills Migration and Integration Support: Hands on experience with pipeline migration strategies from Jenkins or other CI/CD tools to Harness. Familiarity with integrating SSO, SailPoint, Service Now, Jira, SonarQube and other enterprise tools into the Harness ecosystem. Configuration and Automation: Experience customizing accounts with roles, policies, OPA and security groups using Terraform and other automation tools. Observability and Monitoring Tools: Knowledge of monitoring and reporting mechanisms for CI/CD pipeline performance and reliability. Data and Messaging Tools: Familiarity with Kafka, Zookeeper, and ELK Stack for application monitoring and data flow optimization

Posted 2 weeks ago

Apply

8.0 - 10.0 years

4 - 8 Lacs

Chennai

Work from Office

Job Title:DevOps with HarnessExperience8-10YearsLocation:Remote : Harness CI/CD Expertise: Proficiency in setting up and managing Harness CI/CD pipelines for enterprise systems. Experience in creating CI/CD pipelines across any 2 technologies (Java, .NET, Lambda, Python and Database) with a focus on build, automation and deployment optimization. Hands-on experience integrating SaaS platforms (e.g., JIRA, JFrog, DataPower, F5) and hosted platforms (e.g., GitHub, Jenkins, SonarQube, Checkmarx) into Harness pipelines. Knowledge of deployment patterns such as Blue-Green and Canary. Familiarity with quality gates (e.g., SonarQube, Checkmarx) and artifact repositories like JFrog Artifactory. Expertise in configuring Harness Delegate Provisioning, RBAC rules and LDAP Integrations setups. Cloud and Infrastructure as Code (IaC): Strong expertise in AWS and cloud infrastructure management. 4years of experience with Terraform for provisioning across development, staging, and production environments. CI/CD Tools and Pipelines: Experience with Jenkins, Harness or similar CI/CD tools for pipeline development Proficiency in pipeline scripting, including Jenkins and Harness-specific pipelines. Version Control: Expertise in GitHub, including enterprise-level setups and integration with CI/CD pipelines. Programming and Scripting: 4years of experience with scripting languages such as Python, Shell and Terraform for automation and troubleshooting. Containers and Orchestration: 1years of hands-on experience with Docker and Kubernetes (CKA certification preferred). Knowledge of container image repositories such as ECR. Secondary Skills Migration and Integration Support: Hands on experience with pipeline migration strategies from Jenkins or other CI/CD tools to Harness. Familiarity with integrating SSO, SailPoint, Service Now, Jira, SonarQube and other enterprise tools into the Harness ecosystem. Configuration and Automation: Experience customizing accounts with roles, policies, OPA and security groups using Terraform and other automation tools. Observability and Monitoring Tools: Knowledge of monitoring and reporting mechanisms for CI/CD pipeline performance and reliability. Data and Messaging Tools: Familiarity with Kafka, Zookeeper, and ELK Stack for application monitoring and data flow optimization.

Posted 2 weeks ago

Apply

8.0 - 10.0 years

2 - 6 Lacs

Chennai

Work from Office

Job Title:Harness Developers - SOSExperience4-8 YearsLocation:Remote : Primary Skills Harness CI/CD Expertise: Proficiency in setting up and managing Harness CI/CD pipelines for enterprise systems. Experience in creating CI/CD pipelines across any 2 technologies (Java, .NET, Lambda, Python and Database) with a focus on build, automation and deployment optimization. Hands-on experience integrating SaaS platforms (e.g., JIRA, JFrog, DataPower, F5) and hosted platforms (e.g., GitHub, Jenkins, SonarQube, Checkmarx) into Harness pipelines. Knowledge of deployment patterns such as Blue-Green and Canary. Familiarity with quality gates (e.g., SonarQube, Checkmarx) and artifact repositories like JFrog Artifactory. Expertise in configuring Harness Delegate Provisioning, RBAC rules and LDAP Integrations setups. Cloud and Infrastructure as Code (IaC): Strong expertise in AWS and cloud infrastructure management. 4+ years of experience with Terraform for provisioning across development, staging, and production environments. CI/CD Tools and Pipelines: Experience with Jenkins, Harness or similar CI/CD tools for pipeline development Proficiency in pipeline scripting, including Jenkins and Harness-specific pipelines. Version Control: Expertise in GitHub, including enterprise-level setups and integration with CI/CD pipelines. Programming and Scripting: 4+ years of experience with scripting languages such as Python, Shell and Terraform for automation and troubleshooting. Containers and Orchestration: 1+ years of hands-on experience with Docker and Kubernetes (CKA certification preferred). Knowledge of container image repositories such as ECR. Secondary Skills Migration and Integration Support: Hands-on experience with pipeline migration strategies from Jenkins or other CI/CD tools to Harness. Familiarity with integrating SSO, SailPoint, Service Now, Jira, SonarQube and other enterprise tools into the Harness ecosystem. Configuration and Automation: Experience customizing accounts with roles, policies, OPA and security groups using Terraform and other automation tools. Observability and Monitoring Tools: Knowledge of monitoring and reporting mechanisms for CI/CD pipeline performance and reliability. Data and Messaging Tools: Familiarity with Kafka, Zookeeper, and ELK Stack for application monitoring and data flow optimization.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

8 - 14 Lacs

Telangana

Work from Office

Primary Skills : Strong experience with Microsoft Sentinel architecture, including workspaces, playbooks, and automation. Expertise in Azure Cloud, including Azure Defender for cloud, XDR(MDE/MDI/MDO), and Azure EntraID. Proficiency in KQL and other scripting languages for automation. Secondary Skills: Experience with SIEM solutions and security monitoring tools such as Splunk. Knowledge of network security, identity management, and cloud security best practices. Strong analytical and problem-solving skills. Certifications such as Microsoft Certified: Security Operations Analyst Associate or Azure Security Engineer Associate are a plus. Experience of Azure Key Vault creation, configuration, and maintenance. Experience of Private Endpoints, VNETs, Subnets Experience of Entra ID including creating users, user groups, Service Principles, and access management Experience of RBAC mapping and modelling Experience of Storage Accounts and Log Analytics Workspaces Excellent documentation and communication skills Key Responsibilities: Configure Microsoft Sentinel solutions to monitor and respond to security threats. Create and optimize Sentinel playbooks, workbooks, and hunting queries for proactive threat detection. Manage data connectors and integrate Sentinel with various security tools and logs. Automate security processes using KQL Collaborate with SOC teams to enhance security monitoring and incident response. Ensure compliance with industry security standards and best practices. Conduct security assessments and recommend improvements for cloud security posture. Configure Analytic Rules, install connectors and monitor Bonus to have: Experience working in a SOC environment. Familiarity with incident response frameworks. Hands-on experience with Infrastructure as Code (IaC) using Terraform or Biceps.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

25 - 27 Lacs

Bengaluru

Hybrid

Key Skills: SAP S4 HANA Security, RBAC, GRC Access Control, SAP BTP Security, IAM, SU01, PFCG, SU24, ABAP Security, SAP Fiori Security, Identity Authentication Service, Single Sign-On (SSO), OAuth, SAML Roles & Responsibilities: Design, implement, and maintain role-based access control (RBAC) concepts in SAP S4 HANA. Develop and maintain authorization concepts aligned with business requirements and compliance standards. Create and maintain custom roles and authorization objects. Perform security audits and access reviews. Troubleshoot authorization-related issues. Provide documentation for authorization concepts and role designs. Support SAP security implementations and upgrade projects. Collaborate with business process owners for role design and access management. Handle user access management and periodic access reviews. Experience Requirement: 5-10 years of experience in SAP Security and Authorization. Strong expertise in SAP S4 HANA security and authorization concepts. In-depth knowledge of GRC Access Control. Experience with User Access Management and Role Design. Proficiency in authorization trace analysis and troubleshooting. Understanding of security audit logs and their implementation. Knowledge of SAP security best practices and industry standards. Experience with SAP BTP security and authorization concepts. Knowledge of Identity and Access Management (IAM) principles. Familiarity with cloud security concepts. SAP security certifications. Experience with SAP Fiori security. Knowledge of ABAP security. Understanding of OAuth, SAML, and other authentication protocols. Education: Masters, B.Tech M.Tech (Dual), MCA, B.Tech, M. Tech.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

25 - 27 Lacs

Bengaluru

Hybrid

Key Skills: SAP S4 HANA Security, RBAC, GRC Access Control, SAP BTP Security, IAM, SU01, PFCG, SU24, ABAP Security, SAP Fiori Security, Identity Authentication Service, Single Sign-On (SSO), OAuth, SAML. Roles & Responsibilities: Design, implement, and maintain role-based access control (RBAC) concepts in SAP S4 HANA. Develop and maintain authorization concepts aligned with business requirements and compliance standards. Create and maintain custom roles and authorization objects. Perform security audits and access reviews. Troubleshoot authorization-related issues. Provide documentation for authorization concepts and role designs. Support SAP security implementations and upgrade projects. Collaborate with business process owners for role design and access management. Handle user access management and periodic access reviews. Experience Requirement: 5-10 years of experience in SAP Security and Authorization. Strong expertise in SAP S4 HANA security and authorization concepts. In-depth knowledge of GRC Access Control. Experience with User Access Management and Role Design. Proficiency in authorization trace analysis and troubleshooting. Understanding of security audit logs and their implementation. Knowledge of SAP security best practices and industry standards. Education: Masters, B.Tech M.Tech (Dual), MCA, B.Tech, M. Tech.

Posted 2 weeks ago

Apply

2.0 - 4.0 years

2 - 5 Lacs

Hyderabad, Telangana, India

On-site

Key Responsibilities: Design, deploy, and maintain cloud infrastructure and services on Microsoft Azure. Manage and monitor Azure resources such as Virtual Machines, App Services, Azure SQL Database, Storage, and Networking. Implement Infrastructure as Code (IaC) using tools like Azure Resource Manager (ARM) templates, Terraform, or Bicep. Configure and manage Azure security including identity and access management (Azure AD), network security groups, and key vaults. Support migration of on-premise applications and data to Azure cloud. Automate deployment pipelines using Azure DevOps, GitHub Actions, or other CI/CD tools. Monitor cloud performance, troubleshoot issues, and optimize costs. Collaborate with developers, security teams, and stakeholders for cloud adoption and governance. Key Skills Required: Strong experience with Microsoft Azure services (VMs, App Services, Azure SQL, Azure Functions, Azure Storage, Azure Networking) Proficiency in Infrastructure as Code (ARM templates, Terraform, Bicep) Experience with Azure Active Directory, Role-Based Access Control (RBAC), and security best practices Familiarity with Azure DevOps or other CI/CD pipelines Knowledge of scripting languages such as PowerShell, Azure CLI, or Python Understanding of cloud architecture patterns and best practices

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

As a hands-on backend expert, you will be responsible for taking our FastAPI-based platform to the next level by building production-grade model-inference services, agentic AI workflows, and seamless integration with third-party LLMs and NLP tooling. Please note that this role is being hired for one of our client companies, and the company name will be disclosed during the interview process. In this role, you will have the opportunity to work on the following key areas: Core Backend Enhancements: - Build APIs - Harden security with OAuth2/JWT, rate-limiting, SecretManager, and observability with structured logging and tracing - Implement CI/CD, test automation, health checks, and SLO dashboards Awesome UI Interfaces: - Develop UI interfaces using React.js/Next.js, Redact/Context, and various CSS frameworks like Tailwind, MUI, Custom-CSS, and Shadcn LLM & Agentic Services: - Design micro/mini-services to host and route to OpenAI, Anthropic, local HF models, embeddings, and RAG pipelines - Implement autonomous/recursive agents that orchestrate multi-step chains including Tools, Memory, and Planning Model-Inference Infrastructure: - Set up GPU/CPU inference servers behind an API gateway - Optimize throughput with batching, streaming, quantization, and caching using technologies like Redis and pgvector NLP & Data Services: - Own the NLP stack focusing on Transformers for classification, extraction, and embedding generation - Build data pipelines that combine aggregated business metrics with model telemetry for analytics You will be working with the following tech stack: - Python, FastAPI, Starlette, Pydantic - Async SQLAlchemy, Postgres, Alembic, pgvector - Docker, Kubernetes, or ECS/Fargate on AWS or GCP - Redis, RabbitMQ, Celery for jobs and caching - Prometheus, Grafana, OpenTelemetry - HuggingFace Transformers, LangChain, Torch, TensorRT - OpenAI, Anthropic, Azure OpenAI, Cohere APIs - Pytest, GitHub Actions - Terraform or CDK To be successful in this role, you must have: - 3+ years of experience building production Python REST APIs using FastAPI, Flask, or Django-REST - Strong SQL schema design and query optimization skills in Postgres - Deep knowledge of async patterns and concurrency - Hands-on experience with UI applications that integrate with backend APIs - Experience with RAG, LLM/embedding workflows, prompt-engineering, and agent-ops frameworks - Cloud container orchestration experience - Proficiency in CI/CD pipelines and infrastructure-as-code Nice-to-have experience includes familiarity with streaming protocols, NGINX Ingress, RBAC, multi-tenant SaaS security, data privacy, event-sourced data models, and more. This role is crucial as our products are live and evolving rapidly. You will have the opportunity to own systems end-to-end, scale AI services, work closely with the founder, and shape the future of our platform. If you are seeking meaningful ownership and enjoy working on challenging, forward-looking problems, this role is perfect for you.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Mumbai, Gurugram, Chennai

Work from Office

Your role We are seeking a skilled and experienced IAM Engineer to join our team. The ideal candidate will be responsible for implementing, operating, and maintaining enterpriseEntra ID (Azure Active Directory),Intune, and related core technologies. This role involves designing and supporting enterprise infrastructure solutions and collaborating with the Security team to ensure robust protection of systems and data. Implement, operate, and maintainEntra ID,Intune, and associated technologies. Design and support enterprise infrastructure solutions. Act as aSubject Matter Expert (SME)for identity and access management, including Federation Access management Authentication Access control Service provisioning Collaborate with the Security team to implement and manage security measures. Troubleshoot and resolveEntra ID (Azure AD)related issues and incidents. Understand system dependencies and their impact on business operations. Perform capacity planning and infrastructure analysis. Analyze complex cross-functional and cross-platform issues. Your profile Expertise in Entra ID (Azure Active Directory)andMicrosoft Intune, including configuration, management, and troubleshooting. Proven experience withAzure Single Sign-On (SSO),Conditional Access,RBAC, andMicrosoft Security Stack. Strong hands-on knowledge ofActive Directory,GPO,DNS,DHCP,AAD Connect,ADCS, andAzure App Proxy. Proficient inPowerShellandAzure CLIfor automation, reporting, and infrastructure management. Familiarity withMicrosoft 365,cloud technologies, and implementingMulti-Factor Authentication (MFA)using Entra MFA. What you"ll love about working here You can shape yourcareerwith us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. At Capgemini, you can work oncutting-edge projectsin tech and engineering with industry leaders or createsolutionsto overcome societal and environmental challenges. Location - Gurugram,Chennai,Mumbai,Pune,Hyderabad,Bengaluru

Posted 2 weeks ago

Apply

8.0 - 13.0 years

15 - 20 Lacs

Bengaluru

Remote

Greetings!!! We have an urgent opening Azure DevOps L1 Engineer - Remote Role:- Azure DevOps L1 Engineer Location- Remote Duration: Long term Contract Budget: 18 LPA Shift: Rotational Shift ( 9am to 6pm, 12pm to 9pm, 6pm to 3am) Immediate to 15 days Joiner JD: Certifications: A professional-level Azure certification DevOps or Architect Version Control: Expertise in version control systems, particularly Git, for managing and tracking code changes. Scripting Skills: Strong PowerShell, Bash, or Python scripting skills for automating tasks and processes. Containerization Knowledge: Understanding of containerization technologies like Docker and orchestration platforms like Kubernetes on Azure Kubernetes Service (AKS). Security Best Practices: Familiarity with security best practices, including role-based access control (RBAC), Azure Policy, and managing secrets with tools like Azure Key Vault. Problem-Solving Abilities: Strong problem-solving abilities to troubleshoot and resolve complex technical issues related to DevOps processes. Knowledge on other Cloud and AI is a plus If you're interested, please send your resume to suhas@iitjobs.com.

Posted 2 weeks ago

Apply

6.0 - 10.0 years

25 - 27 Lacs

Noida

Work from Office

Hiring an OpenShift L3 Support Engineer with 6+ years of experience for 24x7 onsite support in Noida. Role involves full lifecycle OpenShift cluster management, CI/CD, monitoring, patching, and RCA delivery. Required Candidate profile Experienced OpenShift L3 Engineer with 6+ years in container management, CI/CD, monitoring (Zabbix/Grafana), patching, and RCA reporting. OCP certified and available for 24x7 onsite support in Noida.

Posted 2 weeks ago

Apply

4.0 - 6.0 years

12 - 13 Lacs

Noida

Work from Office

Looking for an OpenShift L2 Support Engineer with 4–6 years of experience for onsite support in Noida/Delhi. Must manage OpenShift clusters, container lifecycle, CI/CD, monitoring, patching, and incident resolution in a 24x7 setup. Required Candidate profile Experienced L2 Support Engineer with 4–6 years in OpenShift container platform, CI/CD pipelines, monitoring, OS patching, IAM, and troubleshooting. Available for 24x7 onsite support in Noida/Delhi.

Posted 2 weeks ago

Apply

2.0 - 7.0 years

10 - 20 Lacs

Noida

Work from Office

SDG is a global cybersecurity, identity governance, risk consulting and advisory company that advises and partners with clients to address their complex security, compliance and technology needs and delivers on strategy, transformation, and long-term management of their cybersecurity and IAM programs. We help some of the largest brands in the world realize their business vision through a mix of strategic advice, expert systems integration, relevant technology recommendations and smart managed services. Our value proposition to our customers is that we bring thought leadership to the table in each of our domains, a passion for customer success, and an eye to risk management in everything we do. We are looking for you to join our SDG family! We are seeking a highly skilled and experienced IdentityNow Engineer to join our team. As an IdentityNow Engineer, you will be responsible for designing, implementing, and maintaining the identity and access management (IAM) infrastructure. You will collaborate with cross-functional teams to ensure the secure and efficient management of identities, access rights, and application onboarding processes. Looking for notice serving candidates or immediate available. Position: Sailpoint ISC Experience: 2+yrs Location:Noida Or Remote Shift Timing: 1PM to 10PM OR 4PM to 1AM Key Responsibilities: 2 to 4 years of industry experience in Identity and Access Management (IAM). 2 to 4 years of experience in developing, implementing, or architecting information systems. 1 to 2 years of experience with technical architecture, including integrating identity management and access governance software into client infrastructures and applications. Practical experience in using IAM or Access Governance platforms Preferable working knowledge and/or experience with tools such as SailPoint (Identity Security Cloud), ForgeRock, CyberArk, and OutSystems. Understanding and familiarity with operating systems (Windows, Unix, Linux). Relevant experience with programming languages including Java, JavaScript, SQL, and Python. • Required experience in Amazon Web Services (AWS), including services such as EC2, RDS, S3, Route 53, SES, VPC, Security Hub, WAF, AWS ALB or NLB, Secrets Manager, CloudWatch, Lambda and AWS Glue. A bachelors degree in computer science, Cyber Security, Information Security, or a related field is highly recommended

Posted 2 weeks ago

Apply

10.0 - 11.0 years

22 - 25 Lacs

Pune

Work from Office

Define overall architecture using Azure App Services Blob StorageAzure Cognitive RBAC security Design for scalability, modularity, and secure document handling Plan metadata models, tagging, and content relationships Integrate with identity provider

Posted 2 weeks ago

Apply

10.0 - 20.0 years

30 - 45 Lacs

Noida

Remote

NP - 30 days max Strong experience in Azure VNet architecture, Azure Landing Zones, CAF, Azure App Services and networking, Firewall Compliance frameworks & RBAC. Architecture diagrams & Documentation Share CV at kavita.singh@elevancesystems.com

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You have an exciting opportunity to join our team as a Splunk Enterprise Security Specialist in Hyderabad. You should have 5-8 years of experience and expertise in Splunk ES architecture. Your responsibilities will include integrating Splunk with various security tools and technologies across different domains like Process control Domain/OT and Operations Domain/IT. You will be administering and managing the Splunk deployment for optimal performance, implementing RBAC, and developing custom Splunk add-ons for ingesting, parsing, and filtering incoming logs. Collaborating with SOC team members, you will understand security requirements and objectives, implementing Splunk solutions to enhance threat detection and incident response capabilities. You will integrate different security controls and devices such as firewalls, Endpoint Detection and Response (EDR) systems, Proxy, Active Directory (AD), and threat intelligence platforms. Your role will involve developing custom Splunk correlation searches, dashboards, and reports to identify security incidents, investigate alerts, and provide actionable insights to SOC analysts. You will also create highly efficient custom dashboards for different teams to facilitate security risks, threat, and vulnerability investigations. Additionally, you will conduct threat hunting exercises using Splunk to proactively identify and mitigate potential security threats and vulnerabilities. You will assist in the development and refinement of SOC processes and procedures, leveraging Splunk to streamline workflows and enhance operational efficiency. Your responsibilities will also include implementing Splunk for various automations of SOC SOP workflows. To be successful in this role, you should have experience in designing and implementing Splunk ES architecture, integration with security tools and technologies, security monitoring, incident response, security analytics, and reporting. You should also have strong collaboration and communication skills. Additionally, you will be responsible for the implementation and management of Splunk Enterprise Security, migration/scaling of the Splunk Environment from Windows to Linux, and enhancing the performance, reliability, and availability. You will also implement and integrate the SOAR platform (Splunk Phantom) and User Behavior Analytics (Splunk UBA/UEBA) with the existing Splunk Infrastructure, supporting and enhancing operations with automations wherever possible.,

Posted 3 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

haryana

On-site

As a Senior Manager specializing in Data Analytics & AI, you will be a pivotal member of the EY Data, Analytics & AI Ireland team. Your role as a Databricks Platform Architect will involve enabling clients to extract significant value from their information assets through innovative data analytics solutions. You will have the opportunity to work across various industries, collaborating with diverse teams and leading the design and implementation of data architecture strategies aligned with client goals. Your key responsibilities will include leading teams with varying skill sets in utilizing different Data and Analytics technologies, adapting your leadership style to meet client needs, creating a positive learning culture, engaging with clients to understand their data requirements, and developing data artefacts based on industry best practices. Additionally, you will assess existing data architectures, develop data migration strategies, and ensure data integrity and minimal disruption during migration activities. To qualify for this role, you must possess a strong academic background in computer science or related fields, along with at least 7 years of experience as a Data Architect or similar role in a consulting environment. Hands-on experience with cloud services, data modeling techniques, data management concepts, Python, Spark, Docker, Kubernetes, and cloud security controls is essential. Ideally, you will have the ability to effectively communicate technical concepts to non-technical stakeholders, lead the design and optimization of the Databricks platform, work closely with the data engineering team, maintain a comprehensive understanding of the data pipeline, and stay updated on new and emerging technologies in the field. EY offers a competitive remuneration package, flexible working options, career development opportunities, and a comprehensive Total Rewards package. Additionally, you will benefit from support, coaching, opportunities for skill development, and a diverse and inclusive culture that values individual contributions. If you are passionate about leveraging data to solve complex problems, drive business outcomes, and contribute to a better working world, consider joining EY as a Databricks Platform Architect. Apply now to be part of a dynamic team dedicated to innovation and excellence.,

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Databricks platform administrator, you will be responsible for managing the Databricks platform and associated cloud resources. Your primary focus will be on ensuring the optimal performance, security, and efficiency of Databricks clusters and workspaces. This is a long-term contract role based in Bengaluru, Karnataka, with a hybrid work arrangement. You must have at least 5 years of experience working with the Databricks platform specifically as an administrator, not as a data engineer. In addition, cloud experience is required for this role. Your responsibilities will include configuring, deploying, and maintaining Databricks clusters and workspaces using tools like Terraform. You will monitor cluster performance, troubleshoot issues, and optimize configurations for performance and cost-effectiveness. Security is a key aspect of the role, as you will be managing access controls, encryption mechanisms, and implementing security policies to protect sensitive data. Collaboration is essential in this role, as you will work closely with application development teams, data engineers, data scientists, and business analysts to understand their requirements and provide technical solutions. You will also conduct training sessions to educate users on platform best practices and capabilities. In addition, you will be responsible for managing platform costs, implementing backup and disaster recovery strategies, and integrating Databricks with other data sources, data warehouses, and data lakes. Working within an Agile delivery/DevOps methodology, you will support the application development teams in debugging and issue resolution. Overall, as a Databricks platform administrator, you will play a crucial role in ensuring the smooth operation and continuous improvement of the Databricks platform to meet the organization's data processing and analytics needs.,

Posted 3 weeks ago

Apply

7.0 - 12.0 years

5 - 15 Lacs

Bengaluru

Work from Office

Job Title: Cloud Architect Industry: [Insert Industry, e.g., IT Services, Financial Services, Healthcare] Job Summary: We are seeking a skilled and strategic Cloud Architect with deep expertise in Azure (preferred), GCP , and AWS to lead cloud transformation initiatives. The ideal candidate will have a strong background in Infrastructure as Code (IaC) , cloud security , governance , and CI/CD pipelines , with a good understanding of Data and AI workloads . This role demands excellent stakeholder management and the ability to thrive in challenging and ambiguous environments. Key Responsibilities: Design and implement scalable, secure, and cost-effective cloud architectures across Azure, GCP, and AWS. Lead cloud strategy and transformation initiatives aligned with business goals. Develop and maintain IaC using Terraform , Bicep , and ARM templates . Implement and manage cloud security using Azure Policy , Key Vault , and Defender for Cloud . Establish CI/CD pipelines using GitHub Actions and Azure DevOps . Define and enforce governance models including RBAC , custom policies , and Zero Trust architectures . Collaborate with data and AI teams to support infrastructure needs for advanced workloads. Optimize cloud cost management and ensure compliance with organizational policies. Provide technical leadership and mentorship to engineering teams. Engage with stakeholders to understand requirements, communicate solutions, and drive adoption. Required Skills & Qualifications: Proven experience with Azure (preferred), GCP , and AWS . Strong proficiency in Terraform , Bicep , and ARM templates . Hands-on experience with Azure Policy , Key Vault , and Defender for Cloud . Expertise in CI/CD tools: GitHub Actions, Azure DevOps. Deep understanding of cloud governance , RBAC , and Zero Trust models . Familiarity with cloud infrastructure for Data and AI workloads (preferred). Excellent stakeholder management and communication skills . Ability to work effectively in challenging and ambiguous environments . Strong problem-solving and analytical skills. Preferred Certifications: Microsoft Certified: Azure Solutions Architect Expert Google Cloud Certified: Professional Cloud Architect AWS Certified Solutions Architect - Professional Certified Terraform Associate (HashiCorp) Required Skills

Posted 3 weeks ago

Apply

5.0 - 10.0 years

15 - 20 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Project description We are seeking an experienced Azure DevOps Engineer to develop and manage Infrastructure as Code (IaC) using Bicep for Azure PaaS (Platform as a Service) solutions with a focus on private networking. In this role, you will collaborate with development, security, and infrastructure teams to automate and streamline the deployment of secure, scalable, and resilient cloud environments. Responsibilities Infrastructure as Code (IaC) Development Design, develop, and maintain Bicep templates to provision and manage Azure PaaS resources such as App Services, Azure Functions, Logic Apps, EventHub, Service Bus, Azure SQL, Azure Storage, Key Vault, API Management, and Cosmos DB. Ensure IaC templates are modular, reusable, and follow best practices. Implement parameterization, modules, and consistent naming conventions to enhance template flexibility. Private Networking & Security Architect and deploy private networking solutions using Private Endpoints, Private Links, Virtual Networks (VNet), Subnets, and Network Security Groups (NSGs). Configure service integrations with private networking, ensuring traffic stays within the Azure backbone. Use Application Firewall and policies to enhance security. Use Azure Key Vault to securely store and manage secrets, certificates, and keys. CI/CD Pipelines & Automation Build and manage CI/CD pipelines in Azure DevOps to automate the deployment of Bicep-based PaaS environments. Integrate IaC pipelines with Biceps for consistent deployment. Use release gates, approvals, and checks to ensure compliance and security in deployment processes. Monitoring & Optimization Implement Azure Monitor, Application Insights, and Log Analytics to monitor PaaS environments. Create alerts and dashboards to ensure performance, availability, and security visibility. Optimize PaaS resources for cost, performance, and reliability. Collaboration & Documentation Collaborate with cloud architects, security teams, and application developers to design and implement PaaS solutions. Document infrastructure, deployment processes, and Bicep modules. Provide guidance and training to development teams on Bicep and Azure networking best practices. Skills Must have Education & Experience Bachelor's degree in Computer Science, Information Technology, or related field. 5+ years of experience in Azure DevOps engineering with a focus on PaaS solutions. Hands-on experience with Azure Bicep and private networking. Technical Skills: Strong proficiency with Bicep for IaC, including modular templates and reusable components. Experience with Azure PaaS services (App Services, Functions, SQL, Storage, API Management). Expertise in Azure networking, including Service and Private Endpoints, Private Links, and VNets. Network Security Groups (NSGs) and Application Gateway. Proficiency in CI/CD pipelines using Azure DevOps. Scripting skills in PowerShell, Bash, or Python for automation tasks. Familiarity with Azure RBAC and role-based security models. Soft Skills: Strong problem-solving and troubleshooting skills. Effective collaboration and communication abilities. Detail-oriented with a focus on cloud security and performance. Nice to have Azure certifications (e.g., Azure DevOps Engineer Expert, Azure Solutions Architect Expert, or Azure Administrator Associate). Knowledge of microservices architecture and containerization ( AKS). Familiarity with Azure Policy and Azure Blueprints.

Posted 3 weeks ago

Apply

6.0 - 8.0 years

13 - 17 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Project description Luxoft is looking for experienced and enthusiastic Appian developers to be part of our growing Appian Digital Automation Practice in India. You will have the opportunity to work with a reputed global clientele of ours on change and digital transformation projects and be part of Appian CoE for our clients. You will be part of high-caliber project teams comprising of engineers with deep technical expertise and domain experience and will have a chance to learn, grow and progress in your career. We encourage cross-training in other BPM / RPA and related technologies through certification reimbursements. Luxoft offers competitive compensation & benefits package for motivated and deserving candidates. The insurance benefits of Luxoft India are among the best in the industry. As you progress through your career with Luxoft India, you will also have the opportunity to apply for roles in overseas locations of Luxoft through our flagship Internal Mobility (IM) program. Responsibilities Collaborate with business partners to understand their needs and challenges, gathering requirements. Design, develop, configure, test, deploy, implement, maintain and enhance Appian based custom Business Process Management (BPM) workflows across the enterprise. Design and implement complex enterprise integrations with industry standard technologies including LDAP, Active Directory and other internal systems in the enterprise. Design and develop database objects to support BPM applications. Maintains database performance by identifying and resolving production and application development problems. Participate in workflow analysis and modelling utilizing SQL Server and SQL Management Studio. Translate BPM requirement specifications into Appian process model prototypes and solutions ensuring the highest quality and performance. Deliver projects in either Agile or Waterfall software development methodologies, as project dictates. Conduct and participate in detailed design reviews and validate that the design follows the approved architecture. Participate in the day-to-day activities of the Appian BPM development teams. The work is primarily in the development area; however, you may also be required to perform Appian Installation, environment management, new environment creation, Appian upgrades, deployments, manage database performance and related activities. Collaborate with application support teams throughout the development, deployment, and support phases. Skills Must have You are expected to Possess a bachelor's degree in Computer Science, Engineering, or a related technical field from an accredited university or college. Have 6-8 years of software engineering experience, demonstrating proficiency in one or more of the following technologiesAppian, Scripting, REST API services, Java and JavaScript. Completed end-to-end implementation for at least 3 Appian projects. Exhibit expertise in integrations utilizing Web API, Integrations, with the capability to tailor these functionalities to suit external system requirements. Showcase a robust grasp of database principles, coupled with practical experience in Stored Procs, MySQL or SQL Server. Demonstrate experience in Appian Process Models, Rules & Expressions, Data Management including Complex Data Types, XSDs, SQL, Inter Process Communication, RBAC, and Process Reporting on Appian 20.X and above. Possess Appian Certification as per your experience level. Embrace an agile mindset and have a level of familiarization with Agile methodologies along with having a knack for crafting solutions that facilitate business process automation. Offer a blend of technical prowess, strong business acumen, leadership abilities, and a track record of effective knowledge transfer. Nice to have Appian, SQL, Integrations, Agile Methodology, Scripting

Posted 3 weeks ago

Apply

7.0 - 12.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Project description Luxoft has been asked to contract a Developer in support of a number of customer initiatives. The primary objective is to develop based on client requirements in the Telecom/network work environment Responsibilities A Data Engineer with experience in the following techologies Databricks and Azure Apache Spark-based, hands on Python, SQL, Apache Airflow. Databricks clusters for ETL processes. Integration with ADLS, Blob Storage. Efficiently ingest data from various sources, including on-premises databases, cloud storage, APIs, and streaming data. Use Azure Key Vault for managing secrets. Hands on experience working with API's Kafka/Azure EventHub streaming hands on experience Hands on experience with data bricks delta API's and UC catalog Hands on experience working with version control tools Github Data Analytics Supports various ML frameworks. Integration with Databricks for model training. OnPrem Exposure on Linux based systems Unix scripting Skills Must have Python, Apache Airflow, Microsoft Azure and Databricks, SQL, databricks clusters for ETL, ADLS, Blob storage, ingestion from various sources including databases and cloud storage, APIs and streaming data, Kafka/Azure EventHub, databricks delta APIs and UC catalog. EducationTypically, a Bachelor's degree in Computer Science (preferably M.Sc. in Computer Science), Software Engineering, or a related field is required. Experience7+ years of experience in development or related fields. Problem-Solving Skills: Ability to troubleshoot and resolve issues related to application development and deployment. Communication Skills: Ability to effectively communicate technical concepts to team members and stakeholders. This includes written and verbal communication. TeamworkAbility to work effectively in teams with diverse individuals and skill sets. Continuous LearningGiven the rapidly evolving nature of web technologies, a commitment to learning and adapting to new technologies and methodologies is crucial. Nice to have Snowflake, PostGre, Redis exposure GenAI exposure Good understanding of RBAC

Posted 3 weeks ago

Apply

6.0 - 8.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Project description Luxoft is seeking skilled and passionate Appian developers to join our expanding Appian Digital Automation Practice in India. As part of our team, you'll collaborate with esteemed global clients on change and digital transformation initiatives and contribute to our clients' Appian Centers of Excellence. You'll work alongside accomplished project teams comprised of engineers with profound technical knowledge and domain expertise, providing ample opportunities for learning, advancement, and career development. We actively promote cross-training in other BPM/RPA and related technologies by offering certification reimbursements. Luxoft offers a competitive compensation and benefits package for dedicated and deserving individuals. Luxoft India's insurance benefits are recognized as some of the best in the industry. Furthermore, as you progress in your career with Luxoft India, you'll have the chance to explore roles in Luxoft's international locations through our flagship Internal Mobility (IM) program. Responsibilities Collaborate with business partners to understand their needs and challenges, gathering requirements. Design, develop, test, deploy, maintain, and enhance Appian-based solutions based on prioritized stakeholder requirements, ensuring the highest quality and performance. Assist with DevOps and other stretch initiatives as advised by clients from time to time. The work is performed in the change area; however, you will be required to perform production verification testing of the code you have created during the production release cycle. Lead various aspects of solution delivery in partnership with internal teams and external partners. Proactively work with team members and independently to assess and explore new technologies, conduct proof of concepts to test these, and share with the wider team within the Appian Practice. Proactively identify and communicate options and solution recommendations to business and IT leaders with benefits and drawbacks. Take initiative to identify and communicate issues, risks, and progress to both business and IT stakeholders, presenting available options and recommend next steps. Foster innovation in solution creation. Collaborate with application support teams throughout the development, deployment, and support phases. Continuously learn and seek opportunities to enhance technical skills and expand the business knowledge in Luxoft Appian and Automation Practice. Assist in talent selection and hiring activities. Skills Must have You are expected to Have 6-8 years of software engineering experience, demonstrating proficiency in one or more of the following technologiesAppian, Scripting, REST API services, Java, and JavaScript. Possess 6+ Years of experience in Appian and completed end-to-end implementation for at least 3 Appian projects. Exhibit expertise in integrations utilizing Web API, Integrations, with the capability to tailor these functionalities to suit external system requirements. Showcase a robust grasp of database principles, coupled with practical experience in Stored Procs, MySQL, or SQL Server. Demonstrate experience in Appian Process Models, Rules & Expressions, Data Management including Complex Data Types, XSDs, SQL, Inter Process Communication, RBAC, and Process Reporting on Appian 23.X and above. Have knowledge of available Appian Components, Custom Components, Plug-ins, Smart Services, Patterns, and use of Design Library Possess Appian Certification as per your experience level. Embrace an agile mindset and possess a knack for crafting solutions that facilitate business process automation. Offer a blend of technical prowess, strong business acumen, leadership abilities, and a track record of effective knowledge transfer. Possess a bachelor's degree in Computer Science, Engineering, or a related technical field from an accredited university or college. Nice to have Java, RPA, Solution Mindset, Requirements Analysis, Logical Reasoning,

Posted 3 weeks ago

Apply

7.0 - 8.0 years

27 - 42 Lacs

Chennai

Work from Office

Job Summary As an App Security Specialist you will play a crucial role in safeguarding our digital assets by implementing and managing identity and access management solutions. With a focus on Sailpoint IdentityIQ you will ensure robust identity governance and management practices. This hybrid role requires adaptability to rotational shifts offering a dynamic work environment. Responsibilities Hands-on experience in the implementation administration configuration and support of SailPoint and supporting technologies. Experience in integrating SailPoint with HR Systems (Workday PeopleSoft etc) enterprise infrastructure platforms (Unix Databases Active Directory LDAP ACF2 etc.) and business applications. Experience Strong understanding of user life cycle RBAC policies enterprise roles rules Lifecycle events and provisioning workflows etc. to enable the engineering and onboarding of systems and applications on the SailPoint platform. Working knowledge of IAM industry standards & protocols including SAML OpenID Connect Oauth RBAC LDAP Kerberos etc Strong Programming skills (Java BeanShell JSP/Servlets PERL Unix Shell scripts Batch Powershell VB Script SQL PL/SQL etc) in a DevOps environment Multiple operating systems such as UNIX Windows Linux AIX etc Web technologies (WebServices RESTful API frameworks Application servers like Tomcat/JBoss JSON etc) Database technologies (Oracle SQL Server) Single Sign On MFA SCIM and Federation Directory integration including Active Directory LDAP Virtual Directories Automation and/or scripting skills Must have a working knowledge of virtualization (e.g. VMware HyperV) and LAN/WAN/Firewall/VPN network technologies monitoring and support best practices. Have a good understanding of current regulatory environment and related implications to identity management security and audit compliance

Posted 3 weeks ago

Apply

5.0 - 9.0 years

6 - 16 Lacs

Gurugram

Work from Office

We are looking for a highly skilled DDI Engineer with expertise in Infoblox to manage and support enterprise-wide DNS, DHCP, and IPAM (DDI) infrastructure. The ideal candidate will be responsible for the configuration, administration, and lifecycle management of Infoblox appliances, ensuring a secure, reliable, and scalable IP address infrastructure. Experience - 5 years Location - Gurugram Notice Period - Immediate Joiner Key Responsibilities: Design, implement, and maintain DDI services using Infoblox appliances (Grid Masters, Members). Manage authoritative and recursive DNS zones, DHCP scopes, IP address reservations, and IPAM. Administer Infoblox NIOS platform, upgrades, HA configuration, and patching. Automate and orchestrate IP address provisioning and clean-up tasks. Ensure DNSSEC, Anycast, and TSIG configurations are secure and compliant. Integrate Infoblox with Active Directory, DHCP failover, and external DNS platforms. Troubleshoot DDI-related issues such as name resolution failures, DHCP lease conflicts, or replication problems. Implement IPAM policies and track IP usage across the network. Collaborate with network, cloud, and cybersecurity teams to support project rollouts and DDI automation. Monitor logs and alerts, analyse- trends, and generate reports for capacity planning and audits. Create and maintain SOPs, topology diagrams, and configuration documentation. Required Skills & Qualifications: Bachelor's degree in Information Technology, Computer Science, or a related field. Strong understanding of: DNS (Authoritative, Recursive, Forwarding, Reverse Zones) DHCP (failover, reservations, relay agents) IP Address Management (IPAM) Experience with Infoblox NIOS, Grid Master, HA pairs, and cloud-integrated IPAM. Knowledge of networking concepts like subnets, routing, NAT, VLANs, and firewalls. Experience with Role-Based Access Control (RBAC) in Infoblox. Preferred Skills & Certifications: Experience with Infoblox RESTful APIs. Exposure to cloud platforms (AWS Route 53, Azure DNS, GCP Cloud DNS). Familiarity with ITIL processes, change management, and incident response. Understanding of security compliance standards (GDPR, etc.)

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies