Jobs
Interviews

9 Key Vaults Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Network Operations Center (NOC) Analyst at Inspire Brands, you will be responsible for overseeing all technology aspects of the organization. Your primary role will involve acting as the main technology expert for the NOC team, ensuring the detection and resolution of issues in production before they impact the large scale operations. It will be your duty to guarantee that the services provided by the Inspire Digital Platform (IDP) meet user needs in terms of reliability, uptime, and continuous improvement. Additionally, you will play a crucial role in ensuring an outstanding customer experience by establishing service level agreements that align with the business model. In the technical aspect of this role, you will be required to develop and monitor various monitoring dashboards to identify problems related to applications, infrastructure, and potential security incidents. Providing operational support for multiple large, distributed software applications will be a key responsibility. Your deep troubleshooting skills will be essential in enhancing availability, performance, and security to ensure 24/7 operational readiness. Conducting thorough postmortems on production incidents to evaluate business impact and facilitate learning for the Engineering team will also be part of your responsibilities. Moreover, you will create dashboards and alerts for monitoring the platform, define key metrics and service level indicators, and ensure the collection of relevant metric data to create actionable alerts for the responsible teams. Participation in the 24/7 on-call rotation and automation of tasks to streamline application deployment and third-party tool integration will be crucial. Analyzing major incidents, collaborating with other teams to find permanent solutions, and establishing and publishing regular KPIs and metrics for measuring performance, stability, and customer satisfaction will also be expected from you. In terms of qualifications, you should hold a 4-year degree in computer science, Information Technology, or a related field. You should have a minimum of 5 years of experience in a production support role, specifically supporting large scale SAAS Production B2C or B2B Cloud Platforms, with a strong background in problem-solving and troubleshooting. Additionally, you should possess knowledge and skills in various technologies such as Java, TypeScript, Python, Azure Cloud services, monitoring tools like Splunk and Prometheus, containers, Kubernetes, Helm, Cloud networking, Firewalls, and more. Overall, this role requires strong technical expertise, effective communication skills, and a proactive approach to ensuring the smooth operation of Inspire Brands" technology infrastructure.,

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

We are seeking a Senior Python Backend Developer to join our team and take charge of building REST APIs and serverless functions in Azure using Python. As a Senior Python Backend Developer, your main responsibility will be developing high-performance and responsive REST APIs to cater to front-end requests. It will be essential for you to collaborate with team members working on various layers of the application and integrate front-end elements provided by co-workers. Therefore, having a basic understanding of front-end technologies is crucial for this role. Your duties will include delivering top-quality working software independently, writing secure and efficient code, and designing low-latency, high-availability applications. You will also be tasked with integrating user-facing elements developed by front-end developers, implementing security measures, and ensuring data protection. Additionally, your role will involve working with various Azure services like Azure Functions, APIM, Azure storage, SQL, and NoSQL databases, as well as writing automated tests and integrating with Azure APIM, Tracing, and Monitoring services. To excel in this role, you should have experience in building Azure Functions with Python, be proficient in Python with knowledge of at least one Python web framework, and have familiarity with ORM libraries. You should also be capable of integrating multiple data sources and databases into a cohesive system, possess a basic understanding of front-end technologies like JavaScript, HTML5, and CSS3, and have experience with OAuth2/OIDC for securing the backend using Azure AD B2C. Additionally, expertise in Azure services such as Key Vaults, Cost Management, Budgets, Application Insights, Azure Monitor, VNet, etc., fundamental design principles for scalable applications, database schema creation, unit testing, debugging, Git, Postman, Swagger/OpenAPI, Gen-AI, Langchain, Vectorization, LLMs, NoSQL databases like MongoDB, REST APIs, Microservices, and Azure DevOps for CI/CD will be beneficial for this role. If you are passionate about backend development, have a knack for problem-solving collaboratively, and are committed to delivering high-quality software, we invite you to apply for this exciting opportunity to contribute to our team. (Note: This job description is sourced from hirist.tech),

Posted 3 weeks ago

Apply

7.0 - 12.0 years

3 - 7 Lacs

Gurugram

Work from Office

Detailing & reviewing business requirements and creating epics, features and user stories accordingly Analysis of individual customer requirements and designing solutions for MS Dynamics 365 Customer Service and Field Service Advising on best practices for CRM, development or integration processes Orchestrate and manage the process of deployment of releases and hotfixes to staging and production environments Contribution to the international rollouts of the Dynamics CRM solution to the subsidiaries Taking ownership of the realization of new features or IT demands in collaboration with the development team Global second level support for Dynamics CRM users Ensuring the ongoing operation in the area Dynamics CRM Very good Power Platform knowledge Model driven and canvas apps, Power automate, Power pages, Power BI Good knowledge and experience of building PCF controls Knowledge about integration with Power Pages with Azure B2C would be a plusVery good knowledge about development on Azure platform and security aspects of it Azure Resource groups, Azure Functions, Key Vaults, Storage, Service Principles, etc.Quality assurance of the technical and functional design Expert technical knowledge in the Dynamics customizations and configuration , JavaScript, C#, .NET, HTML, including plugins, scripting and form creation Profile Degree in computer science or comparable Several years ( minimum 7 years ) of professional experience with MS Dynamics 365 CRM Customer Engagement Several years of experience in consulting and implementation of solutions based on MS Dynamics 365 Customer Service and Field Service Very good knowledge of creating functional and technical concepts Strong understanding of customer service and field service processes and out-of-the-box features available in MS Dynamics 365 CRM Project experience in service processes and their digitization Knowledge and experience with integrating Dynamics with SAP ERP would be a plus Good knowledge about configuration of DevOps CI/CD pipelines Very good analytical and communication skills Very good written and spoken English skills Team spirit and an efficient and solution-oriented way of working Good understanding on Agile methodologies Feel Free To Contact Us...!!! Submit

Posted 3 weeks ago

Apply

5.0 - 10.0 years

9 - 18 Lacs

Bengaluru

Work from Office

Mandatory Skill : Strong understanding of Azure PAAS(Azure functions, key vaults, service bus etc) concepts Experience on React JS Proficiency in .NET Development with .Net Core .NET Development with .Net Core, .NET Framework 4.5 or later, Asp.net MVC5, Asp.net Core, Angular 2+, C#, WCF, Web API, LINQ, Entity Framework, SQL Server 2008 or above. OOPS, software design patterns Azure concepts (Mandatory) with programming knowledge around Azure PowerShell, Azure DevOps Azure Stack, App Services, Azure Search, Azure SQL, Logic Apps, Scheduled Job Collection, Redis Cache, AAD & Graph API, KeyVault, Azure Data Factory, Application Insights, Azure Service Bus Cheers

Posted 1 month ago

Apply

6.0 - 11.0 years

6 - 11 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Key Responsibilities: Build scalable ETL pipelines and implement robust data solutions using Azure technologies. Manage and orchestrate workflows with Azure Data Factory (ADF), Databricks, ADLS Gen2, and Azure Key Vault. Design, maintain, and optimize secure and efficient data lake architectures. Collaborate with stakeholders to gather requirements and translate them into detailed technical specifications. Implement CI/CD pipelines to enable automated, seamless data deployment leveraging Azure DevOps. Monitor and troubleshoot data quality, performance bottlenecks, and scalability issues in production pipelines. Write clean, modular, and reusable PySpark code adhering to Agile development methodologies. Maintain thorough documentation of data pipelines, architecture designs, and best practices for team reuse. Must-Have Skills: 6+ years of experience in Data Engineering roles. Strong expertise with SQL, Python, PySpark, Apache Spark. Hands-on experience with Azure Databricks, Azure Data Factory (ADF), ADLS Gen2, Azure DevOps, and Azure Key Vault. Deep knowledge of data warehousing concepts, ETL development, data modeling, and governance. Familiarity with Agile software development lifecycle (SDLC) and containerization tools like Docker. Commitment to clean coding practices and maintaining high-quality codebases. Good to Have Skills: Experience with Azure Event Hubs and Logic Apps. Exposure to Power BI for data visualization. Strong problem-solving skills with a background in logic building and competitive programming.

Posted 1 month ago

Apply

6.0 - 11.0 years

8 - 12 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Senior Data Engineer (Remote, Contract 6 Months) Databricks, ADF, and PySpark. We are hiring a Senior Data Engineer for a 6-month remote contract position. The ideal candidate is highly skilled in building scalable data pipelines and working within the Azure cloud ecosystem, especially Databricks, ADF, and PySpark. You'll work closely with cross-functional teams to deliver enterprise-level data engineering solutions. KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices GoodToHaveSkills Event Hubs, Logic Apps Power BI Strong logic building and competitive programming background Location : - Remote, Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune

Posted 1 month ago

Apply

6.0 - 11.0 years

8 - 12 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

JobOpening Senior Data Engineer (Remote, Contract 6 Months) Remote | Contract Duration: 6 Months | Experience: 68 Years We are hiring a Senior Data Engineer for a 6-month remote contract position. The ideal candidate is highly skilled in building scalable data pipelines and working within the Azure cloud ecosystem, especially Databricks, ADF, and PySpark. You'll work closely with cross-functional teams to deliver enterprise-level data engineering solutions. #KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices #GoodToHaveSkills Event Hubs, Logic Apps Power BI Strong logic building and competitive programming background Location : - Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune,Remote

Posted 1 month ago

Apply

6.0 - 11.0 years

8 - 12 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

JobOpening Senior Data Engineer (Remote, Contract 6 Months) Remote | Contract Duration: 6 Months | Experience: 6-8 Years We are hiring a Senior Data Engineer for a 6-month remote contract position. The ideal candidate is highly skilled in building scalable data pipelines and working within the Azure cloud ecosystem, especially Databricks, ADF, and PySpark. You'll work closely with cross-functional teams to deliver enterprise-level data engineering solutions. #KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices #GoodToHaveSkills Event Hubs, Logic Apps Power BI Strong logic building and competitive programming background Location : - Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune,Remote

Posted 2 months ago

Apply

6.0 - 8.0 years

8 - 12 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

JobOpening Senior Data Engineer (Remote, Contract 6 Months) Remote | Contract Duration: 6 Months | Experience: 6-8 Years We are hiring a Senior Data Engineer for a 6-month remote contract position. The ideal candidate is highly skilled in building scalable data pipelines and working within the Azure cloud ecosystem, especially Databricks, ADF, and PySpark. You'll work closely with cross-functional teams to deliver enterprise-level data engineering solutions. #KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices #GoodToHaveSkills Event Hubs, Logic Apps Power BI Strong logic building and competitive programming background Location : - Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies