Jobs
Interviews

83 Blob Storage Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

4 - 7 Lacs

Gurgaon, Haryana, India

On-site

Requirement Analysis: Collaborate with business stakeholders to gather requirements and define solution objectives aligned with business goals. Development and Implementation: Lead AI solution implementation including data integration, model development, and deployment. Engineer and improve AI systems and tools. Build prototypes, POCs, and MVPs to bring solutions in a structured manner. Security and Compliance: Ensure AI solutions adhere to security best practices and data privacy regulations. Documentation and Training: Create technical documentation and best practice guidelines. Provide training on AI technologies. Monitoring and Maintenance: Implement monitoring to track AI solution performance and proactively address issues. Architecture: Design scalable, secure, and compliant AI solutions in collaboration with AI architects, integrating smoothly with existing IT infrastructure. Development and Implementation: Develop, test, and implement solutions following SDLC best practices, including version control, code review, and CI/CD processes. Quality Assurance and Testing: Execute comprehensive testing strategies and resolve defects. Integration and System Connectivity: Manage AI solution integration with internal and external systems, ensuring data consistency and interoperability. Documentation and Change Management: Produce detailed documentation for AI processes and handle changes with minimal disruption. Contribute to best practice development. What you need: Required Qualifications: Minimum 10 years in AI or data science roles, with a focus on building AI solutions on Azure. Proficient in Python or C#, with strong Azure cloud expertise. Skilled in Azure AI services (Copilot Studio, Cognitive Services, Machine Learning, OpenAI, Search, Blob Storage, Web App, Functions). Experience with LLMs, Generative AI, machine learning, NLP, and computer vision models. Preferred Qualifications: Knowledge of DevOps practices, CI/CD pipelines Azure certifications such as Azure AI Engineer Associate are a plus. Bachelor s degree in Computer Science, Computer Engineering, Electrical Engineering, Mathematics, or a related field (Masters degree in Science preferred).

Posted 10 hours ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You will be joining our dynamic team as an Azure Data Engineer - L3 with 5-7 years of experience, based in either Hyderabad or Bangalore, working the shift timings of 2PM-11PM IST. Your responsibilities will include: - Utilizing your expertise in Azure Data Factory, Databricks, Azure data lake, and Azure SQL Server. - Developing ETL/ELT processes using SSIS and/or Azure Data Factory. - Building complex pipelines and dataflows with Azure Data Factory. - Designing and implementing data pipelines using Azure Data Factory (ADF). - Enhancing the functionality and performance of existing data pipelines. - Fine-tuning processes dealing with very large data sets. - Configuring and Deploying ADF packages. - Proficient in the usage of ARM Template, Key Vault, Integration runtime. - Adaptable to working with ETL frameworks and standards. - Demonstrating strong analytical and troubleshooting skills to identify root causes and find solutions. - Proposing innovative and feasible solutions for business requirements. - Knowledge of Azure technologies/services such as Blob storage, ADLS, Logic Apps, Azure SQL, and Web Jobs. - Expertise in ServiceNow, Incidents, JIRA. - Exposure to agile methodology. - Proficiency in understanding and building PowerBI reports using the latest methodologies. Your key skills should include: - Azure - Azure Data Factory - Data bricks - Migration project experience Qualifications: - Engineer graduate Certifications: - Preferable: Azure certification, Data bricks Join us and be a part of our exciting journey as we continue to provide end-to-end solutions in various industry verticals with a global presence and a track record of successful project deliveries for Fortune 500 companies.,

Posted 19 hours ago

Apply

10.0 - 14.0 years

0 Lacs

haryana

On-site

You are a highly experienced and hands-on Technical Architect with expertise in the Microsoft technology stack, particularly .NET MVC and Azure Cloud Services. In this role, you will be responsible for leading the design and development of scalable enterprise applications. Your strong architectural mindset, exceptional problem-solving skills, and ability to guide development teams will be crucial in delivering high-quality solutions. Your key responsibilities will include leading the architecture, design, and implementation of enterprise-grade applications using .NET MVC and Azure, defining and enforcing coding standards and best practices, collaborating with stakeholders to translate business requirements into technical solutions, evaluating and recommending tools and technologies to support application development, providing technical leadership and mentorship to development teams, conducting code reviews to ensure adherence to standards, designing and implementing CI/CD pipelines and DevOps practices in Azure, and troubleshooting and resolving technical issues across the application lifecycle. To excel in this role, you should have 10-12 years of experience in software development and architecture, strong expertise in .NET Framework, .NET Core, MVC, C#, and Entity Framework, proven experience with Azure services such as App Services, Azure Functions, Azure DevOps, Azure SQL, and Blob Storage, a solid understanding of microservices architecture, RESTful APIs, and cloud-native design principles, experience with containerization (Docker, Kubernetes), excellent communication and stakeholder management skills, the ability to work independently and lead cross-functional teams, and a Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Preferred qualifications for this role include being Microsoft Certified as an Azure Solutions Architect Expert or holding similar certifications, experience in Agile/Scrum environments, and exposure to frontend technologies like Angular or React would be a plus.,

Posted 1 day ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Data Engineer, you will be responsible for designing, developing, and delivering ADF pipelines for the Accounting & Reporting Stream. Your role will involve creating and maintaining scalable data pipelines using PySpark and ETL workflows in Azure Databricks and Azure Data Factory. You will also work on data modeling and architecture to optimize data structures for analytics and business requirements. Your responsibilities will include monitoring, tuning, and troubleshooting pipeline performance for efficiency and reliability. Collaboration with business analysts and stakeholders is key to understanding data needs and delivering actionable insights. Implementing data governance practices to ensure data quality, security, and compliance with regulations is essential. You will also be required to develop and maintain documentation for data pipelines and architecture. Experience in testing and test automation is necessary for this role. Collaboration with cross-functional teams to comprehend data requirements and provide technical advice is crucial. Strong background in data engineering is required, with proficiency in SQL, Azure Databricks, Blob Storage, Azure Data Factory, and programming languages like Python or Scala. Knowledge of Logic App and Key Vault is also necessary. Strong problem-solving skills and the ability to communicate complex technical concepts to non-technical stakeholders are essential for effective communication within the team.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a DevOps Operational Support for this project, you will have the opportunity to contribute to the data management architecture of industry-leading software. You will work closely with cross-functional teams and regional experts to design, implement, and support solutions with a focus on data security and global availability to facilitate data-driven decisions for our customers. This is your chance to work on a stable, long-term project with a global client, focusing on digital transformation and change management. Exciting Opportunities await as you work in squads under our customer's direction, utilizing Agile methodologies and Scrum. Contribute to an innovative application that guides and documents the sales order process, aids in market analysis, and ensures competitive pricing. Be part of a team that integrates digital and human approvals, ensuring seamless integration with a broader ecosystem of applications. Collaborate with reputed global clients, delivering top-notch solutions, and join high-caliber project teams with front-end, back-end, and database developers that offer ample opportunities to learn, grow, and advance your career. If you possess strong technical skills, effective communication abilities, and a commitment to security, we want you on our team! Ready to make an impact Apply now and be part of our journey to success! Responsibilities: - Solve Operational Challenges by working with global teams to find creative solutions for customers across our software catalog. - Plan, provision, and configure enterprise-level solutions for customers on a global scale during Customer Deployments. - Monitor customer environments to proactively identify and resolve issues while providing support for incidents in Monitoring and Troubleshooting tasks. - Leverage and maintain automation pipelines to handle all stages of the software lifecycle under Automation responsibilities. - Write and maintain documentation for processes, configurations, and procedures in Documentation tasks. - Lead the team in troubleshooting environment failures within SRE MTTR goals to meet SRE & MTTR Goals. - Collaborate closely with stakeholders to define project requirements and deliverables and understand their needs and challenges. - Ensure the highest standards in coding and security, with a strong emphasis on protecting systems and data by Implementing Best Practices. - Take an active role in defect triage, strategy, and architecture planning in Strategize and Plan activities. - Ensure database performance and resolve development problems to Maintain Performance. - Translate requirements into high-quality solutions, adhering to Agile methodologies to Deliver Quality solutions. - Conduct detailed design reviews to ensure alignment with approved architecture in Review and Validate processes. - Work with application development teams throughout development, deployment, and support phases in Collaborate tasks. Mandatory Skills: Technical Skills: - Database technologies: RDBMS (Postgres preferred), no-SQL (Cassandra preferred) - Software languages: Java, Python, NodeJS, Angular - Cloud Platforms: AWS - Cloud Managed Services: Messaging, Server-less Computing, Blob Storage - Provisioning (Terraform, Helm) - Containerization (Docker, Kubernetes preferred) - Version Control: Git Qualification and Soft Skills: - Bachelors degree in Computer Science, Software Engineering, or a related field - Customer-driven and result-oriented focus - Excellent problem-solving and troubleshooting skills - Ability to work independently and as part of a team - Strong communication and collaboration skills - Strong desire to stay up to date with the latest trends and technologies in the field Nice-to-Have Skills: - Cloud Technologies: RDS, Azure - Knowledge in the E&P Domain (Geology, Geophysics, Well, Seismic, or Production data types) - GIS experience is desirable Languages: English: C2 Proficient,

Posted 2 days ago

Apply

9.0 - 11.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Role: .Net Architect (.Net Core) Experience: 9+ Years (Minimum 5 years in .NET Core Development) Location: Coimbatore / Chennai Mandatory Skills: .Net Core, .Net MVC, Web API, LINQ, SQL Server, Project Architecture & Documentation (HLD/LLD), Azure (App Services, AKS), CI/CD Pipelines. JD: Required Skills .NET Core, ASP.NET MVC / ASPX, C#/VB.NET, Web API LINQ, Entity Framework / ADO.NET Strong in Object-Oriented Programming (OOP) and architectural design patterns (e.g., SOLID, Repository, DI) Deep expertise in SQL Server and database design Hands-on experience in Azure services: o Azure App Services, AKS, Azure SQL, Blob Storage, Azure AD, Key Vault CI/CD automation using Azure DevOps, GitHub Actions, or TFS Strong documentation skills HLD/LLD creation and architectural artifacts Front-end integration: HTML5, CSS3 (basic familiarity) Good to Have / Preferred Skills Experience collaborating directly with client technical teams Familiarity with third-party tools: Telerik, DevExpress Exposure to Agile management tools: JIRA, TFS Working knowledge of cloud-native architecture, containerization (Docker), Helm charts, YAML Knowledge of testing practices, static code analysis tools, and performance monitoring Soft Skills & Attributes Strong analytical, problem-solving, and communication skills Excellent email and professional communication etiquette Flexible and quick to learn new tools and frameworks Strong ownership mindset and a passion for delivering quality solutions Able to work independently or as part of a team in a dynamic environment Mandatory Technologies .NET Core ASP.NET MVC Web API LINQ SQL Server Project Architecture & Documentation (HLD/LLD) Azure (App Services, AKS) CI/CD Pipeline Show more Show less

Posted 3 days ago

Apply

4.0 - 7.0 years

5 - 15 Lacs

Hyderabad

Work from Office

Must Have 4-6 yrs (Total Experience) 3+ years of hand-on experience with Python server-side scripting and API development. 3+ experience with Python Frameworks like Flask, Django or Fast API 2+ year of experience with Azure cloud development. Good hands-on experience with widely sued Azure services. 2+ Years of Angular/React web-UI development experience. 2+ years of hands-on experience with JavaScript, HTML, and CSS Good to Have Knowledge of Generative AI Experience: 4-6 years Work location: Hyderabad Shift Timings: 11 AM to 8 PM Roles and Responsibilities: Develop Scalable and Robust backend APIs aligned with business requirements. Develop and maintain high-quality, responsive user interfaces for web applications. Conduct code reviews, provide feedback, and mentor junior developers as needed. Should be self-driven individual and always look for future opportunities. Ensuring delivery of work with minimal errors, within agreed time frame. Ensures Stakeholder satisfaction. Demonstrates good communication skills. Generates innovative ideas. Operates effectively in a fast-changing environment. Embraces change. Takes a broad approach to problem solving and decision making by analyzing, thinking ahead and planning. Acts with sense of urgency, practicality, and integrity. Demonstrates ability to make decisions. Contributes to and participates in decision making and creates a participative decision-making environment. Seeks understanding of impacts and monitor and reviews these on an ongoing basis. Holds accountable for ensuring that results are achieved. Demonstrates a strong orientation towards achievement. Provides input to decision making processes. Demonstrates an understanding and appreciation of the cultural diversity present in the global business environment.

Posted 4 days ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Chennai

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Microsoft Azure PaaS Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your day will involve collaborating with team members to ensure the successful implementation of software solutions, while also performing maintenance and enhancements to existing applications. You will be responsible for delivering high-quality code and contributing to the overall success of the projects you are involved in, ensuring that client requirements are met effectively and efficiently. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications.- Conduct code reviews to ensure adherence to best practices and coding standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure PaaS.- Azure Logic Apps designing and deploying workflows- Azure Services Service Bus, Event Grid, Blob Storage, SQL DB, Azure Functions- API Management (APIM) creating and managing APIs and custom connectors- Monitoring & Diagnostics Azure Monitor, Log Analytics, Application Insights- Automation PowerShell, Azure CL- CI/CD Azure DevOps pipelines for Logic App deployments- Tools Visual Studio, VS Code, Postman, SQL Server Management Studio- Strong debugging, error handling, and workflow optimization skills Additional Information:- The candidate should have minimum 3 years of experience in Microsoft Azure PaaS.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

4.0 - 6.0 years

8 - 13 Lacs

Pune, Ahmedabad, Surat

Work from Office

Role & responsibilities Design, develop, and maintain REST APIs using .NET Core, OpenAPI, and Node.js Apply SOLID principles, design patterns, and best practices across application design Participate in architectural discussions involving SOA, Microservices, Event-driven, and Serverless patterns Integrate secure authentication/authorization via Azure B2C, IdentityServer, KeyCloak or equivalent Implement message-driven systems using RabbitMQ, Azure Service Bus, or Kafka Implement robust micro-frontend solutions using React.js, angularjs, nextjs, Typescript, Webpack, and Storybook Develop secure backend services including Windows Services, JWT-based authentication, and entity/data access layers using Entity Framework, LINQ Architect and maintain real-time features using SignalR, Socket.io, Twilio Work with MongoDB, MS SQL, Redis for scalable data storage solutions Translate UI/UX designs using HTML, CSS, Bootstrap, Tailwind, MUI, Ant Design Manage codebases using Git, Azure Git, GitHub integrated with Azure DevOps/Jenkins CI/CD pipelines Leverage Azure Services such as Functions, Blob Storage, API Management, Web Apps, and Azure AI Services Preferred candidate profile Strong analytical and reasoning abilities Effective communication with technical and non-technical stakeholders Ownership and accountability for deliverables Good documentation and debugging discipline Team-first, mentor mindset Nice To Have Experience in distributed systems or enterprise-scale software Exposure to AI/ML APIs in Azure or OpenAI

Posted 4 days ago

Apply

2.0 - 6.0 years

0 Lacs

kolkata, west bengal

On-site

As a Solution Architect & Technical Lead at RebusCode, you will play a crucial role in driving the design and architecture of our Big Data Analytics solutions within the Market Research industry. Your responsibilities will include providing technical leadership, ensuring governance, documenting solutions, and sharing knowledge effectively. Moreover, you will be actively involved in project management and ensuring timely delivery of projects. To excel in this role, you should have a minimum of 5 years of experience in software development, out of which at least 2 years should be in architecture or technical leadership positions. A proven track record of delivering enterprise-grade, cloud-native SaaS applications on Azure and/or GCP is essential for this role. Your technical skills should encompass a wide range of areas including Cloud & Infrastructure (Azure App Services, Functions, Kubernetes; GKE, Cloud Functions; Service Bus, Pub/Sub; Blob Storage, Cloud Storage; Key Vault, Secret Manager; CDN), Development Stack (C#/.NET 6/7/8, ASP.NET Core Web API, Docker, container orchestration), Data & Integration (SQL Server, Oracle, Cosmos DB, Spanner, BigQuery, ETL patterns, message-based integration), CI/CD & IaC (Azure DevOps, Cloud Build, GitHub Actions; ARM/Bicep, Terraform; container registries, automated testing), Security & Compliance (TLS/SSL certificate management, API gateway policies, encryption standards), and Monitoring & Performance (Azure Application Insights, Log Analytics, Stackdriver, performance profiling, load testing tools). Nice-to-have qualifications include certifications such as Azure Solutions Architect Expert, Google Professional Cloud Architect, PMP or PMI-ACP. Familiarity with front-end frameworks like Angular and React, as well as API client SDK generation, would be an added advantage. Prior experience in building low-code/no-code integration platforms or automation engines is also beneficial. Exposure to alternative clouds like AWS or on-prem virtualization platforms like VMware and OpenShift will be a plus. Join us at RebusCode, where you will have the opportunity to work on cutting-edge Big Data Analytics solutions and contribute to the growth and success of our market research offerings.,

Posted 5 days ago

Apply

4.0 - 8.0 years

0 Lacs

kolkata, west bengal

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself and a better working world for all. We're seeking a skilled Custom Copilot engineer to join the group of our NCLC Consulting team. You'll be instrumental in designing, developing, and deploying robust, scalable applications leveraging the power of Microsoft Azure and generative AI. This is a great opportunity to be part of a leading firm and play a key role in the growth of our service offerings. Your Key Responsibilities: - Produces high-quality solution or infrastructure deliverables in accordance with project timelines and specifications, using sound coding and programming skills. - Performs coding, debugging, testing, and troubleshooting throughout the development process contributing to moderately complex aspects of a project. - Maintains and enhances systems by fixing complicated errors, raising risks, and escalating issues where necessary. - Works with users to capture detailed requirements, translating designs and solution architecture into design specifications. - Monitors and reports on potential risks/opportunities of emerging technologies and seeks areas for continuous improvement. - Ensures all activities adhere to the relevant processes, procedures, standards, and technical design. - Develops and promotes best practices for usage, operations, and development. - Strong analytical and communication skills with an intense drive to learn and adopt. Skills and attributes: - Design and customize robust AI solutions leveraging Azure AI Services including Azure AI Document Intelligence, Azure AI Vision, Azure AI Language, Azure AI Translator, and Azure OpenAI Service. - Build custom AI Agents through Copilot Studio and/or Azure AI Foundry through a plugin architecture, API function calling, logic apps, storage of prompts, conversation history (Azure Cosmos DB), Azure Key Vault, Container Apps, Blob Storage, Cosmos DB. - Implement intelligent solutions using SDKs like Azure AI Foundry, AutoGen, LangChain, and Semantic Kernel. - Design multi-step prompt workflows, leverage RAG patterns to integrate LLMs with enterprise data, design intelligent agents that are task-based and/or role-based by using responsible AI principles. - Design and implement solutions on Microsoft Azure, including Azure Functions, Azure App Service, logic Apps, Azure SQL Database, Azure Cosmos DB, and Azure Foundry. - Fine-tune and optimize AI models by selecting the most appropriate architectures for performance and efficiency. - Knowledge of cloud computing and Azure infrastructure. - Craft precise and effective prompts to guide AI models and improve their outputs. - Monitor and analyze Azure AI costs to ensure efficient resource utilization. - Develop and integrate custom tools within AI agents to extend their capabilities. - Demonstrate strong expertise in Azure AI Foundry and prompt engineering for building declarative and multi-modal agents. - Integrate AI-powered features into full-stack web applications to improve user experience and efficiency. - Collaborate with data scientists and machine learning engineers to deploy AI models into production. - Stay up-to-date with the latest advancements in .NET Core, Azure, and AI technologies. - Adhere to best practices for software development, including code reviews, unit testing, and continuous integration/continuous delivery (CI/CD) and MLOps. Required Skills and Experience: - Overall 4+ years of experience in Azure, .NET Core, Python, and M365 suite of products. - Min 2 years of experience in developing Generative AI applications using Azure OpenAI. - Proficiency in languages like C#, Python, JavaScript, and TypeScript. - Strong proficiency in Microsoft Azure, including Azure Functions, Azure App Service, Azure SQL Database, Azure Cosmos DB, and Azure AI Studio. - Hands-on experience with generative AI technologies, such as OpenAI and Large Language Models (LLMs). - Proficiency in prompt engineering and fine-tuning AI models and developing applications for RAG applications. - Experience of front-end technologies like HTML, CSS, and JavaScript (React or Angular) for full-stack web application development. - Understanding of CI/CD pipelines and DevOps practices. - Strong problem-solving, analytical, and communication skills. - Ability to work independently and as part of a team. Desired Skills: - Knowledge of Low-code technologies like Logic Apps, Microsoft Power Platform (Power Apps, Power Automate) is good to have. - Understanding of cloud-native development principles and microservices architecture. - Experience with containerization technologies like Docker and Kubernetes. - Understanding of security best practices for cloud applications. To qualify for the role, you must have: - A bachelor's or master's degree. - A minimum of 4+ years of experience, preferably a background in a professional services firm. - Strong knowledge with M365 Suite of products. - Excellent communication skills with consulting experience preferred. Ideally, you'll also have: - Analytical ability to manage multiple projects and prioritize tasks into manageable work products. - Can operate independently or with minimum supervision. What Working At EY Offers: At EY, we're dedicated to helping our clients, from startups to Fortune 500 companies, and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: - Support, coaching, and feedback from some of the most engaging colleagues around. - Opportunities to develop new skills and progress your career. - The freedom and flexibility to handle your role in a way that's right for you. EY | Building a Better Working World: EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,

Posted 1 week ago

Apply

7.0 - 9.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Our Client in India is one of the leading providers of risk, financial services and business advisory, internal audit, corporate governance, and tax and regulatory services. Our Client was established in India in September 1993, and has rapidly built a significant competitive presence in the country. The firm operates from its offices in Mumbai, Pune, Delhi, Kolkata, Chennai, Bangalore, Hyderabad , Kochi, Chandigarh and Ahmedabad, and offers its clients a full range of services, including financial and business advisory, tax and regulatory. Our client has their client base of over 2700 companies. Their global approach to service delivery helps provide value-added services to clients. The firm serves leading information technology companies and has a strong presence in the financial services sector in India while serving a number of market leaders in other industry segments. Job Requirements Mandatory Skills Bachelor s or higher degree in Computer Science or a related discipline or equivalent (minimum 7+ years work experience). At least 6+ years of consulting or client service delivery experience on Azure Microsoft data engineering. At least 4+ years of experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases such as SQL server and data warehouse solutions such as Synapse/Azure Databricks, Microsoft Fabric Hands-on experience implementing data ingestion, ETL and data processing using Azure services: Fabric, onelake, ADLS, Azure Data Factory, Azure Functions, services in Microsoft Fabric etc. Minimum of 5+ years of hands-on experience in Azure and Big Data technologies such as Fabric, databricks, Python, SQL, ADLS/Blob, pyspark/SparkSQL. Minimum of 3+ years of RDBMS experience Experience in using Big Data File Formats and compression techniques. Experience working with Developer tools such as Azure DevOps, Visual Studio Team Server, Git, etc. Preferred Skills Technical Leadership & Demo Delivery: oProvide technical leadership to the data engineering team, guiding the design and implementation of data solutions. oDeliver compelling and clear demonstrations of data engineering solutions to stakeholders and clients, showcasing functionality and business value. oCommunicate fluently in English with clients, translating complex technical concepts into business-friendly language during presentations, meetings, and consultations. ETL Development & Deployment on Azure Cloud: oDesign, develop, and deploy robust ETL (Extract, Transform, Load) pipelines using Azure Data Factory (ADF), Azure Synapse Analytics, Azure Notebooks, Azure Functions, and other Azure services. oEnsure scalable, efficient, and secure data integration workflows that meet business requirements. oPreferably to have following skills Azure doc intelligence, custom app, blob storage oDesign and develop data quality frameworks to validate, cleanse, and monitor data integrity. oPerform advanced data transformations, including Slowly Changing Dimensions (SCD Type 1 and Type 2), using Fabric Notebooks or Databricks. oPreferably to have following skills Azure doc intelligence, custom app, blob storage Microsoft Certifications: oHold relevant role-based Microsoft certifications, such as: DP-203: Data Engineering on Microsoft Azure AI-900: Microsoft Azure AI Fundamentals. oAdditional certifications in related areas (e.g., PL-300 for Power BI) are a plus. Azure Security & Access Management: oStrong knowledge of Azure Role-Based Access Control (RBAC) and Identity and Access Management (IAM). oImplement and manage access controls, ensuring data security and compliance with organizational and regulatory standards on Azure Cloud. Additional Responsibilities & Skills: oTeam Collaboration: Mentor junior engineers, fostering a culture of continuous learning and knowledge sharing within the team. oProject Management: Oversee data engineering projects, ensuring timely delivery within scope and budget, while coordinating with cross-functional teams. oData Governance: Implement data governance practices, including data lineage, cataloging, and compliance with standards like GDPR or CCPA. oPerformance Optimization: Optimize ETL pipelines and data workflows for performance, cost-efficiency, and scalability on Azure platforms. oCross-Platform Knowledge: Familiarity with integrating Azure services with other cloud platforms (e.g., AWS, GCP) or hybrid environments is an added advantage. Soft Skills & Client Engagement: oExceptional problem-solving skills with a proactive approach to addressing technical challenges. oStrong interpersonal skills to build trusted relationships with clients and stakeholders. Ability to manage multiple priorities in a fast-paced environment, ensuring high-quality deliverables.

Posted 1 week ago

Apply

1.0 - 3.0 years

1 - 3 Lacs

Hyderabad, Telangana, India

On-site

EY GDS Data and Analytics (D&A) Azure Data Engineer - Staff As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We're looking for candidates with strong technology and data understanding in big data engineering space, having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your key responsibilities Develop & deploy big data pipelines in a cloud environment using Azure Cloud services ETL design, development and migration of existing on-prem ETL routines to Cloud Service Interact with senior leaders, understand their business goals, contribute to the delivery of the workstreams Design and optimize model codes for faster execution Skills and attributes for success Overall 1-3 years of IT experience with 1+ Relevant experience in Azure Data Factory (ADF) and good hands-on with Exposure to latest ADF Version Hands-on experience on Azure functions & Azure synapse (Formerly SQL Data Warehouse) Should have project experience in Azure Data Lake / Blob (Storage purpose) Should have basic understanding on Batch Account configuration, various control options Sound knowledge in Data Bricks & Logic Apps Should be able to coordinate independently with business stake holders and understand the business requirements, implement the requirements using ADF To qualify for the role, you must have Be a computer science graduate or equivalent with 1-3 years of industry experience Have working experience in an Agile base delivery methodology (Preferable) Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support. Ideally, you'll also have Client management skills What we look for People with technical experience and enthusiasm to learn new things in this fast-moving environment What working at EY offers At EY, we're dedicated to helping our clients, from startups to Fortune 500 companies and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that's right for you

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

You are an experienced Senior dbt Engineer with a strong background in Snowflake and Azure cloud platforms. Your primary responsibility will be to lead the design and development of scalable, governed, and efficient data transformation pipelines using dbt. You will collaborate across functions to deliver business-ready data solutions. With at least 8 years of experience in data engineering, analytics engineering, or similar roles, you have proven expertise in dbt (Data Build Tool) and modern data transformation practices. Your advanced proficiency in SQL and deep understanding of dimensional modeling, medallion architecture, and ELT principles will be crucial for success in this role. You must have strong hands-on experience with Snowflake, including query optimization, and be proficient with Azure cloud services such as Azure Data Factory and Blob Storage. Your communication and collaboration skills should be exemplary, and you should also have familiarity with data governance, metadata management, and data quality frameworks. As a Senior dbt Engineer, your key responsibilities will include leading the design, development, and maintenance of dbt models and transformation layers. You will define and enforce data modeling standards, best practices, and development guidelines while driving the end-to-end ELT process to ensure reliability and data quality across all layers. Collaboration with data product owners, analysts, and stakeholders to translate complex business needs into clean, reusable data assets is essential. You will utilize best practices on Snowflake to build scalable and robust dbt models and integrate dbt workflows with orchestration tools like Azure Data Factory, Apache Airflow, or dbt Cloud for robust monitoring and alerting. Supporting CI/CD implementation for dbt deployments using tools like GitHub Actions, Azure DevOps, or similar will also be part of your responsibilities. If you are looking for a challenging opportunity to leverage your expertise in dbt, Snowflake, and Azure cloud platforms to drive digital transformation and deliver impactful data solutions, this role is perfect for you.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

maharashtra

On-site

We are M&G Global Services Private Limited (formerly known as 10FA India Private Limited, and prior to that Prudential Global Services Private Limited). We are a fully owned subsidiary of the M&G plc group of companies, operating as a Global Capability Centre providing a range of value adding services to the Group since 2003. At M&G our purpose is to give everyone real confidence to put their money to work. As an international savings and investments business with roots stretching back more than 170 years, we offer a range of financial products and services through Asset Management, Life and Wealth. All three operating segments work together to deliver attractive financial outcomes for our clients, and superior shareholder returns. M&G Global Services has rapidly transformed itself into a powerhouse of capability that is playing an important role in M&G plcs ambition to be the best loved and most successful savings and investments company in the world. Our diversified service offerings extending from Digital Services (Digital Engineering, AI, Advanced Analytics, RPA, and BI & Insights), Business Transformation, Management Consulting & Strategy, Finance, Actuarial, Quants, Research, Information Technology, Customer Service, Risk & Compliance and Audit provide our people with exciting career growth opportunities. Through our behaviours of telling it like it is, owning it now, and moving it forward together with care and integrity; we are creating an exceptional place to work for exceptional talent. Job Title Lead Test Engineer Investment Data Platform Grade 2B Level Experienced colleague Job Function Asset Management Tech & Change Job Sub Function Investments Data Platform Reports to Lead/Sr. Manager Investments Data Platform (India) Location Mumbai Business Area M&G Global Services Overall Job Purpose Working with M&G Plc. means becoming part of a brand with a global reputation and our purpose is to help people manage and grow their savings and investments, responsibly. M&G plc is a firm built on a rich and long history and with a commitment to an innovative future centred on the needs of customers and clients. There is a genuine opportunity to drive competitive advantage with value creation through the formation of this new organisation. The Lead Test Engineer is expected to deliver platform testing for the investment data platform. Carry out the development of automation test frameworks across one or more interrelated data platform teams. Promoting the use of best tools for a particular job and are proficient in multiple test scripting languages. Work closely with leads across and collaborate to design test strategies which will support complex deliveries. You will be required to manage communications between teams, resolve dependencies, whilst promoting knowledge sharing and adoption of good practice. Your key priorities will gravitate around development of solutions to support the Investment Data platform to meet increased demands across the investment teams, clients and regulators. Accountabilities/Responsibilities Key accountabilities and responsibilities Be accountable for delivering and maintaining significant portion of the Investment Data Platform Test capabilities and design/develop testing frameworks in support of the same. Applies judgement to deliver outcomes, evaluating solutions, considering the impact for customers, cost and risk. Manages conflicts that may impact delivery. Promote the use of the best tools for a particular job and keep up-to-date with domain, business, programming languages (e.g. scripting language, C#, Java) and testing tools & technology (e.g. Groovy, Specflow, Selenium, Cucumber, etc.) Explore different types of automated/technical testing techniques to drive the test approach. Understand and build key testing design principles and be able to create testing tools to aid with testing from scratch. Be able to build, package and deploy software through environments; manage configuration settings and check the success of a deployment whilst being able to maintain the deployment framework. Creation, maintenance and subsequent running of functional and non-functional tests within the team and analyse and present results to stakeholders. Perform Root Cause analysis against the issue to provide as much information in the report and create automated tests where relevant for defects to ensure they do not reoccur. Drive the code/test review process across the whole discipline and participate in code/test reviews written by other engineers. Consistently identify improvement opportunities within the team and implements processes to address improvement opportunities. Influence testing process standardisation and stimulate knowledge sharing initiatives within the team. Key Stakeholder Management Internal All M&G Plc Business Areas M&G Plc Support Groups External Partner(s)/Vendor(s) Knowledge, Skills, Experience & Educational Qualification Knowledge & Skills (Key): Good knowledge of Object-Oriented programming and Relational & Non-Relational databases management systems. Experience in building Test Automation frameworks using Groovy, SpecFlow, Selenium, Cucumber etc. Experience working in an Agile environment, TDD and BDD methodologies. Familiarity with Unit testing frameworks e.g. MS Unit Testing Framework, NUnit, Mocha as well as BDD tools e.g. Spock for Groovy. Experience in Microsoft Azure - App Services, Service Bus, Azure VM, Azure SQL, Blob Storage, Azure Data Factory, Azure Data Lake Store, Azure Function Apps. API Testing, and an understanding smoke testing and performance testing. Good interpersonal skills, with the ability to communicate clearly and effectively, both written and orally, within a project team Excellent attention to detail, and ability to prioritize and work efficiently to project deadlines Knowledge & Skills (Desirable): Exposure to financial markets & asset management processes and understand analysis into a wide variety of asset classes and associated analytics (e.g. Equity, Fixed Income, Private Assets etc). Experience in Snowflake. Experience: 8+ years of total experience in Technology/Software Development/Data Engineering. 4+ years of experience in a Test Engineer role. Educational Qualification: Graduate in any discipline. M&G Behaviours relevant to all roles: Note: *We are in Hybrid working with min. three days work from office (subject to policy change),

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

YASH Technologies is looking to hire .Net Core Professionals with 6-8 years of experience in Hyderabad, Indore, and Pune. As a .Net Developer, you will be responsible for developing clean, scalable code using .NET programming languages. You should have expertise in ASP.NET framework, C#, SQL Server, and design/architectural patterns. Additionally, experience in UI technologies like HTML5, CSS3, JavaScript, React.JS, and Angular 10+ is required. You will be working with Azure development/deployment environments, containerized applications, and code versioning tools. Strong debugging skills, knowledge of OOPS, and familiarity with REST and RPC APIs are essential. The role requires proficiency in Agile Development methodologies and the ability to adhere to application architecture and design. Personal skills such as good communication, attention to detail, integrity, ownership, flexibility, teamwork, analytical thinking, and problem-solving are crucial for this role. You should be able to complete assigned tasks in a timely manner and work independently as an individual contributor. Technical and functional competencies include requirement gathering and analysis, application design, architecture tools and frameworks, estimation and resource planning, product/technology knowledge, test management, customer management, project management, and domain/industry knowledge. Additionally, basic knowledge of marketing and pre-sales activities is beneficial. Behavioral competencies required for this role include accountability, collaboration, agility, customer focus, communication, driving results, and conflict resolution. Certification in relevant areas is mandatory. At YASH, you will have the opportunity to create a career path tailored to your goals in an inclusive team environment. Our Hyperlearning workplace emphasizes flexible work arrangements, emotional positivity, self-determination, trust, transparency, open collaboration, and support for career development. Join us to be a part of our stable employment with a great atmosphere and ethical corporate culture.,

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

As a Developer contracted by Luxoft for supporting customer initiatives, your main task will involve developing solutions based on client requirements within the Telecom/network work environment. You will be responsible for utilizing technologies such as Databricks and Azure, Apache Spark, Python, SQL, and Apache Airflow to create and manage Databricks clusters for ETL processes. Integration with ADLS, Blob Storage, and efficient data ingestion from various sources including on-premises databases, cloud storage, APIs, and streaming data will also be part of your role. Moreover, you will work on handling secrets using Azure Key Vault, interacting with APIs, and gaining hands-on experience with Kafka/Azure EventHub streaming. Your expertise in data bricks delta APIs, UC catalog, and version control tools like Github will be crucial. Additionally, you will be involved in data analytics, supporting ML frameworks, and integrating with Databricks for model training. Proficiency in Python, Apache Airflow, Microsoft Azure, Databricks, SQL, ADLS, Blob storage, Kafka/Azure EventHub, and various other related skills is a must. The ideal candidate should hold a Bachelor's degree in Computer Science or a related field and possess at least 7 years of experience in development. Problem-solving skills, effective communication abilities, teamwork, and a commitment to continuous learning are essential traits for this role. Desirable skills include exposure to Snowflake, PostGre, Redis, GenAI, and a good understanding of RBAC. Proficiency in English at C2 level is required for this Senior-level position based in Bengaluru, India. This opportunity falls under the Big Data Development category within Cross Industry Solutions and is expected to be effective from 06/05/2025.,

Posted 1 week ago

Apply

11.0 - 15.0 years

0 Lacs

karnataka

On-site

The Solution Architect + Azure AI Architect role requires an individual with over 11 years of experience to lead the architecture and design of intelligent, secure, and scalable enterprise applications leveraging Azure technologies. As a highly experienced professional, you will define end-to-end architecture using Azure App Services, Blob Storage, Azure Cognitive Search, and/or Azure OpenAI. Your responsibilities include designing highly scalable, modular, and secure document processing and management systems. You will create metadata models, define tagging strategies, and plan complex content relationships. It is essential to ensure seamless integration with Azure Active Directory (Azure AD), DevOps pipelines, and administrative workflows. Additionally, you will lead the design of AI-powered search and recommendation engines to provide intelligent and personalized user experiences. Driving best practices in cloud security, performance optimization, and data governance across the solution is a key aspect of this role. The ideal candidate must possess proven experience in architecting cloud solutions on Microsoft Azure. Expertise in Azure Cognitive Services, Azure OpenAI, and Cognitive Search is crucial. A solid understanding of RBAC security, identity integration, and cloud-based content lifecycle management is required. Moreover, experience in working with large-scale metadata-driven architectures will be advantageous. This is an exciting opportunity to work remotely on cutting-edge Azure technologies and contribute to the development of innovative solutions.,

Posted 1 week ago

Apply

10.0 - 15.0 years

25 - 30 Lacs

Hyderabad, Pune

Work from Office

Solutions Architect / Technical Lead - AI & Automation1 Key Responsibilities Solution Architecture & Development: Design end-to-end solutions using Node.JS (backend) and Vue.JS (frontend) for custom portals and administration interfaces. Integrate Azure AI services , Google OCR , and Azure OCR into client workflows. AI/ML Engineering: Develop and optimize vision-based AI models ( Layout Parsing/LP, Layout Inference/LI, Layout Transformation/LT ) using Python . Implement NLP pipelines for document extraction, classification, and data enrichment. Cloud & Database Management: Architect and optimize MongoDB databases hosted on Azure for scalability, security, and performance. Manage cloud infrastructure (Azure) for AI workloads, including containerization and serverless deployments. Technical Leadership: Lead cross-functional teams (AI engineers, DevOps, BAs) in solution delivery. Troubleshoot complex technical issues in OCR accuracy, AI model drift, or system integration. Client Enablement: Advise clients on technical best practices for scaling AI solutions. Document architectures, conduct knowledge transfers, and mentor junior engineers. Required Technical Expertise Frontend/Portal: Vue.JS (advanced components, state management), Node.JS (Express, REST/GraphQL APIs). AI/ML Stack: Python (PyTorch/TensorFlow), Azure AI (Cognitive Services, Computer Vision), NLP techniques (NER, summarization). Layout Engineering LP/LI/LT for complex documents (invoices, contracts). OCR Technologies: Production experience with Google Vision OCR and Azure Form Recognizer . Database & Cloud: MongoDB (sharding, aggregation, indexing) hosted on Azure (Cosmos DB, Blob Storage, AKS). Infrastructure-as-Code (Terraform/Bicep), CI/CD pipelines (Azure DevOps). Experience: 10+ years in software development, including 5+ years specializing in AI/ML, OCR, or document automation . Proven track record deploying enterprise-scale solutions in cloud environments (Azure preferred). Preferred Qualifications Certifications Azure Solutions Architect Expert , MongoDB Certified Developer , or Google Cloud AI/ML. Experience with alternative OCR tools (ABBYY, Tesseract) or AI platforms (GCP Vertex AI, AWS SageMaker). Knowledge of DocuSign CLM , Coupa , or SAP Ariba integrations. Familiarity with Kubernetes , Docker , and MLOps practices.

Posted 2 weeks ago

Apply

1.0 - 6.0 years

3 - 8 Lacs

Bengaluru

Work from Office

Azure, Linux OS. Work on advanced technical principles, theories, and concepts. 2. Work closely with geo aligned Tower leads, SMEs and Build managers depending on the stage of opportunity. 3. Work on Azure Expertise: IaaS, PaaS, VM Migrations, V Net, Traffic Manager, Azure Cloud Services, SQL Azure, Active Directory, ADFS, Data Factory, Data Lake, HD Insights, ExpressRoute, Power Shell, OMS, Security Center, Service bus, blob storages. 4. Work on MS Servers: Windows 2000/ 2003/ 2008/ 2012, MOSS 2007, SharePoint 2010/2013, Office 365, Windows Cluster Server, SQL Server, IIS Server, File Server, Proxy Server, Exchange Server, SMS server, Terminal Server. 5. Work on Networking: LB, TCP/ IP Configuration, Ethernet, Firewall, VPN, Wireless Networking, Hyper-V, Virtualization, Cloud storage, Clustering. 6. Work on Other tools/ technologies: PowerShell/Shell, MS Visio, Azure AD Connect, VMware, IP Subnetting,

Posted 2 weeks ago

Apply

2.0 - 5.0 years

8 - 14 Lacs

Mumbai, Maharashtra, India

On-site

Key Responsibilities: Data Pipeline Development : Design, build, and maintain scalable ETL (Extract, Transform, Load) data pipelines using Azure Databricks , Apache Spark , and Python . Spark Optimization : Develop and optimize Spark jobs for large-scale data processing on Databricks . Ensure that the jobs run efficiently, leveraging the capabilities of distributed computing for optimal performance. Data Integration : Integrate data from various sources, including structured and unstructured data, into the Azure cloud environment using Databricks and related tools. Collaboration with Data Scientists & Analysts : Collaborate with data scientists , analysts , and business stakeholders to understand data requirements and deliver robust data solutions that enable advanced analytics, machine learning, and reporting. Azure Integration : Work closely with Azure services such as Azure Data Lake , Azure SQL Database , Azure Blob Storage , Azure Synapse Analytics , and Azure Data Factory for comprehensive data processing solutions. Data Transformation : Use Spark SQL , PySpark , and Databricks notebooks to perform data transformations and enable the conversion of raw data into actionable insights. Automation & Scheduling : Implement automated job scheduling and orchestration for regular data processing tasks, ensuring data is consistently processed and available for downstream consumption. Performance Tuning & Troubleshooting : Optimize the performance of data workflows and Spark applications on Databricks . Troubleshoot and resolve data-related issues and bottlenecks. Cloud Security : Ensure that data security and compliance standards are followed for cloud-based solutions, including managing data access, encryption, and auditing within the Azure Databricks environment. Monitoring & Logging : Implement logging and monitoring practices for the Azure Databricks environment to track job performance, failures, and troubleshooting efforts. Documentation & Best Practices : Maintain proper documentation for data pipelines, processes, and technical workflows. Follow best practices for coding, version control, and deployment. Stay Updated with Technology Trends : Keep up to date with the latest developments in Azure Databricks , Apache Spark , and related technologies. Apply new techniques to improve performance and scalability. Required Qualifications & Skills: 3-5 years of hands-on experience in data engineering and working with Azure Databricks . Strong proficiency in Apache Spark , particularly in Databricks for building large-scale data pipelines and distributed data processing applications. Solid experience with Azure cloud services , including Azure Data Lake , Azure SQL Database , Azure Blob Storage , Azure Synapse , and Azure Data Factory . Proficiency in Python , Scala , or SQL for data engineering tasks, with a focus on PySpark for data processing. Experience working with structured and unstructured data from a variety of sources, including relational databases , APIs , and flat files . Familiarity with Databricks notebooks for developing, testing, and sharing data workflows, and using them for collaboration. In-depth understanding of ETL processes , data pipelines , and data transformation techniques. Hands-on experience with cloud-based data storage solutions (e.g., Azure Data Lake , Blob Storage ) and data warehousing concepts. Knowledge of data security best practices in a cloud environment (e.g., data encryption , access controls , Azure Active Directory ). Experience with CI/CD pipelines and version control systems like Git . Familiarity with containerization and deployment practices using Docker and Kubernetes is a plus. Strong debugging, performance tuning, and problem-solving skills. Excellent written and verbal communication skills, with the ability to collaborate effectively across teams. Bachelor's degree in Computer Science , Information Technology , or a related field.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

The project duration for this role is 6 months with a monthly rate of 1.60 Lac. The ideal candidate should possess 4-7 years of experience and the work location is in Bangalore with a Hybrid setup. Key Responsibilities: - Demonstrated strong proficiency in Python, LLMs, Lang Chain, Prompt Engineering, and related Gen AI technologies. - Proficiency in working with Azure Databricks. - Ability to showcase strong analytical skills, problem-solving capabilities, and effective stakeholder communication. - A solid understanding of data governance frameworks, compliance requirements, and internal controls. - Hands-on experience in data quality rule development, profiling, and implementation. - Familiarity with Azure Data Services such as Data Lake, Synapse, and Blob Storage. Preferred Qualifications: - Previous experience in supporting AI/ML pipelines, particularly with GenAI or LLM based models. - Proficiency in Python, PySpark, SQL, and knowledge of Delta Lake architecture. - Hands-on experience with Azure Data Lake, Azure Data Factory, and Azure Synapse Analytics. - Prior experience in data engineering, with a strong expertise in Databricks.,

Posted 2 weeks ago

Apply

10.0 - 18.0 years

15 - 30 Lacs

Hyderabad

Work from Office

Job Title: Azure Cloud Engineer with DevOps Experience Location: Hyderabad Job Type: Full-time Experience Level: Senior (10+ years) Job Description: We are seeking an experienced Azure Cloud Engineer with DevOps expertise to design, implement, and manage scalable cloud solutions. The ideal candidate will have extensive experience in Microsoft Azure, DevOps methodologies, automation, CI/CD pipelines, and cloud security best practices. This role requires a proactive approach to infrastructure management, continuous integration/deployment, and optimizing cloud environments for performance and cost-effectiveness. Key Responsibilities To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill and/or ability required. Design, deploy, and manage cloud infrastructure in Microsoft Azure. Implement and manage CI/CD pipelines for automated deployment. Develop and maintain Infrastructure as Code (IaC) using Terraform, ARM templates, or Bicep. Monitor system performance and ensure high availability and scalability. Implement security best practices for cloud applications and infrastructure. Optimize cost, performance, and reliability of cloud resources. Troubleshoot system issues and provide Level 3 support. Collaborate with software development teams to integrate DevOps best practices. Ensure compliance with industry standards and security policies. Automate cloud deployments and maintenance tasks using PowerShell, Python, or Bash. Conduct cloud performance analysis and optimize cloud costs. Required Skills/Qualifications 10+ years of experience in cloud engineering, DevOps, or related fields. Strong expertise in Microsoft Azure, including Azure Virtual Machines, Logic App, Blob Storage, EntraID, Azure DevOps, Azure Functions, App Service, and VNet integration Configure and setup auth (using Azure Entra) on Azure hosted Apps Build and deploy applications using Azure Devops with yaml based pipelines Hands-on experience with CI/CD tools such as Azure DevOps, GitHub Actions, Jenkins, or GitLab CI/CD. Proficiency in Infrastructure as Code (IaC) tools like Terraform, ARM Templates or Bicep. Manage code deployments to Azure Web/Function Apps using zip deploy method Strong knowledge of networking, security, and identity management in Azure. Scripting experience in PowerShell, Python, or Bash. Experience with monitoring and logging tools like Azure Monitor, Prometheus, Grafana, or ELK Stack. Excellent problem-solving skills and ability to work in a fast-paced, agile environment. Strong understanding of cloud security principles and compliance standards (ISO 27001, NIST, SOC2, etc.). Certifications such as Microsoft Certified: Azure Solutions Architect Expert, Azure DevOps Engineer Expert, or Certified Kubernetes Administrator (CKA) are a plus. Preferred Qualifications Experience with hybrid cloud environments. Familiarity with serverless computing and microservices architecture. Knowledge of data storage solutions (SQL, NoSQL, Blob Storage, Cosmos DB, etc.). Experience with machine learning workloads in Azure. Background in performance tuning and cost optimization in cloud environments. Expertise in containerization using Docker and orchestration with Kubernetes (AKS). Experience with configuration management tools such as Ansible, Chef, or Puppet

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

The project is expected to last for 6 months with a monthly rate of 1.60 Lac. The ideal candidate should have 4-7 years of experience and the work location will be in Bangalore with hybrid working options available. As a candidate, you are required to have a strong proficiency in Python, LLMs, Lang Chain, Prompt Engineering, and related Gen AI technologies. Additionally, you should have proficiency with Azure Databricks and possess strong analytical, problem-solving, and stakeholder communication skills. A solid understanding of data governance frameworks, compliance, and internal controls is essential. Your experience should include data quality rule development, profiling, and implementation, as well as familiarity with Azure Data Services such as Data Lake, Synapse, and Blob Storage. Preferred qualifications for this role include experience in supporting AI/ML pipelines, particularly with GenAI or LLM based models. Proficiency in Python, PySpark, SQL, and Delta Lake architecture is desired, along with hands-on experience in Azure Data Lake, Azure Data Factory, and Azure Synapse Analytics. A background in data engineering with a strong expertise in Databricks would be beneficial for this position.,

Posted 2 weeks ago

Apply

5.0 - 8.0 years

14 - 20 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Must have : IoT domain/testing experience (IoT cloud to device/device to cloud - preferably in Azure) Fundamental understanding of Azure Blobstorage, Iot Hub, Keyvault, Devops pipelines, DPS (Device provisioning service) Good experience with Python scripting Nice to have: Life science domain, Validation testing experience, Pytest, JIRA IOT SD Tester / IOT Tester / MS Intune / Blazor with Azure PaaS is mandatory along with Life Sciences domain experience.

Posted 2 weeks ago

Apply
Page 1 of 4
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies