Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 10.0 years
25 - 30 Lacs
Bengaluru
Work from Office
ECMS Number - Skill ADB+ADF and Power BI Role Tech Lead Experience level Overall 8+, relavant 5+ ADF and databricks followed by Power BI Project Description min 50 words High proficiency and 8 10 years of experience in designing/developing data analytics and data warehouse solutions with Python and Azure Data Factory (ADF) and Azure Data Bricks. He/She tends to test code manually and does not utilize automated testing frameworks for Python and PySpark. While he has a foundational understanding of Spark architecture and the Spark execution model, he has limited experience in optimizing code based on Spark Monitoring features. Experience in designing large data distribution, integration with service-oriented architecture and/or data warehouse solutions, Data Lake solution using Azure Databricks with large and multi-format data Ability to translate working solution into implementable working package using Azure platform Good understanding on Azure storage Gen2 Hands on experience with Azure stack (minimum 5 years) + Azure Databricks + Azure Data Factory Proficient coding experience using Spark (Scala/Python), T-SQL Understanding around the services related to Azure Analytics, Azure SQL, Azure function app, logic app Should be able to demonstrate a constant and quick learning ability and to handle pressure situations without compromising on quality + Power BI Report development using PBI Analysis of SSRS reports. PBI data modelling experience is advantage. Work involved in report development and as well as migration od SSRS report. Must Have: [Power BI (cloud SaaS) / Paginated Report Builder / Power Query / Data modeling] Strong SQL scripting is required Well organized and able to manage multiple projects in a fast-paced demanding environment. Attention to detail and quality; excellent problem solving and communication skills. Ability and willingness to learn new tools and applications. Work Location with zip code Pune BGC Completion timeline: Before/post onboarding Before Onboarding Vendor billing Max. 13000 INR/Day
Posted 2 months ago
6.0 - 10.0 years
4 - 7 Lacs
Chennai
Work from Office
Job Information Job Opening ID ZR_1666_JOB Date Opened 19/12/2022 Industry Technology Job Type Work Experience 6-10 years Job Title Azure Data Engineer City Chennai Province Tamil Nadu Country India Postal Code 600001 Number of Positions 4 Azure Data Factory Azure Databricks Azure SQL database Synapse Analytics Logic App Azure Functions Azure Analysis Service Active Directory Azure Devops Python Pyspark check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 2 months ago
5.0 - 8.0 years
5 - 9 Lacs
Mumbai
Work from Office
Job Information Job Opening ID ZR_1624_JOB Date Opened 08/12/2022 Industry Technology Job Type Work Experience 5-8 years Job Title Azure ADF & Power BI Developer City Mumbai Province Maharashtra Country India Postal Code 400001 Number of Positions 4 Roles & Responsibilities: Resource must have 5+ years of hands on experience in Azure Cloud development (ADF + DataBricks) - mandatory Strong in Azure SQL and good to have knowledge on Synapse / Analytics Experience in working on Agile Project and familiar with Scrum/SAFe ceremonies. Good communication skills - Written & Verbal Can work directly with customer Ready to work in 2nd shift Good in communication and flexible Defines, designs, develops and test software components/applications using Microsoft Azure- Data-bricks, ADF, ADL, Hive, Python, Data bricks, SparkSql, PySpark. Expertise in Azure Data Bricks, ADF, ADL, Hive, Python, Spark, PySpark Strong T-SQL skills with experience in Azure SQL DW Experience handling Structured and unstructured datasets Experience in Data Modeling and Advanced SQL techniques Experience implementing Azure Data Factory Pipelines using latest technologies and techniques. Good exposure in Application Development. The candidate should work independently with minimal supervision check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 2 months ago
3.0 - 7.0 years
10 - 20 Lacs
Kochi
Hybrid
Skills and attributes for success 3 to 7 years of Experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases, NoSQL, and data warehouse solutions Extensive hands-on experience implementing data migration and data processing using Azure services: Databricks, ADLS, Azure Data Factory, Azure Functions, Synapse/DW, Azure SQL DB, etc. Hands on experience in programming like python/pyspark Need to have good knowledge on DWH concepts and implementation knowledge on Snowflake Well versed in DevOps and CI/CD deployments Must have hands on experience in SQL and procedural SQL languages Strong analytical skills and enjoys solving complex technical problems Please apply on the below link for further interview process. https://careers.ey.com/job-invite/1537161/
Posted 2 months ago
3.0 - 7.0 years
10 - 15 Lacs
Bengaluru
Work from Office
: We are seeking an experienced Power BI Developer with AI Skills with 4-7 years of experience to join our dynamic team. The ideal candidate will bring extensive expertise in Power BI and a strong command of DAX queries along with strong knowledge of using AI/ML embeddings in Power BI . You should have a proven track record of designing , developing and optimizing complex dashboards. If you are passionate about leveraging data to drive business insights and thrive in a collaborative environment, we d love to hear from you! Roles & Responsibilities : Understand business requirements in BI context and design data models to transform raw data into meaningful insights. Creation of dashboards & visually interactive reports using Power BI. Create relationships between data sources and develop data models accordingly. Apply DAX functions and write advanced DAX queries for custom calculations and performance optimization. Awareness of Star and Snowflake schemas and core concepts of Data Warehousing (DWH). Work independently in rotational shifts, managing tasks without close supervision. Understand the business functionality of applications to provide accurate and relevant insights. Participate in bridge calls for critical incidents and provide clarifications as needed. Ensure on-time delivery of solutions as per the agreed ETAs. Maintain high quality in deliverables through careful validation and testing. Demonstrate effective verbal and written communication. Handle multiple tasks simultaneously while maintaining attention to detail. Possess strong interpersonal and collaboration skills to work across teams.
Posted 2 months ago
6.0 - 8.0 years
10 - 14 Lacs
Ahmedabad
Remote
Hiring Senior Azure Data Architect & Performance Engineer (Remote, 6 PM–3 AM IST). Expert in SQL Server, Azure, T-SQL, PowerShell, performance tuning, Oracle to SQL migration, Snowflake. 6–8 yrs exp. Strong DB internals & Azure skills required.
Posted 2 months ago
6.0 - 8.0 years
15 - 20 Lacs
Ahmedabad
Remote
Hiring Senior Azure Data Architect & Performance Engineer (Remote, 6 PM–3 AM IST). Expert in SQL Server, Azure, T-SQL, PowerShell, performance tuning, Oracle to SQL migration, Snowflake. 6–8 yrs exp. Strong DB internals & Azure skills required.
Posted 2 months ago
4.0 - 8.0 years
5 - 15 Lacs
Bengaluru
Work from Office
Databricks Administration, Terraform, Python/Bash, Azure Required Skills Power BI, Azure Monitor, ServiceNow
Posted 2 months ago
7.0 - 12.0 years
5 - 15 Lacs
Bengaluru
Work from Office
Azure Cloud Architecture, Databricks Administration, Azure Networking, Global Load Balancing, HA/DR, Fault Tolerance Required Skills Terraform, DR Drill, PowerBI
Posted 2 months ago
2.0 - 3.0 years
3 - 4 Lacs
Jewar
Work from Office
Responsibilities: Design, deploy, and manage secure Azure infra; maintain VMs, storage, DBs; use IaC (ARM, Terraform); Requirements: 23 yrs Azure experience; strong with VMs, VNets, storage, App Services; Nice to Have: Fintech/PCI-DSS knowledge
Posted 2 months ago
15.0 - 19.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Apache Spark, PySpark Minimum 15 year(s) of experience is required Educational Qualification : Graduate br/>Key Responsibilities :1Azure Devops CI/CD Integration Specialist to help us set up the end-to-end technical Continuous Integration/Continuous Deployment framework in Azure ADF, Databricks code, SQL and AAS and embedding the processes around it as well with the team 2Build processes supporting data transformation, data structures, metadata, dependency and workload management 3 Azure Data Factory, Azure Data Lake Storage, Azure SQL, Pyspark br/> Technical Experience :1Extensive experience with Azure Data Bricks and good to have Synapse, SQL, Pyspark 2Experience with Azure:Azure Data Factory, Azure Data Lake Storage, Databricks, Stream Analytics, Azure Functions, Serverless Architecture, ARM Templates 3Experience with object-oriented/object function scripting languages:Python, SQL, Scala, Spark-SQL 4Advanced working SQL knowledge and experience working with relational databases, query authoring SQL as well as working familiarity with a variety of data br/> Professional Attributes :1Strong project management and organizational skills 2Experience supporting and working with cross-functional teams in a dynamic environment 3Analytical bent of mind 4Ability to manage interaction with business stakeholders and other within the organization 5Good communication and documentation skil Qualification Graduate
Posted 2 months ago
3.0 - 8.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role Application Lead Project Role Description Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills Connected Vehicles Good to have skills NA Minimum 5 year(s) of experience is required Educational Qualification 15 years full time education Summary As a Software Development Engineer, you will analyze, design, code, and test multiple components of application code across one or more clients. You will perform maintenance, enhancements, and/or development work, contributing to the overall success of the projects.Roles & Responsibilities1.Develop, deploy, maintain, and manage microservices on Java Spring boot on a cloud platform along with corresponding CI/CD pipelines. 2.Provide support to junior members of the team. Professional & Technical Skills: 1.3-6 Years experience and strong foundation in Java Spring boot with exposure to microservices architecture. 2.Cloud experience is preferred on any of the CSPs like Azure/AWS/GCP. 3.Should have knowledge of using CI/CD pipelines and deploying to AKS/ECS. 4.Good to have experience and knowledge in Event Hubs/Kafka SQL/NoSQL DBs, Redis Azure functions or AWS Lambda APIs through APIM/API gateway App Insights, Log Analytics.5.Good communication skills, diligent and proactive in working, troubleshooting and keeping leads informed. Additional Information:1.The candidate should have a minimum of 3 years of experience in Spring Boot.2.This position is based at our Hyderabad.3.A 15 years full-time education is required. Qualification 15 years full time education
Posted 2 months ago
4.0 - 7.0 years
7 - 12 Lacs
Gurugram
Hybrid
Role & responsibilities Design and build effective solutions using the primary key skills required for the profile Support the Enterprise Data Environment team particularly for Data Quality and production support. Collaborate on a data migration strategy for existing systems that need to migrate to a next generation Cloud / AWS application software platform. Collaborate with teams as a key contributor of data architecture directives & documentation: including data models, technology roadmaps, standards, guidelines, and best practices. Focus on data quality throughout the ETL & data pipelines, driving improvements to data management processes, data storage, and data security to meet the needs of the business customers. Preferred candidate profile EDUCATION: Bachelor's FIELD OF STUDY: Information Technology EXPERIENCE: 4+ years of total experience into IT industry as a developer/senior developer/data engineer. 3+ years of experience of working extensively with Azure services such as Azure Data Factory, Azure Synapse and Azure Datalake. 3+ years of experience working extensively with Azure SQL, MS SQL Server and good exposure into writing complex SQL queries. 1+ years of experience working with the production support operations team as a production support engineer. Good knowledge and exposure into important SQL concepts such as Query optimization, Data Modelling and Data Governance. Working Knowledge of CI/CD process using Azure DevOps and Azure Logic Apps • Very good written and verbal communication skills. Perks and Benefits Transportation Services : Convenient and reliable commute options to ensure a hassle-free journey to and from work. Meal Facilities : Nutritious and delicious meals provided to keep you energized throughout the day. Career Growth Opportunities : Clear pathways for professional development and advancement within the organization. Captive Unit Advantage : Work in a stable, secure environment with long-term projects and consistent workflow. Continuous Learning : Access to training programs, workshops, and resources to support your personal and professional growth. Link to apply : https://encore.wd1.myworkdayjobs.com/externalnew/job/Gurgaon---Candor-Tech-Space-IT---ITES-SEZ/Senior-Data-Engineer_HR-18537 Or Share your CV at Anjali.panchwan@mcmcg.com
Posted 2 months ago
6.0 - 8.0 years
8 - 10 Lacs
Noida
Work from Office
Expertise in .NET Core programming and web development technologies Developing and implementing complex web applications and RESTful APIs Deep understanding of database technologies such as SQL Server or Oracle Knowledge of software development methodologies and best practices Familiarity with cloud technologies such as Azure or AWS Strong communication and collaboration skills Secondary Skills - C#, ASP.NET, Entity Framework - SQL Server - Web API, RESTful services - Angular, React, or Vue.js Soft Skills and Professional Attributes - Time management - Self-motivation and self-learning - Adaptability to changing environments - Attention to detail - Willingness to learn and improve - Positive attitude - Ownership and accountability Engineering Skills - Strong coding, debugging, and problem-solving skills - Experience with agile methodologies - Experience with code reviews and code quality tools - Familiarity with SDLC, DevOps, and ALM tools - Ability to work in a team environment - Good communication skills Job Responsibilities - Design and develop high-quality .NET Core applications - Collaborate with cross-functional teams to define, design, and ship new features - Write clean, scalable, and maintainable code - Conduct code reviews and maintain code quality standards - Troubleshoot and debug issues as they arise - Ensure timely delivery of assigned tasks - Participate in agile ceremonies such as sprint planning, stand-ups, and retrospectives EXPERIENCE 6-8 Years SKILLS Primary Skill: .NET Development Sub Skill(s): .NET Development Additional Skill(s): .NET Core, ASP.Net, C#, MySQL, SQL, Azure SQL Development
Posted 2 months ago
2.0 - 6.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Job Digital Solutions Schedule Full-time Employment Type Agency Contractor Job Level Experienced Job Posting May 28, 2025 Unposting Date Jun 27, 2025 Reporting Manager Title Director - Digital Customer Solutions We deliver the worlds most complex projects Work as part of a collaborative and inclusive team Enjoy a varied & challenging role Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts headquartered in Australia. Right now, were bridging two worlds as we accelerate to more sustainable energy sources, while helping our customers provide the energy, chemicals and resources that society needs now. We partner with our customers to deliver projects and create value over the life of their portfolio of assets. We solve complex problems by finding integrated data-centric solutions from the first stages of consulting and engineering to installation and commissioning, to the last stages of decommissioning and remediation. Join us and help drive innovation and sustainability in our projects. The Role Develop and implement data pipelines for ingesting and collecting data from various sources into a centralized data platform. Develop and maintain ETL jobs using AWS Glue services to process and transform data at scale. Optimize and troubleshoot AWS Glue jobs for performance and reliability. Utilize Python and PySpark to efficiently handle large volumes of data during the ingestion process. Collaborate with data architects to design and implement data models that support business requirements. Create and maintain ETL processes using Airflow, Python and PySpark to move and transform data between different systems. Implement monitoring solutions to track data pipeline performance and proactively identify and address issues. Manage and optimize databases, both SQL and NoSQL, to support data storage and retrieval needs. Familiarity with Infrastructure as Code (IaC) tools like Terraform, AWS CDK and others. Proficiency in event-driven integrations, batch-based and API-led data integrations. Proficiency in CICD pipelines such as Azure DevOps, AWS pipelines or Github Actions. About You To be considered for this role it is envisaged you will possess the following attributes Strong hands-on experience in implementing Microservice-based Solutions Expertise in Azure services like Azure Kubernetes Service (AKS), Azure Functions, Azure Blob Storage, Azure SQL, Azure APIM, Application Gateways Good understanding of OWASP Vulnerabilities and their remediation Strong understanding of microservices architecture principles, containerization (e.g., Docker, Kubernetes), and API design. Strong understanding Data Architecture, Data Modelling, Data Management, Data Integration patterns and challenges. Experience with scalable data platforms and solutions integrating and standardising data from different enterprise applications Having one or more certifications as a Solution Architect in any of the leading Cloud Platforms Experience with EPC(Engineering, Procurement and Construction) Customers Moving forward together Were committed to building a diverse, inclusive and respectful workplace where everyone feels they belong, can bring themselves, and are heard. We provide equal employment opportunities to all qualified applicants and employees without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by law. We want our people to be energized and empowered to drive sustainable impact. So, our focus is on a values-inspired culture that unlocks brilliance through belonging, connection and innovation. And we're not just talking about it; we're doing it. We're reskilling our people, leveraging transferable skills, and supporting the transition of our workforce to become experts in today's low carbon energy infrastructure and technology. Whatever your ambition, theres a path for you here. And theres no barrier to your potential career success. Join us to broaden your horizons, explore diverse opportunities, and be part of delivering sustainable change.
Posted 2 months ago
3.0 - 6.0 years
14 - 18 Lacs
Pune
Work from Office
Establish and implement best practices for DBT workflows, ensuring efficiency, reliability, and maintainability. Collaborate with data analysts, engineers, and business teams to align data transformations with business needs. Monitor and troubleshoot data pipelines to ensure accuracy and performance. Work with Azure-based cloud technologies to support data storage, transformation, and processing Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong MS SQL, Azure Databricks experience Implement and manage data models in DBT, data transformation and alignment with business requirements. Ingest raw, unstructured data into structured datasets to cloud object store. Utilize DBT to convert raw, unstructured data into structured datasets, enabling efficient analysis and reporting. Write and optimize SQL queries within DBT to enhance data transformation processes and improve overall performance Preferred technical and professional experience Establish best DBT processes to improve performance, scalability, and reliability. Design, develop, and maintain scalable data models and transformations using DBT in conjunction with Databricks Proven interpersonal skills while contributing to team effort by accomplishing related results as required
Posted 2 months ago
4.0 - 8.0 years
4 - 8 Lacs
Gurugram
Work from Office
Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your Role Proficiency in MS Fabric,Azure Data Factory, Azure Synapse Analytics, Azure Databricks Extensive knowledge of MS Fabriccomponents Lakehouses, OneLake, Data Pipelines, Real-Time Analytics, Power BI Integration, Semantic Model. Integrate Fabric capabilities for seamless data flow, governance, and collaborationacross teams. Strong understanding of Delta Lake, Parquet, and distributed data systems. Strong programming skills in Python, PySpark,Scalaor SparkSQL/TSQLfor data transformations. Your Profile Strong experience in implementation and management of lake House using Databricks and Azure Tech stack (ADLS Gen2, ADF, Azure SQL) . Proficiencyin data integration techniques, ETL processes and data pipeline architectures. Understanding of Machine Learning Algorithms & AI/ML frameworks (i.e TensorFlow, PyTorch)and Power BIis an added advantage MS Fabric and PySpark is must. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.
Posted 2 months ago
5.0 - 10.0 years
14 - 20 Lacs
Hyderabad
Work from Office
Job Title: Lead Engineer / Senior Lead Engineer Full Stack Cloud Developer Years Of Experience: 5 - 10 Years Role Overview: A full stack cloud developer in an individual contributor role will be handling development of engineering cloud applications, APIs, by leveraging the latest cloud technologies. The developer shall be able to analyze the requirements, develop software design and implement error-free software as per OTIS coding standards. On a typical day you will: Able to analyze the requirements, develop software design and implement error-free software as per OTIS coding standards. Able to develop and execute unit test scenarios before hand-off to QA engineers Ability to debug, troubleshoot issues that arise from various phases of software development lifecycle activities. Sharing the project updates to leadership, PMO. Follow SAFe methodologies and OTIS Software Development Processes What You Will Need to be Successful: Education: BE/B.Tech/M. TECH/MCA 5+ years software development experience as a full stack developer using .Net Core. Experience in Angular 10 or Latest version. Working experience in Cloud Computing (App Services, Web Jobs, Function Apps etc.). Experience in REST API and Cloud Services development. Data Management Components: Azure SQL, SQL Server, Azure Blob Storage, Azure Data Tables. Good to have Skills: Good to have experience on Messaging & Integration Components: Storage Queues, Service Bus (Queues, Relay & Topics) Trouble shooting Azure applications C++ and .Net CLI Integration
Posted 2 months ago
6.0 - 9.0 years
5 - 14 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Role & responsibilities Data Bricks skillset with Pyspark , SQL Strong proficiency in pyspark and SQL Understanding of data warehousing concepts ETL processes/ Data pipeline building with ADB/ADF Experience with Azure cloud platform, knowledge of data manipulation techniques Experience working with business teams to convert the requirements into technical stories for migration Leading the technical discussions and implementing the solution Experience will multi tenant architecture and have delivered projects in Databricks + Azure combination Experience to Unity catalogue is useful
Posted 2 months ago
7.0 - 11.0 years
20 - 27 Lacs
Noida, Greater Noida
Work from Office
Description - Primary Responsibilities: Design, develop, and implement scalable data pipelines using Azure Databricks Develop PySpark-based data transformations and integrate structured and unstructured data from various sources Optimize Databricks clusters for performance, scalability, and cost-efficiency within the Azure ecosystem Monitor, troubleshoot, and resolve performance bottlenecks in Databricks workloads Manage orchestration and scheduling of end to end data pipeline using tool like Apache airflow, ADF scheduling, logic apps Effective collaboration with Architecture team in designing solutions and with product owners with validating the implementations Implementing best practices to enable data quality, monitoring, logging and alerting the failure scenarios and exception handling Documenting step by step process to trouble shoot the potential issues and deliver cost optimized cloud solutions Provide technical leadership, mentorship, and best practices for junior data engineers Stay up to date with Azure and Databricks advancements to continuously improve data engineering capabilities Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Qualifications - Required Qualifications: B.Tech or equivalent 7+ years of overall experience in IT industry and 6+ years of experience in data engineering with 3+ years of hands-on experience in Azure Databricks Hands-on experience with Delta Lake, Lakehouse architecture, and data versioning Experience with CI/CD pipelines for data engineering solutions (Azure DevOps, Git) Solid knowledge of performance tuning, partitioning, caching, and cost optimization in Databricks Deep understanding of data warehousing, data modeling (Kimball/Inmon), and big data processing Solid expertise in the Azure ecosystem, including Azure Synapse, Azure SQL, ADLS, and Azure Functions Proficiency in PySpark, Python and SQL for data processing in Databricks Proven excellent written and verbal communication skills Proven excellent problem-solving skills and ability to work independently Proven ability to balance multiple and competing priorities and execute accordingly Proven highly self-motivated with excellent interpersonal and collaborative skills Proven ability to anticipate risks and obstacles and develop plans for mitigation Proven excellent documentation experience and skills Preferred Qualifications: Azure certifications DP-203, AZ-304 etc. Experience in infrastructure as code, scheduling as code, and automating operational activities using Terraform scripts
Posted 2 months ago
5.0 - 7.0 years
12 - 18 Lacs
Pune
Work from Office
Proven experience as a Data Engineer Must have strong knowledge in T-SQL. Should have Azure ADF background, Strong expertise in Snowflakes, Data factory, Proficiency in Azure SQL and experience with data modeling Experience with ETL tools Health insurance Flexi working
Posted 2 months ago
2.0 - 5.0 years
3 - 6 Lacs
Bhimavaram
Work from Office
Client Server Tech is looking for Azure SAN Engineer to join our dynamic team and embark on a rewarding career journey. Analyzing customer needs to determine appropriate solutions for complex technical issues Creating technical diagrams, flowcharts, formulas, and other written documentation to support projects Providing guidance to junior engineers on projects within their areas of expertise Conducting research on new technologies and products in order to recommend improvements to current processes Developing designs for new products or systems based on customer specifications Researching existing technologies to determine how they could be applied in new ways to solve problems Reviewing existing products or concepts to ensure compliance with industry standards, regulations, and company policies Preparing proposals for new projects, identifying potential problems, and proposing solutions Estimating costs and scheduling requirements for projects and evaluating results
Posted 2 months ago
3.0 - 7.0 years
5 - 9 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Any Azure certifications is advantage. Key words :Azure Logic Apps or Azure Dev ops or Terraform Job Description: Create and configure Azure Logic Apps, Functions, Azure Communication Services and Storage accounts to implement integration with cloud and on-premises components. Implement azure monitoring and application insight. Employ Terraform/Bicep to define and deploy Azure infrastructure as code (IaC), ensuring consistency and repeatability across environments. Establish and maintain Azure Pipelines for seamless and automated infrastructure promotion through various stages (e.g., development, staging, production). Apply your knowledge of virtual networks and hub-and-spoke network topology to ensure secure and efficient network connectivity within Azure. Implement and adhere to Azure security best practices to safeguard our cloud resources. Skills Required: Proven hands-on experience in creating, configuring, and managing Azure Logic Apps, Functions, Azure Blob Storage, Azure Monitoring, and Azure Communication Services. Significant experience with Terraform/Bicep for infrastructure provisioning and management. Demonstrated ability to set up and manage Azure Pipelines for CI/CD of infrastructure. A solid understanding of Azure virtual networks and hub-and-spoke network architecture. A strong grasp of Azure security principles and best practices.
Posted 2 months ago
5.0 - 8.0 years
13 - 17 Lacs
Gurugram
Work from Office
KPMG India is looking for Senior - Azure Data Engineering to join our dynamic team and embark on a rewarding career journey Assure that data is cleansed, mapped, transformed, and otherwise optimised for storage and use according to business and technical requirements Solution design using Microsoft Azure services and other tools The ability to automate tasks and deploy production standard code (with unit testing, continuous integration, versioning etc.) Load transformed data into storage and reporting structures in destinations including data warehouse, high speed indexes, real-time reporting systems and analytics applications Build data pipelines to collectively bring together data Other responsibilities include extracting data, troubleshooting and maintaining the data warehouse
Posted 2 months ago
5.0 - 8.0 years
7 - 10 Lacs
Hyderabad
Work from Office
> Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Azure Integration. Experience: 5-8 Years.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City