Jobs
Interviews

378 Azure Synapse Jobs - Page 14

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 7.0 years

15 - 20 Lacs

Pune

Work from Office

Roles and Responsibilities: You are detailed reviewing and analyzing structured, semi-structured and unstructured data sources for quality, completeness, and business value. You design, architect, implement and test rapid prototypes that demonstrate value of the data and present them to diverse audiences. You participate in early state design and feature definition activities. Responsible for implementing robust data pipeline using Microsoft, Databricks Stack Responsible for creating reusable and scalable data pipelines. You are a Team-Player, collaborating with team members across multiple engineering teams to support the integration of proven prototypes into core intelligence products. You have strong communication skills to effectively convey complex data insights to non-technical stakeholders. Critical Skills to Possess: Skills: Advanced working knowledge and experience with relational and non-relational databases. Advanced working knowledge and experience with API data providers Experience building and optimizing Big Data pipelines, architectures, and datasets. Strong analytic skills related to working with structured and unstructured datasets. Hands-on experience in Azure Databricks utilizing Spark to develop ETL pipelines. Strong proficiency in data analysis, manipulation, and statistical modeling using tools like Spark, Python, Scala, SQL, or similar languages. Strong experience in Azure Data Lake Storage Gen2, Azure Data Factory, Databricks, Event Hub, Azure Synapse. Familiarity with several of the following technologies: Event Hub, Docker, Azure Kubernetes Service, Azure DWH, API Azure, Azure Function, Power BI, Azure Cognitive Services. Azure DevOps experience to deploy the data pipelines through CI/CD. Roles and Responsibilities Skills: Azure Databricks, Azure Datafactory, Big Data Pipelines, Pyspark, Azure Synapse, Azure DevOps, Azure Data Lake Storage Gen2, Event Hub, Azure DWH, API Azure. Experience: Minimum 5-7 years of practical experience as Data Engineer. Azure cloud stack in-production experience. Preferred Qualifications: BS degree in Computer Science or Engineering or equivalent experience

Posted 2 months ago

Apply

7.0 - 9.0 years

10 - 20 Lacs

Bengaluru

Hybrid

Overall, 7 to 9 years of experience in cloud data and analytics platforms such as AWS, Azure, or GCP • Including 3+ years experience with Azure cloud Analytical tools is a must • Including 5+ years of experience working with data & analytics concepts such as SQL, ETL, ELT, reporting and report building, data visualization, data lineage, data importing & exporting, and data warehousing • Including 3+ years of experience working with general IT concepts such as integrations, encryption, authentication & authorization, batch processing, real-time processing, CI/CD, automation • Advanced knowledge of cloud technologies and services, specifically around Azure Data Analytics tools o Azure Functions (Compute) o Azure Blob Storage (Storage) o Azure Cosmos DB (Databases) o Azure Synapse Analytics (Databases) o Azure Data Factory (Analytics) o Azure Synapse Serverless SQL Pools (Analytics)Role & responsibilities Preferred candidate profile

Posted 2 months ago

Apply

10.0 - 12.0 years

25 - 30 Lacs

Hyderabad, Bengaluru

Hybrid

Excellent communication and interpersonal skills, with the ability to explain technical concepts to non-technical stakeholders. Azure certifications such as Azure Solutions Architect Expert (AZ-30x) or equivalent certifications are preferred. Knowledge of hybrid cloud architecture and migration techniques. Mandatory Skills: Microsoft Fabric Azure Data Factory Azure Synapse Analytics Azure SQL DB Azure Functions Azure Cosmos DB Nice to Have: .NET experience Azure AI skills

Posted 2 months ago

Apply

5.0 - 6.0 years

8 - 13 Lacs

Hyderabad

Work from Office

About the Role - We are seeking a highly skilled and experienced Senior Azure Databricks Engineer to join our dynamic data engineering team. - As a Senior Azure Databricks Engineer, you will play a critical role in designing, developing, and implementing data solutions on the Azure Databricks platform. - You will be responsible for building and maintaining high-performance data pipelines, transforming raw data into valuable insights, and ensuring data quality and reliability. Key Responsibilities - Design, develop, and implement data pipelines and ETL/ELT processes using Azure Databricks. - Develop and optimize Spark applications using Scala or Python for data ingestion, transformation, and analysis. - Leverage Delta Lake for data versioning, ACID transactions, and data sharing. - Utilize Delta Live Tables for building robust and reliable data pipelines. - Design and implement data models for data warehousing and data lakes. - Optimize data structures and schemas for performance and query efficiency. - Ensure data quality and integrity throughout the data lifecycle. - Integrate Azure Databricks with other Azure services (e.g., Azure Data Factory, Azure Synapse Analytics, Azure Blob Storage). - Leverage cloud-based data services to enhance data processing and analysis capabilities. Performance Optimization & Troubleshooting - Monitor and analyze data pipeline performance. - Identify and troubleshoot performance bottlenecks. - Optimize data processing jobs for speed and efficiency. - Collaborate effectively with data engineers, data scientists, data analysts, and other stakeholders. - Communicate technical information clearly and concisely. - Participate in code reviews and contribute to the improvement of development processes. Qualifications Essential - 5+ years of experience in data engineering, with at least 2 years of hands-on experience with Azure Databricks. - Strong proficiency in Python and SQL. - Expertise in Apache Spark and its core concepts (RDDs, DataFrames, Datasets). - In-depth knowledge of Delta Lake and its features (e.g., ACID transactions, time travel). - Experience with data warehousing concepts and ETL/ELT processes. - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Bachelor's degree in Computer Science, Computer Engineering, or a related field.

Posted 2 months ago

Apply

3.0 - 4.0 years

10 - 20 Lacs

Hyderabad

Remote

Experience Required: 3 to 4Years Mode of work: Remote Skills Required: Azure Data Bricks, Azure Data Factory, Pyspark, Python, SQL, Spark Notice Period : Immediate Joiners/ Permanent/Contract role (Can join within June 15th ) 3 to 4+ years of experience with Big Data technologies Exp with Databricks is must with Python scripting and SQL knowledge Strong knowledge and experience with Microsoft Azure cloud platform. Proficiency in SQL and experience with SQL-based database systems. Experience with batch and data streaming. Hands-on experience with Azure data services, such as Azure SQL Database, Azure Data Lake, and Azure Blob Storage . Experience using Azure Databricks in real-world scenarios is preferred . Experience with data integration and ETL (Extract, Transform, Load) processes . Strong analytical and problem-solving skills. Good understanding of data engineering principles and best practices. Experience with programming languages such as Pyspark/Python Relevant certifications in Azure data services or data engineering are a plus. Interested candidate can share your resume to OR you can refer your friend to Pavithra.tr@enabledata.com for the quick response.

Posted 2 months ago

Apply

3.0 - 7.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Key Skills: Azure Synapse, Azure Databricks, Azure, Azure Devops, Azure AI, Azure Api, Azure AD, PLSQL Roles and Responsibilities: Design and Develop Data Pipelines: Build and maintain scalable data pipelines using Azure Data Factory, ensuring efficient and reliable data movement and transformation. File-Based Data Management: Handle data ingestion and management from various file sources, including CSV, JSON, and Parquet formats, ensuring data accuracy and consistency. ETL Implementation: Implement and optimize ETL (Extract, Transform, Load) processes using tools such as Azure Data Factory, Azure SQL Database, and Azure Synapse Analytics. Cloud Storage Management: Work with Azure Data Lake Storage to manage and utilize cloud storage solutions, ensuring data is securely stored and easily accessible. Automation with Data Factory: Leverage Azure Data Factory's automation capabilities to schedule and monitor data workflows, ensuring timely execution and error-free operations. Performance Monitoring: Continuously monitor and optimize data pipeline performance, troubleshoot issues, and implement best practices to enhance efficiency. Team Collaboration: Collaborate with Technical Architects, Business Analysts, and other engineers to build scalable and reliable end-to-end data solutions for reporting and analytics. DevOps Framework: Defining and implementing DevOps framework using CI/CD pipelines. SQL Development: Write efficient, clean, and well-documented SQL queries for data extraction, manipulation, and analysis. SQL Performance Optimization: Optimize performance of SQL-based queries, stored procedures, and jobs in Azure environments. Data Security & Compliance: Implement data security best practices and ensure compliance with data privacy regulations (HIPAA, etc.). Technical Leadership: Provide technical leadership and mentoring to junior engineers and team members. Technology Adoption: Stay current with emerging Azure technologies and trends, recommending improvements to existing systems and solutions. Skills Required: Strong expertise in Data Analytics for analyzing and interpreting large datasets. Proficiency in Azure Boards-GitHub for managing project tasks and source code version control. Extensive experience with Azure Data Factory for building and managing scalable data pipelines. In-depth knowledge of Azure Data Lake for managing cloud storage solutions and data access. Hands-on experience with Azure Synapse for data integration and analytics solutions. Proficient in Azure DevOps for implementing CI/CD pipelines and automating deployments. Education: Bachelor's or Master's degree in Computer Science, Information Systems, Data Engineering, or a related technical field

Posted 2 months ago

Apply

5.0 - 10.0 years

5 - 13 Lacs

Chennai

Work from Office

Roles and Responsibilities Design, develop, test, deploy, and maintain Azure Data Factory (ADF) pipelines for data integration. Collaborate with cross-functional teams to gather requirements and design solutions using ADF. Develop complex data transformations using SQL Server Integration Services (SSIS), DDL/DML statements, and other tools. Troubleshoot issues related to pipeline failures or errors in the pipeline execution process. Optimize pipeline performance by analyzing logs, identifying bottlenecks, and implementing improvements.

Posted 2 months ago

Apply

8.0 - 13.0 years

16 - 27 Lacs

Hyderabad

Remote

Job Location : Hyderabad / Bangalore / Chennai / Kolkata / Noida/ Gurgaon / Pune / Indore / Mumbai Preferred: Hyderabad At least 5+ years of relevant hands on development experience as Azure Data Engineering role Proficient in Azure technologies like ADB, ADF, SQL(capability of writing complex SQL queries), ADB, PySpark, Python, Synapse, Delta Tables, Unity Catalog Hands on in Python, PySpark or Spark SQL Hands on in Azure Analytics and DevOps Taking part in Proof of Concepts (POCs) and pilot solutions preparation Ability to conduct data profiling, cataloguing, and mapping for technical design and construction of technical data flow Experience in business processing mapping of data and analytics solutions

Posted 2 months ago

Apply

8.0 - 13.0 years

16 - 27 Lacs

Indore, Hyderabad, Ahmedabad

Work from Office

Kanerika Inc. is a premier global software products and services firm that specializes in providing innovative solutions and services for data-driven enterprises. Our focus is to empower businesses to achieve their digital transformation goals and maximize their business impact through the effective use of data and AI. We leverage cutting-edge technologies in data analytics, data governance, AI-ML, GenAI/ LLM and industry best practices to deliver custom solutions that help organizations optimize their operations, enhance customer experiences, and drive growth. Designation: Lead Data Engineer Location: Hyderabad, Indore, Ahmedabad Experience: 8 years Role & responsibilities What You Will Do: • Analyze Business Requirements. • Analyze the Data Model and do GAP analysis with Business Requirements and Power BI. Design and Model Power BI schema. • Transformation of Data in Power BI/SQL/ETL Tool. • Create DAX Formula, Reports, and Dashboards. Able to write DAX formulas. • Experience writing SQL Queries and stored procedures. • Design effective Power BI solutions based on business requirements. • Manage a team of Power BI developers and guide their work. • Integrate data from various sources into Power BI for analysis. • Optimize performance of reports and dashboards for smooth usage. • Collaborate with stakeholders to align Power BI projects with goals. • Knowledge of Data Warehousing(must), Data Engineering is a plus What we need? • B. Tech computer science or equivalent • Minimum 5+ years of relevant experience Perks and benefits

Posted 2 months ago

Apply

5.0 - 10.0 years

13 - 23 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Hi, We are excited to announce that #LTI Mindtree is currently recruiting #Data Engineers! Roles Available: - Specialist - Data Engineering: 5 to 8 years of experience Senior Specialist - Data Engineering: 8 to 12 years of experience Location: Bangalore, Pune, Mumbai, Kolkata, Hyderabad, Chennai and Delhi NCR. Work Mode: Hybrid Notice period: Till 60 days Link to share your details: ( https://lnkd.in/daty4F25 ) Job Summary: We are seeking an experienced and strategic Data to design, build, and optimize scalable, secure, and high-performance data solutions. You will play a pivotal role in shaping our data infrastructure, working with technologies such as Databricks, Azure Data Factory, Unity Catalog , and Spark , while aligning with best practices in data governance, pipeline automation , and performance optimization . Key Responsibilities: Design and develop scalable data pipelines using Databricks and Medallion Architecture (Bronze, Silver, Gold layers). • Architect and implement data governance frameworks using Unity Catalog and related tools. • Write efficient PySpark and SQL code for data transformation, cleansing, and enrichment. • Build and manage data workflows in Azure Data Factory (ADF) including triggers, linked services, and integration runtimes. • Optimize queries and data structures for performance and cost-efficiency . • Develop and maintain CI/CD pipelines using GitHub for automated deployment and version control. • Collaborate with cross-functional teams to define data strategies and drive data quality initiatives. • Implement best practices for DevOps, CI/CD , and infrastructure-as-code in data engineering. • Troubleshoot and resolve performance bottlenecks across Spark, ADF, and Databricks pipelines. • Maintain comprehensive documentation of architecture, processes, and workflows . Requirements: Bachelors or master’s degree in computer science, Information Systems, or related field. • Proven experience as a Data Architect or Senior Data Engineer. • Strong knowledge of Databricks , Azure Data Factory , Spark (PySpark) , and SQL . • Hands-on experience with data governance , security frameworks , and catalog management . • Proficiency in cloud platforms (preferably Azure). • Experience with CI/CD tools and version control systems like GitHub. • Strong communication and collaboration skills.

Posted 2 months ago

Apply

7.0 - 11.0 years

15 - 20 Lacs

Mumbai

Work from Office

This role requires deep understanding of data warehousing, business intelligence (BI), and data governance principles, with strong focus on the Microsoft technology stack. Data Architecture Develop and maintain the overall data architecture, including data models, data flows, data quality standards. Design and implement data warehouses, data marts, data lakes on Microsoft Azure platform Business Intelligence Design and develop complex BI reports, dashboards, and scorecards using Microsoft Power BI. Data Engineering Work with data engineers to implement ETL/ELT pipelines using Azure Data Factory. Data Governance Establish and enforce data governance policies and standards. Primary Skills Experience 15+ years of relevant experience in data warehousing, BI, and data governance. Proven track record of delivering successful data solutions on the Microsoft stack. Experience working with diverse teams and stakeholders. Required Skills and Experience Technical Skills: Strong proficiency in data warehousing concepts and methodologies. Expertise in Microsoft Power BI. Experience with Azure Data Factory, Azure Synapse Analytics, and Azure Databricks. Knowledge of SQL and scripting languages (Python, PowerShell). Strong understanding of data modeling and ETL/ELT processes. Secondary Skills Soft Skills Excellent communication and interpersonal skills. Strong analytical and problem-solving abilities. Ability to work independently and as part of a team. Strong attention to detail and organizational skills.

Posted 2 months ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Kolkata, Hyderabad, Bengaluru

Hybrid

Greetings from Tech Mahindra!! With reference to your profile on Naukri portal, we are contacting you to share a better job opportunity for the role of SQL Developer with our own organization, Tech Mahindra based. COMPANY PROFILE: Tech Mahindra is an Indian multinational information technology services and consulting company. Website: www.techmahindra.com We are looking for SQL Developer for our Organization. Job Details: Experience : 5+ years Education : Any Work timings : Normal Shift Mode-Hybrid Location open for all locations No of days working : 05 Days Working Required Skills - Experience: 5+ years Experience on SQL DB/Server including building SQL Database, database designing, data modelling and data warehousing. Strong experience Creating complex stored procedures and functions, Dynamic SQLs Strong experience in performance tuning activities •Must have experience on Azure Data Factory V2, Azure Synapse, Azure Databricks and SSIS. •Strong Azure SQL Database and Azure SQL Datawarehouse concepts. •Strong verbal and written communications skills Kindly share only interested candidates forward your updated resumes with below details at: ps00874998@techmahindra.com Total years of experience: Relevant experience in SQL developer : Relevant experience in Azure Data Factory :- Relevant experience in Azure Databricks :- Offer amount (if holding any offer ) : Location of offer:- Reason for looking another offer:- Notice Period (if serving LWD) : Current location :- Preferred location : CTC: Exp CTC: When you are available for the interview? (Time/Date): How soon can you join? Best Regards, Prerna Sharma Business Associate | RMG Tech Mahindra | PS00874998@TechMahindra.com

Posted 2 months ago

Apply

4.0 - 6.0 years

10 - 18 Lacs

Bengaluru

Work from Office

Primary Responsibilities: Ability to interact closely with Business Stakeholder on understanding their business requirements and converting them into opportunity. Leading POCs to create break through technical solutions, performing exploratory and targeted data analyses. Ability to Manage and support existing applications and implementing the best practices on timely Manner. Analyzes the results to generate actionable insights and presents the findings to the business users for informed decision making. Understands business requirements and develops dashboards to meet business needs Adapts to the changing business requirements and supports the development and implementation of best-known methods with respect to data analytics Performs Data mining which provides actionable data in response to changing business requirements Migrates data into standardized platforms (Power BI) and builds critical data models to improve process performance and product quality Owns technical implementation and documentation associated with datasets Provides updates on project progress, performs root cause analysis on completed projects and works on identified improvement areas (like process, product quality, performance, etc.) Provides post-implementation support and ensures the target project benefits are successfully delivered in a robust and sustainable fashion Builds relationships and partners effectively with cross-functional teams to ensure available data is accurate, consistent and timely Mandatory Skills required to perform the job: Knowledge on the software development lifecycle expert in translating business requirements into technical solutions; and fanatical about quality, usability, security and scalability Specialist in Power Platform (Power Apps & Power Automate) Experience with JavaScript, Power Fx, create plugins, custom app and canvas/model driven, page development, web API creation, data/Cloud flow, Dataverse. Expert in Reports & Dashboard development (Power BI) Knowledge of SAP systems (SAP ECC T-Codes & Navigation) Experience in Data Base Development, Troubleshooting & Problem-solving skills (SQL Server, SAP HANA, Azure Synapse) Experience in project requirements gathering and converting business requirements into analytical & technical specs. Good understanding of business processes and experience in Manufacturing/Inventory Management domains Knowledge in performing Root cause analysis and Corrective actions Excellent verbal and written communication & presentation skills, able to communicate cross-functionally Eligibility Criteria: Years of Experience: Minimum 5-7 years Job Experience: Expert with Power Platform (Power Apps, Power Automate & Power BI) Experience in Database and Data warehouse tech (Azure Synapse/SQL Server/SAP HANA) Data Analysis/Data Profiling/Data Visualization

Posted 2 months ago

Apply

8.0 - 10.0 years

10 - 12 Lacs

Pune

Work from Office

Work Mode: Full-time, Office-based JobSummary Transform raw data into compelling stories that drive business decisions. You will design, build, and optimize interactive dashboards and reports with PowerBI, partner with business stakeholders to define KPIs and data models, and ensure every analytic deliverable meets enterprise reporting standards for accuracy, usability, and performance. KeyResponsibilities Collaborate with business teams to gather requirements, identify key performance indicators, and translate them into intuitive PowerBI reports and dashboards. Build robust semantic modelsdefining star/snowflake schemas, measures, and calculated tablesto support self service analytics. Develop advanced DAX calculations, optimized queries, and dynamic visual interactions that deliver near real time insights. Continuously tune data models, visuals, and refresh schedules to maximise performance and minimise cost. Establish and maintain report governance standards (naming conventions, documentation, version control, and accessibility compliance). Mentor analysts and citizen developers on PowerBI best practices and storytelling techniques. Partner with data engineering teams to validate data quality, source new data sets, and enhance the analytics pipeline. Must HaveSkills 6-8years in BI/reporting roles, with 3+years hands on PowerBI design and development experience. Expertise in data modelling concepts (star/snowflake, slowly changing dimensions) and strong command of DAX and PowerQuery (M). Proven ability to translate complex business needs into intuitive KPIs, visuals, and interactive drill downs. Solid SQL skills and familiarity with data warehouse/ETL processes (AzureSynapse, Snowflake, or similar). Experience optimising report performance—query folding, aggregation tables, incremental refresh, composite models, etc. Strong understanding of data visualisation best practices, UX design, and storytelling principles. Excellent stakeholder management, requirements gathering, and presentation abilities. PreferredCertifications Microsoft Certified: PowerBI Data Analyst Associate (PL 300) Microsoft Certified: Azure Enterprise Data Analyst Associate (DP 500)

Posted 2 months ago

Apply

6.0 - 11.0 years

10 - 20 Lacs

Chennai, Bengaluru

Hybrid

Hiring for Big Data Lead Experience : 6 - 12+ yrs Work location : Chennai and Bangalore Shift timings : 12:30pm - 9:30pm Work Mode : 5 days WFO Primary: Azure, Databricks, ADF, Pyspark, SQL Sharing JD for your reference : Must Have 6 + Years of IT experience in Datawarehouse and ETL • Hands-on data experience on Cloud Technologies on Azure, ADF, Synapse, Pyspark/Python • Ability to understand Design, Source to target mapping (STTM) and create specifications documents • Flexibility to operate from client office locations • Able to mentor and guide junior resources, as needed Nice to Have • Any relevant certifications • Banking experience on RISK & Regulatory OR Commercial OR Credit Cards/Retail Kindly, share the following details : Updated CV Relevant Skills Total Experience Current Company Current CTC Expected CTC Notice Period Current Location Preferred Location

Posted 2 months ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Azure Data Factory: - Develop Azure Data Factory Objects - ADF pipeline, configuration, parameters, variables, Integration services runtime - Hands-on knowledge of ADF activities(such as Copy, SP, lkp etc) and DataFlows - ADF data Ingestion and Integration with other services Azure Databricks: - Experience in Big Data components such as Kafka, Spark SQL, Dataframes, HIVE DB etc implemented using Azure Data Bricks would be preferred. - Azure Databricks integration with other services - Read and write data in Azure Databricks - Best practices in Azure Databricks Synapse Analytics: - Import data into Azure Synapse Analytics with and without using PolyBase - Implement a Data Warehouse with Azure Synapse Analytics - Query data in Azure Synapse Analytics Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

4.0 - 9.0 years

6 - 10 Lacs

Hyderabad

Work from Office

- Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub. - Experience in migrating on-premise data warehouses to data platforms on AZURE cloud. - Designing and implementing data engineering, ingestion, and transformation functions - Azure Synapse or Azure SQL data warehouse - Spark on Azure is available in HD insights and data bricks - Good customer communication. - Good Analytical skill Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

4.0 - 8.0 years

3 - 7 Lacs

Bengaluru

Work from Office

About The Role : - Minimum 4 years of experience in relevant field. - Hands on experience in Databricks, SQL, Azure Data Factory, Azure DevOps - Strong expertise in Microsoft Azure cloud platform services (Azure Data Factory, Azure Data Bricks, Azure SQL Database, Azure Data Lake Storage, Azure Synapse Analytics). - Proficient in CI-CD pipelines in Azure DevOps for automatic deployments - Good in Performance optimization techniques like using temp tables, CTE, indexing, merge statements, joins. - Familiarity in Advanced SQL and programming skills (e.g., Python, Pyspark). - Familiarity with data warehousing and data modelling concepts. - Good in Data management and deployment processes using Azure Data factory and Databricks, Azure DevOps. - Knowledge on integrating every azure service with DevOps - Experience in designing and implementing scalable data architectures. - Proficient in ETL processes and tools. - Strong communication and collaboration skills. - Certifications in relevant Azure technologies are a plus Location Bangalore/ Hyderabad Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

5.0 - 6.0 years

7 - 12 Lacs

Hyderabad

Work from Office

About the Role - We are seeking a highly skilled and experienced Senior Azure Databricks Engineer to join our dynamic data engineering team. - As a Senior Azure Databricks Engineer, you will play a critical role in designing, developing, and implementing data solutions on the Azure Databricks platform. - You will be responsible for building and maintaining high-performance data pipelines, transforming raw data into valuable insights, and ensuring data quality and reliability. Key Responsibilities - Design, develop, and implement data pipelines and ETL/ELT processes using Azure Databricks. - Develop and optimize Spark applications using Scala or Python for data ingestion, transformation, and analysis. - Leverage Delta Lake for data versioning, ACID transactions, and data sharing. - Utilize Delta Live Tables for building robust and reliable data pipelines. - Design and implement data models for data warehousing and data lakes. - Optimize data structures and schemas for performance and query efficiency. - Ensure data quality and integrity throughout the data lifecycle. - Integrate Azure Databricks with other Azure services (e.g., Azure Data Factory, Azure Synapse Analytics, Azure Blob Storage). - Leverage cloud-based data services to enhance data processing and analysis capabilities. Performance Optimization & Troubleshooting - Monitor and analyze data pipeline performance. - Identify and troubleshoot performance bottlenecks. - Optimize data processing jobs for speed and efficiency. - Collaborate effectively with data engineers, data scientists, data analysts, and other stakeholders. - Communicate technical information clearly and concisely. - Participate in code reviews and contribute to the improvement of development processes. Qualifications Essential - 5+ years of experience in data engineering, with at least 2 years of hands-on experience with Azure Databricks. - Strong proficiency in Python and SQL. - Expertise in Apache Spark and its core concepts (RDDs, DataFrames, Datasets). - In-depth knowledge of Delta Lake and its features (e.g., ACID transactions, time travel). - Experience with data warehousing concepts and ETL/ELT processes. - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Bachelor's degree in Computer Science, Computer Engineering, or a related field. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

5.0 - 10.0 years

15 - 20 Lacs

Noida, Hyderabad

Work from Office

Azure data factory, Azure Databricks, SQL, Pyspark, Python, Synapse

Posted 2 months ago

Apply

8 - 12 years

18 - 25 Lacs

Pune

Work from Office

We are looking for an experienced TechLead with a deep understanding of the Microsoft Data Technology stack. The candidate should have 8-10 years of professional experience, proven leadership skills, and the ability to manage and mentor a team of 5 to 8 people. Preferred candidate profile Experience: 8-10 years in the Data and Analytics domain with expertise in the Microsoft Data Tech stack. Leadership: Experience in managing teams of 8-10 members. Technical Skills: Expertise in tools like Microsoft Fabric, Azure Synapse Analytics, Azure Data Factory, Power BI, SQL Server, Azure Databricks, etc. Strong understanding of data architecture, pipelines, and governance. Understanding of one of the other data platforms like Snowflake or Google Big query or Amazon Red shift will be a plus and good to have skill. Tech stack - DBT and Databricks or Snowflake Microsoft BI - PBI, Synapse and Fabric Project Management: Proficiency in project management methodologies (Agile, Scrum, or Waterfall). Communication: Excellent interpersonal, written, and verbal communication skills. Education: Bachelors or masters degree in computer science, Information Technology, or related field. Intersected candidates can forward your profile to karthik@busisol.net or whatsapp @ 9791876677

Posted 2 months ago

Apply

5 - 8 years

5 - 9 Lacs

Bengaluru

Work from Office

Wipro Limited (NYSEWIT, BSE507685, NSEWIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. About The Role Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ? Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ? Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Azure Data Factory. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 2 months ago

Apply

5 - 8 years

9 - 12 Lacs

Pune

Work from Office

About The Role Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ? Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ? Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Azure Synapse Analytics. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 2 months ago

Apply

5 - 8 years

9 - 14 Lacs

Pune

Work from Office

About The Role Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ? Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ? Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Azure Synapse Analytics. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 2 months ago

Apply

2 - 7 years

6 - 16 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Exciting Azure developer Job Opportunity at Infosys! We are looking for skilled Azure Developers to join our dynamic team PAN INDIA. If you have a passion for technology and a minimum of 2 to 9 years of hands-on experience in azure development, this is your chance to make an impact. At Infosys, we value innovation, collaboration, and diversity. We believe that a diverse workforce drives creativity and fosters a richer company culture. Therefore, we strongly encourage applications from all genders and backgrounds. Ready to take your career to the next level? Join us in shaping the future of technology. Visit our careers page for more details on how to apply.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies