Jobs
Interviews

378 Azure Synapse Jobs - Page 5

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: Azure Synapse Analytics. Experience:5-8 Years.

Posted 1 month ago

Apply

2.0 - 3.0 years

3 - 6 Lacs

Hyderabad

Work from Office

Detailed job description - Skill Set: Technically strong hands-on Self-driven Good client communication skills Able to work independently and good team player Flexible to work in PST hour(overlap for some hours) Past development experience for Cisco client is preferred.

Posted 1 month ago

Apply

10.0 - 16.0 years

18 - 30 Lacs

Chennai

Work from Office

Hi, We have vacancy for Sr. Data engineer. Location: Chennai , Experience: 10+ years Salary: maximum up to 30lpa We are seeking an experienced Senior Data Engineer to join our dynamic team. The ideal candidate will be responsible for Design and implement the data engineering framework. Responsibilities Strong Skill in Big Query, GCP Cloud Data Fusion (for ETL/ELT) and PowerBI. Need to have strong skill in Data Pipelines Able to work with Power BI and Power BI Reporting Design and implement the data engineering framework and data pipelines using Databricks and Azure Data Factory. Document the high-level design components of the Databricks data pipeline framework. Evaluate and document the current dependencies on the existing DEI toolset and agree a migration plan. Lead on the Design and implementation an MVP Databricks framework. Document and agree an aligned set of standards to support the implementation of a candidate pipeline under the new framework. Support integrating a test automation approach to the Databricks framework in conjunction with the test engineering function to support CI/CD and automated testing. Support the development teams capability building by establishing an L&D and knowledge transition approach. Support the implementation of data pipelines against the new framework in line with the agreed migration plan. Ensure data quality management including profiling, cleansing and deduplication to support build of data products for clients Skill Set Experience working in Azure Cloud using Azure SQL, Azure Databricks, Azure Data Lake, Delta Lake, and Azure DevOps. Proficient in Python, Pyspark and SQL coding skills. Profiling data and data modelling experience on large data transformation projects creating data products and data pipelines. Creating data management frameworks and data pipelines which are metadata and business rules driven using Databricks. Experience of reviewing datasets for data products in terms of data quality management and populating data schemas set by Data Modellers Experience with data profiling, data quality management and data cleansing tools. Immediate joining or short notice is required. Pls Call varsha 7200847046 for more info Thanks, varsha 7200847046

Posted 1 month ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Pune

Remote

Design, develop, and optimize data pipelines and ETL/ELT workflows using Microsoft Fabric, Azure Data Factory, and Azure Synapse Analytics. Implement Lakehouse and Warehouse architectures within Microsoft Fabric, supporting medallion (bronze-silver-gold) data layers. Collaborate with business and analytics teams to build scalable and reliable data models (star/snowflake) using Azure SQL, Power BI, and DAX. Utilize Azure Analysis Services, Power BI Semantic Models, and Microsoft Fabric Dataflows for analytics delivery. Very good hands-on experience with Python for data transformation and processing. Apply CI/CD best practices and manage code through Git version control. Ensure data security, lineage, and quality using data governance best practices and Microsoft Purview (if applicable). Troubleshoot and improve performance of existing data pipelines and models. Participate in code reviews, testing, and deployment activities. Communicate effectively with stakeholders across geographies and time zones. Required Skills: Strong knowledge of Azure Synapse Analytics, Azure Data Factory, Azure SQL, and Azure Analysis Services. Proficiency in Power BI and DAX for data visualization and analytics modeling. Strong Python skills for scripting and data manipulation. Experience in dimensional modeling, star/snowflake schemas, and Kimball methodologies. Familiarity with CI/CD pipelines, DevOps, and Git-based versioning. Understanding of data governance, data cataloging, and quality management practices. Excellent verbal and written communication skills.

Posted 1 month ago

Apply

4.0 - 6.0 years

4 - 9 Lacs

Pune

Remote

Azure Data Engineer The Data Engineer builds and maintains data pipelines and infrastructure within Microsoft Fabric, enabling a seamless migration from Oracle/Informatica. This offshore role requires deep expertise in data engineering techniques to support enterprise data needs. The successful candidate will excel in creating scalable data solutions. Responsibilities Develop and maintain data pipelines for Microsoft Fabric, handling ETL processes from Oracle/Informatica. Ensure seamless data flow, integrity, and performance in the new platform. Collaborate with the Offshore Data Modeler and Onsite Data Modernization Architect to align with modernization goals. Optimize code and queries for performance using tools like PySpark and SQL. Conduct unit testing and debugging to ensure robust pipeline functionality. Report technical progress and issues to the Offshore Project Manager. Skills Bachelors degree in computer science, data engineering, or a related field. 4+ years of data engineering experience with PySpark, Python, and SQL. Strong knowledge of Microsoft Fabric, Azure services (e.g., Data Lake, Synapse), and ETL processes. Experience with code versioning (e.g., Git) and optimization techniques. Ability to refactor legacy code and write unit tests for reliability. Problem-solving skills with a focus on scalability and performance.

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

You will be joining a company that values innovation and maintains an open, friendly culture while benefiting from the support of a well-established parent company with a strong ethical reputation. The company is dedicated to guiding customers towards the future by leveraging the potential of their data and applications to address digital challenges, ultimately delivering positive outcomes for both business and society. As an Infor M3 Support professional, you will play a crucial role in providing technical and functional support for the Infor M3 Cloud platform. Your responsibilities will include expertise in M3 integrations, data engineering, analytics, and cloud technologies. You will be primarily involved in L2/L3 technical support, minor enhancements, API management, and data pipeline optimizations to ensure smooth business operations across various functions such as Manufacturing, Supply Chain, Procurement, Sales, and Finance within the Food & Beverage industry. Key Requirements: - Proficiency in API Services & Integration Management, including platforms such as Azure, AWS, Kafka, and EDI - Ability to design and maintain data pipelines using tools like Azure Synapse, Databricks, or AWS Glue - Experience in supporting the personalization of M3 UI elements such as Homepages, Smart Office, Enterprise Search, and XtendM3 If you have a minimum of 8 years of experience and can join immediately, this role based in PAN India (Work From Office) could be the next step in your career.,

Posted 1 month ago

Apply

12.0 - 15.0 years

40 - 45 Lacs

New Delhi, Pune

Hybrid

Role & responsibilities Experience: 10-14 years Key skills: Azure Data Architect sql data modeling dimension data modeling databricks or synapse

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 17 Lacs

Ahmedabad

Work from Office

Greetings from Dev Information Technology Ltd ! Company Details: We are trusted as one of the leading IT enabled services provider, having a remarkable track record of consistently delivering workable and robust solutions. This becomes possible as we adopt continual innovation and remain committed to quality, implement and refine processes and leverage technological prowess. With the best software and hardware environments coupled with state-of the-art communication facilities; our offices are fully equipped to work as virtual extensions of clients environment, providing 247 services. Founded in 1997 in Ahmedabad, India one of the fastest growing metros of India Branch offices in India, USA and Canada Multi-million US$ turnover with CAGR of 20% 1000+ certified and skilled professionals serving more than 300+ clients globally Offering end-to-end solutions to meet IT and ICT needs of clients | Website : http://www.devitpl.com/ Profile Summary Designation: Project Lead (Data) Experience: 5+ Years Work Location: Ahmedabad KEY RESPONSIBILITIES Translate business needs to technical specifications Design, build and deploy BI solutions (e.g. reporting tools) Maintain and support data analytics platforms Collaborate with teams to integrate systems Develop and execute database queries and conduct analyses Create visualizations and reports for requested projects Develop and update technical documentation SKILLS AND EXPERIENCE : 5+ years of experience in designing and implementing reports/dashboards, ETL and warehouse. 3+ years of direct management/supervisory experience. In-depth understanding of Data warehousing and database concepts Understands the lifecycle for report development work In Depth understanding of BI fundamentals. Experience in Microsoft SQL Server, SSIS, SSRS, Azure Data Factory, and Azure Synapse. Experts in Power BI Define all aspects of software development from appropriate technology and workflow to coding standards Communicate successfully all concepts and guidelines to development team. Provide technical guidance and coaching to reporting team Oversee progress of report/dashboard development to ensure consistency with DW/RDBMS design. Engage with stakeholders to identify business KPIs with the correct tools/mechanism to record them and present actionable insights through reports and dashboards Proven analytical and problem-solving abilities. Communication - Excellent interpersonal and written communication skills QUALIFICATIONS AND CERTIFICATIONS: BE/ MCA/ B.Tech/ M.Tech Perks & Benefits: Health Insurance Employee rewards and recognition Flexible working hours Gratuity Professional Development Food Coupons Comprehensive Leave Benefits

Posted 1 month ago

Apply

5.0 - 8.0 years

2 - 7 Lacs

Ahmedabad

Work from Office

KEY RESPONSIBILITIES Translate business needs to technical specifications Design, build and deploy BI solutions (e.g. reporting tools) Maintain and support data analytics platforms (e.g. MicroStrategy) Create tools to store data (e.g. OLAP cubes) Collaborate with teams to integrate systems Develop and execute database queries and conduct analyses Create visualizations and reports for requested projects Design, implement, and maintain databases to ensure optimal data storage. Work closely with stakeholders to understand business requirements and translate them into technical solutions. MANDATORY SKILLS Proficiency in Power BI, Azure Data Factory, Azure Synapse, SSIS Solid understanding of Data Warehousing. Experience with database design and management (SQL). OPTIONAL SKILLS Azure Data bricks, AWS Glue ,SSAS, Azure Analysis Service.

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Ahmedabad

Work from Office

About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure Analytics Services Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : BE Summary :As an Application Lead for Packaged Application Development, you will be responsible for designing, building, and configuring applications using Microsoft Azure Analytics Services. Your typical day will involve leading the effort to deliver high-quality applications, acting as the primary point of contact for the project team, and ensuring timely delivery of project milestones. Roles & Responsibilities:- Lead the effort to design, build, and configure applications using Microsoft Azure Analytics Services.- Act as the primary point of contact for the project team, ensuring timely delivery of project milestones.- Collaborate with cross-functional teams to ensure the successful delivery of high-quality applications.- Provide technical guidance and mentorship to team members, ensuring adherence to best practices and standards. Professional & Technical Skills: - Must To Have Skills: Strong experience with Microsoft Azure Analytics Services.- Good To Have Skills: Experience with other Azure services such as Azure Data Factory, Azure Databricks, and Azure Synapse Analytics.- Experience in designing, building, and configuring applications using Microsoft Azure Analytics Services.- Must have databricks and pyspark Skills.- Strong understanding of data warehousing concepts and best practices.- Experience with ETL processes and tools such as SSIS or Azure Data Factory.- Experience with SQL and NoSQL databases.- Experience with Agile development methodologies. Additional Information:- The candidate should have a minimum of 5 years of experience in Microsoft Azure Analytics Services.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering high-quality applications.- This position is based at our Bengaluru office. Qualification BE

Posted 1 month ago

Apply

7.0 - 12.0 years

15 - 27 Lacs

Pune

Hybrid

Notice Period - Immediate joiner Responsibilities Lead, develop and support analytical pipelines to acquire, ingest and process data from multiple sources Debug, profile and optimize integrations and ETL/ELT processes Design and build data models to conform to our data architecture Collaborate with various teams to deliver effective, high value reporting solutions by leveraging an established DataOps delivery methodology Continually recommend and implement process improvements and tools for data collection, analysis, and visualization Address production support issues promptly, keeping stakeholders informed of status and resolutions Partner closely with on and offshore technical resources Provide on-call support outside normal business hours as needed Provide status updates to the stakeholders. Identify obstacles and seek assistance with enough lead time to ensure delivery on time Demonstrate technical ability, thoroughness, and accuracy in all assignments Document and communicate on proper operations, standards, policies, and procedures Keep abreast on all new tools and technologies that are related to our Enterprise data architecture Foster a positive work environment by promoting teamwork and open communication. Skills/Qualifications Bachelors degree in computer science with focus on data engineering preferable. 6+ years of experience in data warehouse development, building and managing data pipelines in cloud computing environments Strong proficiency in SQL and Python Experience with Azure cloud services, including Azure Data Lake Storage, Data Factory, and Databricks Expertise in Snowflake or similar cloud warehousing technologies Experience with GitHub, including GitHub Actions. Familiarity with data visualization tools, such as Power BI or Spotfire Excellent written and verbal communication skills Strong team player with interpersonal skills to interact at all levels Ability to translate technical information for both technical and non-technical audiences Proactive mindset with a sense of urgency and initiative Adaptability to changing priorities and needs If you are interested share your updated resume on mail - recruit5@focusonit.com. Also Request you to please spread this message across your Networks or Contacts.

Posted 1 month ago

Apply

6.0 - 10.0 years

17 - 32 Lacs

Ghaziabad, Hyderabad, Delhi / NCR

Hybrid

Job Role: Azure Data Engineer Location: Hyderabad Experience: 5 to 10 years Skill Required: Azure products and services: (Azure Data Lake Storage, Azure Data Factory, Azure Functions, Event Hub, Azure Stream Analytics, Azure Databricks, etc.). Pyspark, SQL, Python. Job Responsibilities: Work closely with source data application teams and product owners to design, implement and support analytics solutions that provide insights to make better decisions Implement data migration and data engineering solutions using Azure products and services: (Azure Data Lake Storage, Azure Data Factory, Azure Functions, Event Hub, Azure Stream Analytics, Azure Databricks, etc.) and traditional data warehouse tools Perform multiple aspects involved in the development lifecycle design, cloud engineering (Infrastructure, network, security, and administration), ingestion, preparation, data modeling, testing, CICD pipelines, performance tuning, deployments, consumption, BI, alerting, prod support Provide technical leadership and collaborate within a team environment as well as work independently Be a part of a DevOps team that completely owns and supports their product Implement batch and streaming data pipelines using cloud technologies Leads development of coding standards, best practices and privacy and security guidelines. Mentors' others on technical and domain skills to create multi-functional teams Minimum Qualifications: 1. Bachelor's degree in Computer Science, Computer Engineering, Technology, Information Systems (CIS/MIS), Engineering or related technical discipline, or equivalent experience/training 2. 3 years Data Engineering experience using SQL 3. 2 years of cloud development (prefer Microsoft Azure) including Azure EventHub, Azure Data Factory, Azure Databricks, Azure DevOps, Azure Blob Storage, Azure Power Apps and Power BI. 4. Combination of Development, Administration & Support experience in several of the following tools/platforms required: a. Scripting: Python, PySpark, Unix, SQL b. Data Platforms: Teradata, SQL Server c. Azure Data Explorer. Administration skills are a plus d. Azure Cloud Technologies

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Microsoft Azure Engineer in Bangalore (Hybrid) with 5+ years of experience, you will be responsible for building and optimizing cloud solutions on Microsoft Azure. Your expertise in Azure Synapse, Azure Data Factory, and related cloud technologies will be crucial in ensuring scalability, security, and automation. Your key responsibilities will include: Cloud Data Engineering & Processing: - Designing and optimizing ETL/ELT pipelines using Azure Synapse and Data Factory. - Developing and managing data pipelines, data lakes, and workflows within the Azure ecosystem. - Implementing data security, governance, and compliance best practices. Backend & Application Development: - Developing scalable cloud applications using Azure Functions, Service Bus, and Event Grid. - Building RESTful APIs and microservices for cloud-based data processing. - Integrating Azure services to enhance data accessibility and processing. Cloud & DevOps: - Deploying and managing solutions using Azure DevOps, CI/CD, and Infrastructure as Code (Terraform, Bicep). - Optimizing cloud costs and ensuring high availability of data platforms. - Implementing logging, monitoring, and security best practices. Required Skills & Experience: - 5+ years of experience in Azure cloud engineering and development. - Strong expertise in Azure Synapse, Data Factory, and Microsoft Fabric. - Proficiency in CI/CD, Azure DevOps, and related tools. - Experience with Infrastructure as Code (Terraform, Bicep). - Hands-on knowledge of Azure Functions, Service Bus, Event Grid, and API development. - Familiarity with SQL, T-SQL, Cosmos DB, and relational databases. - Strong experience in data security and compliance. Preferred Skills (Good to Have): - Knowledge of Databricks, Python, and ML models for data processing. - Familiarity with event-driven architectures (Kafka, Event Hubs). - Azure certifications (e.g., DP-203, AZ-204). Apply now if you are ready to leverage your expertise in Microsoft Azure to contribute to building robust cloud solutions and optimizing data processing workflows.,

Posted 1 month ago

Apply

8.0 - 13.0 years

0 Lacs

hyderabad, telangana

On-site

At Techwave, we are committed to fostering a culture of growth and inclusivity. We ensure that every individual associated with our brand is challenged at every step and provided with the necessary opportunities to excel in their professional and personal lives. People are at the core of everything we do. Techwave is a leading global IT and engineering services and solutions company dedicated to revolutionizing digital transformations. Our mission is to enable clients to maximize their potential and achieve a greater market share through a wide array of technology services, including Enterprise Resource Planning, Application Development, Analytics, Digital solutions, and the Internet of Things (IoT). Founded in 2004 and headquartered in Houston, TX, USA, Techwave leverages its expertise in Digital Transformation, Enterprise Applications, and Engineering Services to help businesses accelerate their growth. We are a team of dreamers and doers who constantly push the boundaries of what's possible, and we want YOU to be a part of it. Job Title: Data Lead Experience: 10+ Years Mode of Hire: Full-time Key Skills: As a senior-level ETL developer with 10-13 years of experience, you will be responsible for building relational and data warehousing applications. Your primary role will involve supporting the existing EDW, designing and developing various layers of our data, and testing, documenting, and optimizing the ETL process. You will collaborate within a team environment to design and develop frameworks and services according to specifications. Your responsibilities will also include preparing detailed system documentation, performing unit and system tests, coordinating with Operations staff on application deployment, and ensuring that all activities are performed with quality and compliance standards. Additionally, you will design and implement ETL batches that meet SLAs, develop data collection, staging, movement, quality, and archiving strategies, and design automation processes to control data access and movement. To excel in this role, you must have 8-10 years of ETL/ELT experience, strong SQL skills, and proficiency in Stored Procedures and database development. Experience in Azure Data Lake, Synapse, Azure Data Factory, and Databricks, as well as Snowflake, is essential. You should possess a good understanding of data warehouse ETL and ELT design best practices, be able to work independently, and have a strong database experience with DB2, SQL Server, and Azure. Furthermore, you should be adept at designing Relational and Dimensional Data models, have a good grasp of Enterprise reporting (particularly Power BI), and understand Agile practices and methodologies. Your role will also involve assisting in analyzing and extracting relevant information from historical business data to support Business Intelligence initiatives and conducting Proof of Concept for new technology selection and proposing data warehouse architecture enhancements. If you are a self-starter with the required skills and experience, we invite you to join our dynamic team at Techwave and be a part of our journey towards innovation and excellence.,

Posted 1 month ago

Apply

5.0 - 10.0 years

32 - 37 Lacs

Pune

Work from Office

: Job TitleAFC Transaction Monitoring - Senior Engineer, VP LocationPune, India Role Description You will be joining the Anti-Financial Crime (AFC) Technology team and will work as part of a multi-skilled agile squad, specializing in designing, developing, and testing engineering solutions, as well as troubleshooting and resolving technical issues to enable the Transaction Monitoring (TM) systems to identify Money Laundering or Terrorism Financing. You will have the opportunity to work on challenging problems, with large complex datasets and play a crucial role in managing and optimizing the data flows within Transaction Monitoring. You will have the opportunity to work across Cloud and BigData technologies, optimizing the performance of existing data pipelines as well as designing and creating new ETL Frameworks and solutions. You will have the opportunity to work on challenging problems, building high-performance systems to process large volumes of data, using the latest technologies. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities As a Vice President, your role will include management and leadership responsibilities, such as: Leading by example, by creating efficient ETL workflows to extract data from multiple sources, transform it according to business requirements, and load it into the TM systems. Implementing data validation and cleansing techniques to maintain high data quality and detective controls to ensure the integrity and completeness of data being prepared through our Data Pipelines. Work closely with other developers and architects to design and implement solutions that meet business needs whilst ensuring that solutions are scalable, supportable and sustainable. Ensuring that all engineering work complies with industry and DB standards, regulations, and best practices Your skills and experience Good analytical problem-solving capabilities with excellent communication skills written and oral enabling authoring of documents that will support a technical team in performing development work. Experience in Google Cloud Platform is preferred but other the cloud solutions such as AWS would be considered 5+ years experience in Oracle, Control M, Linux and Agile methodology and prior experience of working in an environment using internally engineered components (database, operating system, etc.) 5+ years experience in Hadoop, Hive, Oracle, Control M, Java development is required whilst experience in OpenShift, PySpark is preferred Strong understanding of designing and delivering complex ETL pipelines in a regulatory space How well support you

Posted 1 month ago

Apply

5.0 - 10.0 years

3 - 5 Lacs

Bengaluru, Delhi / NCR, Mumbai (All Areas)

Work from Office

Azure Databricks Developer Job Title: Azure Databricks Developer Experience: 5+ Years Location: PAN India (Remote/Hybrid as per project requirement) Employment Type: Full-time Job Summary: We are hiring an experienced Azure Databricks Developer to join our dynamic data engineering team. The ideal candidate will have strong expertise in building and optimizing big data solutions using Azure Databricks, Spark, and other Azure data services. Key Responsibilities: Design, develop, and maintain scalable data pipelines using Azure Databricks and Apache Spark. Integrate and manage large datasets using Azure Data Lake, Azure Data Factory, and other Azure services. Implement Delta Lake for efficient data versioning and performance optimization. Collaborate with cross-functional teams including data scientists and BI developers. Ensure best practices for data security, governance, and compliance. Monitor performance and troubleshoot Spark clusters and data pipelines. Skills & Requirements: Minimum 5 years of experience in data engineering with at least 2+ years in Azure Databricks. Proficiency in Apache Spark (PySpark/Scala). Strong hands-on experience with Azure services ADF, ADLS, Synapse Analytics. Expertise in building and managing ETL/ELT pipelines. Strong SQL skills and experience with performance tuning. Experience with CI/CD pipelines and Azure DevOps is a plus. Good understanding of data modeling, partitioning, and data lake architecture. Role & responsibilities Preferred candidate profile

Posted 1 month ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Gurugram

Work from Office

Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your Role Proficiency in MS Fabric,Azure Data Factory, Azure Synapse Analytics, Azure Databricks Extensive knowledge of MS Fabriccomponents: Lakehouses, OneLake, Data Pipelines, Real-Time Analytics, Power BI Integration, Semantic Model. Integrate Fabric capabilities for seamless data flow, governance, and collaborationacross teams. Strong understanding of Delta Lake, Parquet, and distributed data systems. Strong programming skills in Python, PySpark,Scalaor SparkSQL/TSQLfor data transformations. Your Profile Strong experience in implementation and management of lake House using Databricks and Azure Tech stack (ADLS Gen2, ADF, Azure SQL) . Proficiencyin data integration techniques, ETL processes and data pipeline architectures. Understanding of Machine Learning Algorithms & AI/ML frameworks (i.e TensorFlow, PyTorch)and Power BIis an added advantage MS Fabric and PySpark is must. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.

Posted 1 month ago

Apply

6.0 - 11.0 years

15 - 18 Lacs

Noida, Pune, Chennai

Work from Office

We seek an experienced Azure Data Engineer to implement robust data pipelines using Azure Data Factory, Azure Synapse Analytics, and Databricks. And also, to Manage data storage solutions using Azure Data Lake, Azure SQL Database, and Cosmos DB.

Posted 1 month ago

Apply

10.0 - 15.0 years

1 - 2 Lacs

Hyderabad

Work from Office

Experience needed: 10-15 years Type: Full-Time Mode: WFO Shift: General Shift IST Location: Hyderabad NP: Immediate Joinee - 30 days Job Summary: We are seeking a highly experienced and results-driven Power BI Architect to lead the design, development, and implementation of enterprise-level BI solutions. The ideal candidate will have deep expertise in Power BI architecture , data modeling , visualization , DAX , and Power BI/Fabric administration , along with a solid foundation in Microsoft Azure and Microsoft Entra . You will work closely with data engineers, analysts, and stakeholders to build a scalable and secure data visualization ecosystem. Key Responsibilities: Design end-to-end Power BI Architecture including data ingestion, modeling, visualization, and governance. Lead implementation of dimensional data models to support enterprise reporting and analytics needs. Develop and optimize Power BI reports and dashboards using DAX, M Language (Power Query), and advanced visualizations. Architect and manage the Power BI Service environment including workspaces, datasets, dataflows, gateways, and security. Define and implement Power BI SDLC processes including versioning, deployment pipelines, and documentation. Manage Power BI/Fabric administration tasks, including tenant settings, capacity management, and usage monitoring. Ensure best practices in report performance tuning , data refresh optimization , and data security . Collaborate with Azure teams to integrate Power BI solutions with Microsoft Azure services (Data Lake, Synapse, Data Factory, etc.). Implement Microsoft Entra (Azure AD) role-based access controls and security for BI content. Provide thought leadership and mentorship to BI developers and analysts. Stay current on Microsofts data and analytics roadmap and assess applicability to ongoing projects. Required Skills & Qualifications: Strong experience with Power BI Desktop , Power BI Service , and Power BI Premium/Fabric . Expertise in DAX and Power Query (M Language) . Proven experience with dimensional modeling and data warehousing concepts. Proficient in ETL processes and integrating data from multiple sources. Demonstrated success in leading enterprise BI implementations . Solid understanding and experience with Power BI governance , security , and lifecycle management . Experience with Microsoft Azure platform , especially Azure Data Services. Experience and Knowledge of Microsoft Entra (Azure AD) for authentication and access management. Strong communication and stakeholder management skills. Preferred Qualifications: Microsoft Certified: Power BI Data Analyst Associate or Azure Data Engineer Associate . Familiarity with DevOps and CI/CD pipelines for Power BI deployments. Experience working in Agile/Scrum environments.

Posted 1 month ago

Apply

12.0 - 18.0 years

35 - 45 Lacs

Pune, Chennai

Work from Office

Job Summary We are looking for an experienced Azure Data Architect who will lead the architecture, design, and implementation of enterprise data solutions on Microsoft Azure. The ideal candidate should have strong experience in cloud data architecture, big data solutions, and modern data platforms, with a focus on scalability, security, and performance. Key Responsibilities Architect, design, and implement end-to-end data solutions on Azure including Azure Data Factory, Synapse Analytics, Azure SQL, Data Lake Storage, and more. Define data architecture strategy , governance models, and frameworks for enterprise-grade solutions. Lead data modernization and migration projects from on-premise systems to Azure Cloud. Design data integration and transformation pipelines for batch and real-time data. Collaborate with stakeholders, data engineers, data scientists, and business analysts to align architecture with business goals. Ensure data solutions follow security, compliance, and regulatory standards. Conduct architecture reviews, performance tuning, and troubleshooting of cloud data environments. Mentor and guide data engineering and BI teams on Azure best practices. Required Skills & Experience 12+ years of overall IT experience, with at least 5+ years as a Data Architect on Azure . Hands-on expertise with: Azure Data Factory, Synapse Analytics, Azure SQL, Data Lake Gen2 Azure Databricks, Power BI, Azure Analysis Services CI/CD pipelines for data (Azure DevOps) Strong experience with data modeling, ETL/ELT design , and data warehousing concepts. Experience in building data lakes, data vaults, and lakehouse architectures . Proficiency in SQL, Python, PySpark , and other scripting languages for data workflows. Good knowledge of data security, encryption, RBAC, and compliance standards . Experience with Kafka, Event Hubs, or other streaming technologies is a plus. Strong understanding of Agile delivery, DevOps, and automation in data projects. Please note: We are strictly looking for candidates with shorter notice only.

Posted 1 month ago

Apply

5.0 - 10.0 years

16 - 30 Lacs

Bengaluru

Work from Office

About Client Hiring for One of the Most Prestigious Multinational Corporations Job Title : Azure Databricks Qualification : Any Graduate or Above Experience : 5 to 15 Yrs Location : Bangalore Key Responsibilities: Design and build scalable and robust data pipelines using Azure Databricks, PySpark, and Spark SQL. Integrate data from various structured and unstructured data sources using Azure Data Factory,ADLS, Azure Synapse, etc. Develop and maintain ETL/ELT processes for ingestion, transformation, and storage of data. Collaborate with data scientists, analysts, and other engineers to deliver data products and solutions. Monitor, troubleshoot, and optimize existing pipelines for performance and reliability. Ensure data quality,governance, and security compliance in all solutions. Participate in architectural decisions and cloud data solutioning. Required Skills: 5+ years of experience in data engineering or related fields. Strong hands-on experience with Azure Databricks and Apache Spark. Proficiency in Python (PySpark),SQL, and performance tuning techniques. Experience with Azure Data Factory, Azure Data Lake Storage (ADLS), and Azure Synapse Analytics. Solid understanding of data modeling, data warehousing, and data lakes. Familiarity with DevOps practices, CI/CD pipelines, and version control (e.g., Git). Notice period : immediate,serving notice Mode of Work : Hybrid Mode of interview: Face to Face -- Thanks & Regards Hasan Black and White Business Solutions Pvt.Ltd. Bengaluru,Karnataka,India. Number:8067432495 hasan.s@blackwhite.in |www.blackwhite.in

Posted 1 month ago

Apply

6.0 - 11.0 years

12 - 22 Lacs

Pune, Gurugram, Bengaluru

Work from Office

Warm welcome from SP Staffing Services! Reaching out to you regarding permanent opportunity !! Job Description: Exp: 6-12 yrs Location: PAN India Skill: Azure Data Factory/SSIS Interested can share your resume to sangeetha.spstaffing@gmail.com with below inline details. Full Name as per PAN: Mobile No: Alt No/ Whatsapp No: Total Exp: Relevant Exp in Data Factory: Rel Exp in Synapse: Rel Exp in SSIS: Rel Exp in Python/Pyspark: Current CTC: Expected CTC: Notice Period (Official): Notice Period (Negotiable)/Reason: Date of Birth: PAN number: Reason for Job Change: Offer in Pipeline (Current Status): Availability for virtual interview on weekdays between 10 AM- 4 PM(plz mention time): Current Res Location: Preferred Job Location: Whether educational % in 10th std, 12th std, UG is all above 50%? Do you have any gaps in between your education or Career? If having gap, please mention the duration in months/year:

Posted 1 month ago

Apply

5.0 - 10.0 years

15 - 25 Lacs

Bengaluru

Work from Office

About Client Hiring for One of the Most Prestigious Multinational Corporations Job Title : Azure Databricks Qualification : Any Graduate or Above Experience : 5 to 15 Yrs Location : Bangalore Key Responsibilities: Design and build scalable and robust data pipelines using Azure Databricks, PySpark, and Spark SQL. Integrate data from various structured and unstructured data sources using Azure Data Factory,ADLS, Azure Synapse, etc. Develop and maintain ETL/ELT processes for ingestion, transformation, and storage of data. Collaborate with data scientists, analysts, and other engineers to deliver data products and solutions. Monitor, troubleshoot, and optimize existing pipelines for performance and reliability. Ensure data quality,governance, and security compliance in all solutions. Participate in architectural decisions and cloud data solutioning. Required Skills: 5+ years of experience in data engineering or related fields. Strong hands-on experience with Azure Databricks and Apache Spark. Proficiency in Python (PySpark),SQL, and performance tuning techniques. Experience with Azure Data Factory, Azure Data Lake Storage (ADLS), and Azure Synapse Analytics. Solid understanding of data modeling, data warehousing, and data lakes. Familiarity with DevOps practices, CI/CD pipelines, and version control (e.g., Git). Notice period : immediate,serving notice Mode of Work : Hybrid Mode of interview : Face to Face -- Thanks & Regards Bhavana B Black and White Business Solutions Pvt.Ltd. Bangalore,Karnataka,India. Direct Number:8067432454 bhavana.b@blackwhite.in |www.blackwhite.in

Posted 1 month ago

Apply

5.0 - 10.0 years

0 - 1 Lacs

Hyderabad, Pune, Ahmedabad

Hybrid

Contractual (Project-Based) Notice Period: Immediate - 15 Days Fill this form: https://forms.office.com/Pages/ResponsePage.aspx?id=hLjynUM4c0C8vhY4bzh6ZJ5WkWrYFoFOu2ZF3Vr0DXVUQlpCTURUVlJNS0c1VUlPNEI3UVlZUFZMMC4u Resume- shweta.soni@panthsoftech.com

Posted 1 month ago

Apply

8.0 - 13.0 years

20 - 35 Lacs

Chennai

Work from Office

Warm Greetings from SP Staffing Services Pvt Ltd!!!! Experience:8-15yrs Work Location :Chennai Job Description: Required Technical Skill Set: Azure Native Technology, synapse and data bricks, Python Desired Experience Range: 8+ Years Location of Requirement: Chennai Required Skills: Previous experience as a data engineer or in a similar role Must have experience with MS Azure services such as Data Lake Storage, Data Factory, Databricks, Azure SQL Database, Azure Synapse Analytics, Azure Functions Technical expertise with data models, data mining, analytics and segmentation techniques Knowledge of programming languages and environments such as Python, Java, Scala, R, .NET/C# Hands-on experience with SQL database design Great numerical and analytical skills Degree in Computer Science, IT, or similar field; a master's is a plus Experience working in integrating Azure PaaS services Interested candidates, Kindly share your updated resume to ramya.r@spstaffing.in or contact number 8667784354 (Whatsapp:9597467601 )to proceed further.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies