Jobs
Interviews

1265 Azure Databricks Jobs - Page 37

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 7.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Design, develop, and implement data solutions using Microsoft Fabric , including data pipelines, data warehouses, and data marts. Develop data pipelines, data transformations, and data workflows using Microsoft Fabric.

Posted 1 month ago

Apply

8.0 - 11.0 years

11 - 22 Lacs

Hyderabad, Bengaluru

Work from Office

Company Name: Tech Mahindra Experience: 8-11 Years Location: Bangalore/Hyderabad (Hybrid Model) Interview Mode: Virtual Interview Rounds: 2-3 Rounds Notice Period: Immediate to 15 days Generic Responsibilities : Design, develop, and maintain large-scale data pipelines using Azure Data Factory (ADF) to extract, transform, and load data from various sources into Azure Databricks. Collaborate with cross-functional teams to gather requirements and design solutions for complex business problems. Develop high-quality code in PySpark to process large datasets stored in Azure Data Lake Storage. Troubleshoot issues related to ADF pipeline failures and optimize performance for improved efficiency. Generic Requirements : 8-11 years of experience as an Azure Data Engineer or similar role. Strong expertise in Azure Data Factory (ADF), Azure Databricks, and PySpark programming languages. Experience working on big data processing projects involving ETL processes using Spark-based technologies.

Posted 1 month ago

Apply

4.0 - 8.0 years

5 - 12 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Hiring for Azure Data Engineer and also the client is looking for Immediate joiners who can join in 30 days.

Posted 1 month ago

Apply

14.0 - 20.0 years

25 - 40 Lacs

Hyderabad

Hybrid

Senior Cloud Architect Requisition ID 133521 Posting Start Date 03-Jun-2025 Posting End Date 30-Jun-2025 Job Details About the Role Are you looking for an exciting opportunity in Solution Architecture? Are you passionate about everything Azure Cloud? Then join us as a Senior Cloud Architect Your main responsibilities: • Design and deliver Azure solution architecture for an application workload. • Design disaster recovery and backup plans based on RTO, RPO and other non-functional requirements to deliver resilient solution on Public Cloud • Assisting engineering teams delivering infrastructure architecture and designing cloud solutions. • Build and maintain relationship with Application teams, understanding the context and assisting in achieving respective cloud transformation roadmap • Engage with subject matter experts in Security, Enterprise Architecture and Governance teams to contribute and develop cloud technology roadmap and adherence to best practices. About you The following proven technical skills will be required: • Expertise in designing app workload using Cloud Platform Services (SaaS, and PaaS). • Expertise in technology selection based on Architecture decision records and other standards within the organization. • Expertise in Azure AI Services (Foundry, ML OpenAI, Anomaly detection, Bot Services, LUIS) , AKS (Azure Kubernetes Service), App Services, DataBricks, ADF (Azure Data Factory), ASB (Azure Service Bus), EventHub, KV (Key Vault), SA (Storage Account), Container Registry, Azure Functions, Redis, LogicApps, Azure Firewall, VNET (Virtual Network), Private Endpoint, Service Endpoint, SQL Server, CosmosDB, MongoDB • Experience in designing IaC using Terraform on Azure DevOps. • Expertise in designing Azure disaster recovery and backup scenarios meeting NFRs. • Azure Well-Architected framework or Cloud design patterns. • Experience in one or more programming languages: .Net / C#, Java, Python or Ruby • Experience in Azure landing zone design, platform automation design, DevSecOps tooling, network topology & connectivity, access management & privileged identity design, platform monitoring, security architecture, high availability architecture design. • Experience in DevOps methodology (preferably DevSecOps), both technically and organisationally, including continuous deployment, delivery pipelines and test environments. • Good stakeholder communication. Ability to work with Product Owner, Product Manager, Architects and engineering teams. • At ease working in a transformational and complex environment at a fast-pace and getting things done. • Proficiency in English is required. About Swiss Re Swiss Re is one of the worlds leading providers of reinsurance, insurance and other forms of insurance-based risk transfer, working to make the world more resilient. We anticipate and manage a wide variety of risks, from natural catastrophes and climate change to cybercrime. We cover both Property & Casualty and Life & Health. Combining experience with creative thinking and cutting-edge expertise, we create new opportunities and solutions for our clients. This is possible thanks to the collaboration of more than 14,000 employees across the world. Our success depends on our ability to build an inclusive culture encouraging fresh perspectives and innovative thinking. We embrace a workplace where everyone has equal opportunities to thrive and develop professionally regardless of their age, gender, race, ethnicity, gender identity and/or expression, sexual orientation, physical or mental ability, skillset, thought or other characteristics. In our inclusive and flexible environment everyone can bring their authentic selves to work and their passion for sustainability. If you are an experienced professional returning to the workforce after a career break, we encourage you to apply for open positions that match your skills and experience. Keywords: Reference Code: 133521 Interesterd candidates Contact: 7207997185

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Coimbatore

Work from Office

MKS Vision Pvt Ltd About us: MKS Vision is a full spectrum of Information Technology and engineering service provider. We exist to provide increased efficiencies and flexibility that accelerate business performance by adapting the latest cutting-edge technologies for our customers. Our services bring tangible benefits to our customers. MKS Vision will assist you in adopting global services. Website: https://www.mksvision.com/ Job Location: Coimbatore Below are the roles and associated skills we are looking for: Risk Data Analyst Knowledge of lending data systems and data structures (credit applications, loan origination, collections, payments, dialers, credit bureau data) Proficient in SQL for ETL Strong knowledge in PowerBI. Experience in the use of tools for ETL Scheduling / Automation Proficient in Data standardization and cleanup, ensure data integrity and diagnostic Knowledge of Database schema design and normalization Knowledge of cloud-based systems (Azure) for data warehouse building Intermediate knowledge of Python for data cleanup and transformation Knowledge of data documentation and exception/error handling and remediation Ability to build / test / deploy APIs to ingest data from internal and external databases Proficient in JSON, XML and text formats to ingest, parse, transform and load databases Preferred minimum 7+ years of experience, BS degree on computer science, management information systems, statistics, data science, etc., or similar experience.

Posted 1 month ago

Apply

8.0 - 12.0 years

15 - 27 Lacs

Mumbai, Pune, Bengaluru

Work from Office

Role & responsibilities : Job Description: Primarily looking for a Data Engineer (AWS) with expertise in processing data pipelines using Data bricks, PySpark SQL on Cloud distributions like AWS Must have AWS Data bricks ,Good-to-have PySpark, Snowflake, Talend Requirements- • Candidate must be experienced working in projects involving • Other ideal qualifications include experiences in • Primarily looking for a data engineer with expertise in processing data pipelines using Databricks Spark SQL on Hadoop distributions like AWS EMR Data bricks Cloudera etc. • Should be very proficient in doing large scale data operations using Databricks and overall very comfortable using Python • Familiarity with AWS compute storage and IAM concepts • Experience in working with S3 Data Lake as the storage tier • Any ETL background Talend AWS Glue etc. is a plus but not required • Cloud Warehouse experience Snowflake etc. is a huge plus • Carefully evaluates alternative risks and solutions before taking action. • Optimizes the use of all available resources • Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit • Skills • Hands on experience on Databricks Spark SQL AWS Cloud platform especially S3 EMR Databricks Cloudera etc. • Experience on Shell scripting • Exceptionally strong analytical and problem-solving skills • Relevant experience with ETL methods and with retrieving data from dimensional data models and data warehouses • Strong experience with relational databases and data access methods especially SQL • Excellent collaboration and cross functional leadership skills • Excellent communication skills both written and verbal • Ability to manage multiple initiatives and priorities in a fast-paced collaborative environment • Ability to leverage data assets to respond to complex questions that require timely answers • has working knowledge on migrating relational and dimensional databases on AWS Cloud platform Skills Mandatory Skills: Apache Spark, Databricks, Java, Python, Scala, Spark SQL. Note : Need only Immediate joiners/ Serving notice period. Interested candidates can apply. Regards, HR Manager

Posted 1 month ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Hyderabad

Work from Office

Overview Seeking an Associate Manager, Data Operations, to support our growing data organization. In this role, you will assist in maintaining data pipelines and corresponding platforms (on-prem and cloud) while working closely with global teams on DataOps initiatives. Support the day-to-day operations of data pipelines, ensuring data governance, reliability, and performance optimization on Microsoft Azure. Hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and real-time streaming architectures is preferred. Assist in ensuring the availability, scalability, automation, and governance of enterprise data pipelines supporting analytics, AI/ML, and business intelligence. Contribute to DataOps programs, aligning with business objectives, data governance standards, and enterprise data strategy. Help implement real-time data observability, monitoring, and automation frameworks to improve data reliability, quality, and operational efficiency. Support the development of governance models and execution roadmaps to enhance efficiency across Azure, AWS, GCP, and on-prem environments. Work on CI/CD integration, data pipeline automation, and self-healing capabilities to improve enterprise-wide DataOps processes. Collaborate with cross-functional teams to support and maintain next-generation Data & Analytics platforms while promoting an agile and high-performing DataOps culture. Assist in the adoption of Data & Analytics technology transformations, ensuring automation for proactive issue identification and resolution. Partner with cross-functional teams to support process improvements, best practices, and operational efficiencies within DataOps. Responsibilities Assist in the implementation and optimization of enterprise-scale data pipelines using Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and Azure Stream Analytics. Support data ingestion, transformation, orchestration, and storage workflows, ensuring data reliability, integrity, and availability. Help ensure seamless batch, real-time, and streaming data processing, focusing on high availability and fault tolerance. Contribute to DataOps automation efforts, including CI/CD for data pipelines, automated testing, and version control using Azure DevOps and Terraform. Collaborate with Data Engineering, Analytics, AI/ML, CloudOps, and Business Intelligence teams to support data-driven decision-making. Assist in aligning DataOps practices with regulatory and security requirements by working with IT, data stewards, and compliance teams. Support data operations and sustainment activities, including testing and monitoring processes for global products and projects. Participate in data capture, storage, integration, governance, and analytics efforts, working alongside cross-functional teams. Assist in managing day-to-day DataOps activities, ensuring adherence to service-level agreements (SLAs) and business requirements. Engage with SMEs and business stakeholders to ensure data platform capabilities align with business needs. Contribute to Agile work intake and execution processes, helping to maintain efficiency in data platform teams. Help troubleshoot and resolve issues related to cloud infrastructure and data services in collaboration with technical teams. Support the development and automation of operational policies and procedures, improving efficiency and resilience. Assist in incident response and root cause analysis, contributing to self-healing mechanisms and mitigation strategies. Foster a customer-centric approach, advocating for operational excellence and continuous improvement in service delivery. Help build a collaborative, high-performing team culture, promoting automation and efficiency within DataOps. Adapt to shifting priorities and support cross-functional teams in maintaining productivity and achieving business goals. Utilize technical expertise in cloud and data operations to support service reliability and scalability. Qualifications 5+ years of technology work experience in a large-scale global organization, with CPG industry experience preferred. 5+ years of experience in Data & Analytics roles, with hands-on expertise in data operations and governance. 2+ years of experience working within a cross-functional IT organization, collaborating with multiple teams. Experience in a lead or senior support role, with a focus on DataOps execution and delivery. Strong communication skills, with the ability to collaborate with stakeholders and articulate technical concepts to non-technical audiences. Analytical and problem-solving abilities, with a focus on prioritizing customer needs and operational improvements. Customer-focused mindset, ensuring high-quality service delivery and operational efficiency. Growth mindset, with a willingness to learn and adapt to new technologies and methodologies in a fast-paced environment. Experience supporting data operations in a Microsoft Azure environment, including data pipeline automation. Familiarity with Site Reliability Engineering (SRE) principles, such as monitoring, automated issue remediation, and scalability improvements. Understanding of operational excellence in complex, high-availability data environments. Ability to collaborate across teams, building strong relationships with business and IT stakeholders. Basic understanding of data management concepts, including master data management, data governance, and analytics. Knowledge of data acquisition, data catalogs, data standards, and data management tools. Strong execution and organizational skills, with the ability to follow through on operational plans and drive measurable results. Adaptability in a dynamic, fast-paced environment, with the ability to shift priorities while maintaining productivity.

Posted 2 months ago

Apply

3.0 - 5.0 years

14 - 19 Lacs

Hyderabad

Work from Office

Overview Seeking an Associate Manager, Data Operations, to support our growing data organization. In this role, you will assist in maintaining data pipelines and corresponding platforms (on-prem and cloud) while working closely with global teams on DataOps initiatives. Support the day-to-day operations of data pipelines, ensuring data governance, reliability, and performance optimization on Microsoft Azure. Hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and real-time streaming architectures is preferred. Assist in ensuring the availability, scalability, automation, and governance of enterprise data pipelines supporting analytics, AI/ML, and business intelligence. Contribute to DataOps programs, aligning with business objectives, data governance standards, and enterprise data strategy. Help implement real-time data observability, monitoring, and automation frameworks to improve data reliability, quality, and operational efficiency. Support the development of governance models and execution roadmaps to enhance efficiency across Azure, AWS, GCP, and on-prem environments. Work on CI/CD integration, data pipeline automation, and self-healing capabilities to improve enterprise-wide DataOps processes. Collaborate with cross-functional teams to support and maintain next-generation Data & Analytics platforms while promoting an agile and high-performing DataOps culture. Assist in the adoption of Data & Analytics technology transformations, ensuring automation for proactive issue identification and resolution. Partner with cross-functional teams to support process improvements, best practices, and operational efficiencies within DataOps. Responsibilities Assist in the implementation and optimization of enterprise-scale data pipelines using Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and Azure Stream Analytics. Support data ingestion, transformation, orchestration, and storage workflows, ensuring data reliability, integrity, and availability. Help ensure seamless batch, real-time, and streaming data processing, focusing on high availability and fault tolerance. Contribute to DataOps automation efforts, including CI/CD for data pipelines, automated testing, and version control using Azure DevOps and Terraform. Collaborate with Data Engineering, Analytics, AI/ML, CloudOps, and Business Intelligence teams to support data-driven decision-making. Assist in aligning DataOps practices with regulatory and security requirements by working with IT, data stewards, and compliance teams. Support data operations and sustainment activities, including testing and monitoring processes for global products and projects. Participate in data capture, storage, integration, governance, and analytics efforts, working alongside cross-functional teams. Assist in managing day-to-day DataOps activities, ensuring adherence to service-level agreements (SLAs) and business requirements. Engage with SMEs and business stakeholders to ensure data platform capabilities align with business needs. Contribute to Agile work intake and execution processes, helping to maintain efficiency in data platform teams. Help troubleshoot and resolve issues related to cloud infrastructure and data services in collaboration with technical teams. Support the development and automation of operational policies and procedures, improving efficiency and resilience. Assist in incident response and root cause analysis, contributing to self-healing mechanisms and mitigation strategies. Foster a customer-centric approach, advocating for operational excellence and continuous improvement in service delivery. Help build a collaborative, high-performing team culture, promoting automation and efficiency within DataOps. Adapt to shifting priorities and support cross-functional teams in maintaining productivity and achieving business goals. Utilize technical expertise in cloud and data operations to support service reliability and scalability. Qualifications 5+ years of technology work experience in a large-scale global organization, with CPG industry experience preferred. 5+ years of experience in Data & Analytics roles, with hands-on expertise in data operations and governance. 2+ years of experience working within a cross-functional IT organization, collaborating with multiple teams. Experience in a lead or senior support role, with a focus on DataOps execution and delivery. Strong communication skills, with the ability to collaborate with stakeholders and articulate technical concepts to non-technical audiences. Analytical and problem-solving abilities, with a focus on prioritizing customer needs and operational improvements. Customer-focused mindset, ensuring high-quality service delivery and operational efficiency. Growth mindset, with a willingness to learn and adapt to new technologies and methodologies in a fast-paced environment. Experience supporting data operations in a Microsoft Azure environment, including data pipeline automation. Familiarity with Site Reliability Engineering (SRE) principles, such as monitoring, automated issue remediation, and scalability improvements. Understanding of operational excellence in complex, high-availability data environments. Ability to collaborate across teams, building strong relationships with business and IT stakeholders. Basic understanding of data management concepts, including master data management, data governance, and analytics. Knowledge of data acquisition, data catalogs, data standards, and data management tools. Strong execution and organizational skills, with the ability to follow through on operational plans and drive measurable results. Adaptability in a dynamic, fast-paced environment, with the ability to shift priorities while maintaining productivity.

Posted 2 months ago

Apply

5.0 - 9.0 years

14 - 19 Lacs

Hyderabad

Work from Office

Overview Seeking an Associate Manager, Data Operations, to support our growing data organization. In this role, you will assist in maintaining data pipelines and corresponding platforms (on-prem and cloud) while working closely with global teams on DataOps initiatives. Support the day-to-day operations of data pipelines, ensuring data governance, reliability, and performance optimization on Microsoft Azure. Hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and real-time streaming architectures is preferred. Assist in ensuring the availability, scalability, automation, and governance of enterprise data pipelines supporting analytics, AI/ML, and business intelligence. Contribute to DataOps programs, aligning with business objectives, data governance standards, and enterprise data strategy. Help implement real-time data observability, monitoring, and automation frameworks to improve data reliability, quality, and operational efficiency. Support the development of governance models and execution roadmaps to enhance efficiency across Azure, AWS, GCP, and on-prem environments. Work on CI/CD integration, data pipeline automation, and self-healing capabilities to improve enterprise-wide DataOps processes. Collaborate with cross-functional teams to support and maintain next-generation Data & Analytics platforms while promoting an agile and high-performing DataOps culture. Assist in the adoption of Data & Analytics technology transformations, ensuring automation for proactive issue identification and resolution. Partner with cross-functional teams to support process improvements, best practices, and operational efficiencies within DataOps. Responsibilities Assist in the implementation and optimization of enterprise-scale data pipelines using Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and Azure Stream Analytics. Support data ingestion, transformation, orchestration, and storage workflows, ensuring data reliability, integrity, and availability. Help ensure seamless batch, real-time, and streaming data processing, focusing on high availability and fault tolerance. Contribute to DataOps automation efforts, including CI/CD for data pipelines, automated testing, and version control using Azure DevOps and Terraform. Collaborate with Data Engineering, Analytics, AI/ML, CloudOps, and Business Intelligence teams to support data-driven decision-making. Assist in aligning DataOps practices with regulatory and security requirements by working with IT, data stewards, and compliance teams. Support data operations and sustainment activities, including testing and monitoring processes for global products and projects. Participate in data capture, storage, integration, governance, and analytics efforts, working alongside cross-functional teams. Assist in managing day-to-day DataOps activities, ensuring adherence to service-level agreements (SLAs) and business requirements. Engage with SMEs and business stakeholders to ensure data platform capabilities align with business needs. Contribute to Agile work intake and execution processes, helping to maintain efficiency in data platform teams. Help troubleshoot and resolve issues related to cloud infrastructure and data services in collaboration with technical teams. Support the development and automation of operational policies and procedures, improving efficiency and resilience. Assist in incident response and root cause analysis, contributing to self-healing mechanisms and mitigation strategies. Foster a customer-centric approach, advocating for operational excellence and continuous improvement in service delivery. Help build a collaborative, high-performing team culture, promoting automation and efficiency within DataOps. Adapt to shifting priorities and support cross-functional teams in maintaining productivity and achieving business goals. Utilize technical expertise in cloud and data operations to support service reliability and scalability. Qualifications 5+ years of technology work experience in a large-scale global organization, with CPG industry experience preferred. 5+ years of experience in Data & Analytics roles, with hands-on expertise in data operations and governance. 2+ years of experience working within a cross-functional IT organization, collaborating with multiple teams. Experience in a lead or senior support role, with a focus on DataOps execution and delivery. Strong communication skills, with the ability to collaborate with stakeholders and articulate technical concepts to non-technical audiences. Analytical and problem-solving abilities, with a focus on prioritizing customer needs and operational improvements. Customer-focused mindset, ensuring high-quality service delivery and operational efficiency. Growth mindset, with a willingness to learn and adapt to new technologies and methodologies in a fast-paced environment. Experience supporting data operations in a Microsoft Azure environment, including data pipeline automation. Familiarity with Site Reliability Engineering (SRE) principles, such as monitoring, automated issue remediation, and scalability improvements. Understanding of operational excellence in complex, high-availability data environments. Ability to collaborate across teams, building strong relationships with business and IT stakeholders. Basic understanding of data management concepts, including master data management, data governance, and analytics. Knowledge of data acquisition, data catalogs, data standards, and data management tools. Strong execution and organizational skills, with the ability to follow through on operational plans and drive measurable results. Adaptability in a dynamic, fast-paced environment, with the ability to shift priorities while maintaining productivity.

Posted 2 months ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru

Work from Office

We are looking for a Strategic thinker, have ability grasp new technologies, innovate, develop, and nurture new solutions. a self-starter to work in a diverse and fast-paced environment to support, maintain and advance the capabilities of the unified data platform. This a global role that requires partnering with the broader JLLT team at the country, regional and global level by utilizing in-depth knowledge of cloud infrastructure technologies and platform engineering experience. Responsibilities Working with the application teams to prioritize new requests for functionality. Specifically, new user-facing functionality (e.g., the ability to ingest IoT data, subscription-based consumption, etc.) Addressing internal functionality (e.g., monitoring and alerting based on application performance, automated testing frameworks, etc.) Managing respective support queues (e.g., Ingest, Prepare, Storage and Consume, etc.) Note: agreed upon SLAs will be established post burn-in period Manage backlog via effective sprint planning based on feedback from the application teams. Mentoring and coaching the application teams on tools, technology and design patterns. Ensuring that the production environment is well built and that there is a clear escalation path for production issues Ensuring solution architecture meets JLL's requirements including, but not limited to, those regarding cloud spend, scalability, performance, etc. Developing infrastructure that is scalable, reliable, and monitored. Building a relationship with Cloud providers, to take advantage of their most appropriate technology offerings. Collaborating with the application team leads to ensure that the application teams' needs are met through the CI/CD framework, component monitoring and stats, incident escalation. Lead teams for a discovery and architecture workshop, influence client architects, and IT personnel Guide other architects working with you in the team. Adapt communications and approaches to conclude technical scope discussions with various Partners, resulting in Common Agreements. Deliver an optimized infrastructure services design leveraging public, private, and hybrid Cloud architectures and services Act as subject matter and implementation expert for the client as related to technical architecture and implementation of proposed solution using Cloud Services Inculcating "infrastructure as code" mentality in the Platform team overall. Create and maintain incident management requests to product group/engineering group. Analyse Complex application landscapes, anticipate potential problems and future trends, assess potential solutions, Impacts, and risks to propose cloud roadmap & solution architecture Develop and implement cloud architecture solutions based on AWS/Azure/GCP Cloud when assigned to work on delivery projects. Analyse client requirements, propose for overall Application modernization, migrations and green field implementations Experience in implementing and deploying a DevOps based, end to end cloud application. Sounds like you? To apply you need to be: Experience & Education Bachelors degree in Information Science, Computer Science, Mathematics, Statistics or a quantitative discipline in science, business, or social science. Worked with Cloud delivery team to provide technical solutions services roadmap for customer. Knowledge on creating IaaS and PaaS cloud solutions in Azure Platform that meet customer needs for scalability, reliability, and performance Technical Skills & Competencies Must have experience in application development on Azure Cloud. Must possess business level understanding of enterprise application systems to drive innovations and transformations. Should have very good understanding of Cloud Native services and how they ascribe to application requirements. Minimum of 3-5 years of relevant experience with, API ingestion, file ingestion, batch transformation, metadata management, monitoring, pub/sub consumption, RDBMS ingestion and real-time transformation. Minimum of 3-5 years using the following technology or equivalent:GithubActionns, Azure DevOps, Azure Functions, Azure Batch using Python, C#, or NodeJS, Azure APIM , Azure Event Hub, Azure Data Lake Storage (Gen 2), Azure Monitor, Azure Table Storage, Azure Databricks, Azure SQL Database, Azure Search, Azure Cosmo Data Store, and Azure SignalR. Work with infrastructure team and deploy applications on cloud using blue green or brown field deployments. Ability to provide holistic and right scale cloud solutions that addresses scalability, availability, service continuity (DR), and performance and security requirements. Help customers by supporting scalable and highly available applications leveraging cloud services. Scripting and Automation skills using CLI, Python, PowerShell Clear understanding of IAM roles and policies and how to attach them to business entities and users. Provide deep development knowledge with respect to cloud architecture, design patterns. Design understanding and experience in RDBMS, NoSQL and RDS. Exposure to PaaS technologies and Containers like Docker, Kubernetes. Should understand costing oerent cloud services. Should have experience with Azure Cloud Infrastructure. Should have experience with CI CD tools such as Azure Devops, GitHub, Github Actions. Understanding of Application architecture and Enterprise Architecture is a must. What we can do for you: Youll join an entrepreneurial, inclusive culture. One where we succeed together across the desk and around the globe. Where like-minded people work naturally together to achieve great things. Our Total Rewards program reflects our commitment to helping you achieve your ambitions in career, recognition, well-being, benefits and pay. Join us to develop your strengths and enjoy a fulfilling career full of varied experiences. Keep those ambitions in sights and imagine where JLL can take you... Apply today! Location: On-site Bengaluru, KA Scheduled Weekly Hours: 40 If this job description resonates with you, we encourage you to apply even if you dont meet all of the requirements. Were interested in getting to know you and what you bring to the table! JLL Privacy Notice Jones Lang LaSalle (JLL), together with its subsidiaries and affiliates, is a leading global provider of real estate and investment management services. We take our responsibility to protect the personal information provided to us seriously. Generally the personal information we collect from you are for the purposes of processing in connection with JLLs recruitment process. We endeavour to keep your personal information secure with appropriate level of security and keep for as long as we need it for legitimate business or legal reasons. We will then delete it safely and securely. For more information about how JLL processes your personal data, please view our . For additional details please see our career site pages for each country. For candidates in the United States, please see a full copy of our Equal Employment Opportunity and Affirmative Action policy . Jones Lang LaSalle (JLL) is an Equal Opportunity Employer and is committed to working with and providing reasonable accommodations to individuals with disabilities. If you need a reasonable accommodation because of a disability for any part of the employment process including the online application and/or overall selection process you may contact us at . This email is only to request an accommodation. Please direct any other general recruiting inquiries to our page > I want to work for JLL.

Posted 2 months ago

Apply

10.0 - 19.0 years

16 - 31 Lacs

Kochi

Remote

10+y of exp with >2 years into azure Azure Databricks,Data Lakehouse,Azure Data Factory, Azure SQL,Spark,PySpark,Python Optimizing data workflows,data solutions,automating CI/CD tools,data governance Design,implement pipelines using Databricks,Spark

Posted 2 months ago

Apply

5.0 - 10.0 years

20 - 30 Lacs

Hyderabad

Remote

Hiring for Top MNC: For long term contract Data Engineer - Palantir Technical Capability Foundry Certified (Data Engineering) Foundry Certified (Foundational) Time Series Data Equipment & Sensors - O&G Context and Engineering Ontology Manager Pipeline Builder Data Linerage Object Explorer Python & Spark (PySpark) -specifically PySpark which is the extension of the big data platform Spark that Foundry uses. SQL Mesa (Palantir proprietary language) Experience: 5+ Years Soft Skills: Strong Communication Skills (focus on O&G enginnering) abilty to engage with multiple Product Manager's . Ability to work independenty and voice of authority Interested candidates can share their resume: tejasri.m@i-q.co

Posted 2 months ago

Apply

2.0 - 8.0 years

19 - 23 Lacs

Hyderabad

Work from Office

In this role you will be joining the Enterprise Data Solutions team, within the Digital & Information Technology organization. Driven by a passion for operational excellence and innovation, we partner with all functional groups to provide the expertise and technologies which will enable the company to digitalize, simplify, and scale for the future. We are seeking an experienced Sr. Data Engineer to join our Enterprise Data Solutions team. The ideal candidate will have a strong background in data engineering, data analysis, business intelligence, and data management. This role will be responsible for the ingestion, processing, and storage of data in our Azure Databricks Data Lake and SQL Server data warehouses. OVERVIEW: The Enterprise Data Solutions team provides Skillsoft with the data backbone needed to seamlessly connect systems and enable data-driven business insights through democratized and analytics-ready data sets. Our mission is to: Deliver analytics-ready data sets that enhance business insights, drive decision making, and foster a culture of data-driven innovation. Set a gold standard for process, collaboration, and communication. OPPORTUNITY HIGHLIGHTS: Lead the identification of business data requirements, create data models and design processes that align to the business logic and regularly communicate with business stakeholders to ensure delivery meets business needs. Design ETL processes, develop source-to-target mappings/integration workflows and manage load processes to support regular and ad hoc activities considering the needs of down-stream systems, functions and visualizations. Work with the latest open-source tools, libraries, platforms and languages to build data products enabling other analysts to explore and interact with large and complex data sets Build robust systems and reusable code modules to solve problems across the team and organization with an eye on the long-term maintenance and support of the application Perform routine testing of own and others’ work to guarantee accurate, complete processes that support business needs. Awareness and compliance with all organizational development standards, industry best practices and business, security, privacy, and retention requirements. Routinely monitor performance, diagnose and implement tuning/optimization strategies to guarantee a highly efficient data structure. Collaborate with other engineers through active participation in code reviews and challenge the team to deliver with precision, consistency and speed. Document data flows and technical designs to ensure compliance with organization, business and security best practices. Regularly monitor timelines and workload. Ensure delivery promises are met or exceeded. Ability and willingness to support the BI mission through learning new technologies and supporting other projects as needed. Provides code reviews and technical guidance to the team. Collaborate closely with the SA and TPO and get the requirements and develop the enterprise solutions SKILLS & QUALIFICATIONS: Bachelor’s degree in quantitative field – engineering, finance, data science, statistics, economics, or other quantitative. 5+ years of experience in Data Engineering/Data Management space and working with enterprise level production data warehouses. 5+ years of experience in working with Azure Databricks 5+ years experience in SQL and PySpark Ability to work in an Agile methodology environment. Experience and interest in cloud migration/journey to the cloud for data platforms and landscape Strong business acumen, analytical skills, and technical abilities Practical problem-solving skills and ability to move complex projects forward.

Posted 2 months ago

Apply

2.0 - 5.0 years

6 - 16 Lacs

Kolkata, Hyderabad, Pune

Hybrid

Role- Azure Databricks Developer skill- Databrick, Azure Databricks, Azure Databricks Engineer exp_ 2+yrs location- PAN INDIA

Posted 2 months ago

Apply

2.0 - 5.0 years

6 - 16 Lacs

Kolkata, Pune, Bengaluru

Hybrid

Role- Azure Databricks Developer skill- Databrick, Azure Databricks, Azure Databricks Engineer exp_ 2+yrs location- PAN INDIA

Posted 2 months ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Chennai

Hybrid

Join our Mega Tech Recruitment Drive at TaskUs Chennai - where bold ideas, real impact, and ridiculous innovation come together. Who are we hiring for? We are hiring for Developers, Senior Developers, Leads, Architects, and more. When is it happening? 24th June 2025, 9 AM to 4 PM IST. Which skills are we hiring for? Dot Net Full Stack: AWS/Azure + Angular/React/Vue.Js Oracle Fusion: Functional Finance (AP, AR, GL, CM and Tax) Senior Data Engineer: Tableau Dashboard / Clikview / PowerBi, Azure Databricks, PySpark, Databricks SQL, JupyterHub/ PyCharm. SQL Server Database Administrator: SQL Server Admin (Both Cloud & On-Prem) Workday Integration Developer: Workday integration tools (Studio, EIB), Workday Matrix, XML, XSLT Workday Configuration Lead Developer: Workday configuration tools (Studio, EIB), Workday Matrix, XML, XSLT, xPath, Simple, Matrix, Composite, Advanced About TaskUs: TaskUs is a provider of outsourced digital services and next-generation customer experience to fast-growing technology companies, helping its clients represent, protect and grow their brands. Leveraging a cloud-based infrastructure, TaskUs serves clients in the fastest-growing sectors, including social media, e-commerce, gaming, streaming media, food delivery, ride-sharing, HiTech, FinTech, and HealthTech. The People First culture at TaskUs has enabled the company to expand its workforce to approximately 45,000 employees globally. Presently, we have a presence in twenty-three locations across twelve countries, which include the Philippines, India, and the United States. What We Offer: At TaskUs, we prioritize our employees' well-being by offering competitive industry salaries and comprehensive benefits packages. Our commitment to a People First culture is reflected in the various departments we have established, including Total Rewards, Wellness, HR, and Diversity. We take pride in our inclusive environment and positive impact on the community. Moreover, we actively encourage internal mobility and professional growth at all stages of an employee's career within TaskUs. Join our team today and experience firsthand our dedication to supporting People First.

Posted 2 months ago

Apply

10.0 - 12.0 years

11 - 15 Lacs

Hyderabad

Work from Office

Job Information Job Opening ID ZR_2063_JOB Date Opened 17/11/2023 Industry Technology Job Type Work Experience 10-12 years Job Title Azure Data Architect City Hyderabad Province Telangana Country India Postal Code 500003 Number of Positions 4 LocationCoimbatore & Hyderabad : Key-Azure+ SQL+ ADF+ Databricks +design+ Architecture( Mandate) Total experience in data management area for 10 + years with Azure cloud data platform experience Architect with Azure stack (ADLS, AALS, Azure Data Bricks, Azure Streaming Analytics Azure Data Factory, cosmos DB & Azure synapse) & mandatory expertise on Azure streaming Analytics, Data Bricks, Azure synapse, Azure cosmos DB Must have worked experience in large Azure Data platform and dealt with high volume Azure streaming Analytics Experience in designing cloud data platform architecture, designing large scale environments 5 plus Years of experience architecting and building Cloud Data Lake, specifically Azure Data Analytics technologies and architecture is desired, Enterprise Analytics Solutions, and optimising real time 'big data' data pipelines, architectures and data sets. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 2 months ago

Apply

6.0 - 10.0 years

4 - 7 Lacs

Chennai

Work from Office

Job Information Job Opening ID ZR_1666_JOB Date Opened 19/12/2022 Industry Technology Job Type Work Experience 6-10 years Job Title Azure Data Engineer City Chennai Province Tamil Nadu Country India Postal Code 600001 Number of Positions 4 Azure Data Factory Azure Databricks Azure SQL database Synapse Analytics Logic App Azure Functions Azure Analysis Service Active Directory Azure Devops Python Pyspark check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 2 months ago

Apply

5.0 - 8.0 years

5 - 9 Lacs

Mumbai

Work from Office

Job Information Job Opening ID ZR_1624_JOB Date Opened 08/12/2022 Industry Technology Job Type Work Experience 5-8 years Job Title Azure ADF & Power BI Developer City Mumbai Province Maharashtra Country India Postal Code 400001 Number of Positions 4 Roles & Responsibilities: Resource must have 5+ years of hands on experience in Azure Cloud development (ADF + DataBricks) - mandatory Strong in Azure SQL and good to have knowledge on Synapse / Analytics Experience in working on Agile Project and familiar with Scrum/SAFe ceremonies. Good communication skills - Written & Verbal Can work directly with customer Ready to work in 2nd shift Good in communication and flexible Defines, designs, develops and test software components/applications using Microsoft Azure- Data-bricks, ADF, ADL, Hive, Python, Data bricks, SparkSql, PySpark. Expertise in Azure Data Bricks, ADF, ADL, Hive, Python, Spark, PySpark Strong T-SQL skills with experience in Azure SQL DW Experience handling Structured and unstructured datasets Experience in Data Modeling and Advanced SQL techniques Experience implementing Azure Data Factory Pipelines using latest technologies and techniques. Good exposure in Application Development. The candidate should work independently with minimal supervision check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 2 months ago

Apply

5.0 - 8.0 years

2 - 6 Lacs

Pune

Work from Office

Job Information Job Opening ID ZR_2098_JOB Date Opened 13/01/2024 Industry Technology Job Type Contract Work Experience 5-8 years Job Title DCT Data Engineer City Pune City Province Maharashtra Country India Postal Code 411001 Number of Positions 4 LocationsPune, Bangalore, Indore Work modeWork from Office Informatica data quality - idq Azure databricks Azure data lake Azure Data Factory Api integration check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 2 months ago

Apply

3.0 - 7.0 years

10 - 20 Lacs

Kochi

Hybrid

Skills and attributes for success 3 to 7 years of Experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases, NoSQL, and data warehouse solutions Extensive hands-on experience implementing data migration and data processing using Azure services: Databricks, ADLS, Azure Data Factory, Azure Functions, Synapse/DW, Azure SQL DB, etc. Hands on experience in programming like python/pyspark Need to have good knowledge on DWH concepts and implementation knowledge on Snowflake Well versed in DevOps and CI/CD deployments Must have hands on experience in SQL and procedural SQL languages Strong analytical skills and enjoys solving complex technical problems Please apply on the below link for further interview process. https://careers.ey.com/job-invite/1537161/

Posted 2 months ago

Apply

8.0 - 13.0 years

8 - 18 Lacs

Pune

Remote

Data Engineer with good experience in Azure Data Engineer :- Pyspark: Python : Azure Data Bricks: Azure Data Factory :- SQL:- No Hyderabad & Bangalore Candidates

Posted 2 months ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Chennai, Bengaluru

Work from Office

Immediate hiring for Azure Data Engineers/ Lead - Hexaware Technologies Primary Skill Set - Azure Databricks, Pyspark Required Total Exp : 4 to 12yrs Location : Chennai & Bangalore only Work Mode : 5 Days work from office Shift Timing : 1 pm to 10pm Notice : Immediate & Early joiners only preferred Job Description: Primary: Azure Databricks, ADF, Pyspark/Python Must Have • 6+ Years of IT experience in Datawarehouse and ETL • Hands-on data experience on Cloud Technologies on Azure, ADF, Synapse, Pyspark/Python • Ability to understand Design, Source to target mapping (STTM) and create specifications documents • Flexibility to operate from client office locations • Able to mentor and guide junior resources, as needed Nice to Have • Any relevant certifications • Banking experience on RISK & Regulatory OR Commercial OR Credit Cards/Retail Interested candidates, Kindly share your updated resume to ramyar2@hexaware.com with below required details. Full Name: Contact No: Total Exp: Rel Exp in PLSQL: Current & Joining Location: Notice Period (If serving mention LWD): Current CTC: Expected CTC:

Posted 2 months ago

Apply

5.0 - 10.0 years

18 - 25 Lacs

Hyderabad, Bengaluru

Work from Office

• Must have: Azure Data Factory (ADF) o Experience with Python / Pyspark- Mandatory o Databricks - Mandatory o Excellent knowledge of ADF. Ability to develop and configure complex ADF pipel

Posted 2 months ago

Apply

10.0 - 13.0 years

6 - 8 Lacs

Hyderabad

Work from Office

Azure Data Architect

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies