Jobs
Interviews

1265 Azure Databricks Jobs - Page 32

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 7.0 years

10 - 17 Lacs

Pune

Hybrid

Hi Greeting for the Day! We found your profile suitable for the below opening, kindly go through the JD and reach out to us if you are interested. About Us Incorporated in 2006, We are an 18 year old recruitment and staffing company, we are a provider of manpower for some of the fortune 500 companies for junior/ Middle/ Executive talent. About Client Hiring for One of the Most Prestigious Multinational Corporations! Job Description Job Title : Azure Data Engineer Qualification : Any Graduate or Above Relevant Experience : 4+Yrs Required skills: Azure Databricks Python PySpark SQL AZURE Cloud PowerBI {Basic+Debug} Location : Pune CTC Range : 10-17 (Lakhs Per Annum) Mode of Work : Hybrid Joel. IT Staff. Black and White Business solutions PVT Ltd Bangalore, Karnataka, INDIA. 8067432416 I joel.manivasan@blackwhite.in I www.blackwhite.in

Posted 1 month ago

Apply

5.0 - 10.0 years

8 - 14 Lacs

Bengaluru

Work from Office

Notice period : Immediate to 30 Days Job Description : We are seeking an experienced Database Administrator (DBA) to join our dynamic team. The ideal candidate will have extensive knowledge in relational database management systems, with a strong focus on optimization, administration, and support. Key Responsibilities : - Administer and maintain both standalone and clustered database environments. - Optimize database performance through efficient indexing, partitioning strategies, and query optimization techniques. - Manage and support relational databases (e.g., MySQL, MS SQL). - Ensure data integrity, availability, security, and scalability of databases. - Implement and manage data storage solutions like S3/Parquet and Delta tables. - Monitor database performance, troubleshoot issues, and implement solutions. - Collaborate with development teams to design and implement database solutions. - Implement and manage database backups, restores, and recovery models. - Perform routine database maintenance tasks such as upgrades, patches, and migrations. - Document database processes, procedures, and configurations. Requirements : Required : - 5-10 years of proven experience as a Database Administrator (DBA). - Strong understanding of database management principles. - Proficiency in relational databases (e.g., MySQL, MS SQL). - Experience in optimizing database performance and implementing efficient query processing. - Familiarity with Linux environments and basic administration tasks. - Knowledge of Parquet file structures as relational data stores. Preferred : - Experience with RDS (AWS) and Databricks (AWS). - Understanding of Databricks and Unity Catalogue. - Experience with S3/Parquet and Delta tables. - Knowledge of Apache Drill and Trino DB connectors. - Prior experience with Hadoop, Parquet file formats, Impala, and HIVE.

Posted 1 month ago

Apply

5.0 - 10.0 years

15 - 25 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

We are looking for a skilled Data Engineer with expertise in Azure Data Factory, ADLS, SQL, PySpark, Python and Azure Databricks to design, build and optimize data pipelines. Also ensuring efficient data ingestion, transformation and storage solutions. Key Responsibilities: Desing and Develop data pipelines using Azure Data Factory and SSIS Manage Cloud Data Storage and Processing (AWS S3, ADLS etc) Write complex SQL queries, optimize performance Process large datasets using PySpark Develop scripts for data processing, automation and API integration using Python Develop Databrikcs notebook, manage workflows and implement delta lake for data transactions Pipeline orchestration and Monitoring Knowledge on CI/CD using Azure DevOps /Github

Posted 1 month ago

Apply

4.0 - 9.0 years

20 - 30 Lacs

Pune, Bengaluru

Hybrid

Job role & responsibilities:- Understanding operational needs by collaborating with specialized teams Supporting key business operations. This involves in supporting architecture designing and improvements ,Understanding Data integrity and Building Data Models, Designing and implementing agile, scalable, and cost efficiency solution. Lead a team of developers, implement Sprint planning and executions to ensure timely deliveries Technical Skills, Qualification and experience required:- Proficient in Data Modelling 4-10 years of experience in Data Modelling. Exp in Data modeling tools ( Tool - Erwin). building ER diagram Hands on experince into ERwin / Visio tool Hands-on Expertise in Entity Relationship, Dimensional and NOSQL Modelling Familiarity with manipulating dataset using using Python. Exposure of Azure Cloud Services (Azure DataFactory, Azure Devops and Databricks) Exposure to UML Tool like Ervin/Visio Familiarity with tools such as Azure DevOps, Jira and GitHub Analytical approaches using IE or other common Notations Strong hands-on experience on SQL Scripting Bachelors/Master's Degree in Computer Science or related field Experience leading agile scrum, sprint planning and review sessions Good communication and interpersonal skills Good communication skills to coordinate between business stakeholders & engineers Strong results-orientation and time management True team player who is comfortable working in a global team Ability to establish relationships with stakeholders quickly in order to collaborate on use cases Autonomy, curiosity and innovation capability Comfortable working in a multidisciplinary team within a fast-paced environment * Immediate Joiners will be preferred only

Posted 1 month ago

Apply

5.0 - 9.0 years

7 - 17 Lacs

Pune

Work from Office

Job Overview: Diacto is looking for a highly capable Data Architect with 5 to 9 years of experience to lead cloud data platform initiatives with a primary focus on Snowflake and Azure Data Hub. This individual will play a key role in defining the data architecture strategy, implementing robust data pipelines, and enabling enterprise-grade analytics solutions. This is an on-site role based in our Baner, Pune office. Qualifications: B.E./B.Tech in Computer Science, IT, or related discipline MCS/MCA or equivalent preferred Key Responsibilities: Design and implement enterprise-level data architecture with a strong focus on Snowflake and Azure Data Hub Define standards and best practices for data ingestion, transformation, and storage Collaborate with cross-functional teams to develop scalable, secure, and high-performance data pipelines Lead Snowflake environment setup, configuration, performance tuning, and optimization Integrate Azure Data Services with Snowflake to support diverse business use cases Implement governance, metadata management, and security policies Mentor junior developers and data engineers on cloud data technologies and best practices Experience and Skills Required: 5 to 9 years of overall experience in data architecture or data engineering roles Strong, hands-on expertise in Snowflake , including design, development, and performance tuning Solid experience with Azure Data Hub and Azure Data Services (Data Lake, Synapse, etc.) Understanding of cloud data integration techniques and ELT/ETL frameworks Familiarity with data orchestration tools such as DBT, Airflow , or Azure Data Factory Proven ability to handle structured, semi-structured, and unstructured data Strong analytical, problem-solving, and communication skills Nice to Have: Certifications in Snowflake and/or Microsoft Azure Experience with CI/CD tools like GitHub for code versioning and deployment Familiarity with real-time or near-real-time data ingestion Why Join Diacto Technologies? Work with a cutting-edge tech stack and cloud-native architectures Be part of a data-driven culture with opportunities for continuous learning Collaborate with industry experts and build transformative data solutions Competitive salary and benefits with a collaborative work environment in Baner, Pune How to Apply: Option 1 (Preferred) Copy and paste the following link on your browser and submit your application for automated interview process : - https://app.candidhr.ai/app/candidate/gAAAAABoRrcIhRQqJKDXiCEfrQG8Rtsk46Etg4-K8eiwqJ_GELL6ewSC9vl4BjaTwUAHzXZTE3nOtgaiQLCso_vWzieLkoV9Nw==/ Option 2 1. Please visit our website's career section at https://www.diacto.com/careers/ 2. Scroll down to the " Who are we looking for ?" section 3. Find the listing for " Data Architect (Snowflake) " and 4. Proceed with the virtual interview by clicking on " Apply Now ."

Posted 1 month ago

Apply

4.0 - 9.0 years

5 - 15 Lacs

Chandigarh, Pune, Bengaluru

Work from Office

Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. • You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. • You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. • You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills: Technology->Cloud Platform->Azure Analytics Services->Azure Data Lake Preferred Skills: Technology->Cloud Platform->Azure Analytics Services->Azure Data Lake

Posted 1 month ago

Apply

4.0 - 9.0 years

5 - 15 Lacs

Kolkata, Hyderabad, Bengaluru

Hybrid

Roles and Responsibilities Design, develop, test, deploy, and maintain scalable cloud-based solutions on Microsoft Azure platform. Collaborate with cross-functional teams to identify business requirements and design technical solutions. Implement DevOps practices using Azure Pipelines to ensure efficient deployment of applications. Troubleshoot issues related to application performance, scalability, and security. Stay up-to-date with latest technologies and best practices in Azure development. Desired Candidate Profile 4-9 years of experience as an Azure Developer or similar role. Bachelor's degree in BCA (Any Specialization) or equivalent from a recognized university. Strong understanding of Azure services including App Service, Functions, Logic Apps, Key Vault etc. . Experience with containerization using Docker and orchestration using Kubernetes.

Posted 1 month ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Educational Requirements MCA,MSc,Bachelor of Engineering,BBA,BCom,BSc Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Technical and Professional Requirements: Primary skills:Technology->Cloud Platform->Azure Development & Solution Architecting Preferred Skills: Technology->Cloud Platform->Azure Development & Solution Architecting

Posted 1 month ago

Apply

6.0 - 10.0 years

15 - 20 Lacs

Pune, Bengaluru, Delhi / NCR

Work from Office

What impact will you make? Every day, your work will make an impact that matters, while you thrive in a dynamic culture of inclusion, collaboration and high performance. As the undisputed leader in professional services, Deloitte is where you will find unrivaled opportunities to succeed and realize your full potential The Team Deloittes AI&D practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning. Work youll do Location: Bangalore/Mumbai/Pune/Delhi/Chennai/Hyderabad/Kolkata Roles: Databricks Data Engineering Senior Consultant We are seeking highly skilled Databricks Data Engineers to join our data modernization team. You will play a pivotal role in designing, developing, and maintaining robust data solutions on the Databricks platform. Your experience in data engineering, along with a deep understanding of Databricks, will be instrumental in building solutions to drive data-driven decision-making across a variety of customers. Mandatory Skills: Databricks, Spark, Python / SQL Responsibilities • Design, develop, and optimize data workflows and notebooks using Databricks to ingest, transform, and load data from various sources into the data lake. • Build and maintain scalable and efficient data processing workflows using Spark (PySpark or Spark SQL) by following coding standards and best practices. • Collaborate with technical and business stakeholders to understand data requirements and translate them into technical solutions. • Develop data models and schemas to support reporting and analytics needs. • Ensure data quality, integrity, and security by implementing appropriate checks and controls. • Monitor and optimize data processing performance, identifying, and resolving bottlenecks. • Stay up to date with the latest advancements in data engineering and Databricks technologies. Qualifications • Bachelors or masters degree in any field • 6-10 years of experience in designing, implementing, and maintaining data solutions on Databricks • Experience with at least one of the popular cloud platforms – Azure, AWS or GCP • Experience with ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes • Knowledge of data warehousing and data modelling concepts • Experience with Python or SQL • Experience with Delta Lake • Understanding of DevOps principles and practices • Excellent problem-solving and troubleshooting skills • Strong communication and teamwork skills Your role as a leader At Deloitte India, we believe in the importance of leadership at all levels. We expect our people to embrace and live our purpose by challenging themselves to identify issues that are most important for our clients, our people, and for society and make an impact that matters. In addition to living our purpose, Senior Consultant across our organization: Develop high-performing people and teams through challenging and meaningful opportunities Deliver exceptional client service; maximize results and drive high performance from people while fostering collaboration across businesses and borders Influence clients, teams, and individuals positively, leading by example and establishing confident relationships with increasingly senior people Understand key objectives for clients and Deloitte; align people to objectives and set priorities and direction. Acts as a role model, embracing and living our purpose and values, and recognizing others for the impact they make How you will grow At Deloitte, our professional development plan focuses on helping people at every level of their career to identify and use their strengths to do their best work every day. From entry-level employees to senior leaders, we believe there is always room to learn. We offer opportunities to help build excellent skills in addition to hands-on experience in the global, fast-changing business world. From on-the-job learning experiences to formal development programs at Deloitte University, our professionals have a variety of opportunities to continue to grow throughout their career. Explore Deloitte University, The Leadership Centre. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our purpose Deloitte is led by a purpose: To make an impact that matters . Every day, Deloitte people are making a real impact in the places they live and work. We pride ourselves on doing not only what is good for clients, but also what is good for our people and the Communities in which we live and work—always striving to be an organization that is held up as a role model of quality, integrity, and positive change. Learn more about Deloitte's impact on the world Recruiter tips We want job seekers exploring opportunities at Deloitte to feel prepared and confident. To help you with your interview, we suggest that you do your research: know some background about the organization and the business area you are applying to. Check out recruiting tips from Deloitte professionals.

Posted 1 month ago

Apply

3.0 - 5.0 years

20 - 25 Lacs

Bengaluru

Hybrid

Company name: PulseData labs Pvt Ltd (captive Unit for URUS, USA) About URUS We are the URUS family (US), a global leader in products and services for Agritech. SENIOR DATA ENGINEER This role is responsible for the design, development, and maintenance of data integration and reporting solutions. The ideal candidate will possess expertise in Databricks and strong skills in SQL Server, SSIS and SSRS, and experience with other modern data engineering tools such as Azure Data Factory. This position requires a proactive and results-oriented individual with a passion for data and a strong understanding of data warehousing principles. Responsibilities Data Integration Design, develop, and maintain robust and efficient ETL pipelines and processes on Databricks. Troubleshoot and resolve Databricks pipeline errors and performance issues. Maintain legacy SSIS packages for ETL processes. Troubleshoot and resolve SSIS package errors and performance issues. Optimize data flow performance and minimize data latency. Implement data quality checks and validations within ETL processes. Databricks Development Develop and maintain Databricks pipelines and datasets using Python, Spark and SQL. Migrate legacy SSIS packages to Databricks pipelines. Optimize Databricks jobs for performance and cost-effectiveness. Integrate Databricks with other data sources and systems. Participate in the design and implementation of data lake architectures. Data Warehousing Participate in the design and implementation of data warehousing solutions. Support data quality initiatives and implement data cleansing procedures. Reporting and Analytics Collaborate with business users to understand data requirements for department driven reporting needs. Maintain existing library of complex SSRS reports, dashboards, and visualizations. Troubleshoot and resolve SSRS report issues, including performance bottlenecks and data inconsistencies. Collaboration and Communication Comfortable in entrepreneurial, self-starting, and fast-paced environment, working both independently and with our highly skilled teams. Collaborate effectively with business users, data analysts, and other IT teams. Communicate technical information clearly and concisely, both verbally and in writing. Document all development work and procedures thoroughly. Continuous Growth Keep abreast of the latest advancements in data integration, reporting, and data engineering technologies. Continuously improve skills and knowledge through training and self-learning. This job description reflects managements assignment of essential functions; it does not prescribe or restrict the tasks that may be assigned. Requirements Bachelor's degree in computer science, Information Systems, or a related field. 2+ years of experience in data integration and reporting. Extensive experience with Databricks, including Python, Spark, and Delta Lake. Strong proficiency in SQL Server, including T-SQL, stored procedures, and functions. Experience with SSIS (SQL Server Integration Services) development and maintenance. Experience with SSRS (SQL Server Reporting Services) report design and development. Experience with data warehousing concepts and best practices. Experience with Microsoft Azure cloud platform and Microsoft Fabric desirable. Strong analytical and problem-solving skills. Excellent communication and interpersonal skills. Ability to work independently and as part of a team. Experience with Agile methodologies.

Posted 1 month ago

Apply

10.0 - 20.0 years

25 - 40 Lacs

Pune, Chennai

Work from Office

Experience: Minimum 10+ years in AI/ML or data science, with at least 5 years in a leadership role. Proven experience in banking or financial services is highly preferred. Hands-on with AI/ML frameworks (e.g., TensorFlow, PyTorch, Scikit-learn) and tools (Python, SQL, Spark). Experience in Azure ML tools: Databricks etc. Experience with end-to-end model lifecycle management and MLOps. Minimum Qualification: Masters Degree/PhD in Computer Science, Econometrics, Statistics, or related fields.

Posted 1 month ago

Apply

7.0 - 10.0 years

20 - 30 Lacs

Bengaluru

Hybrid

Company name: PulseData labs Pvt Ltd (captive Unit for URUS, USA) About URUS We are the URUS family (US), a global leader in products and services for Agritech. SENIOR DATA ENGINEER This role is responsible for the design, development, and maintenance of data integration and reporting solutions. The ideal candidate will possess expertise in Databricks and strong skills in SQL Server, SSIS and SSRS, and experience with other modern data engineering tools such as Azure Data Factory. This position requires a proactive and results-oriented individual with a passion for data and a strong understanding of data warehousing principles. Responsibilities Data Integration Design, develop, and maintain robust and efficient ETL pipelines and processes on Databricks. Troubleshoot and resolve Databricks pipeline errors and performance issues. Maintain legacy SSIS packages for ETL processes. Troubleshoot and resolve SSIS package errors and performance issues. Optimize data flow performance and minimize data latency. Implement data quality checks and validations within ETL processes. Databricks Development Develop and maintain Databricks pipelines and datasets using Python, Spark and SQL. Migrate legacy SSIS packages to Databricks pipelines. Optimize Databricks jobs for performance and cost-effectiveness. Integrate Databricks with other data sources and systems. Participate in the design and implementation of data lake architectures. Data Warehousing Participate in the design and implementation of data warehousing solutions. Support data quality initiatives and implement data cleansing procedures. Reporting and Analytics Collaborate with business users to understand data requirements for department driven reporting needs. Maintain existing library of complex SSRS reports, dashboards, and visualizations. Troubleshoot and resolve SSRS report issues, including performance bottlenecks and data inconsistencies. Collaboration and Communication Comfortable in entrepreneurial, self-starting, and fast-paced environment, working both independently and with our highly skilled teams. Collaborate effectively with business users, data analysts, and other IT teams. Communicate technical information clearly and concisely, both verbally and in writing. Document all development work and procedures thoroughly. Continuous Growth Keep abreast of the latest advancements in data integration, reporting, and data engineering technologies. Continuously improve skills and knowledge through training and self-learning. This job description reflects managements assignment of essential functions; it does not prescribe or restrict the tasks that may be assigned. Requirements Bachelor's degree in computer science, Information Systems, or a related field. 7+ years of experience in data integration and reporting. Extensive experience with Databricks, including Python, Spark, and Delta Lake. Strong proficiency in SQL Server, including T-SQL, stored procedures, and functions. Experience with SSIS (SQL Server Integration Services) development and maintenance. Experience with SSRS (SQL Server Reporting Services) report design and development. Experience with data warehousing concepts and best practices. Experience with Microsoft Azure cloud platform and Microsoft Fabric desirable. Strong analytical and problem-solving skills. Excellent communication and interpersonal skills. Ability to work independently and as part of a team. Experience with Agile methodologies.

Posted 1 month ago

Apply

5.0 - 10.0 years

13 - 17 Lacs

Pune

Work from Office

Project description You will be working in a global team that manages and performs a global technical control. You'll be joining Assets Management team which is looking after asset management data foundation and operates a set of in-house developed tooling. As an IT engineer you'll play an important role in ensuring the development methodology is followed, and lead technical design discussions with the architects. Our culture centers around partnership with our businesses, transparency, accountability and empowerment, and passion for the future. Responsibilities Design, develop and maintain/support Power BI workflows to take data from multiple sources to make it ready for analytics and reporting. Optimize existing workflows to ensure performance, scalability and reliability. Support the automation of manual processes to improve operational efficiency. Document workflows, processes, and best practices for knowledge sharing. Provide training and mentorship to other team members on Alteryx development. Collaborate with other members of the team to deliver data solutions for the program. Skills Must have Proficiency in Power BI Desktop, Power BI Service (5+ yrs of experience) Experience with creating interactive dashboards, custom visuals, and reports. Data Modeling: Strong understanding of data modeling concepts, including relationships, calculated columns, measures, and hierarchies. Expertise in using DAX (Data Analysis Expressions) for complex calculations. SQL and Database Management: Proficiency in SQL to extract, manipulate, and analyze data from databases. Knowledge of database design and querying. ETL (Extract, Transform, Load) Tools: Experience with data transformation and cleaning using tools like Power Query, SSIS, or other ETL tools. Nice to have Data Architecture & EngineeringDesign and implement efficient and scalable data warehousing solutions using Azure Databricks and Microsoft Fabric. Business Intelligence & Data VisualizationCreate insightful Power BI dashboards to help drive business decisions. Other Languages EnglishC1 Advanced Seniority Senior

Posted 1 month ago

Apply

0.0 - 4.0 years

9 - 13 Lacs

Pune

Work from Office

Project description You will be working in a global team that manages and performs a global technical control. You'll be joining Assets Management team which is looking after asset management data foundation and operates a set of in-house developed tooling. As an IT engineer you'll play an important role in ensuring the development methodology is followed, and lead technical design discussions with the architects. Our culture centers around partnership with our businesses, transparency, accountability and empowerment, and passion for the future. Responsibilities Design, develop, and maintain scalable data solutions using Starburst. Collaborate with cross-functional teams to integrate Starburst with existing data sources and tools. Optimize query performance and ensure data security and compliance. Implement monitoring and alerting systems for data platform health. Stay updated with the latest developments in data engineering and analytics. Skills Must have Bachelor's degree or Masters in a related technical field; or equivalent related professional experience. Prior experience as a Software Engineer applying new engineering principles to improve existing systems including leading complex, well defined projects. Strong knowledge of Big-Data Languages including: SQL Hive Spark/Pyspark Presto Python Strong knowledge of Big-Data Platforms, such as:o The Apache Hadoop ecosystemo AWS EMRo Qubole or Trino/Starburst Good knowledge and experience in cloud platforms such as AWS, GCP, or Azure. Continuous learner with the ability to apply previous experience and knowledge to quickly master new technologies. Demonstrates the ability to select among technology available to implement and solve for need. Able to understand and design moderately complex systems. Understanding of testing and monitoring tools. Ability to test, debug, fix issues within established SLAs. Experience with data visualization tools (e.g., Tableau, Power BI). Understanding of data governance and compliance standards. Nice to have Data Architecture & EngineeringDesign and implement efficient and scalable data warehousing solutions using Azure Databricks and Microsoft Fabric. Business Intelligence & Data VisualizationCreate insightful Power BI dashboards to help drive business decisions. Other Languages EnglishC1 Advanced Seniority Senior

Posted 1 month ago

Apply

6.0 - 9.0 years

4 - 8 Lacs

Gurugram

Work from Office

esign, implement, and maintain data pipelines for data ingestion, processing, and transformation in Azure. Work together with data scientists and analysts to understand the needs for data and create effective data workflows. Create and maintain data storage solutions including Azure SQL Database, Azure Data Lake, and Azure Blob Storage. Utilizing Azure Data Factory or comparable technologies, create and maintain ETL (Extract, Transform, Load) operations. Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Improve the scalability, efficiency, and cost-effectiveness of data pipelines. Monitoring and resolving data pipeline problems will guarantee consistency and availability of the data. Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications.1. Applies scientific methods to analyse and solve software engineering problems.2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance.3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers.4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities.5. The software engineer collaborates and acts as team player with other software engineers and stakeholders.

Posted 1 month ago

Apply

6.0 - 9.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in Azure. Work together with data scientists and analysts to understand the needs for data and create effective data workflows. Create and maintain data storage solutions including Azure SQL Database, Azure Data Lake, and Azure Blob Storage. Utilizing Azure Data Factory or comparable technologies, create and maintain ETL (Extract, Transform, Load) operations. Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Improve the scalability, efficiency, and cost-effectiveness of data pipelines. Monitoring and resolving data pipeline problems will guarantee consistency and availability of the data. Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications.1. Applies scientific methods to analyse and solve software engineering problems.2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance.3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers.4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities.5. The software engineer collaborates and acts as team player with other software engineers and stakeholders.

Posted 1 month ago

Apply

3.0 - 5.0 years

3 - 6 Lacs

Bengaluru

Work from Office

Dear Candidates, Greetings!! We are hiring for one of the Globalized Product Based & Motor Vehicle Manufacturing MNC. Job Type: FTE Job Role:- Data Analyst Experience: 3 to 5 Years Location: Bangalore Work Mode: Work from office Notice Period: Immediate to 30 days Budget: As Per Market Standards Mandatory Skills:- Azure Data bricks, Power BI, Tableau, SQL, Python Interested candidates can share their updated resume on Gurpreet@selectiveglobalsearch.com

Posted 1 month ago

Apply

12.0 - 15.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure Databricks Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders. You will also engage in problem-solving activities, providing guidance and support to your team members while ensuring that best practices are followed throughout the development process. Your role will be pivotal in driving innovation and efficiency within the application development lifecycle. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate training and mentorship for junior team members to enhance their skills and knowledge.- Monitor project progress and implement necessary adjustments to meet deadlines and quality standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Databricks.- Strong understanding of cloud computing principles and architecture.- Experience with data integration and ETL processes.- Familiarity with big data technologies and frameworks.- Ability to design and implement scalable data solutions. Additional Information:- The candidate should have minimum 12 years of experience in Microsoft Azure Databricks.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

12.0 - 14.0 years

20 - 30 Lacs

Indore, Hyderabad

Work from Office

Microsoft Fabric Data engineer CTC Range 12 14 Years Location – Hyderabad/Indore Notice Period - Immediate * Primary Skill Microsoft Fabric Secondary Skill 1 Azure Data Factory (ADF) 12+ years of experience in Microsoft Azure Data Engineering for analytical projects. Proven expertise in designing, developing, and deploying high-volume, end-to-end ETL pipelines for complex models, including batch, and real-time data integration frameworks using Azure, Microsoft Fabric and Databricks. Extensive hands-on experience with Azure Data Factory, Databricks (with Unity Catalog), Azure Functions, Synapse Analytics, Data Lake, Delta Lake, and Azure SQL Database for managing and processing large-scale data integrations. Experience in Databricks cluster optimization and workflow management to ensure cost-effective and high-performance processing. Sound knowledge of data modelling, data governance, data quality management, and data modernization processes. Develop architecture blueprints and technical design documentation for Azure-based data solutions. Provide technical leadership and guidance on cloud architecture best practices, ensuring scalable and secure solutions. Keep abreast of emerging Azure technologies and recommend enhancements to existing systems. Lead proof of concepts (PoCs) and adopt agile delivery methodologies for solution development and delivery. www.yash.com 'Information transmitted by this e-mail is proprietary to YASH Technologies and/ or its Customers and is intended for use only by the individual or entity to which it is addressed, and may contain information that is privileged, confidential or exempt from disclosure under applicable law. If you are not the intended recipient or it appears that this mail has been forwarded to you without proper authority, you are notified that any use or dissemination of this information in any manner is strictly prohibited. In such cases, please notify us immediately at info@yash.com and delete this mail from your records.

Posted 1 month ago

Apply

6.0 - 10.0 years

15 - 20 Lacs

Pune

Work from Office

Education: Bachelors or masters degree in computer science, Information Technology, Engineering, or a related field. Experience: 6-10 years 8+ years of experience in data engineering or a related field. Strong hands-on experience with Azure Databricks , Spark , Python/Scala, CICD, Scripting for data processing. Experience working in multiple file formats like Parquet , Delta , and Iceberg . Knowledge of Kafka or similar streaming technologies for real-time data ingestion. Experience with data governance and data security in Azure. Proven track record of building large-scale data ingestion and ETL pipelines in cloud environments, specifically Azure. Deep understanding of Azure Data Services (e.g., Azure Blob Storage, Azure Data Lake, Azure SQL Data Warehouse, Event Hubs, Functions etc.). Familiarity with data lakes , data warehouses , and modern data architectures. Experience with CI/CD pipelines , version control (Git), Jenkins and agile methodologies. Understanding of cloud infrastructure and architecture principles (especially within Azure ). Technical Skills: Expert-level proficiency in Spark, SPARK Streaming , including optimization, debugging, and troubleshooting Spark jobs. Solid knowledge of Azure Databricks for scalable, distributed data processing. Strong coding skills in Python and Scala for data processing. Experience working with SQL , especially for large datasets. Knowledge of data formats like Iceberg , Parquet , ORC , and Delta Lake . Leadership Skills: Proven ability to lead and mentor a team of data engineers, ensuring adherence to best practices. Excellent communication skills, capable of interacting with both technical and non-technical stakeholders. Strong problem-solving, analytical, and troubleshooting abilities.

Posted 1 month ago

Apply

2.0 - 7.0 years

5 - 15 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Job description Hiring for Azure developer with experience range 2 to 9 years Mandatory Skills: Azure, ADF, ADB, Azure synapse Education: BE/B.Tech/BCA/B.SC/MCA/M.Tech/MSc./MS Location: Pan India Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Microsoft Azure Databricks Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in developing solutions that align with organizational goals and enhance operational efficiency. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to understand project requirements and deliver high-quality solutions.- Develop and maintain applications using Microsoft Azure Databricks.- Troubleshoot and debug applications to ensure optimal performance.- Implement best practices for application development and deployment.- Stay updated with the latest technologies and trends in application development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Databricks.- Strong understanding of cloud computing principles and services.- Experience with data processing and analytics using Azure services.- Knowledge of programming languages such as Python, Scala, or SQL.- Hands-on experience in building and deploying applications on Azure cloud platform. Additional Information:- The candidate should have a minimum of 3 years of experience in Microsoft Azure Databricks.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure Databricks Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education with Engineering or equivalent Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders to drive project success. You will also engage in problem-solving activities, providing guidance and support to your team while ensuring that best practices are followed throughout the development process. Your role will be pivotal in shaping the direction of application projects and ensuring that they meet the highest standards of quality and functionality. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and development opportunities for team members to enhance their skills.- Monitor project progress and implement necessary adjustments to ensure timely delivery. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Databricks.- Good To Have Skills: Experience with cloud computing platforms.- Strong understanding of application development methodologies.- Experience in managing cross-functional teams.- Familiarity with Agile and DevOps practices. Additional Information:- The candidate should have minimum 5 years of experience in Microsoft Azure Databricks.- This position is based at our Bengaluru office.- A 15 years full time education with Engineering or equivalent is required. Qualification 15 years full time education with Engineering or equivalent

Posted 1 month ago

Apply

5.0 - 9.0 years

12 - 22 Lacs

Mohali

Remote

Company Overview: As Ensemble Health Partners Company, we're at the forefront of innovation, leveraging cutting-edge technology to drive meaningful impact in the Revenue Cycle Management landscape. Our future-forward technology combines tightly integrated data ingestion, workflow automation and business intelligence solutions on a modern cloud architecture. We have the second-largest share in the RCM space in the US Market with 10000+ professionals working in the organization. With 10 Technology Patents in our name, we believe the best results come from a combination of skilled and experienced team, proven and repeatable processes, and modern and flexible technologies. As a leading player in the industry, we offer an environment that fosters growth, creativity, and collaboration, where your expertise will be valued, and your contributions will make a difference. Role & responsibilities : Experience : 5-9 Years Location : remote/wfh Position Summary : Design and maintain scalable data pipelines, manage ETL processes and data warehouses, ensure data quality and governance, collaborate with cross-functional teams, support machine learning deployment, lead projects, mentor juniors, work with big data and cloud technologies, and bring expertise in Spark, Databricks, Streaming/Reactive/Event-driven systems, Agentic programming, and LLM application development. Required Skills : Spark, Databricks, Streaming/Reactive /Event driven, Agentic programming & LLM Application Experience 5+ years of coding experience with Microsoft SQL. 3+ years working with big data technologies including but not limited to Databricks, Apache Spark, Python, Microsoft Azure (Data Factory, Dataflows, Azure Functions, Azure Service Bus) with a willingness and ability to learn new ones Excellent understanding of engineering fundamentals: testing automation, code reviews, telemetry, iterative delivery and DevOps Experience with polyglot storage architectures including relational, columnar, key-value, graph or equivalent Experience with Delta tables as well as Parquet files stored in ADLS Experience delivering applications using componentized and distributed architectures using event driven patterns Demonstrated ability to communicate effectively to both technical and non-technical, globally distributed audiences Solid foundations in formal architecture, design patterns and best practices Experience working with healthcare datasets Why Join US? We adapt emerging technologies to practical uses to deliver concrete solutions that bring maximum impact to providers bottom line. We currently have 10 Technology Patents in our name. We offer you a great organization to work for, where you will get to do best work of your career and grow with the team that is shaping the future of Revenue Cycle Management. We have our strong focus on Learning and development. We have the best Industry standard professional development policies to support the learning goals of our associates. We have flexible/ remote working/ working from home options Benefits H ealth Benefits and Insurance Coverage for family and parents. Accidental Insurance for the associate. Compliant with all Labor Laws- Maternity benefits, Paternity Leaves. Company Swags- Welcome Packages, Work Anniversary Kits Exclusive Referral Policy Professional Development Program and Reimbursements. Remote work flexibility to work from home. Please share your resume on yash.arora@ensemblehp.com with current ctc, expected ctc, notice period.

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Pune, Ahmedabad, Mumbai (All Areas)

Work from Office

5+ yrs in Azure data modeling, Azure (Databricks, Data Lake), SQL, Spark, Python. Exp in Salesforce, Oracle, SQL Server to Azure migration. Strong in ETL, star/snowflake schemas, governance, & cross-functional collaboration. Join within 30 days.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies