Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 7.0 years
5 - 5 Lacs
Mumbai, Chennai, Gurugram
Work from Office
We are seeking a skilled Site Reliability Engineer to support the administration of Azure Kubernetes Service (AKS) clusters running critical, always-on middleware that processes thousands of transactions per second (TPS). The ideal candidate will operate with a mindset aligned to achieving 99.999% (five-nines) availability. Key Responsibilities: Own and manage AKS cluster deployments, cutovers, base image updates, and daily operational tasks. Test and implement Infrastructure as Code (IaC) changes using best practices. Apply software engineering principles to IT operations for maintaining scalable and reliable production environments. Write and maintain IaC as well as automation code for: Monitoring and ing Log analysis Disaster recovery testing Incident response Documentation-as-code Mandatory Skills: Strong experience with Terraform In-depth knowledge of Azure Cloud Proficiency in Kubernetes cluster creation and lifecycle management (deployment-only experience is not sufficient) Hands-on experience with CI/CD tools (GitHub Actions preferred) Bash and Python scripting skills Desirable Skills: Exposure to Azure Databricks and Azure Data Factory Experience with secret management using HashiCorp Vault Familiarity with monitoring tools (any) Required Skills Azure, Kubernetes, Terraform, DevOps
Posted 4 days ago
8.0 - 12.0 years
30 - 35 Lacs
Chennai
Remote
Job Title:- Sr. Python Data Engineer Location:- Chennai & Bangalore (REMOTE) Job Type:- Permanent Employee Experience :- 8 to 12 Years Shift: 2 11 PM Responsibilities Design and develop data pipelines and ETL processes. Collaborate with data scientists and analysts to understand data needs. Maintain and optimize data warehousing solutions. Ensure data quality and integrity throughout the data lifecycle. Develop and implement data validation and cleansing routines. Work with large datasets from various sources. Automate repetitive data tasks and processes. Monitor data systems and troubleshoot issues as they arise. Qualifications Bachelor’s degree in Computer Science, Information Technology, or a related field. Proven experience as a Data Engineer or similar role (Minimum 6+ years’ experience as Data Engineer). Strong proficiency in Python and PySpark. Excellent problem-solving abilities. Strong communication skills to collaborate with team members and stakeholders. Individual Contributor Technical Skills Required Expert Python, PySpark and SQL/Snowflake Advanced Data warehousing, Data pipeline design – Advanced Level Data Quality, Data validation, Data cleansing – Advanced Level Intermediate/Basic Microsoft Fabric, ADF, Databricks, Master Data management/Data Governance Data Mesh, Data Lake/Lakehouse Architecture
Posted 4 days ago
3.0 - 8.0 years
13 - 18 Lacs
Pune
Work from Office
Enterprise Systems Administrator - Azure JOB_DESCRIPTION.SHARE.HTML CAROUSEL_PARAGRAPH JOB_DESCRIPTION.SHARE.HTML Pune, India India Enterprise IT - 22752 about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What you'll do: Assist in deploying, managing, and troubleshooting Azure resources, including virtual machines, networking, and storage. Monitor and respond to alerts to ensure optimal performance and availability of Azure services. Manage Azure networking services such as private endpoints, load balancers, Application Gateway, and ExpressRoute to ensure secure and efficient connectivity. Administer Azure governance features, including management groups, Azure Policy, and cost management. Oversee Power BI, Databricks, and Synapse administration, ensuring proper configuration and security. Ensure security compliance by managing Azure Defender, monitoring for vulnerabilities, and implementing security best practices. Handle ServiceNow tickets and resolve issues related to Azure services promptly. Maintain and update documentation related to Azure deployments, policies, and best practices. What you'll bring: Strong knowledge of Azure infrastructure, including networking, storage, private endpoints, and load balancing (Application Gateway, WAF). Basic understanding of monitoring tools like Azure Monitor, along with setting up alerts and reports. Experience with Azure governance, including management groups, policies, and cost optimization strategies. Knowledge of security best practices and tools such as Azure Defender and role-based access control (RBAC). Experience in Power BI, Azure Databricks, Synapse administration, and ServiceNow ticket management. Familiarity with scripting and automation tools such as PowerShell, ARM templates. Basic understanding of containerization and orchestration tools (Docker, Kubernetes) is a plus. Additional Skills: 1-3 years of experience managing Azure cloud environments. Strong communication and problem-solving skills. Ability to manage multiple tasks and work both independently and within a team. Azure certifications such as Azure Administrator Associate or Azure Fundamentals are a plus. Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying? At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At
Posted 4 days ago
5.0 - 8.0 years
14 - 24 Lacs
Noida, Pune, Bengaluru
Hybrid
Mandatory skill- Azure Databricks, Datafactory, Pyspark, SQL Experience- 5 to 8 years Location- Mumbai/Bangalore/Pune/Chennai/Hyderabad/Indore/Kolkata/Noida/Coimbatore/Bhuvaneswar Key Responsibilities: Design and build data pipelines and ETL/ELT workflows using Azure Databricks and Azure Data Factory Ingest, clean, transform, and process large datasets from diverse sources (structured and unstructured) Implement Delta Lake solutions and optimize Spark jobs for performance and reliability Integrate Azure Databricks with other Azure services including Data Lake Storage, Synapse Analytics, and Event Hubs Interested candidates share your CV at himani.girnar@alikethoughts.com with below details Candidate's name- Email and Alternate Email ID- Contact and Alternate Contact no- Total exp- Relevant experience- Current Org- Notice period- CCTC- ECTC- Current Location- Preferred Location- Pancard No-
Posted 4 days ago
6.0 - 11.0 years
12 - 17 Lacs
Pune
Work from Office
Roles and Responsibility The Senior Tech Lead - Databricks leads the design, development, and implementation of advanced data solutions. Has To have extensive experience in Databricks, cloud platforms, and data engineering, with a proven ability to lead teams and deliver complex projects. Responsibilities: Lead the design and implementation of Databricks-based data solutions. Architect and optimize data pipelines for batch and streaming data. Provide technical leadership and mentorship to a team of data engineers. Collaborate with stakeholders to define project requirements and deliverables. Ensure best practices in data security, governance, and compliance. Troubleshoot and resolve complex technical issues in Databricks environments. Stay updated on the latest Databricks features and industry trends. Key Technical Skills & Responsibilities Experience in data engineering using Databricks or Apache Spark-based platforms. Proven track record of building and optimizing ETL/ELT pipelines for batch and streaming data ingestion. Hands-on experience with Azure services such as Azure Data Factory, Azure Data Lake Storage, Azure Databricks, Azure Synapse Analytics, or Azure SQL Data Warehouse. Proficiency in programming languages such as Python, Scala, SQL for data processing and transformation. Expertise in Spark (PySpark, Spark SQL, or Scala) and Databricks notebooks for large-scale data processing. Familiarity with Delta Lake, Delta Live Tables, and medallion architecture for data lakehouse implementations. Experience with orchestration tools like Azure Data Factory or Databricks Jobs for scheduling and automation. Design and implement the Azure key vault and scoped credentials. Knowledge of Git for source control and CI/CD integration for Databricks workflows, cost optimization, performance tuning. Familiarity with Unity Catalog, RBAC, or enterprise-level Databricks setups. Ability to create reusable components, templates, and documentation to standardize data engineering workflows is a plus. Ability to define best practices, support multiple projects, and sometimes mentor junior engineers is a plus. Must have experience of working with streaming data sources and Kafka (preferred) Eligibility Criteria: Bachelors degree in Computer Science, Data Engineering, or a related field Extensive experience with Databricks, Delta Lake, PySpark, and SQL Databricks certification (e.g., Certified Data Engineer Professional) Experience with machine learning and AI integration in Databricks Strong understanding of cloud platforms (AWS, Azure, or GCP) Proven leadership experience in managing technical teams Excellent problem-solving and communication skills Our Offering Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences Attractive Salary Hybrid work culture
Posted 4 days ago
5.0 - 10.0 years
20 - 30 Lacs
Bengaluru
Work from Office
Please apply only if your notice period is less the 15 days Years of exp: 5+ Years Location: Bangalore Data Engineer Job Summary: The Data Engineer responsible for implementing and managing the operational aspects of cloud-native and hybrid data platform solutions built with Azure Databricks. They ensure the efficient and effective functioning of the Azure Databricks environment, including monitoring and troubleshooting data pipelines, managing data storage and access, and optimizing performance. They work closely with data engineers, data scientists, and other stakeholders to understand data requirements, design solutions, and implement data integration and transformation processes. Key Responsibilities: Provide expertise and ownership of Azure Databricks development tasks within the scrum team. Interact effectively with clients and leadership and can adapt communication for the appropriate audience. Read and comprehend software requirements, assisting with development of agile user stores and tasks. Assist with troubleshooting configuration and performance issues. Assist with Azure Databricks deployments, testing, configuration, and installation. Ensure security is a priority and understand the various areas where security vulnerabilities arise with database technologies. Ensure database resiliency, and disaster recovery capabilities. Required Skills & Qualifications: 5+ years proven experience working with Azure Databricks Analytics database capabilities, specifically Azure Databricks and other relational database technologies supported in Azure. 5+ years proven experience with Azure Data Lake Storage Gen 2, Azure Databricks, Azure Data Explorer, Azure Event Hubs, Spark Pools, Python, PySpark, SQL, Azure Landing Zone, Azure Networking Services, Microsoft EntraID. 5+ years proven experience with Azure geo-redundancy, HA/failover technologies. 5+ years proven experience designing and implementing data pipelines using Azure Databricks for data cleaning, transformation, and loading into Data Lakehouse. 5+ years proven experience with Infrastructure as Code (IaC) tools such as Terraform. 5+ years proven experience with programming languages such as Python, PySpark and data constructs such as JSON or XML
Posted 4 days ago
8.0 - 10.0 years
20 - 35 Lacs
Ahmedabad
Remote
We are seeking a talented and experienced Senior Data Engineer to join our team and contribute to building a robust data platform on Azure Cloud. The ideal candidate will have hands-on experience designing and managing data pipelines, ensuring data quality, and leveraging cloud technologies for scalable and efficient data processing. The Data Engineer will design, develop, and maintain scalable data pipelines and systems to support the ingestion, transformation, and analysis of large datasets. The role requires a deep understanding of data workflows, cloud platforms (Azure), and strong problem-solving skills to ensure efficient and reliable data delivery. Key Responsibilities Data Ingestion and Integration: Develop and maintain data ingestion pipelines using tools like Azure Data Factory , Databricks , and Azure Event Hubs . Integrate data from various sources, including APIs, databases, file systems, and streaming data. ETL/ELT Development: Design and implement ETL/ELT workflows to transform and prepare data for analysis and storage in the data lake or data warehouse. Automate and optimize data processing workflows for performance and scalability. Data Modeling and Storage: Design data models for efficient storage and retrieval in Azure Data Lake Storage and Azure Synapse Analytics . Implement best practices for partitioning, indexing, and versioning in data lakes and warehouses. Quality Assurance: Implement data validation, monitoring, and reconciliation processes to ensure data accuracy and consistency. Troubleshoot and resolve issues in data pipelines to ensure seamless operation. Collaboration and Documentation: Work closely with data architects, analysts, and other stakeholders to understand requirements and translate them into technical solutions. Document processes, workflows, and system configurations for maintenance and onboarding purposes. Cloud Services and Infrastructure: Leverage Azure services like Azure Data Factory , Databricks , Azure Functions , and Logic Apps to create scalable and cost-effective solutions. Monitor and optimize Azure resources for performance and cost management. Security and Governance: Ensure data pipelines comply with organizational security and governance policies. Implement security protocols using Azure IAM, encryption, and Azure Key Vault. Continuous Improvement: Monitor existing pipelines and suggest improvements for better efficiency, reliability, and scalability. Stay updated on emerging technologies and recommend enhancements to the data platform. Skills Strong experience with Azure Data Factory , Databricks , and Azure Synapse Analytics . Proficiency in Python , SQL , and Spark . Hands-on experience with ETL/ELT processes and frameworks. Knowledge of data modeling, data warehousing, and data lake architectures. Familiarity with REST APIs, streaming data (Kafka, Event Hubs), and batch processing. Good To Have: Experience with tools like Azure Purview , Delta Lake , or similar governance frameworks. Understanding of CI/CD pipelines and DevOps tools like Azure DevOps or Terraform . Familiarity with data visualization tools like Power BI . Competency Analytical Thinking Clear and effective communication Time Management Team Collaboration Technical Proficiency Supervising Others Problem Solving Risk Management Organizing & Task Management Creativity/innovation Honesty/Integrity Education: Bachelors degree in Computer Science, Data Science, or a related field. 8+ years of experience in a data engineering or similar role.
Posted 4 days ago
2.0 - 6.0 years
0 Lacs
haryana
On-site
At EY, you will have the opportunity to shape a career that is as unique as you are, leveraging the global reach, support, inclusive environment, and cutting-edge technology to unleash your full potential. Your distinctive voice and perspective are crucial in our journey towards continuous improvement at EY. By joining us, you will not only craft an exceptional experience for yourself but also contribute to creating a better working world for all. The mission of EY's GDS Tax Technology team is to design, implement, and integrate technology solutions that enhance client service and support engagement teams. As a member of EY's core Tax practice, you will deepen your tax technical expertise while honing your database, data analytics, and programming skills. In a landscape of ever-evolving regulations, tax departments are faced with the challenge of collecting, organizing, and analyzing vast amounts of data. This data often needs to be sourced from various systems and departments within an organization. Managing the diversity and volume of data efficiently poses significant challenges and time constraints for companies. Collaborating closely with EY partners, clients, and tax technical experts, members of the GDS Tax Technology team develop and integrate technology solutions that add value, enhance efficiencies, and equip clients with disruptive and cutting-edge tools to support Tax functions. GDS Tax Technology collaborates with clients and professionals in areas such as Federal Business Tax Services, Partnership Compliance, Corporate Compliance, Indirect Tax Services, Human Capital, and Internal Tax Services. The team offers solution architecture, application development, testing, and maintenance support to the global TAX service line, both proactively and in response to specific requests. EY is currently looking for a Data Engineer - Staff to join our Tax Technology practice in India. Key Responsibilities: - Must have proficiency in Azure Databricks. - Strong command of Python and PySpark programming is essential. - Solid understanding of Azure SQL Database and Azure SQL Datawarehouse concepts. - Develop, maintain, and optimize all data layer components for new and existing systems, including databases, stored procedures, ETL packages, and SQL queries. - Experience with Azure data platform offerings. - Effective communication with team members and stakeholders. Qualification & Experience Required: - Candidates should possess 1.5 to 3 years of experience in Azure Data Platform (Azure Databricks) with a strong grasp of Python and PySpark. - Excellent verbal and written communication skills. - Ability to work independently as a contributor. - Experience with Azure Data Factory, SSIS, or other ETL tools. Join EY in building a better working world, where diverse teams across 150 countries leverage data and technology to provide assurance, support growth, transformation, and operational excellence for clients. EY teams engage in assurance, consulting, law, strategy, tax, and transactions, asking insightful questions to address the complex challenges of today's world.,
Posted 4 days ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As a Cloud Platform Engineer at Healthcare Intelligence within Providence, you will be responsible for the Azure Administration to ensure the availability and operational efficiency of the cloud platform. Your role will involve managing critical applications hosted on Azure Infrastructure, including AzureSQL, ADF, AKS, Azure VM, and more. Your primary focus will be on maintaining platform availability, reliability, and performance of the Azure Cloud Infrastructure. In the position of Sr. Cloud Engineer, you will play a crucial role in Azure Infrastructure administration, ensuring a highly available and stable environment with sustained performance. Your responsibilities will include working on various Azure services, implementing automation using IaaC approach, troubleshooting production issues, managing Azure resource utilization, and utilizing Telemetry solutions for monitoring and alerting. Your day-to-day activities will involve monitoring and addressing incidents and user requests related to Azure Infrastructure, collaborating with product teams on application architecture and performance issues, working with Enterprise Infrastructure and Security teams on policy implementation, and engaging with Microsoft support on severity issues. To be successful in this role, you should have a Bachelor's degree in Engineering, a minimum of 5 years of experience in Cloud Infrastructure administration with at least 3 years in Azure administration, strong knowledge of Azure Administration concepts, experience with Infrastructure as Code deployment, Azure DevOps, CI/CD, system reliability, Azure Databricks, Azure AI Services, and more. Additionally, you should be proficient in incident management, source code control systems, agile methodologies, and have excellent communication and collaborative skills. Join our team of professionals who are dedicated to improving patient and caregiver experience through innovative technologies and drive a lasting social impact. If you are a pioneering and compassionate individual who is ready to plan for the future of healthcare, we look forward to working with you in re-imagining the future of care with cutting-edge technologies.,
Posted 4 days ago
13.0 - 17.0 years
0 Lacs
maharashtra
On-site
Birlasoft is a powerhouse that brings together domain expertise, enterprise solutions, and digital technologies to redefine business processes. With a consultative and design thinking approach, we drive societal progress by enabling our customers to run businesses with efficiency and innovation. As part of the CK Birla Group, a multibillion-dollar enterprise, we have a team of 12,500+ professionals dedicated to upholding the Group's 162-year legacy. Our core values prioritize Diversity, Equity, and Inclusion (DEI) initiatives, along with Corporate Sustainable Responsibility (CSR) activities, demonstrating our commitment to building inclusive and sustainable communities. Join us in shaping a future where technology seamlessly aligns with purpose. As an Azure Tech PM at Birlasoft, you will be responsible for leading and delivering complex data analytics projects. With 13-15 years of experience, you will play a critical role in overseeing the planning, execution, and successful delivery of data analytics initiatives, while managing a team of 15+ skilled resources. You should have exceptional communication skills, a deep understanding of Agile methodologies, and a strong background in managing cross-functional teams in data analytics projects. Key Responsibilities: - Lead end-to-end planning, coordination, and execution of data analytics projects, ensuring adherence to project scope, timelines, and quality standards. - Guide the team in defining project requirements, objectives, and success criteria using your extensive experience in data analytics. - Apply Agile methodologies to create and maintain detailed project plans, sprint schedules, and resource allocation for efficient project delivery. - Manage a team of 15+ technical resources, fostering collaboration and a culture of continuous improvement. - Collaborate closely with cross-functional stakeholders to align project goals with business objectives. - Monitor project progress, identify risks, issues, and bottlenecks, and implement mitigation strategies. - Provide regular project updates to executive leadership, stakeholders, and project teams using excellent communication skills. - Facilitate daily stand-ups, sprint planning, backlog grooming, and retrospective meetings to promote transparency and efficiency. - Drive the implementation of best practices for data analytics, ensuring data quality, accuracy, and compliance with industry standards. - Act as a point of escalation for project-related challenges and work with the team to resolve issues promptly. - Collaborate with cross-functional teams to ensure successful project delivery, including testing, deployment, and documentation. - Provide input to project estimation, resource planning, and risk management activities. Mandatory Experience: - Technical Project Manager experience of minimum 5+ years in Data lake and Data warehousing (DW). - Strong understanding of DW process execution from acquiring data to visualization. - Exposure to Azure skills such as Azure ADF, Azure Databricks, Synapse, SQL, PowerBI for minimum 3+ years or experience in managing at least 2 end-to-end Azure Cloud projects. Other Qualifications: - Bachelor's or Master's degree in Computer Science, Information Systems, or related field. - 13-15 years of progressive experience in technical project management focusing on data analytics and data-driven initiatives. - In-depth knowledge of data analytics concepts, tools, and technologies. - Exceptional leadership, team management, interpersonal, and communication skills. - Demonstrated success in delivering data analytics projects on time, within scope, and meeting quality expectations. - Strong problem-solving skills and proactive attitude towards identifying challenges. - Project management certifications such as PMP, PMI-ACP, CSM would be an added advantage. - Ability to thrive in a dynamic and fast-paced environment, managing multiple projects simultaneously.,
Posted 4 days ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As a skilled professional, you will be responsible for designing, developing, and implementing data pipelines utilizing Azure Data Factory. Your primary focus will be on efficiently extracting, transforming, and loading data from diverse sources into Azure Data Lake Storage (ADLS). In addition to the mandatory skills mentioned above, it will be beneficial to have knowledge and experience in utilizing Azure Databricks, as well as proficiency in Python and PySpark. Your expertise in these areas will be crucial in ensuring the seamless flow of data and maintaining the integrity of the data pipelines within the Azure environment. Your contributions will play a key role in the successful management and utilization of data resources for the organization.,
Posted 4 days ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
The Testing Specialist will collaborate closely with the Business and Delivery teams to implement the test strategy and fulfill identified business requirements, ensuring the delivery of business value. With 8 to 10 years of experience in Quality Assurance and ETL testing, you will lead the Quality Engineering and Assurance (QEA) team, responsible for ensuring the quality and reliability of software and data products within the Databricks environment. Your responsibilities will include developing and overseeing the quality assurance strategy and test planning for Databricks" products and solutions. You should possess a good understanding of efficient data pipelines utilizing Azure Databricks and its native services such as Azure Data Factory, Azure Synapse Analytics, Azure SQL Database, and Azure Blob Storage. Continuous assessment and enhancement of quality assurance processes and methodologies will be crucial to improve the efficiency and effectiveness of the QA team. You will drive the automation of testing processes, including the development and maintenance of test scripts and frameworks, and manage the entire testing lifecycle from test case creation to defect tracking and reporting. Collaboration with development and product management teams is essential to ensure software releases meet quality standards and are delivered on time. You will establish and monitor key quality metrics and performance indicators to gain insights into the quality of Databricks" products. Working with cross-functional teams, including development, data engineering, and data science, will be necessary to embed quality into all stages of product development. You will be responsible for ensuring that Databricks" products comply with security and compliance standards, conducting security testing as needed, and overseeing the identification and resolution of software defects by collaborating closely with development teams. Maintaining detailed documentation of testing processes, test cases, and results will also be part of your role. Qualifications: - Bachelor's or Master's degree in computer science, software engineering, or a related field. - Extensive experience in quality engineering, software testing, and quality assurance, with a proven track record of leadership and management. - Strong knowledge of software testing methodologies, test automation tools, and quality assurance best practices. - Experience with big data technologies, data processing, and analytics platforms relevant to Databricks. - Strong leadership and communication skills to collaborate effectively with diverse teams and stakeholders. - Familiarity with Databricks" platform and related technologies is often preferred. - Experience with cloud platforms like AWS, Azure Cloud may be beneficial. - Knowledge of data governance, security, and compliance standards.,
Posted 4 days ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Engineer, you will be responsible for designing, building, and maintaining data pipelines on the Microsoft Azure cloud platform. Your primary focus will be on utilizing technologies such as Azure Data Factory, Azure Synapse Analytics, PySpark, and Python to handle complex data processing tasks efficiently. Your key responsibilities will include designing and implementing data pipelines using Azure Data Factory or other orchestration tools, writing SQL queries for ETL processes, and collaborating with data analysts to meet data requirements and ensure data quality. You will also need to implement data governance practices for security and compliance, monitor and optimize data pipelines for performance, and develop unit tests for code. Working in an Agile environment, you will be part of a team that develops Modern Data Warehouse solutions using Azure Stack, coding in Spark (Scala or Python) and T-SQL. Proficiency in source code control systems like GIT, designing solutions with Azure data services, and managing team governance are essential aspects of this role. Additionally, you will provide technical leadership, guidance, and support to team members, resolve blockers, and report progress to customers regularly. Preferred skills and experience for this role include a good understanding of PySpark and Python, proficiency in Azure Data Engineering tools (Azure Data Factory, DataBricks, Synapse Analytics), experience in handling large datasets, exposure to DevOps basics, and knowledge of Release Engineering fundamentals.,
Posted 4 days ago
5.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
You are an experienced Azure Databricks Engineer who will be responsible for designing, developing, and maintaining scalable data pipelines and supporting data infrastructure in an Azure cloud environment. Your key responsibilities will include designing ETL pipelines using Azure Databricks, building robust data architectures on Azure, collaborating with stakeholders to define data requirements, optimizing data pipelines for performance and reliability, implementing data transformations and cleansing processes, managing Databricks clusters, and leveraging Azure services for data orchestration and storage. You must possess 5-10 years of experience in data engineering or a related field with extensive hands-on experience in Azure Databricks and Apache Spark. Strong knowledge of Azure cloud services such as Azure Data Lake, Data Factory, Azure SQL, and Azure Synapse Analytics is required. Experience with Python, Scala, or SQL for data manipulation, ETL frameworks, Delta Lake, Parquet formats, Azure DevOps, CI/CD pipelines, big data architecture, and distributed systems is essential. Knowledge of data modeling, performance tuning, and optimization of big data solutions is expected, along with problem-solving skills and the ability to work in a collaborative environment. Preferred qualifications include experience with real-time data streaming tools, Azure certifications, machine learning frameworks, integration with Databricks, and data visualization tools like Power BI. A bachelor's degree in Computer Science, Data Engineering, Information Technology, or a related field is required for this role.,
Posted 4 days ago
8.0 - 13.0 years
15 - 30 Lacs
Hyderabad
Work from Office
Job Title: ETL Lead Job Type: Fulltime Location: Hyderabad / Work from office Experience: 8+ Years No of positions: 1 Job Summary: We are looking for an experienced ETL Lead to manage and deliver enterprise-grade data integration solutions using Azure Data Factory (ADF), SSIS, SQL Querying, Azure SQL, Azure Data Lake, and preferably Azure Databricks. The role includes leading a team, building scalable ETL pipelines, and ensuring data quality and performance through efficient CI/CD practices. Key Responsibilities: Lead a team of engineers and manage ETL project lifecycles. Design, develop, and optimize ETL workflows using ADF and SSIS. Write complex SQL queries and perform performance tuning. Integrate data from varied sources into Azure SQL and Data Lake. Implement CI/CD pipelines for automated deployment and testing. Collaborate with stakeholders to translate business needs into technical solutions. Maintain documentation and enforce best practices. Requirements: 8+ years in ETL development and data integration. Strong expertise in ADF, SSIS, SQL Querying, Azure SQL, Azure Data Lake. Experience with CI/CD tools (e.g., Azure DevOps, Git). Exposure to Azure Databricks is a plus. Solid understanding of data warehousing and data modelling.
Posted 5 days ago
4.0 - 8.0 years
5 - 9 Lacs
Gurugram
Work from Office
4 Years+ experience in Azure (Must Have) - Excellent communication skills required. Graduate Budget - Upto 10 LPA (Fixed) 5 Days / US Shifts / Cabs (Immediate Joiners or Max Notice Period - 30 Days) Please Call - 9999869475 Required Candidate profile Candidate should be excellent with VBA, SQL, Azure Databricks
Posted 5 days ago
4.0 - 8.0 years
5 - 9 Lacs
Gurugram
Work from Office
4 Years+ experience in Data Analytics (Azure Databricks - Must Have) Graduate Budget - Upto 10 LPA (Fixed) 5 Days / US Shifts / Cabs (Immediate Joiners or Max Notice Period - 30 Days) Please Call - 9999869475 Required Candidate profile Candidate should be excellent with VBA, SQL, Azure Databricks Excellent communication skills required
Posted 5 days ago
3.0 - 8.0 years
5 - 9 Lacs
Gurugram
Work from Office
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Analyze business requirements & functional specifications Be able to determine the impact of changes in current functionality of the system Interaction with diverse Business Partners and Technical Workgroups Be flexible to collaborate with onshore business, during US business hours Be flexible to support project releases, during US business hours Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Undergraduate degree or equivalent experience 3+ years of working experience in Python, Pyspark, Scala 3+ years of experience working on MS Sql Server and NoSQL DBs like Cassandra, etc. Hands-on working experience in Azure Databricks Solid healthcare domain knowledge Exposure to following DevOps methodology and creating CI/CD deployment pipeline Exposure to following Agile methodology specifically using tools like Rally Ability to understand the existing application codebase, perform impact analysis and update the code when required based on the business logic or for optimization Proven excellent analytical and communication skills (Both verbal and written) Preferred Qualification: Experience in the Streaming application (Kafka, Spark Streaming, etc.) At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyoneof every race, gender, sexuality, age, location and incomedeserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes an enterprise priority reflected in our mission. #Gen #NJP
Posted 5 days ago
4.0 - 7.0 years
9 - 13 Lacs
Coimbatore
Work from Office
Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : Microsoft Azure Databricks Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Lead, you will develop and configure software systems, either end-to-end or for specific stages of the product lifecycle. Your typical day will involve collaborating with various teams to ensure the successful implementation of software solutions, applying your knowledge of technologies and methodologies to support project goals and client needs. You will engage in problem-solving activities, guiding your team through challenges while ensuring that the software development process aligns with best practices and client expectations. Your role will also include mentoring team members and fostering a collaborative environment to drive innovation and efficiency in software development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure adherence to timelines and quality standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Databricks.- Strong understanding of cloud computing principles and practices.- Experience with data engineering and ETL processes.- Familiarity with programming languages such as Python or Scala.- Ability to design and implement scalable data solutions. Additional Information:- The candidate should have minimum 7.5 years of experience in Microsoft Azure Databricks.- This position is based in Coimbatore.- A 15 years full time education is required. Qualification 15 years full time education
Posted 5 days ago
3.0 - 8.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform, Microsoft Azure Databricks, Microsoft Azure Data Services Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with cross-functional teams to gather requirements, developing application features, and ensuring that the applications are optimized for performance and usability. You will also participate in testing and debugging processes to deliver high-quality solutions that meet the needs of the organization and its stakeholders. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in continuous learning to stay updated with the latest technologies and best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform, Microsoft Azure Databricks, Microsoft Azure Data Services.- Strong understanding of data integration techniques and ETL processes.- Experience with cloud-based data storage solutions and data management.- Familiarity with programming languages such as Python or Scala.- Ability to work with data visualization tools to present insights effectively. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 5 days ago
3.0 - 8.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform, Microsoft Azure Databricks, Microsoft Azure Data Services Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will engage in problem-solving discussions, contribute to the overall project strategy, and adapt to evolving requirements while maintaining a focus on delivering high-quality applications that align with business objectives. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and requirements.- Collaborate with cross-functional teams to ensure seamless integration of applications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform, Microsoft Azure Databricks, Microsoft Azure Data Services.- Experience with data integration and ETL processes.- Strong understanding of application development methodologies.- Familiarity with cloud computing concepts and services.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 5 days ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform, Microsoft Azure Databricks, Microsoft Azure Data Services Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with cross-functional teams to gather requirements, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to deliver high-quality applications that meet user needs and expectations. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Collaborate with stakeholders to gather and analyze requirements for application development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform, Microsoft Azure Databricks, Microsoft Azure Data Services.- Strong understanding of data integration techniques and ETL processes.- Experience with cloud-based data storage solutions and data warehousing.- Familiarity with programming languages such as Python or SQL for data manipulation.- Knowledge of application development methodologies and best practices. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 5 days ago
3.0 - 8.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: At least 3+ years of overall experienceExperience in building data solutions using Azure Databricks and PySparkExperience in building complex SQL queries , understand data warehousing concepts Experience with Azure DevOps CI/CD pipelines for automated deployment and release managementGood to have :Experience with Snowflake data warehousingExcellent problem-solving and analytical skillsAbility to work independently as well as collaboratively in a team environmentGood to have :Experience building pipelines, Data Flows using Azure Data FactoryStrong communication and interpersonal skills Additional Information:- The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 5 days ago
3.0 - 8.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Microsoft Azure Databricks, Apache Spark, Microsoft Azure Data Services Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function seamlessly within the existing infrastructure. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application design and functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Databricks, Apache Spark, Microsoft Azure Data Services.- Strong understanding of data integration techniques and ETL processes.- Experience with application development frameworks and methodologies.- Familiarity with cloud computing concepts and services.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 3 years of experience in Microsoft Azure Databricks.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education
Posted 5 days ago
2.0 - 5.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform, Microsoft Azure Databricks, Microsoft Azure Data Services Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to gather requirements, developing application features, and ensuring that the applications are aligned with business objectives. You will also engage in problem-solving activities, providing innovative solutions to enhance application performance and user experience, while maintaining a focus on quality and efficiency throughout the development process. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform, Microsoft Azure Data Services, Microsoft Azure Databricks.- Good To Have Skills: Experience with data integration tools and ETL processes.- Strong understanding of cloud computing concepts and architecture.- Experience in application development using programming languages such as Python or Scala.- Familiarity with Agile methodologies and project management tools. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 5 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough