Jobs
Interviews

7 Azure Datalake Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 11.0 years

0 Lacs

hyderabad, telangana

On-site

As a Databricks Engineer-Lead, you will be responsible for designing and developing ETL pipelines using Azure Data Factory for data ingestion and transformation. You will collaborate with various Azure stack modules such as Data Lakes and SQL Data Warehouse to create robust data solutions. Your role will involve writing efficient SQL, Python, and PySpark code for data processing and transformation. It is essential to understand and translate business requirements into technical designs, develop mapping documents, and adhere to transformation rules as per the project scope. Effective communication with stakeholders to ensure smooth project execution is a crucial aspect of this role. To excel in this position, you should possess 7-10 years of experience in data ingestion, data processing, and analytical pipelines involving big data and relational databases. Hands-on experience with Azure services like Azure Data Lake Storage, Azure Databricks, Azure Data Factory, Azure Synapse Analytics, and Azure SQL Database is required. Proficiency in SQL, Python, and PySpark for data manipulation is essential. Familiarity with DevOps practices and CI/CD deployments is a plus. Strong communication skills and attention to detail, especially in high-pressure situations, are highly valued in this role. Previous experience in the insurance or financial industry is preferred. This role is based in Hyderabad and requires the selected candidate to work from the office. If you are passionate about leveraging Databricks, PySpark, SQL, and other Azure technologies to drive innovative data solutions, this position offers an exciting opportunity to lead and contribute to impactful projects.,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

kolkata, west bengal

On-site

You must have knowledge in Azure Datalake, Azure function, Azure Databricks, Azure Data Factory, and PostgreSQL. Working knowledge in Azure DevOps and Git flow would be an added advantage. Alternatively, you should have working knowledge in AWS Kinesis, AWS EMR, AWS Glue, AWS RDS, AWS Athena, and AWS RedShift. Demonstrable expertise in working with timeseries data is essential. Experience in delivering data engineering/data science projects in Industry 4.0 is an added advantage. Knowledge of Palantir is required. You must possess strong problem-solving skills with a focus on sustainable and reusable development. Proficiency in using statistical computer languages like Python/PySpark, Pandas, Numpy, seaborn/matplotlib is necessary. Knowledge in Streamlit.io is a plus. Familiarity with Scala, GoLang, Java, and big data tools such as Hadoop, Spark, Kafka is beneficial. Experience with relational databases like Microsoft SQL Server, MySQL, PostGreSQL, Oracle, and NoSQL databases including Hadoop, Cassandra, MongoDB is expected. Proficiency in data pipeline and workflow management tools like Azkaban, Luigi, Airflow is required. Experience in building and optimizing big data pipelines, architectures, and data sets is crucial. You should possess strong analytical skills related to working with unstructured datasets. Provide innovative solutions to data engineering problems, document technology choices, and integration patterns. Apply best practices for project delivery with clean code. Demonstrate innovation and proactiveness in meeting project requirements. Reporting to: Director- Intelligent Insights and Data Strategy Travel: Must be willing to be deployed at client locations worldwide for long and short terms, flexible for shorter durations within India and abroad.,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

maharashtra

On-site

As a Data Engineer with 6 to 8 years of experience, you will be responsible for designing, developing, and maintaining data pipelines and ETL/ELT processes using Big Data technologies. Your role will involve working extensively with Azure Data Services such as Azure Data Factory, Azure Synapse, Data Lake, and Databricks. You should have a strong knowledge of Big Data ecosystems like Hadoop and Spark, along with hands-on experience in Azure Data Services including ADF, Azure Data Lake, Synapse, and Databricks. Your proficiency in SQL, Python, or Scala for data manipulation and pipeline development will be crucial for this role. Experience with data modeling, data warehousing, and batch/stream processing is required to ensure the quality, integrity, and reliability of data across multiple sources and formats. You will also be expected to handle large-scale data processing using distributed computing tools and optimize the performance of data systems while ensuring security and compliance in the cloud environment. Collaboration with data scientists, analysts, and business stakeholders is an essential part of this role to deliver scalable data solutions. Therefore, understanding of CI/CD, version control (Git), and Agile methodologies will be beneficial in this collaborative environment. If you have a passion for working with data and enjoy solving complex data engineering challenges, this role offers an exciting opportunity to contribute to the development of innovative data solutions.,

Posted 2 weeks ago

Apply

12.0 - 16.0 years

0 Lacs

maharashtra

On-site

As an experienced professional with 12-14 years of experience, your primary role will involve developing a detailed project plan encompassing tasks, timelines, milestones, and dependencies. You will be responsible for solutions architecture design and implementation, understanding the source, and outlining the ADF structure. Your expertise will be crucial in designing and scheduling packages using ADF. Facilitating collaboration and communication within the team is essential to ensure a smooth workflow. You will also be focusing on application performance optimization and monitoring resource allocation to ensure tasks are adequately staffed. It will be part of your responsibility to create detailed technical specifications, business requirements, and unit test report documents. Your role will require you to ensure that the project complies with best practices, coding standards, and technical requirements. Collaboration with technical leads to address technical issues and mitigate risks will be a key aspect of your job. Your primary skill set should revolve around Data Architecture, with additional expertise in Data Modeling, ETL, Azure Log Analytics, Analytics Architecture, BI & Visualization Architecture, Data Engineering, Costing Management, databricks, Datadog, Apache Spark, Azure Datalake, and Azure Data Factory. Your proficiency in these areas will be instrumental in successfully executing your responsibilities.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Data Engineer with 3-5 years of experience in BI/Data Engineering, you will be responsible for designing, developing, and maintaining Azure Analysis Service and Power BI solutions. Working in Bangalore, you will collaborate with cross-functional teams to gather and analyze business requirements and implement data modeling, ETL processes, and data visualization using Azure-based tools and technologies. Your role will involve optimizing and enhancing existing data pipelines and reporting dashboards for performance and scalability, as well as troubleshooting and resolving data-related issues while providing technical guidance to stakeholders. It is essential to stay updated on the latest Azure and Power BI developments and best practices to ensure efficient and effective solutions. To excel in this role, you must have proven experience as a Data Engineer with a focus on Azure Analysis Service and Power BI. Proficiency in SQL, Data Modeling, ETL, and data visualization is required, along with a strong understanding of Azure Data Factory, Azure SQL Database, and other Azure data services. Experience with DAX, M, and Power Query for data manipulation and transformation is necessary. Strong problem-solving skills, the ability to work in a fast-paced environment, and Microsoft Azure certifications (e.g., DP-200, DP-201) are advantageous. Moreover, you should possess strong knowledge of Azure Databricks, Azure Data Lake, and Azure Analysis Services, as well as expertise in Power BI, including Data Modeling, Tabular Model design, writing complex DAX formulas, and DAX optimization. Familiarity with DWBI with SQL, experience in On-premise/cloud BI solution implementation, and knowledge of Power Automate are beneficial skills for this role. If you join UST, a global digital transformation solutions provider, you will work alongside the world's best companies to drive real impact through transformation. With over 30,000 employees in 30 countries, UST is committed to partnering with clients from design to operation, embedding innovation and agility into organizations to create boundless impact and touch billions of lives in the process.,

Posted 3 weeks ago

Apply

12.0 - 20.0 years

12 - 20 Lacs

Kolkata, West Bengal, India

On-site

We are seeking a highly skilled and client-focused Azure Data Engineer to join our team. The ideal candidate will have strong experience with Azure Data Factory and Azure Data Lake, focusing on data extraction, particularly from D365. This role requires an individual who can design, build, and maintain data pipelines within Azure, ensuring efficient data ingestion and basic transformations. Excellent communication skills are essential for this client-facing role. Key Responsibilities Design, develop, and maintain robust data pipelines using Azure Data Factory (ADF) for data ingestion and transformation. Work extensively with Azure Data Lake for data storage and management. Perform data extraction from D365 (Dynamics 365) environments. Configure and manage Linked Services within Azure Data Factory to connect to various data sources. Create, schedule, and monitor data pipelines within Azure Data Factory to ensure timely and accurate data delivery. Utilize SQL for minimal data transformations, particularly for data ingested from Excel sheets. Collaborate directly with clients to understand data requirements, provide technical insights, and ensure solutions meet business needs. Apply knowledge of D365 Techno-Functional aspects , especially in relation to Sustainability and Power Apps , to understand data contexts and requirements. Ensure data quality, integrity, and security across all implemented solutions. Communicate technical concepts clearly and effectively to both technical and non-technical stakeholders. Required Skills & Experience Relevant Experience: 5+ years in Azure Data Engineering roles. Mandatory Skills: Strong proficiency in Azure Data Factory (ADF) . Extensive experience with Azure Data Lake . Proven experience with data extraction from D365 . Familiarity with Linked Services and data ingestion from various sources within Azure. Experience in creating and scheduling data pipelines in Azure Data Factory. Proficiency in SQL for data querying and basic transformations. Secondary Skills: Knowledge of D365 Techno-Functional aspects , particularly related to Sustainability . Understanding of Power Apps . Communication: Excellent communication skills are mandatory for this client-facing role.

Posted 1 month ago

Apply

8.0 - 10.0 years

10 - 12 Lacs

Pune

Work from Office

The IT Technical Analyst - Senior provides technical expertise in applications or infrastructure, delivering detailed solution designs that translate business requirements into scalable IT solutions. This role supports the design, development, implementation, and maintenance of IT systems and infrastructure in alignment with Cummins architectural standards and security practices. The position involves technical analysis, design review, solution configuration, and cross-functional collaboration to deliver optimized and reliable solutions. Key Responsibilities Develop and manage technical specifications to guide application, infrastructure, or solution development. Analyze and evaluate solution options, including commercial-off-the-shelf products vs custom-built solutions. Deliver and document detailed solution designs that meet business, performance, security, scalability, and maintainability requirements. Collaborate with infrastructure teams and service providers to ensure solution delivery meets specifications and quality standards. Participate in design and code reviews to ensure adherence to standards and requirements. Drive reuse of components and efficiency in the build and deployment processes using automation and CI/CD pipelines. Support the creation of runbooks and documentation for end-user support and knowledge transfer. Assist in testing strategy and execution of test plans for solution validation. Provide advanced (Level 3) support for critical technical issues and participate in system remediation. Analyze existing systems for improvements, optimizations, or compliance needs. Qualifications College, university, or equivalent degree in Computer Science, Information Technology, Business, or a related field is required. Relevant industry certifications are a plus. May require licensing for compliance with export controls or sanctions regulations. Experience 8-10 years of relevant technical experience in IT solution development and infrastructure design. Intermediate level of experience with demonstrated ability to design, implement, and support enterprise-level solutions. Skills and Experience Strong proficiency in: SQL, DAX, M Query, Python SSIS, SSAS, Azure DataLake, Databricks Power BI service development and optimization CI/CD pipelines (GitHub) Agile tools (Jira, Confluence) Familiarity with: C#, R, TensorFlow, PyTorch, NumPy Azure Analytics Services (AAS) Experience in: Data pipeline design and optimization AI/ML model integration and LLM training within Databricks Data governance and data quality metrics Cybersecurity and secure solution design Technical documentation and stakeholder communication Advanced data modeling in SSAS, Azure, Databricks Competencies Customer Focus - Builds strong relationships and delivers customer-centric solutions. Global Perspective - Approaches issues with a global lens. Manages Complexity - Resolves complex and sometimes contradictory data-driven problems. Manages Conflict - Handles challenging interpersonal situations effectively. Optimizes Work Processes - Continuously improves efficiency and effectiveness of processes. Solution Configuration & Design - Creates, configures, and validates COTS and custom-built solutions using industry-standard tools and practices. Performance Tuning - Optimizes systems and applications for maximum performance and scalability. Solution Validation Testing - Ensures solutions meet business and technical requirements through structured testing. Values Differences - Recognizes and leverages diverse perspectives and backgrounds.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies