Jobs
Interviews

99 Data Factory Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 11.0 years

0 Lacs

karnataka

On-site

As an experienced AI Analytics Engineer with expertise in Azure DevOps, your role will involve designing, implementing, and optimizing data pipelines, machine learning models, and analytics solutions. You will be responsible for bridging the gap between data science, engineering, and DevOps practices to deliver scalable and production-ready AI/ML solutions. Key Responsibilities: - Design, develop, and deploy AI/ML models and analytics workflows. - Build and manage end-to-end CI/CD pipelines in Azure DevOps for data and ML projects. - Automate data ingestion, preprocessing, model training, testing, and deployment. - Monitor model performance and implement retraining pipelines. - Work closely with data scientists, data engineers, and business stakeholders to translate requirements into scalable solutions. - Ensure solutions are secure, cost-optimized, and highly available on Azure. - Perform root cause analysis and continuous improvement for production issues. Qualifications Required: - Hands-on experience with Azure DevOps (pipelines, repos, artifacts, boards). - Strong programming skills in Python or R for AI/ML development. - Experience with Azure Machine Learning, Databricks, Synapse, Data Factory. - Good understanding of MLOps principles and tools. - Strong knowledge of data visualization and analytics (Power BI a plus). - Familiarity with containerization (Docker, Kubernetes) for deploying ML workloads. - Experience with version control (Git) and agile development practices. In addition to the above details, the job description also emphasizes the importance of the following soft skills: - Excellent communication and collaboration skills. - Ability to translate technical insights into business value. - Strong analytical thinking and attention to detail. (Note: Any additional details of the company were not provided in the job description),

Posted 2 days ago

Apply

7.0 - 11.0 years

0 Lacs

bhopal, madhya pradesh

On-site

You will be responsible for designing and building ETL pipelines in Azure Databricks (PySpark, Delta Lake) to load, clean, and deliver data across Bronze, Silver, and Gold layers. Your role will also involve implementing Data Lakehouse Architecture on Azure Data Lake Gen2 with partitioning, schema management, and performance optimization. Additionally, you will need to develop data models (dimensional/star schema) for reporting in Synapse and Power BI. Integration of Databricks with Azure services like ADF, Key Vault, Event Hub, Synapse Analytics, Purview, and Logic Apps will be a part of your tasks. Building and managing CI/CD pipelines in Azure DevOps (YAML, Git repos, pipelines) will also fall under your responsibilities. You will be expected to optimize performance through cluster tuning, caching, Z-ordering, Delta optimization, and job parallelization. Ensuring data security and compliance (row-level security, PII masking, GDPR/HIPAA, audit logging) and collaborating with data architects and analysts to translate business needs into technical solutions are vital aspects of this role. - Strong experience in Azure Databricks (Python, PySpark, SQL). - Proficiency with Delta Lake (ACID transactions, schema evolution, incremental loads). - Hands-on with Azure ecosystem - Data Factory, ADLS Gen2, Key Vault, Event Hub, Synapse. - Knowledge of data governance & lineage tools (Purview, Unity Catalog is a plus). - Strong understanding of data warehouse design and star schema. - Azure DevOps (YAML, Git repos, pipelines) experience. - Good debugging skills for performance tuning & schema drift issues. **Good to Have:** - Experience with healthcare or financial data. - Familiarity with FHIR, OMOP, OpenEHR (for healthcare projects). - Exposure to AI/ML integration using Databricks ML runtime. - Experience with Unity Catalog for governance across workspaces. If you are ready to take the lead in building scalable data solutions with Azure Databricks, this Full-time position in Bhopal, MP, India awaits you!,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

Role Overview: In this role at Kimberly-Clark, you will play a pivotal part in designing and building analytics solutions to facilitate informed decision-making. Your primary responsibility will be to design, develop, and maintain analytics solutions working closely with various R&D and DTS technology teams. By leveraging your professional expertise and talent, you will contribute to delivering better care for billions of people worldwide. Key Responsibilities: - Collaborate with engineering and architecture teams to identify, collect, and harmonize data from various sources. - Design and develop ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) pipelines using technologies such as SQL Server, Azure Data Factory, and Databricks. - Develop and maintain data models and data warehouses using platforms like SQL Server, Azure Data Factory, Snowflake, and Databricks. - Implement data quality checks and validation frameworks to maintain high data standards. - Build interpretable and scalable models that meet business needs and develop visualizations to demonstrate results to stakeholders using Microsoft Azure technologies. - Work with key stakeholders to translate requirements into technical designs and mentor junior engineers on data engineering techniques and best practices. - Use Agile methodologies and tools to deliver products in a fast-paced environment and collaborate with platform teams to automate pipeline construction, testing, and code migration. Qualification Required: Mandatory: - Bachelors degree required. - Fluency in English. - 5+ years of experience in data engineering and designing, developing, and building solutions on platforms like SQL, API, Power Query, Microsoft Power BI, Data Factory, and Data Bricks. - Strong knowledge of data collection, analyses, cleaning to build reports for stakeholders. Nice to have: - Experience with Python. - Exposure to SQL Server, Databricks, HANA, Snowflake, and Teradata. - Proven track record in building and maintaining data pipelines, data warehousing, and data modeling. - Strong communication skills, both oral and written, with the ability to convey complex technical concepts to non-technical stakeholders. - Ability to work independently and collaboratively in a team environment. - Familiarity with DevOps tools and practices for continuous integration and deployment. Additional Company Details: Kimberly-Clark is a renowned company known for its iconic brands like Huggies, Kleenex, Cottonelle, Scott, Kotex, Poise, Depend, and Kimberly-Clark Professional. With a history of over 150 years of market leadership, the company is committed to driving innovation, growth, and impact. Kimberly-Clark values sustainability, inclusion, wellbeing, and career development, offering a performance culture fueled by authentic caring. The company encourages flexible work arrangements and prioritizes safety, mutual respect, and human dignity in its operations. Note: This section is optional and can be omitted if not needed.,

Posted 4 days ago

Apply

8.0 - 13.0 years

20 - 32 Lacs

pune

Hybrid

Are you passionate about driving technical excellence, quality, and governance across a global enterprise integration landscape? Do you have strong leadership and problem-solving skills, combined with hands-on expertise in Microsoft Azure Integration Services? If yes, this role could be the perfect next step in your career. We are looking for an experienced Technical Lead Azure Integration to guide and oversee the delivery of integration solutions across multiple vendors and teams. You will play a key role in ensuring adherence to best practices, enforcing technical governance, and resolving complex integration challenges. Key Responsibilities Provide technical oversight and governance for integration solutions delivered across multiple teams and vendors. Review, challenge, and validate vendor designs, estimates, and delivery timelines. Enforce coding standards, performance expectations, and cost optimization guidelines. Collaborate with Integration Architects to ensure alignment with the overall integration strategy. Lead root cause analysis and resolution of integration issues across Azure services. Mentor and guide vendor teams to uphold best practices and quality standards . Drive continuous improvement through technical reviews and retrospectives. Troubleshoot and support integration flows built on: Azure Logic Apps Azure API Management Event Hub / Event Grid Service Bus Azure Data Factory Azure Functions Azure Data Gateway Monitor and report on delivery quality, risks, and deviations from standards. Reduce operational load on architects by leading detailed technical analyses. Foster knowledge sharing and best practices within the integration delivery community. Key Requirements 8+ years of overall IT experience in software development. 5+ years of experience in designing and delivering integrations and APIs using Microsoft Azure Integration Services. Strong understanding of cloud design patterns, distributed systems, and enterprise integrations . Hands-on experience with the Azure Well-Architected Framework (reliability, performance, cost, security, and operations). Proven ability to review and govern vendor deliverables , enforce standards, and identify improvement opportunities. Familiarity with Agile methodologies (Scrum/DevOps) and tools such as Azure DevOps, CI/CD pipelines, automated testing . Excellent communication and vendor management skills. Experience working with international teams across time zones. Fluent in English written and spoken. Why Join Us? Opportunity to work on a world-class enterprise integration ecosystem . A chance to shape and enforce integration standards, governance, and best practices . A role that combines hands-on technical leadership with strategic impact . Collaborate with global teams and stakeholders, driving outcomes that directly support business growth. Warm Regards Isha Shrivastava Sr. Consultant Isha_shrivastava@persolapac.com https://www.persolapac.com/ Sriram Samanthu Chambers, #3287, 12th Main, Indiranagar, Bangalore 560038, India ********************************** The information contained in this e-mail and any accompanying documents may contain information that is confidential or otherwise protected from disclosure. If you are not the intended recipient of this message, or if this message has been addressed to you in error, please immediately alert the sender by reply e-mail and then delete this message, including any attachments. Any dissemination, distribution or other use of the contents of this message by anyone other than the intended recipient is strictly prohibited. By submitting your curriculum vitae or personal data to us in connection with your job application, you are deemed to have read and agreed to the terms of our Privacy Policy, and consented to the collection, use and disclosure of your personal data by us and our affiliates, in accordance with our Privacy Policy. Please visit www.persolindia.com for a copy of our Privacy Policy. If you wish to withdraw your consent, please drop us an email to let us know. **********************************

Posted 6 days ago

Apply

1.0 - 6.0 years

0 Lacs

andhra pradesh

On-site

As a Data Engineer at Microsoft Fabric, you will be responsible for designing, developing, and optimizing data pipelines, reporting solutions, and analytics frameworks using Microsoft Fabric. Your role will involve collaborating with stakeholders and technical teams to deliver scalable, secure, and high-performing analytics solutions. You will work closely with data architects, analysts, and business stakeholders to gather analytics requirements and build data solutions using Microsoft Fabric components such as Data Factory, OneLake, Synapse, and Power BI. Your responsibilities will include developing and optimizing pipelines for ingestion, transformation, and integration, as well as creating and maintaining semantic models and datasets for reporting purposes. Ensuring compliance with best practices for performance, governance, and security of Fabric solutions will also be a key aspect of your role. Additionally, you will support migration projects, conduct proof-of-concepts, and create and maintain documentation related to ETL processes, data flows, and data mappings. You will also play a crucial role in guiding and training client teams on Fabric adoption. To excel in this role, you should have 4-6 years of experience in data analytics, BI, or cloud platforms, with at least 1 year of hands-on experience in Microsoft Fabric, specifically in Data Factory, OneLake, Synapse, and Power BI semantic models and reporting. Strong SQL and data modeling skills, experience with ETL/ELT and performance tuning, familiarity with Azure and cloud data platforms, as well as strong communication and client-facing skills are essential requirements. Knowledge of the Azure Data Stack (ADF, Synapse, Databricks), governance, security, compliance, and consulting/IT services experience will be beneficial. This is a full-time position located in Visakhapatnam, with health insurance and Provident Fund benefits provided. The work location is in person.,

Posted 6 days ago

Apply

3.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

As a Manager, Technology specializing in Data, Analytics & AI, you will be responsible for overseeing the health, performance, and growth of a diverse team of Data/BI/ML Engineers and Analysts. Your primary objective will be to create an environment conducive to sustainable high performance by providing clear direction, fostering collaboration, and supporting career development within the team. Your key responsibilities will include managing, coaching, and developing the team, in addition to collaborating with Tech Leadership, Product, and Business stakeholders to translate requirements into scalable data models, pipelines, semantic layers, and dashboards. You will also play a crucial role in shaping the technical vision for your team and driving initiatives related to data quality, governance, documentation, and metric definitions. Furthermore, you will be expected to enforce engineering and analytics best practices, address complex technical and data issues, monitor project status and risks, and manage project forecast and spending. Your success in this role will be measured based on various metrics such as team health, delivery predictability, talent growth, operational maturity, stakeholder trust, and adoption & impact. To excel in this position, you should possess at least 8 years of experience in data engineering/BI/analytics, including 3 years of direct team management. A background in ML & Data Science would be advantageous. A degree in Computer Science/Applications, Engineering, Information Technology/Systems, or equivalent field is required, with a preference for a Master's degree. You should also have demonstrated proficiency in SQL, ELT/ETL patterns, dimensional & semantic modeling, as well as experience with tools such as Snowflake, Databricks, Data Factory, and Oracle. Moreover, familiarity with programming languages like Python, Spark, Scala, and streaming technologies is desirable. Strong BI development skills, governance expertise, software engineering practices, stakeholder management abilities, and clear communication skills are essential for success in this role. Your track record should reflect talent development, collaborative team-building, and the ability to deliver impactful solutions that drive business decisions. Overall, as a Manager, Technology specializing in Data, Analytics & AI, you will play a pivotal role in shaping the direction and performance of your team, driving innovation, and delivering high-quality data solutions that positively impact the organization.,

Posted 6 days ago

Apply

7.0 - 10.0 years

15 - 25 Lacs

bengaluru

Work from Office

We are hiring for Big 4-Bangalore Immediate joiners/serving notice (Preferred)/45 days/60 days JD: Bachelors or higher degree in Computer Science or a related discipline; or equivalent (minimum 7 years work experience). At least 4+ years of consulting or client service delivery experience on Azure At least 4+ years of experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases such as SQL server and data warehouse solutions such as Azure Synapse Extensive experience providing practical direction with using Azure Native services. Extensive hands-on experience implementing data ingestion, ETL and data processing using Azure services: ADLS, Azure Data Factory, Azure Functions, Azure Logic App Synapse/DW, Azure SQL DB, Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Databricks, Cosmo DB etc. Minimum of 4+ years of hands-on experience in Azure and Big Data technologies such as Java, Python, SQL, ADLS/Blob, PySpark and SparkSQL, Databricks, HD Insight and live streaming technologies such as EventHub, Azure Stream Analytics etc. Well versed in DevSecOps and CI/CD deployments Cloud migration methodologies and processes including tools like Azure Data Factory, Data Migration Service, etc. Minimum of 4+ years of RDBMS experience Experience in using Big Data File Formats and compression techniques. Experience working with Developer tools such as Azure DevOps, Visual Studio Team Server, Git, Jenkins, etc. Experience with private and public cloud architectures, pros/cons, and migration considerations.

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As an Azure Databricks Lead with over 6 years of experience, located in Chennai, your primary role will revolve around spearheading data engineering initiatives and offering technical leadership in Azure-based data platform projects. Your expertise in designing and creating scalable big data pipelines using Azure Databricks, Spark, and related Azure services will be crucial for the success of these initiatives. Your key responsibilities will include leading the design, development, and deployment of scalable data pipelines utilizing Azure Databricks and Spark. Collaboration with stakeholders to comprehend business requirements and transforming them into technical specifications will be essential. You will architect comprehensive data solutions by leveraging Azure services like Data Lake, Data Factory, Synapse, Key Vault, and Event Hub. Optimizing existing data pipelines to ensure adherence to security, performance, and cost best practices will be part of your routine tasks. Guiding and mentoring a team of data engineers and developers will also be a significant aspect of your role. Implementation of data governance, quality checks, and CI/CD practices within the Databricks environment will be necessary for maintaining data integrity and efficiency. Your ability to work closely with cross-functional teams, including data scientists, analysts, and BI developers, will be pivotal in ensuring seamless collaboration and project success.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

We are looking for an experienced Senior Data Engineer with expertise in Microsoft Fabric to contribute to our enterprise data modernization and analytics transformation efforts. You should possess a strong understanding of data pipelines, lakehouse architecture, Power BI, Synapse integration, and the ability to modernize legacy data systems to cloud-native solutions. Your role will be crucial in developing scalable, secure, and high-performing data solutions within the Microsoft ecosystem. Your responsibilities will include designing and implementing data pipelines using Microsoft Fabric's Data Factory, Synapse Data Engineering, and OneLake components. You will also be expected to construct and manage lakehouse architectures utilizing Delta Lake, Parquet, and OneLake within Microsoft Fabric. Additionally, you will lead projects aiming to modernize legacy ETL/ELT processes to cloud-native data pipelines. Collaboration with Data Architects, BI Developers, and Analysts will be essential to deliver scalable data models for analytics and reporting purposes. Optimizing Power BI datasets and reports through effective data modeling and DAX practices will also be part of your role. Furthermore, you will implement data governance and security controls, incorporating tools like Microsoft Purview, role-based access, and lineage tracking. Working alongside cross-functional teams, you will contribute to cloud migration projects, particularly transitioning from on-premises SQL/Oracle/Hadoop platforms to Microsoft Azure & Fabric. Your expertise will be needed to evaluate and implement CI/CD practices for data pipelines using Azure DevOps or GitHub Actions. The ideal candidate should hold a Bachelor's/Master's degree in Computer Science, Information Systems, or a related field, along with a minimum of 8 years of experience in data engineering. Proficiency in Microsoft Fabric components such as Data Factory, Lakehouse/OneLake, Synapse Data Engineering, and Power BI is crucial. Experience with data modeling, performance tuning in Power BI, modern data architecture patterns, and various languages like SQL, PySpark, T-SQL, DAX, and Power Query is required. Familiarity with Azure ecosystem tools and strong experience in CI/CD pipelines are also essential. Knowledge of data security, GDPR, HIPAA, and enterprise data governance is preferred. Preferred qualifications include Microsoft certifications like Microsoft Certified: Fabric Analytics Engineer Associate, Azure Data Engineer Associate (DP-203), experience with DataOps and Agile delivery methods, and knowledge of Machine Learning/AI integration with Fabric. Hands-on experience with Notebooks in Microsoft Fabric using Python or Scala would be a plus. In addition to technical skills, the ideal candidate should possess strong analytical and problem-solving abilities, excellent communication, and stakeholder management skills. The ability to lead projects, mentor junior engineers, and collaborate effectively with cross-functional teams are also valued traits.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

andhra pradesh

On-site

The ideal candidate for this position must possess excellent communication skills and should be available to work until 12pm ET to ensure overlapping working hours. It is essential for the candidate to have a strong and proven work experience with the following Microsoft tech stack: - C# - .Net framework - SQL - REST API - Azure resources (Data Factory, App Insights, Cosmos, Function apps, Service Bus, Synapse) The responsibilities of the role include designing, modifying, developing, writing, and implementing software programming applications. The candidate will also be required to support and/or install software applications/operating systems. Participation in the testing process through test review, analysis, and certification of software is a key aspect of the role. The ideal candidate should be familiar with a variety of field concepts, practices, and procedures. They will be expected to rely on their experience and judgment to plan and accomplish goals, performing a variety of complicated tasks. The role also requires a high degree of creativity and latitude in decision-making.,

Posted 1 week ago

Apply

4.0 - 9.0 years

7 - 17 Lacs

bengaluru

Work from Office

Job Description Azure Data Engineer Company: CGI Position: Azure Data Engineer Experience: 4 – 8 Years Location: Bangalore (Yemlur) Interview Mode: Face-to-Face (6th September 2025, Saturday) Compensation: Up to 16 – 17 LPA Notice Period: Only September Joiners considered Job ID: J0725-0611 Role Overview We are looking for an experienced Azure Data Engineer with strong expertise in modern data platforms, data integration, and advanced analytics. The ideal candidate must have hands-on experience in Azure Data Factory, Databricks, Python, and SQL , and should be able to design, develop, and optimize scalable data pipelines and solutions in Azure Cloud. Key Responsibilities Design and build highly scalable data pipelines using Azure Data Factory and Azure Databricks . Develop ETL workflows to extract, transform, and load data from multiple sources into Azure data platforms. Write optimized Python and SQL scripts for data cleansing, transformation, and reporting. Work with structured/unstructured data and implement best practices in data modeling. Collaborate with business analysts, architects, and data scientists to deliver high-quality solutions. Ensure data quality, security, and governance across the platform. Monitor and optimize system performance, troubleshoot production issues, and implement preventive measures. Stay updated with the latest Azure services and recommend improvements in architecture and processes. Mandatory Skills Azure Data Engineering (hands-on experience designing and deploying solutions). Azure Data Factory – Data pipelines, orchestration, triggers, linked services. Azure Databricks – Spark-based data processing, notebooks, Delta Lake. Python – Advanced scripting for data transformation and automation. SQL – Strong expertise in writing complex queries, stored procedures, and performance tuning. Nice to Have (Preferred) Experience with Azure Synapse Analytics. Knowledge of CI/CD pipelines and DevOps for data workflows. Familiarity with cloud security, compliance, and monitoring tools. Exposure to Agile methodology and working in distributed teams. Candidate Requirements Minimum 4+ years of overall experience , with 3+ years in Azure Data Engineering . Strong hands-on expertise across all mandatory skills (ADF, Databricks, Python, SQL). Must be available for Face-to-Face interview on 6th Sept (Saturday) at CGI – Yemlur, Bangalore . Must be an Immediate Joiner (September 2025) . Excellent communication skills and ability to work independently with minimal supervision.

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

You are looking for a skilled and experienced Microsoft Fabric Engineer to join the data engineering team. Your main responsibilities will include designing, developing, and maintaining data solutions using Microsoft Fabric. This will involve working across key workloads such as Data Engineering, Data Factory, Data Science, Real-Time Analytics, and Power BI. In this role, you will need to have a deep understanding of Synapse Data Warehouse, OneLake, Notebooks, Lakehouse architecture, and Power BI integration within the Microsoft ecosystem. Some of your key responsibilities will include designing and implementing scalable and secure data solutions, building and maintaining Data Pipelines using Dataflows Gen2 and Data Factory, working with Lakehouse architecture, and managing datasets in OneLake. You will also be responsible for developing and optimizing notebooks (PySpark or T-SQL) for data transformation and processing, collaborating with data analysts and business users to create interactive dashboards and reports using Power BI (within Fabric), leveraging Synapse Data Warehouse and KQL databases for structured and real-time analytics, monitoring and optimizing performance of data pipelines and queries, and ensuring data quality, security, and governance practices are adhered to. To excel in this role, you should have at least 3 years of hands-on experience with Microsoft Fabric or similar tools in the Microsoft data stack. You must be proficient in tools such as Data Factory (Fabric), Synapse Data Warehouse/SQL Analytics Endpoints, Power BI integration, and DAX, as well as have a solid understanding of data modeling, ETL/ELT processes, and real-time data streaming. Experience with KQL (Kusto Query Language) is a plus, and familiarity with Microsoft Purview, Azure Data Lake, or Azure Synapse Analytics is advantageous. Overall, as a Microsoft Fabric Engineer, you will play a crucial role in designing, developing, and maintaining data solutions using Microsoft Fabric, collaborating with various teams to ensure data quality and security, and staying current with Microsoft Fabric updates and best practices to recommend enhancements. Please note that the qualifications required for this role include proficiency in Microsoft Fabric, OneLake, Data Factory, Data Lake, and DataMesh.,

Posted 1 week ago

Apply

4.0 - 12.0 years

0 Lacs

maharashtra

On-site

As an Azure Data Engineer specializing in Microsoft Fabric (Data Lake) based in Mumbai, you should have a minimum of 4 years of experience in the field, with at least 2 years dedicated to working with Microsoft Fabric technologies. Your expertise in Azure services is key, specifically in Data Lake, Synapse Analytics, Data Factory, Azure Storage, and Azure SQL. Your responsibilities will involve data modeling, ETL/ELT processes, and data integration patterns. It is essential to have experience in Power BI integration for effective data visualization. Proficiency in SQL, Python, or PySpark for data transformations is required for this role. A solid understanding of data governance, security, and compliance in cloud environments is also necessary. Previous experience working in Agile/Scrum environments is a plus. Strong problem-solving skills and the ability to work both independently and collaboratively within a team are crucial for success in this position.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

The ideal candidate for this role will have a Bachelor's degree in Information Technology, Computer Science Engineering, or related fields. You should have at least 5 years of experience in infrastructure management and systems administration. Your primary responsibility will be to provide technical support for Azure cloud services. We are looking for a Cloud Engineer with a strong background in both applications and infrastructure. You should have hands-on experience with Azure services such as Data Lake, Data Factory, and Databricks. Proficiency in setting up Data Warehouse, ETL processes, Reporting, and Analytics will be advantageous. Experience with tools like Control-M, SAS, SunGL, and PowerBI is preferred. Proficiency in both Linux and Windows operating systems is mandatory. Strong knowledge of API and Firewall connectivity is required. Expertise in Control-M is a must-have for this role. Preferred certifications include SSL, ITIL, HP, IAT, MS, VCP, Azure Engineer Certification II, CE, and CCNP. Familiarity with DevOps models/framework and tools like Jenkin, Terraform, Bitbucket/Artifactory, Jira, and Confluence will be beneficial. As an Azure Infrastructure Engineer, you will play a crucial role in managing and optimizing cloud services. This is a full-time, permanent position in the IT/Computers - Software industry. Your key skills should include proficiency in Azure Cloud, Data Lake, Data Factory, Databricks, Control-M, expertise in DevOps models/framework, and experience in Linux and Windows environments. If you meet these requirements and are ready to take on this challenging role, we encourage you to apply with the Job Code: GO/JC/20985/2025. Recruiter Name: Kathiravan,

Posted 1 week ago

Apply

2.0 - 8.0 years

0 Lacs

noida, uttar pradesh

On-site

As a part of the global organization Optum, you will have the opportunity to contribute towards delivering care with the help of technology to improve the health outcomes of millions of individuals. Your work within the team will involve connecting people with the necessary care, pharmacy benefits, data, and resources essential for their well-being. At Optum, we foster a culture driven by diversity, inclusion, and excellence. You will collaborate with talented peers, avail comprehensive benefits, and have access to numerous career development opportunities. Join us in making a positive impact on the communities we serve while striving to advance health equity on a global scale. Start your journey with us by Caring, Connecting, and Growing together. Your primary responsibilities will include architecting and designing various aspects such as extensibility, scalability, security, and design patterns while adhering to predefined checklists and ensuring the implementation of best practices. You will be expected to possess or acquire strong troubleshooting skills and demonstrate interest in resolving issues across different technologies and environments. Additionally, you will be involved in architecting with a modern technology stack and designing Public Cloud Applications using Azure. Your tasks will also involve designing and coding solutions utilizing Azure services like functions, databricks, servicebus, and eventhub. Application build and deployment using Jenkins/CICD Tool will be a part of your routine. To excel in this role, you should meet the following required qualifications: - At least 8 years of software development experience - Minimum 6 years of programming experience in Java, Python, or Scala - A minimum of 4 years of hands-on programming experience in Spark using Scala, Python, or Java - At least 2 years of hands-on working experience with Azure services including Azure Databricks, Azure Data Factory, Azure Functions, and Azure App Service - Experience in reading and loading files into a cloud environment from various file formats such as .csv, JSON, .txt, and Avro - Proficiency in Spark Multi-Threading, Open Source technologies, Design Patterns, SQL queries, REST APIs, Java Spring Boot, Jenkins, GitHub, Performance Tuning, Optimization, and Debugging If you are an External Candidate, please proceed with the application process. Internal Employees are also encouraged to apply for this exciting opportunity to be a part of our team at Optum.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

Genpact is a global professional services and solutions firm with a workforce of over 125,000 individuals spread across more than 30 countries. The company is characterized by an innate curiosity, entrepreneurial agility, and a commitment to creating lasting value for clients. Fueled by the relentless pursuit of a world that works better for people, Genpact serves and transforms leading enterprises, including Fortune Global 500 companies, leveraging its deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. We are currently seeking applications for the position of Lead Consultant, Python Developer. In this role, you will be responsible for coding, testing, and delivering high-quality deliverables. Additionally, you should demonstrate a willingness to learn new technologies. Key Responsibilities: - Develop program specifications and coded modules in compliance with specifications and client standards. - Capture requirements and design from Database, BI Setup, and Implementation. - Effectively communicate within a multi-disciplinary project team and with external agencies to ensure timely completion of assigned tasks. - Collaborate with Products and Strategy teams to comprehend and develop Python codes for product policies. - Hands-on approach to understanding technical issues, building client relationships, establishing process rigor, and acting as a transformation evangelist. Qualifications: Minimum Qualifications: - BE/B Tech/MCA - Excellent written and verbal communication skills Preferred Qualifications/ Skills: - Experience in Python and data analytics space. - Proficiency in Python programming with in-depth knowledge of pandas, numpy, flask, etc. - Experience in building Tableau/Power BI dashboards and creating customized dashboards for visual analytics. - Expertise in writing complex SQL queries, stored procedures, and scripts. - Strong analytical and problem-solving skills. - Familiarity with software design patterns, frameworks, unit testing, automated testing, performance/memory analysis. - Agile development techniques understanding and knowledge of systems engineering, build, and release management principles. - Provide thought leadership to clients across business and technical project dimensions. - Well-versed in designing applications and Infrastructure using various Application Servers, Databases, Security Managers, etc. - Knowledge and expertise in recent trends like Cloud, AI/ML, Microservices, etc. - Experience in configuring CICD pipelines using Github Actions or Jenkins. - Integration of development and deployment tools in Azure. - Proficiency in Azure Data Factory, Azure Service Bus, Cloud messaging system. Primary Job Details: - Job Title: Lead Consultant - Location: India-Mumbai - Schedule: Full-time - Education Level: Bachelor's / Graduation / Equivalent - Job Posting: Feb 27, 2025, 1:06:38 AM - Unposting Date: Mar 29, 2025, 1:29:00 PM Key Skills: Consulting Job Category: Full Time,

Posted 2 weeks ago

Apply

3.0 - 5.0 years

25 - 27 Lacs

bengaluru

Work from Office

-> Utilize Azure data services and frameworks as Data Factory, Stream Analytics, DataLake Storage, and Databricks. -> Integrate and transform data from structured and unstructured data systems into suitable schemas for building analytics solutions. Required Candidate profile -> 3-5 Years of experience as a Data Engineer -> Proficiency in Azure Databricks, SQL, Python and pyspark -> BE/B-Tech Degree mandatory -> Knowledge of parallel processing and data architecture

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

chennai, tamil nadu

On-site

You have a great opportunity to join Ashra Technologies as a Microsoft Fabric professional with a minimum of 7 years of experience. The role will require you to have hands-on experience with Microsoft Fabric components such as Lakehouse, Data Factory, and Synapse. You should also possess a strong expertise in PySpark and Python for large-scale data processing and transformation. In this role, you will be expected to have deep knowledge of various Azure data services including ADLS Gen2, Azure Databricks, Synapse, ADF, Azure SQL, among others. Your responsibilities will include designing, implementing, and optimizing end-to-end data pipelines specifically on Azure platform. It would be advantageous if you have an understanding of Azure infrastructure setup, including networking, security, and access management. While not mandatory, having knowledge of the healthcare domain would be considered a plus. This is a full-time position based out of Pune, Mumbai, Chennai, or Bangalore. If you are interested in this exciting opportunity, please share your resume with us at akshitha@ashratech.com or contact us at 8688322632. We look forward to potentially having you join our team at Ashra Technologies.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Engineer, you will be responsible for designing and implementing data models, warehouses, and databases using Microsoft Fabric, Azure Synapse Analytics, and Azure Data Lake. Your role will involve developing ETL pipelines utilizing tools such as SQL Server Integration Services (SSIS), Azure Synapse Pipelines, and Azure Data Factory. You will leverage Fabric Lakehouse for Power BI reporting, real-time analytics, and automation, while ensuring optimal data integration, governance, security, and performance. Collaboration with cross-functional teams to develop scalable data solutions will be a key aspect of your job. You will implement Medallion Architecture for efficient data processing and work in an Agile environment, applying DevOps principles for automation and CI/CD processes. Your skills should include proficiency in Microsoft Fabric & Azure Lakehouse, OneLake, Data Pipelines, Power BI, Synapse, Data Factory, and Data Lake. Experience in Data Warehousing & ETL both on-premises and in the cloud using SQL, Python, SSIS, and Synapse Pipelines is essential. Strong knowledge of data modeling & architecture, including Medallion Architecture, integration, governance, security, and performance tuning is required. In addition, you should have expertise in analytics & reporting tools such as Power BI, Excel (formulas, macros, pivots), and ERP systems like SAP and Oracle. Problem-solving skills, collaboration abilities in Agile and DevOps environments, and a degree in Computer Science, Engineering, or a related field are necessary. Familiarity with Azure DevOps, Agile, and Scrum methodologies, as well as Microsoft Certifications, particularly Agile certification, would be advantageous for this role.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

You will be responsible for supporting the gathering, analysis, and design of requirements, extracting necessary report data from various sources, and managing reports, dashboards, and visualizations to effectively communicate business data and insights. Conducting comprehensive data analysis to identify trends, patterns, and insights crucial for strategic business decision-making will be a key aspect of your role. Collaborating with stakeholders in business requirements gathering sessions to understand their needs and specifications for reporting and analysis is essential. Your duties will include administering and maintaining BI tools and platforms, managing user access, implementing security protocols, and optimizing performance. Creating impactful visualizations that transform complex data into clear, actionable insights for business users, and interpreting data analysis findings to present them to business stakeholders in a clear and actionable manner will be part of your responsibilities. Additionally, you will provide comprehensive training and ongoing support to empower business users in effectively utilizing BI tools and reports for self-service analytics. Monitoring the performance of enterprise and client reports, optimizing queries and processes to enhance efficiency and operational performance will be crucial. Maintaining documentation such as business requirements documents, data dictionaries, data catalogs, and data mapping documents is also expected. You may be required to perform additional tasks and duties as instructed by the manager or supervisor based on team and business needs, including administrative duties, project support, backup development roles, and other ad-hoc tasks. Required Competencies: - Bachelor's degree in Computer Science, Information Technology, Mathematics, or a related field. - Minimum of 5 years of hands-on experience developing BI solutions using Power BI. - At least 3 years of experience in data warehousing and data modeling, including normalization and denormalization. - Extensive experience performing ETL to extract data from various sources using SSIS, Data Factory, or Microsoft Fabric. - Proficiency in T-SQL scripting and strong technical knowledge of databases. - Expertise in data visualization techniques to create visually appealing and insightful dashboards and reports using Power BI. - Strong skills in designing and developing data models that structure data for analysis and reporting. - Solid knowledge of data warehousing concepts and architectures. - Ability to analyze user needs and data, translating them into technical specifications for BI solutions. - Knowledgeable in other data analysis languages, such as DAX, KQL, or their equivalents.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You are seeking a Cloud Engineer to undertake the migration of millions of documents from Azure Blob Storage to TR Azure Blob Storage. This role involves meticulous planning, thorough testing, comprehensive documentation, and successful execution of the migration process. As a Cloud Engineer, your primary responsibilities will include migrating documents from Azure Blob Storage to TR Azure Blob Storage utilizing Azure Storage (Blob/Table), Data Factory, and ETL jobs. It is imperative to ensure data integrity throughout the migration, optimize performance, and promptly resolve any arising issues. Additionally, documenting the migration process in detail and collaborating effectively with cross-functional teams are crucial for the seamless execution of the project. The ideal candidate for this role should possess expertise in Azure Storage, specifically Blob and Table storage. Prior experience with data migration processes and proficiency in ETL/Data Factory jobs are essential requirements. Strong troubleshooting skills coupled with adept documentation abilities will be instrumental in successfully fulfilling the responsibilities of this position.,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

You will be responsible for leading end-to-end architecture design for data and analytics solutions on Azure, defining and implementing Azure-based modern data platforms, and overseeing solution delivery to ensure timelines and milestones are met. Collaborating with Pre-sales teams, you will support RFP responses, solutioning, and client presentations. Additionally, you will create and maintain comprehensive architectural documentation, review implementation teams, and ensure alignment with security, compliance, and governance standards. Your role will also involve overseeing data integration pipelines, supporting application migration and modernization efforts on Azure, and providing technical leadership and mentoring to development and data engineering teams. Troubleshooting architectural issues and optimizing deployed solutions will be part of your regular tasks. You should have at least 4 years of hands-on experience with Microsoft Azure services such as Data Factory, Databricks, Synapse Analytics, and Azure Data Lake Storage. Proven expertise in designing and implementing enterprise-grade data and analytics solutions, application migration and modernization on cloud platforms, and strong understanding of cloud security, identity management, and compliance practices are essential. Proficiency in modern application architectures, a Bachelor's degree in Engineering with a solid foundation in software engineering and architectural design, as well as strong documentation, stakeholder communication, and project leadership skills are required. Preferred qualifications include Microsoft Azure Certifications, experience with Azure Machine Learning, and familiarity with microservices, containers, and event-driven systems. Join Polestar Solutions, a data analytics and enterprise planning powerhouse, to help customers derive sophisticated insights from their data in a value-oriented manner. The company offers a comprehensive range of services and opportunities for growth and learning in a dynamic environment.,

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

Greetings from Ashra Technologies! We are currently looking to hire a skilled professional for the role of Azure Microsoft Fabric with at least 7+ years of experience. This position is based in Pune, Mumbai, Chennai, or Bangalore. Key Responsibilities & Requirements: - Hands-on experience with Microsoft Fabric, specifically Lakehouse, Data Factory, and Synapse. - Strong expertise in PySpark and Python for large-scale data processing and transformation. - Deep knowledge of Azure data services such as ADLS Gen2, Azure Databricks, Synapse, ADF, Azure SQL, etc. - Proven experience in designing, implementing, and optimizing end-to-end data pipelines on Azure. - Familiarity with Azure infrastructure setup including networking, security, and access management would be beneficial. - While not mandatory, domain knowledge in healthcare would be considered a plus. If you meet the above requirements and are interested in this opportunity, please share your resume with us at akshitha@ashratech.com or contact us at 8688322632. Thank you!,

Posted 2 weeks ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

mumbai, delhi / ncr, bengaluru

Work from Office

Job Title: Microsoft Purview Specialist (Junior Level) Type: Contract (36 Months Project) Location: Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote Experience: 1--3 Years Preferred Availability: Immediate Joiners Preferred Were looking for a Junior Microsoft Purview Specialist to support our data cataloging and governance initiatives in a fast-paced remote setup. Key Responsibilities: Assist in the configuration and management of Microsoft Purview Support data cataloging, classification, and lineage tracking Work with data owners to ensure proper tagging and metadata management Help implement data governance policies Assist in integrating Purview with Azure and on-premises sources Document governance processes and resolve Purview-related issues Collaborate with project teams for timely delivery Primary Skills Required: Microsoft Purview Data Cataloging & Classification Metadata Management Understanding of Data Governance Azure Data Services (Basic knowledge is a plus) Strong communication and collaboration skills Preferred Qualifications: Certification/training in Microsoft Purview or related tools Exposure to Azure ecosystem: Data Factory, Synapse, Data Lake Ability to work independently in a remote environment If interested, please share your profile with the following details: Full Name: Total Experience: Relevant Microsoft Purview Experience: Current CTC: Expected CTC: Notice Period / Availability: Current Location: Preferred Location (Remote):

Posted 2 weeks ago

Apply

2.0 - 4.0 years

4 - 7 Lacs

mumbai, delhi / ncr, bengaluru

Work from Office

Type: Contract (36 Months Project) Availability: Immediate Joiners Preferred Were looking for a Junior Microsoft Purview Specialist to support our data cataloging and governance initiatives in a fast-paced remote setup. Key Responsibilities: Assist in the configuration and management of Microsoft Purview Support data cataloging, classification, and lineage tracking Work with data owners to ensure proper tagging and metadata management Help implement data governance policies Assist in integrating Purview with Azure and on-premises sources Document governance processes and resolve Purview-related issues Collaborate with project teams for timely delivery Primary Skills Required: Microsoft Purview Data Cataloging & Classification Metadata Management Understanding of Data Governance Azure Data Services (Basic knowledge is a plus) Strong communication and collaboration skills Preferred Qualifications: Certification/training in Microsoft Purview or related tools Exposure to Azure ecosystem: Data Factory, Synapse, Data Lake Ability to work independently in a remote environment If interested, please share your profile with the following details: Full Name: Total Experience: Relevant Microsoft Purview Experience: Current CTC: Expected CTC: Notice Period / Availability: Current Location: Preferred Location (Remote): Location-remote,Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad

Posted 2 weeks ago

Apply
Page 1 of 4
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies