Jobs
Interviews

378 Azure Synapse Jobs - Page 8

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Technical Skills Microsoft Purview Expertise (Required) Unified Data CatalogExperience setting up and configuring the catalog, managing collections, classifications, glossary terms, and metadata curation. Data Quality (DQ)Implementing DQ rules, defining metrics (accuracy, completeness, consistency), and using quality scorecards and reports. Data Map and ScansAbility to configure sources, schedule scans, manage ingestion, and troubleshoot scan issues. Data Insights and LineageExperience visualizing data lineage and interpreting catalog insights. Azure Platform Knowledge (Desirable) Azure Data Factory Azure Synapse Analytics Microsoft Fabric including OneLake Experience 3to 5+ years in data governance or data platform projects, ideally with enterprise clients. 2+ years implementing Microsoft Purview or similar tools (Collibra, Informatica, Alation). Hands-on experience configuring and implementing Microsoft Purview Unified Catalog and Data Quality Experience onboarding multiple data sources (on-prem, cloud). Background in data management, data architecture, or business intelligence is highly beneficial. Certifications Desirable Microsoft Certified Azure Data Engineer Associate Microsoft Certified Azure Solutions Architect Expert

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Ahmedabad

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure Analytics Services Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : BE Summary :As an Application Lead for Packaged Application Development, you will be responsible for designing, building, and configuring applications using Microsoft Azure Analytics Services. Your typical day will involve leading the effort to deliver high-quality applications, acting as the primary point of contact for the project team, and ensuring timely delivery of project milestones. Roles & Responsibilities:- Lead the effort to design, build, and configure applications using Microsoft Azure Analytics Services.- Act as the primary point of contact for the project team, ensuring timely delivery of project milestones.- Collaborate with cross-functional teams to ensure the successful delivery of high-quality applications.- Provide technical guidance and mentorship to team members, ensuring adherence to best practices and standards. Professional & Technical Skills: - Must To Have Skills: Strong experience with Microsoft Azure Analytics Services.- Good To Have Skills: Experience with other Azure services such as Azure Data Factory, Azure Databricks, and Azure Synapse Analytics.- Experience in designing, building, and configuring applications using Microsoft Azure Analytics Services.- Must have databricks and pyspark Skills.- Strong understanding of data warehousing concepts and best practices.- Experience with ETL processes and tools such as SSIS or Azure Data Factory.- Experience with SQL and NoSQL databases.- Experience with Agile development methodologies. Additional Information:- The candidate should have a minimum of 5 years of experience in Microsoft Azure Analytics Services.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering high-quality applications.- This position is based at our Bengaluru office. Qualification BE

Posted 1 month ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Hyderabad

Work from Office

Gen AI IT is a global leader in industrial packaging products and services. We are committed to providing innovative solutions that enhance our customers productivity and sustainability. Our team is dedicated to excellence, and we strive to create a collaborative and inclusive work environment. Position Overview: It is seeking a skilled and motivated AI Engineer to join our dynamic team. The ideal candidate will have 2 to 4 years of experience in developing and deploying GenAI and AI/ML solutions to production. This requires hands on experience with no code, low code, and SDKs to build AI systems. The candidate should be proficient in working with data platforms such as Microsoft Azure, Snowflake, GenAI platform such as Azure AI Foundry, Azure OpenAI, Copilot Studio, and ChatGPT. The ability to manage small projects with minimal supervision and a working knowledge of the Agile methodology are essential. The candidate must be comfortable with ambiguity and a fastpaced PoC (Proof of Concept) delivery schedule. KeyResponsibilities: Focus on designing and developing proof of concepts (PoCs) and demonstrate the solution on a tight schedule. Utilize GenAI no code, low code, and SDKs to build robust GenAI agents that automate business processes. Work with data platforms such as Microsoft Azure, Snowflake, and integration services like Azure Data Factory to build agentic workflow Embed/integrate GenAI agents (Copilot agents) into business platforms such as Workday, Teams, etc. Manage small to medium-sized projects with minimal supervision. Apply Agile methodology to ensure efficient project delivery. Make informed decisions under uncertainty and adapt to changing project requirements. Qualifications: Bachelor or master degree in AI, computer science, Engineering, mathematics or a related field. 2 to 4 years of experience in developing and deploying AI/ML solutions to production. Hands on experience with no code, low code, and SDKs for AI system development. Proficiency in data platforms such as Microsoft Azure, Snowflake, and integration services like Azure Data Factory. Experience with Azure Cloud, Azure AI Foundry, Copilot Studio, and frameworks such LangChain, LangGraph, MCP for building agentic systems. Strong understanding of Agile methodology and project management. Ability to manage projects independently and make decisions under ambiguity. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Preferred Skills: Automation of complex processes using GenAI agents, especially using the Azure GenAI echo system Advanced Python programming. Handson experience with data storage systems, especially Snowflake, Azure Data Factory, Azure Fabric, and Azure Synapse Building Copilot agents and embedding them into systems such as Workday, Teams, etc. Mandatory Skills Gen AI Python data storage systems - especially Snowflake, Azure Data Factory, Azure Fabric, and Azure Synapse

Posted 1 month ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Telangana

Work from Office

Immediate Openings on Azure Data Engineer_Hyderabad_Contract Experience 10 + Years Skills Azure Data Engineer Location Hyderabad Notice Period Immediate . Employment Type Contract 10+ Years of overall experience in support and development. Primary Skills Microsoft Azure Cloud Platform Azure Admin(Good to have), Azure Data Factory (ADF), Azure DataBricks, Azure Synapse Analytics, Azure SQL, Azure DevOps, Python or Python Spark Secondary Skills DataLake, Azure Blob Storage, Azure Data Warehouse as Service (DWaaS) and Azure LogAnalytics, Oracle, Postgres, Microsoft storage explorer, ServiceNow

Posted 1 month ago

Apply

14.0 - 22.0 years

35 - 50 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Role & responsibilities Job Summary: We are seeking a highly experienced Azure Databricks Architect to design and implement large-scale data solutions on Azure. The ideal candidate will have a strong background in data architecture, data engineering, and data analytics, with a focus on Databricks. Key Responsibilities: Design and implement end-to-end data solutions on Azure, leveraging Databricks, Azure Data Factory, Azure Storage, and other Azure services Lead data architecture initiatives, ensuring alignment with business objectives and best practices Collaborate with stakeholders to define data strategies, architectures, and roadmaps Develop and maintain data pipelines, ensuring seamless data integration and processing Optimize performance, scalability, and cost efficiency for Databricks clusters and data pipelines Ensure data security, governance, and compliance across Azure data services Provide technical leadership and mentorship to junior team members Stay up-to-date with industry trends and emerging technologies, applying knowledge to improve data solutions Requirements: 15+ years of experience in data architecture, data engineering, or related field 6+ years of experience with Databricks, including Spark, Delta Lake, and other Databricks features Databricks certification (e.g., Databricks Certified Data Engineer or Databricks Certified Architect) Strong understanding of data architecture principles, data governance, and data security Experience with Azure services, including Azure Data Factory, Azure Storage, Azure Synapse Analytics Programming skills in languages such as Python, Scala, or R Excellent communication and collaboration skills Nice to Have: Experience with data migration from on-premises data warehouses (e.g., Oracle, Teradata) to Azure Knowledge of data analytics and machine learning use cases Familiarity with DevOps practices and tools (e.g., Azure DevOps, Git) What We Offer: Competitive salary and benefits package Opportunity to work on large-scale data projects and contribute to the development of cutting-edge data solutions Collaborative and dynamic work environment Professional development and growth opportunities

Posted 1 month ago

Apply

5.0 - 10.0 years

9 - 15 Lacs

Gurugram

Work from Office

Experience:- 5+ yrs Location:- Remote Budget:- 15 LPA Contact:9916086641 anamika@makevisionsoutsourcing.in

Posted 1 month ago

Apply

10.0 - 17.0 years

12 - 22 Lacs

Gurugram

Work from Office

We know the importance that food plays in people's lives the power it has to bring people, families and communities together. Our purpose is to bring enjoyment to people’s lives through great tasting food, in a way which reflects our values. McCain has recently committed to implementing regenerative agriculture practices across 100 percent of our potato acreage by 2030. Ask us more about our commitment to sustainability. OVERVIEW McCain is embarking on a digital transformation. As part of this transformation, we are making significant investments into our data platforms, common data models. data structures and data policies to increase the quality of our data, the confidence of our business teams to use this data to make better decisions and drive value through the use of data. We have a new and ambitious global Digital & Data group, which serves as a resource to the business teams in our regions and global functions. We are currently recruiting an experienced Data Architect to build enterprise data model McCain. JOB PURPOSE: Reporting to the Data Architect Lead, Global Data Architect will take a lead role in creating the enterprise data model for McCain Foods, bringing together data assets across agriculture, manufacturing, supply chain and commercial. This data model will be the foundation for our analytics program that seeks to bring together McCain’s industry-leading operational data sets, with 3rd party data sets, to drive world-class analytics. Working with a diverse team of data governance experts, data integration architects, data engineers and our analytics team including data scientist, you will play a key role in creating a conceptual, logical and physical data model that underpins the Global Digital & Data team’s activities. . JOB RESPONSIBILITIES: Develop an understanding of McCain’s key data assets and work with data governance team to document key data sets in our enterprise data catalog Work with business stakeholders to build a conceptual business model by understanding the business end to end process, challenges, and future business plans. Collaborate with application architects to bring in the analytics point of view when designing end user applications. Develop Logical data model based on business model and align with business teams Work with technical teams to build physical data model, data lineage and keep all relevant documentations Develop a process to manage to all models and appropriate controls With a use-case driven approach, enhance and expand enterprise data model based on legacy on-premises analytics products, and new cloud data products including advanced analytics models Design key enterprise conformed dimensions and ensure understanding across data engineering teams (including third parties); keep data catalog and wiki tools current Primary point of contact for new Digital and IT programs, to ensure alignment to enterprise data model Be a clear player in shaping McCain’s cloud migration strategy, enabling advanced analytics and world-leading Business Intelligence analytics Work in close collaboration with data engineers ensuring data modeling best practices are followed MEASURES OF SUCCESS: Demonstrated history of driving change in a large, global organization A true passion for well-structured and well-governed data; you know and can explain to others the real business risk of too many mapping tables You live for a well-designed and well-structured conformed dimension table Focus on use-case driven prioritization; you are comfortable pushing business teams for requirements that connect to business value and also able to challenge requirements that will not achieve the business’ goals Developing data models that are not just elegant, but truly optimized for analytics, both advanced analytics use cases and dashboarding / BI tools A coaching mindset wherever you go, including with the business, data engineers and other architects A infectious enthusiasm for learning: about our business, deepening your technical knowledge and meeting our teams Have a “get things done” attitude. Roll up the sleeves when necessary; work with and through others as needed KEY QUALIFICATION & EXPERIENCES: Data Design and Governance At least 5 years of experience with data modeling to support business process Ability to design complex data models to connect and internal and external data Nice to have: Ability profile the data for data quality requirements At least 8 years of experience with requirement analysis; experience working with business stakeholders on data design Experience on working with real-time data. Nice to have: experience with Data Catalog tools Ability to draft accurate documentation that supports the project management effort and coding Technical skills At least 5 years of experience designing and working in Data Warehouse solutions building data models; preference for having S4 hana knowledge. At least 2 years of experience in visualization tools preferably Power BI or similar tools.e At least 2 years designing and working in Cloud Data Warehouse solutions; preference for Azure Databricks, Azure Synapse or earlier Microsoft solutions Experience Visio, Power Designer, or similar data modeling tools Nice to have: Experience in data profiling tools informatica, Collibra or similar data quality tools Nice to have: Working experience on MDx Experience in working in Azure cloud environment or similar cloud environment Must have : Ability to develop queries in SQL for assessing , manipulating, and accessing data stored in relational databases , hands on experience in PySpark, Python Nice to have: Ability to understand and work with unstructured data Nice to have at least 1 successful enterprise-wide cloud migration being the data architect or data modeler. - mainly focused on building data models. Nice to have: Experience on working with Manufacturing /Digital Manufacturing. Nice to have: experience designing enterprise data models for analytics, specifically in a PowerBI environment Nice to have: experience with machine learning model design (Python preferred) Behaviors and Attitudes Comfortable working with ambiguity and defining a way forward. Experience challenging current ways of working A documented history of successfully driving projects to completion Excellent interpersonal skills Attention to the details. Good interpersonal and communication skills Comfortable leading others through change

Posted 1 month ago

Apply

8.0 - 13.0 years

25 - 32 Lacs

Bengaluru

Remote

Greetings!!! Currently we have an urgent Opening for Azure Architect role with one of our projects, for a remote role. Job Location : Remote Looking only for Immediate Joiners Skills: Must to have skills. Terraform, scripting, Azure Migration Activities. Recent 2 projects .The candidate need to be involved in Azure migration experience, and he /she need to have solid architectural skills. Job Description: Architect for Migration Intakes and collaboration with Cloud Platform Engineering team for demand forecasting Primary skills - Application migration, broad range of experience migrating Apps from on-prem and AWS to Azure. Ability to identify potential challenges to migration. Experience sizing Applications in terms of R-Type and Complexity. Experience of assessing applications to understand and articulate the correct technical approach to migration, Azure App migration experience. Strong knowledge of Azure Cloud Products/Services. Experience range - Should have at least 5 years- experience working on complex Azure migrations spanning multiple technologies - Windows, SQL, AKS, Oracle, .NET, Java. Must have experience of working with Lead Architects and App owners to articulate challenges and be comfortable guiding Interested Candidates kindly revert with your updated resume to gsathish@sonata-software.com Regards Sathish Talent Acquisition - Sonata Software Services 9840669681

Posted 1 month ago

Apply

10.0 - 15.0 years

12 - 17 Lacs

Hyderabad

Work from Office

Skillset: In-depth knowledge of Azure Synapse Analytics (with dedicated pools) Proficient in Azure Data Factory (ADF) for ETL processes Strong SQL skills for complex queries and data manipulation Knowledge of data warehousing and big data analytics Good analytical and problem-solving skils.

Posted 1 month ago

Apply

3.0 - 6.0 years

5 - 8 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Work from Office

We are seeking an experienced Azure Data Engineer with 36 years of experience for a 6-month remote contract. The candidate will be responsible for developing and supporting IT solutions using technologies like Azure Data Factory, Azure Databricks, Azure Synapse, Python, PySpark, Teradata, and Snowflake. The role involves designing ETL pipelines, developing Databricks notebooks, handling CI/CD pipelines via Azure DevOps, and working on data warehouse modeling and integration. Strong skills in SQL, data lake storage, and deployment/monitoring are required. Prior experience in Power BI and DP-203 certification is a plus. Location: Remote- Bengaluru,Hyderabad,Delhi / NCR,Chennai,Pune,Kolkata,Ahmedabad,Mumbai

Posted 1 month ago

Apply

7.0 - 12.0 years

24 - 30 Lacs

Bengaluru

Work from Office

Expertise in Azure Data Factory (ADF) and Azure Synapse Strong proficiency in SQL and Data Modeling Experience with Data Lakes , Power BI (preferred) MS SQL Server experience Familiarity with CI/CD deployment for data pipelines

Posted 1 month ago

Apply

4.0 - 7.0 years

10 - 20 Lacs

Hyderabad

Work from Office

Job Summary We are seeking a skilled and detail-oriented Azure Data Engineer to join our data team. In this role, you will be responsible for designing, building, and maintaining scalable data pipelines and solutions on the Microsoft Azure cloud platform. You will collaborate with data analysts, reporting team, and business stakeholders to ensure efficient data availability, quality, and governance. Must have skills: Strong hands on experience with Azure Data Factory , Azure Data Lake Storage , and Azure SQL . Good to have skills: Working knowledge on Databricks, Azure Synapse Analytics, Azure functions, Logic app workflows, Log analytics and Azure DevOps. Roles and Responsibilities Design and implement scalable data pipelines using Azure Data Factory , Azure SQL , Databricks , and other Azure services. Develop and maintain data lakes and data warehouses on Azure. Integrate data from various on-premises and cloud-based sources. Create and manage ETL/ELT processes , ensuring data accuracy and performance. Optimize and troubleshoot data pipelines and workflows. Ensure data security, compliance, and governance. Collaborate with business stakeholders to define data requirements and deliver actionable insights. Monitor and maintain Azure data services performance and cost-efficiency. Design, develop, and maintain SQL Server databases and ETL processes. Write complex SQL queries, stored procedures, functions, and triggers to support application development and data analysis Optimize database performance through indexing, partitioning, and other performance tuning techniques.

Posted 1 month ago

Apply

6.0 - 8.0 years

6 - 15 Lacs

Hyderabad, Pune, Chennai

Work from Office

Exp- 6-8 yrs NP:15 days (Max)

Posted 1 month ago

Apply

8.0 - 13.0 years

0 - 1 Lacs

Hyderabad, Bengaluru

Work from Office

Role & responsibilities Solutions Architect AI & Microsoft Ecosystem Job Title: Solutions Architect AI & Microsoft Ecosystem Experience: 8+ Years Job Description: We are seeking a Solutions Architect with deep experience in designing AI and enterprise solutions within the Microsoft ecosystem. This role involves driving end-to-end solutioning across AI, data, and enterprise architecture using Azure and Microsoft technologies. Responsibilities: Design AI-powered enterprise solutions leveraging Azure Cognitive Services, Azure Machine Learning, and Azure OpenAI Define architecture blueprints that align with Microsoft cloud best practices Guide development teams in implementing scalable and secure Azure-based solutions Provide thought leadership on AI strategy, data governance, and cloud adoption Engage in technical discussions with Microsoft stakeholders and clients Required Skills: Strong architectural experience in Azure (IaaS, PaaS, AI/ML, Data Services) In-depth knowledge of Microsoft tools: Azure Synapse, Azure Data Factory, Power BI, and Logic Apps Experience with Azure API Management, Azure DevOps, and MLOps Understanding of security, compliance, and data privacy frameworks within Azure Proven ability to lead technical delivery teams and solution roadmaps Preferred candidate profile Microsoft Certified: Azure Solutions Architect Expert Familiarity with Microsoft Industry Clouds (e.g., Microsoft Cloud for Healthcare, Financial Services) Exposure to enterprise platforms like Microsoft Dynamics 365 or Power Platform

Posted 1 month ago

Apply

5.0 - 10.0 years

9 - 15 Lacs

Pune

Remote

Experience:- 5+ yrs Location:- Remote Budget:- 15 LPA Contact:9916086641 anamika@makevisionsoutsourcing.in

Posted 1 month ago

Apply

5.0 - 8.0 years

9 - 14 Lacs

Pune

Work from Office

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Azure Synapse Analytics. Experience5-8 Years.

Posted 1 month ago

Apply

5.0 - 8.0 years

9 - 12 Lacs

Pune

Work from Office

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Azure Synapse Analytics. Experience5-8 Years.

Posted 1 month ago

Apply

8.0 - 12.0 years

15 - 30 Lacs

Bengaluru

Work from Office

Purpose As a Staff Infrastructure Engineer at LogixHealth, you will work with a globally distributed team of engineers to design and build cutting edge solutions that directly improve the healthcare industry. Youll contribute to our fast-paced, collaborative environment and bring your expertise to continue delivering innovative technology solutions, while mentoring others. Duties and Responsibilities 1. Lead and contribute to the creation of a cloud platform for delivering complex and challenging projects across software and data 2. Design and build cloud-native infrastructure using industry leading practices and tools 3. Establish CI/CD processes, test frameworks, infrastructure-as-code tools, and monitoring/alerting (Git, Terraform, Azure DevOps / GitHub Actions / Jenkins, Azure Monitor / Datadog) 4. Ensure compute, network, storage, and security best principles for a highly available, reliable, secure and cost-efficient solution 5. Adhere to the Code of Conduct and be familiar with all compliance policies and procedures stored in LogixGarden relevant to this position Qualifications To perform this job successfully, an individual must be able to perform each duty satisfactorily. The requirements listed below are representative of the knowledge, skills, and/or ability required. Reasonable accommodation may be made to enable individuals with disabilities perform the duties. Experience 1. 8+ years infrastructure engineering experience 2. 3+ years in a senior, staff or principal engineer role 3. Experience designing and building cloud-native solutions (Azure, AWS, Google Cloud Platform) 4. Experience with Infrastructure as Code tools (Terraform, Pulumi, cloud-specific IaC tools) 5. Experience with configuration management tools (Ansible, Chef, Salt Stack) 6. Experience with containerization and orchestration technologies (Docker, Kubernetes) 7. Experience leading projects within a team and across teams 8. Azure experience preferred 9. Azure Databricks implementation experience preferred 10. Experience with one or more CNCF projects preferred 12. Experience designing and implementing infrastructure security and governance platform adhering to compliance standards (HIPPA, SOC 2) preferred Specific Job Knowledge, Skill and Ability 1. Possess a passion for mentoring and guiding others 2. Strong programming skills in Python, TypeScript, Golang, C#, or other languages (Bash, PowerShell) 3. Strong written and verbal communication skills 4 Expert knowledge in architecting, designing and implementing infrastructure solutions to serve the needs of our data processes and software products 5 Ability to keep security, maintainability, and scalability in mind with the solutions built 6Possess excellent interpersonal communication skills and an aptitude for continued learning

Posted 1 month ago

Apply

10.0 - 15.0 years

1 - 1 Lacs

Bengaluru

Remote

Highly experienced Data Architect / Data Modeler to design and govern the data architecture of a critical MI platform. Requires deep expertise in data modeling, data integration, and cloud-based data platforms.

Posted 1 month ago

Apply

6.0 - 9.0 years

7 - 11 Lacs

Pune

Work from Office

Job Title : Azure Data Factory Engineer Location State : Maharashtra Location City : Pune Experience Required : 6 to 8 Year(s) CTC Range : 7 to 11 LPA Shift: Day Shift Work Mode: Onsite Position Type: C2H Openings: 2 Company Name: VARITE INDIA PRIVATE LIMITED About The Client: Client is an Indian multinational technology company specializing in information technology services and consulting. Headquartered in Mumbai, it is a part of the Tata Group and operates in 150 locations across 46 countries. About The Job: A minimum of 5 years experience with large SQL data marts. Expert relational database experience, Candidate should demonstrate ability to navigate through massive volumes of data to deliver effective and efficient data extraction, design, load, and reporting solutions to business partnersExperience in troubleshooting and Supporting large databases and testing activities; Identifying reporting, and managing database security issues, user access/management; Designing database backup, archiving and storage, performance tunning, ETL importing large volume of data extracted from multiple systems, capacity planning Essential Job Functions: Strong knowledge of Extraction Transformation and Loading (ETL) processes using frameworks like Azure Data Factory or Synapse or Databricks; establishing the cloud connectivity between different system like ADLS ,ADF, Synapse, Databricks etc Qualifications: Skill Required: Digital : PySpark~Azure Data Factory How to Apply: Interested candidates are invited to submit their resume using the apply online button on this job post. About VARITE: VARITE is a global staffing and IT consulting company providing technical consulting and team augmentation services to Fortune 500 Companies in USA, UK, CANADA and INDIA. VARITE is currently a primary and direct vendor to the leading corporations in the verticals of Networking, Cloud Infrastructure, Hardware and Software, Digital Marketing and Media Solutions, Clinical Diagnostics, Utilities, Gaming and Entertainment, and Financial Services. Equal Opportunity Employer: VARITE is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. We do not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity or expression, national origin, age, marital status, veteran status, or disability status. Unlock Rewards: Refer Candidates and Earn. If you're not available or interested in this opportunity, please pass this along to anyone in your network who might be a good fit and interested in our open positions. VARITE offers a Candidate Referral program, where you'll receive a one-time referral bonus based on the following scale if the referred candidate completes a three-month assignment with VARITE. Exp Req - Referral Bonus 0 - 2 Yrs. - INR 5,000 2 - 6 Yrs. - INR 7,500 6 + Yrs. - INR 10,000

Posted 1 month ago

Apply

8.0 - 13.0 years

14 - 24 Lacs

Hyderabad

Hybrid

Key Responsibilities: • Designing and building a scalable Datawarehouse using Azure Data Factory (ADF) and Azure Synapse Pipelines and SSIS. • Creating visually appealing BI dashboards using Power BI and other reporting tools to deliver data-driven insights. • Collaborating with cross-functional teams, communicating complex concepts, and ensuring data governance and quality standards. Basic Qualifications: • 9-12 years of strong Business Intelligence/Business Analytics experience or equivalency is preferred. • B. Tech/B.E. - Any Specialization, M.E/M.Tech - Any Specialization • Strong proficiency in SQL and experience with database technologies (e.g., SQL Server). • Solid understanding of data modeling, data warehousing, and ETL concepts. • Excellent analytical and problem-solving skills, with the ability to translate complex business requirements into practical solutions. • Strong communication and collaboration skills, with the ability to effectively interact with stakeholders at all levels of the organization. • Proven ability to work independently and manage multiple priorities in a fast-paced environment. • Must have worked on ingesting data in the Enterprise Data Warehouse. • Good experience in the areas of Business Intelligence and Reporting, including but not limited to On-Prem and Cloud Technologies • Must have exposure to complete MSBI stack including Power BI and deliver end to end BI solutions independently. • Must have technical expertise in creating data pipelines/ data integration strategy using SSIS/ ADF/ Synapse Pipeline Preferred Qualifications • Hands-on experience on DBT and Fabric will be preferred. • Proficiency in programming languages such as Python or R is a plus.

Posted 1 month ago

Apply

4.0 - 8.0 years

15 - 30 Lacs

Bengaluru

Hybrid

Role & responsibilities Develop and implement generative AI solutions, leveraging Microsofts cutting end technologies Work closely with cross-functional teams to prototype generative AI solutions and validate business needs and requirements Collaborate closely with stakeholders to align solutions to business needs Consider production deployment requirements as part of solution design Optimize solutions using the most appropriate models and design for cost effectiveness, efficiency, and accuracy Collaborate with data scientists to integrate machine learning models into proposed solutions Preferred candidate profile Skills: Proficiency in Python and C# (especially in Azure-integrated environments) Experience with or exposure to the following Microsoft technologies: Azure AI Studio Azure OpenAI Services Azure AI Search (Cognitive Search) Fabric Copilot Studio Cognetive Services (Language, Speech, Vision APIs) Experience with foundation models such as GPT3.5 Turbo, GPT 4.0, etc and RAG (retrieval augmented generation) using Azure AI Search or other vector databases Event Driven Architectures and cloud services that support them (e.g., Azure Service Bus, Kafka, Azure Event Grid, RabbitMQ, etc.) Cloud Software development including coding, testing, and debugging on the Microsoft technology stack including .NET (Core/6+), Entity Framework, C#, Typescript, MS SQL, Azure SQL, CosmosDB Prompt flow and evaluation flow for testing/validating models, and exposure to Azure Content Safety is considered an asset Experience with the following is considered an asset: Data modeling and analytics Azure SQL Azure Synapse Azure Data Bricks Azure Data Factory Certifications : 1. Microsoft Certified: Azure AI Engineer Associate (AI-102) with focus on NLP, vision, responsible AI, Azure Cognitive Services, Azure OpenAI

Posted 1 month ago

Apply

7.0 - 12.0 years

18 - 27 Lacs

Hyderabad

Work from Office

SnowFlake Data Engineering (SnowFlake, DBT & ADF) Lead Programmer Analyst (Experience: 7 to 12 Years) We are looking for a highly self-motivated individual with SnowFlake Data Engineering (SnowFlake, DBT & ADF) Lead Programmer Analyst: At least 5+ years of experience in designing and developing Data Pipelines & Assets. Must have experience with at least one Columnar MPP Cloud data warehouse (Snowflake/Azure Synapse/Redshift) for at least 5 years. Experience in ETL tools like Azure Data factory, Fivetran / DBT for 4 years. Experience with Git and Azure DevOps. Experience in Agile, Jira, and Confluence. Solid understanding of programming SQL objects (procedures, triggers, views, functions) in SQL Server. Experience optimizing SQL queries a plus. Working Knowledge of Azure Architecture, Data Lake. Willingness to contribute to documentation (e.g., mapping, defect logs). Generate functional specs for code migration or ask right questions thereof. Hands on programmer with a thorough understand of performance tuning techniques. Handling large data volume transformations (order of 100 GBs monthly). Able to create solution / data flows to suit requirements. Produce timely documentation e.g., mapping, UTR, defect / KEDB logs etc. Self-starter & learner. Able to understand and probe for requirements. Tech experience expected. Primary: Snowflake, DBT (development & testing). Secondary: Python, ETL or any data processing tool. Nice to have - Domain experience in Healthcare. Should have good oral and written communication. Should be a good team player. Should be proactive and adaptive.

Posted 1 month ago

Apply

5.0 - 8.0 years

20 - 25 Lacs

Mohali, Pune

Work from Office

Exp with Azure Data Factory, Azure Synapse Analytics, Azure SQL Database, Azure Storage, SQL, Git, CI/CD, Azure DevOps, RESTful APIs, Data APIs, event-driven architecture, data governance, lineage, security, privacy best practices. Immediate Joiners Required Candidate profile Data Warehousing, Data Lake, Azure Cloud Services, Azure DevOps ETL-SSIS, ADF, Synapse, SQL Server, Azure SQL Data Transformation, Modelling, Integration Microsoft Certified: Azure Data Engineer

Posted 1 month ago

Apply

9.0 - 14.0 years

27 - 40 Lacs

Hyderabad

Remote

Experience Required: 8+Years Mode of work: Remote Skills Required: Azure DataBricks, Eventhub, Kafka, Architecture,Azure Data Factory, Pyspark, Python, SQL, Spark Notice Period : Immediate Joiners/ Permanent/Contract role (Can join within July 4th 2025) Responsibilities Design, develop, and maintain scalable and robust data solutions in the cloud using Apache Spark and Databricks. Gather and analyse data requirements from business stakeholders and identify opportunities for data-driven insights. Build and optimize data pipelines for data ingestion, processing, and integration using Spark and Databricks. Ensure data quality, integrity, and security throughout all stages of the data lifecycle. Collaborate with cross-functional teams to design and implement data models, schemas, and storage solutions. Optimize data processing and analytics performance by tuning Spark jobs and leveraging Databricks features. Provide technical guidance and expertise to junior data engineers and developers . Stay up to date with emerging trends and technologies in cloud computing, big data, and data engineering. Contribute to the continuous improvement of data engineering processes, tools, and best practices. Requirements: Bachelors or masters degree in computer science, engineering, or a related field. 10+ years of experience as a Data Engineer with a focus on building cloud-based data solutions. Mandatory skills: Azure DataBricks, Eventhub, Kafka, Architecture, Azure Data Factory, Pyspark, Python, SQL, Spark Strong experience with cloud platforms such as Azure or AWS. Proficiency in Apache Spark and Databricks for large-scale data processing and analytics. Experience in designing and implementing data processing pipelines using Spark and Databricks. Strong knowledge of SQL and experience with relational and NoSQL databases. Experience with data integration and ETL processes using tools like Apache Airflow or cloud-native orchestration services. Good understanding of data modelling and schema design principles. Experience with data governance and compliance frameworks . Excellent problem-solving and troubleshooting skills. Strong communication and collaboration skills to work effectively in a cross-functional team. Interested candidate can share your resume to OR you can refer your friend to Pavithra.tr@enabledata.com for the quick response.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies