Jobs
Interviews

378 Azure Synapse Jobs - Page 6

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 10.0 years

8 - 10 Lacs

Chennai

Remote

Title: Senior Data Architect Years of Experience : 10+ years Location: Onsite ( The selected candidate is required to relocate to Kovilpatti/ Chennai, Tamil Nadu for the initial three-month project training session). Job Description The Senior Data Architect will design, govern, and optimize the entire data ecosystem for advanced analytics and AI workloads. This role ensures data is collected, stored, processed, and made accessible in a secure, performant, and scalable manner. The candidate will drive architecture design for structured/unstructured data, build data governance frameworks, and support the evolution of modern data platforms across cloud environments. Key responsibilities Architect enterprise data platforms using Azure/AWS/GCP and modern data lake/data mesh patterns Design logical and physical data models, semantic layers, and metadata frameworks Establish data quality, lineage, governance, and security policies Guide the development of ETL/ELT pipelines using modern tools and streaming frameworks Integrate AI and analytics solutions with operational data platforms Enable self-service BI and ML pipelines through Databricks, Synapse, or Snowflake Lead architecture reviews, design sessions, and CoE reference architecture development Technical Skills Cloud Platforms: Azure Synapse, Databricks, Azure Data Lake, AWS Redshift Data Modeling: ERWin, dbt, Power Designer Storage & Processing: Delta Lake, Cosmos DB, PostgreSQL, Hadoop, Spark Integration: Azure Data Factory, Kafka, Event Grid, SSIS Metadata/Lineage: Purview, Collibra, Informatica BI Platforms: Power BI, Tableau, Looker Security & Compliance: RBAC, encryption at rest/in transit, NIST/FISMA Qualification Bachelors or Master’s in Computer Science, Information Systems, or Data Engineering Microsoft Certified: Azure Data Engineer / Azure Solutions Architect Strong experience building cloud-native data architectures Demonstrated ability to create data blueprints aligned with business strategy and compliance.

Posted 1 month ago

Apply

3.0 - 8.0 years

7 - 12 Lacs

Kochi

Hybrid

Role & responsibilities Report and Dashboard Development: Design, develop, and deploy interactive Power BI reports and dashboards to meet the needs of various business units. Data Modeling: Develop and maintain data models to support business requirements, ensuring efficient data retrieval and reporting. Data Integration: Integrate data from various sources into SQL Server and Azure Synapse, ensuring data accuracy and consistency. Collaboration: Work closely with stakeholders, including business analysts, data scientists, and management, to gather requirements and deliver solutions that drive business insights. Documentation: Create and maintain comprehensive documentation for data models, reports, dashboards, and processes. Performance Monitoring: Monitor and optimize the performance of BI solutions, identifying and resolving issues proactively. Training and Support: Provide support to end-users on Power BI functionalities and best practices. Preferred candidate profile Excellent analytical, troubleshooting, problem-solving and research skills. Must be able to multitask and have experience with interacting within a diverse user/customer base 2-3 years of experience in BI development and data analysis. 3-5 years of experience with Power BI report development and deployment. 2-3 years of experience using SQL server and /or Azure Synapse analytics. Excellent written, verbal, and interpersonal communication skills 2-3 years of experience with Data Warehouse concepts, including the use of Extract, Transform, and Load (ETL) tools. Experience using Cloud architecture, NoSQL databases and R/Python is a plus. Experience using building data pipelines to integrate with unstructured data sources is a plus. Sales/Marketing business background is plus.

Posted 1 month ago

Apply

16.0 - 18.0 years

30 - 36 Lacs

Bengaluru

Work from Office

Data strategist, team leader, and financial advisor to clients. Strong understanding of finance & accounting principles End-to-end BI lifecycle, from solution architecture to insights delivery DAX, ETL - SQL, AZURE - PowerBI - MSBI, DWH, Provident fund Health insurance Annual bonus

Posted 1 month ago

Apply

6.0 - 11.0 years

9 - 19 Lacs

Chennai, Bengaluru

Hybrid

Hello, Greetings from LTIMindtree! We are Hiring for Azure Data Engineer (PAN India)! Job Description Notice Period:- 0 to 60 Days only Experience:- 5 to 12 Years Interview Mode :- F2F (12th-July-2025) Hybrid (2-3 WFO) Brief Description of Role Job Summary: We are seeking an experienced and strategic Data to design, build, and optimize scalable, secure, and high-performance data solutions. You will play a pivotal role in shaping our data infrastructure, working with technologies such as Databricks, Azure Data Factory, Unity Catalog , and Spark , while aligning with best practices in data governance, pipeline automation , and performance optimization . Key Responsibilities: Design and develop scalable data pipelines using Databricks and Medallion Architecture (Bronze, Silver, Gold layers). • Architect and implement data governance frameworks using Unity Catalog and related tools. • Write efficient PySpark and SQL code for data transformation, cleansing, and enrichment. • Build and manage data workflows in Azure Data Factory (ADF) including triggers, linked services, and integration runtimes. • Optimize queries and data structures for performance and cost-efficiency . • Develop and maintain CI/CD pipelines using GitHub for automated deployment and version control. • Collaborate with cross-functional teams to define data strategies and drive data quality initiatives. • Implement best practices for DevOps, CI/CD , and infrastructure-as-code in data engineering. • Troubleshoot and resolve performance bottlenecks across Spark, ADF, and Databricks pipelines. • Maintain comprehensive documentation of architecture, processes, and workflows . Requirements: Bachelors or masters degree in computer science, Information Systems, or related field. • Proven experience as a Data Architect or Senior Data Engineer. • Strong knowledge of Databricks , Azure Data Factory , Spark (PySpark) , and SQL . • Hands-on experience with data governance , security frameworks , and catalog management . • Proficiency in cloud platforms (preferably Azure). • Experience with CI/CD tools and version control systems like GitHub. • Strong communication and collaboration skills.

Posted 1 month ago

Apply

2.0 - 7.0 years

6 - 16 Lacs

Kolkata, Hyderabad, Bengaluru

Hybrid

Exciting Azure developer Job Opportunity at Infosys! We are looking for skilled Azure Developers to join our dynamic team PAN INDIA. If you have a passion for technology and a minimum of 2 to 9 years of hands-on experience in azure development, this is your chance to make an impact. At Infosys, we value innovation, collaboration, and diversity. We believe that a diverse workforce drives creativity and fosters a richer company culture. Therefore, we strongly encourage applications from all genders and backgrounds. Ready to take your career to the next level? Join us in shaping the future of technology. Visit our careers page for more details on how to apply.

Posted 1 month ago

Apply

6.0 - 11.0 years

10 - 20 Lacs

Chennai

Work from Office

Job Description: Primary: Azure, Databricks, ADF, Pyspark/Python Secondary: Datawarehouse, SAS/Alteryx Must Have • 6+ Years of IT experience in Datawarehouse and ETL • Hands-on data experience on Cloud Technologies on Azure, ADF, Synapse, Pyspark/Python • Ability to understand Design, Source to target mapping (STTM) and create specifications documents • Flexibility to operate from client office locations • Able to mentor and guide junior resources, as needed Nice to Have • Any relevant certifications • Banking experience on RISK & Regulatory OR Commercial OR Credit Cards/Retail

Posted 1 month ago

Apply

5.0 - 8.0 years

10 - 20 Lacs

Hyderabad/Secunderabad

Work from Office

Title: Sr Azure Data Engineer Experience: 5 to 8 Years Location: Hyderabad Working Hours: 12.00pm to 9.00pm Key Skills: SQL, Azure Data factory, Azure Data Lake, Azure Data bricks, Synapse, Data Fabric Architecture. We are looking for an experienced data engineer to join our team. You will use various methods to transform raw data into useful data systems. For example, you'll create pipelines and help to design data warehousing systems. Overall, you'll strive for efficiency by aligning data systems with business goals. To succeed in this data engineering position, you should have strong analytical skills and the ability to combine data from different sources. Data engineer skills also include familiarity with several programming languages and knowledge of learning machine methods. Responsibilities Analyze and organize raw data Build data systems and pipelines to support Power BI or another analytical tools reports and dashboards Evaluate business needs and objectives Conduct complex data analysis and report on results Combine raw information from different sources Explore ways to enhance data quality and reliability Collaborate with data scientists and architects on several projects Requirements and skills Previous experience as a data engineer or in a similar role Experience with Data Fabric Architecture Technical expertise with data models, data mining, and segmentation techniques Hands-on experience with SQL database design Experience with SQL Server querying, stored procedures Great numerical and analytical skills Experience with Azure synapse, snowflake, or another cloud-based data warehousing. Experience with ETL, pipelines Degree in Computer Science, IT, or similar field

Posted 1 month ago

Apply

7.0 - 12.0 years

1 - 2 Lacs

Hyderabad

Hybrid

Role & responsibilities 5 to 8 years of experience in data engineering or a related field. At least 2 years of hands-on experience with Databricks, PySpark, Azure Synapse, and cloud platforms (preferably Azure). Technical Skills : Strong expertise in Python/PySpark programming applied to data engineering. Solid experience with cloud platforms (Azure preferred, AWS/GCP) and their data services. Advanced SQL skills and familiarity with relational and NoSQL databases (e.g., MS SQL, MySQL, PostgreSQL). Strong understanding of CI/CD pipelines and automation practices in data engineering. Experience with large-scale data processing on cloud platforms. Soft Skills : Strong analytical and problem-solving skills. Excellent communication skills and the ability to work collaboratively across teams. High attention to detail with a proactive approach to improving systems and processes. Education: Bachelors or masters degree in computer science, Engineering, Information Systems, or a related field. Key Responsibilities : Data Infrastructure and Pipeline Development: Design, develop, and maintain complex ETL/ELT pipelines using Azure Synapse. Design, build, and maintain data pipelines and APIs in the cloud environment, with a focus on Azure cloud Platform. Optimize data pipelines for performance, scalability, and cost efficiency. Implement data governance, quality, and security best practices throughout data workflows. Cloud Platform Management : Design and manage cloud-based data infrastructure on Azure and other cloud platforms. Utilize cloud-native tools and services to enhance data storage and processing capabilities. Build and manage CI/CD pipelines for data engineering projects to ensure smooth deployments and automation. Programming and Automation : Write and maintain high-quality, reusable code in Azure Synapse environments for data processing and automation.

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Hyderabad, Gurugram

Work from Office

Job Description About the Company : Headquartered in California, U.S.A., GSPANN provides consulting and IT services to global clients. We help clients transform how they deliver business value by helping them optimize their IT capabilities, practices, and operations with our experience in retail, high-technology, and manufacturing. With five global delivery centers and 1900+ employees, we provide the intimacy of a boutique consultancy with the capabilities of a large IT services firm. Role: Azure Data Engineer. Experience: 4+ Years Skill Set: Azure Synapse, Pyspark, ADF and SQL. Location: Pune, Hyderabad, Gurgaon 5+ years of experience in software development, technical operations, and running large-scale applications. 4+ years of experience in developing or supporting Azure Data Factory (API/APIM), Azure Databricks, Azure DevOps, Azure Data Lake storage (ADLS), SQL and Synapse data warehouse, Azure Cosmos DB 2+ years of experience working in Data Engineering Any experience in data virtualization products like Denodo is desirable Azure Data Engineer or Solutions Architect certification is desirable Should have a good understanding of container platforms like Docker and Kubernetes. Should be able to assess the application/platform time to time for architectural improvements and provide inputs to the relevant teams Very Good troubleshooting skills (quick identification of the application issues and providing quick resolutions with no or minimal user/business impact) Hands-on experience in working with high-volume, mission-critical applications Deep appreciation of IT tools, techniques, systems, and solutions. Excellent communication skills along with experience in driving triage calls which involves different technical stake holders Has creative problem-solving skills related to cross-functional issues amidst the changing priorities. Should be flexible and resourceful to swiftly manage the changing operational goals and demands. Good experience in handling escalations and take complete responsibility and ownership of all critical issues to get a technical/logical closure. Good understanding of the IT Infrastructure Library (ITIL) framework and various IT Service Management (ITSM) tools available in the marketplace

Posted 1 month ago

Apply

4.0 - 9.0 years

4 - 9 Lacs

Kolkata, Hyderabad, Bengaluru

Work from Office

Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. • You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. • You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Knowledge of more than one technology • Basics of Architecture and Design fundamentals • Knowledge of Testing tools • Knowledge of agile methodologies • Understanding of Project life cycle activities on development and maintenance projects • Understanding of one or more Estimation methodologies, Knowledge of Quality processes • Basics of business domain to understand the business requirements • Analytical abilities, Strong Technical Skills, Good communication skills • Good understanding of the technology and domain • Awareness of latest • Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods technologies and trends • Excellent problem solving, analytical and debugging skills Technical and Professional Requirements: Primary skills-Azure data bricks Preferred Skills: Technology->Cloud Platform->Azure Development & Solution Architecting

Posted 1 month ago

Apply

6.0 - 11.0 years

17 - 25 Lacs

Pune, Chennai, Bengaluru

Hybrid

Job Title: Data Engineer Experience : 6 + Years Location : Chennai, Coimbatore, Bangalore, Pune About the Role We are seeking a skilled Data Engineer with hands-on experience in Microsoft Fabric and Azure Synapse Analytics to build scalable data pipelines, optimize data models, and modernize analytics platforms in a cloud-first environment. Key Skills Required Microsoft Fabric components (Lakehouse, OneLake, Data Pipelines, Real-Time Analytics, Power BI Integration) Azure Synapse Analytics Azure Data Engineering Python / PySpark Good Understanding of ETL/ELT processes, and data warehouse best practices. Good to Have Azure and Microsoft Fabric certifications Experience with real-time and event-driven data processing Familiarity with Data Governance tools Why Join Us? Join KANINIs award-winning Data Engineering Team, recognized as the "Outstanding Data Engineering Team" at DES 2025 Work on real-world problems at the intersection of data, AI, and product development Be part of a collaborative, innovative, and growth-driven environment Access to cutting-edge tools, leadership support, and continuous learning opportunities Enjoy flexible work options and competitive compensation

Posted 1 month ago

Apply

3.0 - 7.0 years

10 - 16 Lacs

Gurugram

Hybrid

5+ years of total experience into IT industry as a developer/senior developer/data engineer 3+ years of experience of working extensively with Azure services such as Azure Data Factory, Azure Synapse, Azure Datalake, Azure SQL, Data Management Required Candidate profile Call Vikas 8527840989 Email vikasimaginators@gmail.com 3+ years of experience working extensively with Azure SQL, MS SQL Server and good exposure into writing complex SQL queries.

Posted 1 month ago

Apply

9.0 - 14.0 years

20 - 35 Lacs

Pune, Chennai, Bengaluru

Hybrid

Role & responsibilities Design and implement end-to-end data solutions on Microsoft Azure, including data lakes, data warehouses, and ETL/ELT processes. Develop scalable and efficient data architectures that support large-scale data processing and analytics workloads. Ensure high performance, security, and compliance within Azure data solutions. Know various techniques (lakehouse, warehouse) and have experience implementing them. Evaluate and choose appropriate Azure services such as Azure SQL Database, Azure Synapse Analytics, Azure Data Lake Storage, Azure Databricks (configuring, costing, etc), Unity Catalog, and Azure Data Factory. Should have deep knowledge and hands-on experience with these Azure Data Services. Ideally, knowledgeable and experienced with Microsoft Fabric. Work closely with business and technical teams to understand and translate data needs into robust, scalable data architecture solutions. Experience with data governance, data privacy, and compliance requirements. Excellent communication and interpersonal skills, with the ability to collaborate effectively with cross-functional teams. Provide expertise and leadership to the development team implementing data engineering solutions. Collaborate with Data Scientists, Analysts, and other stakeholders to ensure data architectures align with business goals and data analysis requirements. Optimize cloud-based data infrastructure for performance, cost-effectiveness, and scalability. Analyze data workloads and recommend optimizations for performance tuning, cost management, and reducing complexity. Monitor and address any issues related to performance and availability in cloud-based data solutions. Experience in programming languages (e.g., SQL, Python, Scala). Hands-on experience using MS SQL Server, Oracle, or similar RDBMS platform. Experience in Azure DevOps, CI/CD pipeline development Hands-on experience working at a high level in architecture, data science, or combination. In-depth understanding of database structure principles Distributed Data Processing of big data batch or streaming pipelines. Familiarity with data visualization tools (e.g., Power BI, Tableau, etc.) Data Modeling and strong analytics skills. The candidate must be able to take OLTP data structures and convert them into Star Schema. Ideally, the candidate should have DBT experience along with data modeling experience. Problem-solving attitude, Highly selfmotivated, selfdirected, and attentive to detail, Ability to prioritize and execute tasks effectively. Attitude and aptitude are highly important at Hitachi; we are a very collaborative group. Preferred candidate profile Azure SQL Data Warehouse Azure Data Factory Azure Data Lake Azure Analysis Services Databricks/Spark Python or Scala (Python preferred) Data Modeling Power BI Database migration from legacy systems to new solutions Design conceptual, logical and physical data models using tools like ER Studio, Erwin

Posted 1 month ago

Apply

12.0 - 16.0 years

18 - 25 Lacs

Thane

Work from Office

Architecting modern Data platform Experience in MDM platform Manage end to end deliveries for data Engineering, EDW and Data Lake platform. Data modelling Maintain robust data catalogue. Manage Azure cloud platform for Data and Analytics

Posted 1 month ago

Apply

3.0 - 7.0 years

10 - 16 Lacs

Gurugram

Work from Office

5+ years of total experience into IT industry as a developer/senior developer/data engineer • 3+ years of experience of working extensively with Azure services such as Azure Data Factory, Azure Synapse and Azure Datalake Required Candidate profile 3+ years of experience working extensively with Azure SQL, MS SQL Server and good exposure into writing complex SQL queries. Call on 7042331616 or drop cv on supreet.imaginators@gmail.com

Posted 1 month ago

Apply

9.0 - 14.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Location: Bangalore Exp:8+ years Work Mode: Hybrid Hands-on experience with: Azure Data Factory (ADF) Azure Synapse Analytic Azure SQL Database / SQL Server Azure Databricks or Apache Spark Azure Blob Storage / Data Lake Storage Gen2 Strong SQL skills and experience in performance tuning. Familiarity with data modeling (star/snowflake schemas) and ETL best practices.

Posted 1 month ago

Apply

9.0 - 14.0 years

25 - 35 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Role & responsibilities Job Overview: We are looking for a Senior Data Engineer with strong expertise in SQL, Python, Azure Synapse, Azure Data Factory, Snowflake, and Databricks . The ideal candidate should have a solid understanding of SQL (DDL, DML, query optimization) and ETL pipelines while demonstrating a learning mindset to adapt to evolving technologies. Key Responsibilities: Collaborate with business and IT stakeholders to define business and functional requirements for data solutions. Design and implement scalable ETL/ELT pipelines using Azure Data Factory, Databricks, and Snowflake . Develop detailed technical designs, data flow diagrams, and future-state data architecture . Evangelize modern data modelling practices , including entity-relationship models, star schema, and Kimball methodology . Ensure data governance, quality, and validation by working closely with quality engineering teams . Write, optimize, and troubleshoot complex SQL queries , including DDL, DML, and performance tuning . Work with Azure Synapse, Azure Data Lake, and Snowflake for large-scale data processing . Implement DevOps and CI/CD best practices for automated data pipeline deployments. Support real-time streaming data processing with Spark, Kafka, or similar technologies . Provide technical mentorship and guide team members on best practices in SQL, ETL, and cloud data solutions . Stay up to date with emerging cloud and data engineering technologies and demonstrate a continuous learning mindset .

Posted 1 month ago

Apply

3.0 - 8.0 years

0 Lacs

Chennai

Work from Office

Responsibilities: To design and implement Data pipelines using Azure Synapse from variety of data sources, file formats into SQL Server database To implement batch and real time data pipelines from various data sources to data warehouse and data lake To work with Data Architect on new Data projects by building Data pipelines, Master Data management To perform data analysis, extraction, cleansing, Column-mapping, data transformations and data modelling based on business requirement To ensure Data Availability on Azure SQL Datawarehouse by monitoring and troubleshoot Data pipelines Skillset: Must have 3+ years experience in Design, Development of ETL Pipelines in Azure Synapse or Azure Data factory Must have hands-on experience in related Azure services like ADLS2, Databricks, Azure SQL, Logic Apps Must have strong implementation knowledge in Pyspark and Advanced SQL to attain Data transformations Must have handled Structured, Semi-structed and unstructured data formats Must have clear understanding of Data-warehouse, Data lake modelling & ETL performance optimization Good to have working knowledge in consuming APIs in ETL pipelines Good to have working knowledge on PowerBI and Manufacturing Data Analytics & Reporting Bachelors or masters degree in information technology or Computer Science or related disciplines.

Posted 1 month ago

Apply

6.0 - 11.0 years

5 - 9 Lacs

Pune, Chennai, Bengaluru

Work from Office

Your Role As a senior software engineer with Capgemini, you will have 6 + years of experience in Azure Fabric technology with strong project track record In this role you will play a key role in Strong customer orientation, decision making, problem solving, communication and presentation skills Very good judgement skills and ability to shape compelling solutions and solve unstructured problems with assumptions Very good collaboration skills and ability to interact with multi-cultural and multi-functional teams spread across geographies Strong executive presence and entrepreneurial spirit Superb leadership and team building skills with ability to build consensus and achieve goals through collaboration rather than direct line authority Your profile Design, develop, and maintain data pipelines using Azure Data Factory, Azure Databricks, and Azure Synapse Implement ETL solutions to integrate data from various sources into Azure Data Lake and Data Warehouse Hands-on experience with SQL, Python, PySpark for data processing Expertise in building Power BI dashboards and reports Strong DAX and Power Query skills Experience in Power BI Service, Gateways, and embedding reports Develop Power BI datasets, semantic models, and row-level security for data access control What youll love about working here You can shape your career with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. Were committed to ensure that people of all backgrounds feel encouraged and have a sense of belonging at Capgemini. You are valued for who you are, and you can bring your original self to work . Every Monday, kick off the week with a musical performance by our in-house band - The Rubber Band. Also get to participate in internal sports events , yoga challenges, or marathons. At Capgemini, you can work on cutting-edge projects in tech and engineering with industry leaders or create solutions to overcome societal and environmental challenges. About Capgemini Location - Bengaluru,Pune,Chennai,Mumbai

Posted 1 month ago

Apply

15.0 - 20.0 years

17 - 20 Lacs

Mumbai

Work from Office

This role requires deep understanding of data warehousing, business intelligence (BI), and data governance principles, with strong focus on the Microsoft technology stack. Data Architecture Develop and maintain the overall data architecture, including data models, data flows, data quality standards. Design and implement data warehouses, data marts, data lakes on Microsoft Azure platform Business Intelligence Design and develop complex BI reports, dashboards, and scorecards using Microsoft Power BI. Data Engineering Work with data engineers to implement ETL/ELT pipelines using Azure Data Factory. Data Governance Establish and enforce data governance policies and standards. Primary Skills Experience 15+ years of relevant experience in data warehousing, BI, and data governance. Proven track record of delivering successful data solutions on the Microsoft stack. Experience working with diverse teams and stakeholders. Required Skills and Experience Technical Skills: Strong proficiency in data warehousing concepts and methodologies. Expertise in Microsoft Power BI. Experience with Azure Data Factory, Azure Synapse Analytics, and Azure Databricks. Knowledge of SQL and scripting languages (Python, PowerShell). Strong understanding of data modeling and ETL/ELT processes. Secondary Skills Soft Skills Excellent communication and interpersonal skills. Strong analytical and problem-solving abilities. Ability to work independently and as part of a team. Strong attention to detail and organizational skills.

Posted 1 month ago

Apply

4.0 - 9.0 years

6 - 10 Lacs

Kolkata

Work from Office

Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your role Develop and maintain data pipelines tailored to Azure environments, ensuring security and compliance with client data standards. Collaborate with cross-functional teams to gather data requirements, translate them into technical specifications, and develop data models. Leverage Python libraries for data handling, enhancing processing efficiency and robustness. Ensure SQL workflows meet client performance standards and handle large data volumes effectively. Build and maintain reliable ETL pipelines, supporting full and incremental loads and ensuring data integrity and scalability in ETL processes. Implement CI/CD pipelines for automated deployment and testing of data solutions. Optimize and tune data workflows and processes to ensure high performance and reliability. Monitor, troubleshoot, and optimize data processes for performance and reliability. Document data infrastructure, workflows, and maintain industry knowledge in data engineering and cloud tech. Your Profile Bachelors degree in computer science, Information Systems, or a related field 4+ years of data engineering experience with a strong focus on Azure data services for client-centric solutions. Extensive expertise in Azure Synapse, Data Lake Storage, Data Factory, Databricks, and Blob Storage, ensuring secure, compliant data handling for clients. Good interpersonal communication skills Skilled in designing and maintaining scalable data pipelines tailored to client needs in Azure environments. Proficient in SQL and PL/SQL for complex data processing and client-specific analytics. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.

Posted 1 month ago

Apply

10.0 - 20.0 years

25 - 40 Lacs

Chennai, pune,gurgaon, Hyderabad/Bangalore

Work from Office

Function: Software Engineering Big Data / DWH / ETL Azure Data Factory Azure Synapse ETL Spark SQL Scala Responsibilities: Designing and implementing scalable and efficient data architectures. Creating data models and optimizing data structures for performance and usability. Implementing and managing data lakehouses and real-time analytics solutions using Microsoft Fabric. Leveraging Fabric's OneLake, Dataflows, and Synapse Data Engineering for seamless data management. Enabling end-to-end analytics and AI-powered insights. Developing and orchestrating data pipelines in Azure Data Factory. Managing ETL/ELT processes for data integration across various sources. Optimizing data workflows for performance and cost efficiency. Designing interactive dashboards and reports in Power BI. Implementing data models, DAX calculations, and performance optimizations. Ensuring data quality, security, and governance in reporting solutions. Requirements: Data Architect with 10+ years of experience in Microsoft Fabric skills, designs, and implements data solutions using Fabric, focusing on data integration, analytics, and automation, while ensuring data quality, security, and compliance. Primary Skills (Must Have): Azure Data Pipeline, Apache Spark, ETL, Azure Factory, Azure Synapse, Azure Functions, Spark SQL, SQL. Secondary Skills (Good to Have): Other Azure Services, Python/Scala, DataStage (preferably), and Fabric

Posted 1 month ago

Apply

6.0 - 8.0 years

0 Lacs

Hyderabad

Work from Office

We are looking for a skilled and analytical Power BI Consultant to join our team. The ideal candidate will have strong experience in business intelligence and data visualization using Power BI. You will be responsible for designing, developing, and deploying BI solutions that provide actionable insights and help drive strategic decisions across the organization. Key Responsibilities: Work closely with business stakeholders to gather, understand, and document reporting requirements. Design, develop, and deploy Power BI dashboards and reports tailored to business needs. Transform raw data into meaningful and interactive visualizations. Develop and maintain datasets, data models, and data pipelines. Optimize data models and DAX queries for performance and usability. Perform data analysis and data validation to ensure data accuracy and integrity. Collaborate with data engineers, analysts, and IT teams to ensure seamless data integration. Provide user training and support for Power BI tools and reports. Monitor and troubleshoot Power BI reports and dashboards to ensure smooth operation. Recommend enhancements and best practices for Power BI architecture and governance REQUIRED SKILLS : Bachelor's degree in Computer Science, Information Technology, Data Analytics, or a related field. Proven experience (typically 3+ years) as a Power BI Developer/Consultant. Strong proficiency in Power BI, including Power Query (M), DAX, and data modeling. Experience with SQL and relational databases (e.g., SQL Server, Azure SQL, PostgreSQL). Familiarity with ETL processes and tools. Good understanding of data warehousing and reporting concepts. Excellent analytical and problem-solving skills. Strong communication and stakeholder management skills. Experience with Power BI Service (publishing, workspaces, sharing, security). Knowledge of Power Platform (Power Apps, Power Automate) is a plus. Experience with cloud data platforms like Azure Data Lake, Synapse, or AWS is a plus.

Posted 1 month ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) What do you need for this opportunity? Must have skills required: Data Governance, Lakehouse architecture, Medallion Architecture, Azure DataBricks, Azure Synapse, Data Lake Storage, Azure Data Factory Intelebee LLC is Looking for: Data Engineer:We are seeking a skilled and hands-on Cloud Data Engineer with 5-8 years of experience to drive end-to-end data engineering solutions. The ideal candidate will have a deep understanding of dimensional modeling, data warehousing (DW), Lakehouse architecture, and the Medallion architecture. This role will focus on leveraging Azure's/AWS ecosystem to build scalable, efficient, and secure data solutions. You will work closely with customers to understand requirements, create technical specifications, and deliver solutions that scale across both on-premise and cloud environments. Key Responsibilities: End-to-End Data Engineering Lead the design and development of data pipelines for large-scale data processing, utilizing Azure/AWS tools such as Azure Data Factory, Azure Synapse, Azure functions, Logic Apps , Azure Databricks, and Data Lake Storage. Tools, AWS Lambda, AWS Glue Develop and implement dimensional modeling techniques and data warehousing solutions for effective data analysis and reporting. Build and maintain Lakehouse and Medallion architecture solutions for streamlined, high-performance data processing. Implement and manage Data Lakes on Azure/AWS, ensuring that data storage and processing is both scalable and secure. Handle large-scale databases (both on-prem and cloud) ensuring high availability, security, and performance. Design and enforce data governance policies for data security, privacy, and compliance within the Azure ecosystem.

Posted 1 month ago

Apply

6.0 - 9.0 years

14 - 24 Lacs

Hyderabad, Bengaluru

Hybrid

Hiring for Dotnet Fullstack with Cloud Exp- 6-9 yrs Level - Assistant Manager but IC role Skill and Location Dotnet Core with Angular and Azure Service (Not Azure Devops) for Bangalore / Hyderabad Dotnet Core with Angular and AWS for Hyderabad

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies