Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
0 Lacs
nagpur, maharashtra
On-site
You have a great opportunity to join Harrier Information Systems as a Sr. PowerBI Consultant in Nagpur. With 4 to 7 years of experience, you will be responsible for designing and developing Business Intelligence Dashboards/Reports using PowerBI. Your role will involve working with Power BI/MS Fabric, DAX, data models, Power Query, and Power BI Service. As a Sr. PowerBI Consultant, you should excel in data transformations, data modeling, and data visualization layers. Performance tuning of existing PowerBI dashboards, analyzing and providing solutions for data model design improvement are key aspects of this role. Strong SQL skills for querying and data manipulation are required, along with experience in ETL processes and tools. Knowledge of data warehousing concepts, data modeling, and experience with various data sources like Teradata, MSSQL are desirable. Your compensation will be commensurate with your technology skills, communication skills, and problem-solving attitude. If you are interested in this opportunity, please send your updated resume with current and expected CTC and notice period to mohan.bangde@harriersys.com or career@harriersys.com. We look forward to hearing from you soon. Thank you. Mohan Bangde Harrier Information Systems Pvt. Ltd. Mobile: +91-997-541-2556 Website: https://www.harriersys.com Locations: India | UK | USA,
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
navi mumbai, maharashtra
On-site
Arcadis is the world's leading company delivering sustainable design, engineering, and consultancy solutions for natural and built assets. We are more than 36,000 people, in over 70 countries, dedicated to improving quality of life. Everyone has an important role to play. With the power of many curious minds, together we can solve the worlds most complex challenges and deliver more impact together. Individual Accountabilities Collaboration Collaborates with domain architects in the DSS, OEA, EUS, and HaN towers and if appropriate, the respective business stakeholders in architecting data solutions for their data service needs. Collaborates with the Data Engineering and Data Software Engineering teams to effectively communicate the data architecture to be implemented. Contributes to prototype or proof of concept efforts. Collaborates with InfoSec organization to understand corporate security policies and how they apply to data solutions. Collaborates with the Legal and Data Privacy organization to understand the latest policies so they may be incorporated into every data architecture solution. Suggest architecture design with Ontologies, MDM team. Technical Skills & Design Significant experience working with structured and unstructured data at scale and comfort with a variety of different stores (key-value, document, columnar, etc.) as well as traditional RDBMS and data warehouses. Deep understanding of modern data services in leading cloud environments, and able to select and assemble data services with maximum cost efficiency while meeting business requirements of speed, continuity, and data integrity. Creates data architecture artifacts such as architecture diagrams, data models, design documents, etc. Guides domain architect on the value of a modern data and analytics platform. Research, design, test, and evaluate new technologies, platforms and third-party products. Working experience with Azure Cloud, Data Mesh, MS Fabric, Ontologies, MDM, IoT, BI solution and AI would be greater assets. Expert troubleshoot skills and experience. Leadership Mentors aspiring data architects typically operating in data engineering and software engineering roles. Key Shared Accountabilities Leads medium to large data services projects. Provides technical partnership to product owners Shared stewardship, with domains architects, of the Arcadis data ecosystem. Actively participates in Arcadis Tech Architect community. Key Profile Requirements Minimum of 7 years of experience in designing and implementing modern solutions as part of variety of data ingestion and transformation pipelines Minimum of 5 years of experience with best practice design principles and approaches for a range of application styles and technologies to help guide and steer decisions. Experience working in large scale development and cloud environment. Why Arcadis We can only achieve our goals when everyone is empowered to be their best. We believe everyone's contribution matters. Its why we are pioneering a skills-based approach, where you can harness your unique experience and expertise to carve your career path and maximize the impact we can make together. Youll do meaningful work, and no matter what role, youll be helping to deliver sustainable solutions for a more prosperous planet. Make your mark, on your career, your colleagues, your clients, your life and the world around you. Together, we can create a lasting legacy. Join Arcadis. Create a Legacy. Our Commitment to Equality, Diversity, Inclusion & Belonging We want you to be able to bring your best self to work every day, which is why we take equality and inclusion seriously and hold ourselves to account for our actions. Our ambition is to be an employer of choice and provide a great place to work for all our people.,
Posted 2 months ago
2.0 - 5.0 years
0 - 3 Lacs
Jaipur
Work from Office
Job Role Data Engineer Job Location Jaipur Job Type Permanent Experience Required- (2-5) Years As a Data Engineer, you will play a critical role in designing, developing, and maintaining our data pipelines and infrastructure. You will work closely with our data scientists, analysts, and other stakeholders to ensure data is accurate, timely, and accessible. Your contributions will directly impact our data-driven decision-making and support our growth. Key Responsibilities: Data Pipeline Development: Design, develop, and implement data pipelines using Azure Data Factory and Databricks to support the ingestion, transformation, and movement of data. ETL Processes: Develop and optimize ETL (Extract, Transform, Load) processes to ensure efficient data flow and transformation. Data Lake Management: Develop and maintain Azure Data Lake solutions, ensuring efficient storage and retrieval of large datasets. Data Warehousing: Work with Azure Synapse Analytics to build and manage scalable data warehousing solutions that enable advanced analytics and reporting. Data Integration: Integrate various data sources into MS-Fabric, ensuring data consistency, quality, and accessibility across different platforms. Performance Optimization: Optimize data processing workflows and storage solutions to improve performance and reduce costs. Database Management: Manage and optimize databases (SQL and NoSQL) to support high-performance queries and data storage requirements. Data Quality: Implement data quality checks and monitoring to ensure accuracy and consistency of data. Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver actionable insights. Documentation: Create and maintain comprehensive documentation for data processes, pipelines, and infrastructure, architecture and best practices. Troubleshooting and Support: Identify and resolve issues in data pipelines, data lakes, and warehousing solutions, providing timely support and maintenance. Qualifications: Experience: 2-4 years of experience in data engineering or a related field. Technical Skills: Proficiency with Azure Data Factory, Azure Synapse Analytics, Databricks, and Azure Data Lake Experience with Microsoft Fabric is a plus Strong SQL skills and experience with data warehousing concepts (DWH) Knowledge of data modeling, ETL processes, and data integration Experience with relational databases (e.g., MS-SQL, PostgreSQL, MySQL) Hands-on experience with ETL tools and frameworks (e.g., Apache Airflow, Talend) Knowledge of big data technologies (e.g., Hadoop, Spark) is a plus Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud) and associated data services (e.g., S3, Redshift, BigQuery) Familiarity with data visualization tools (e.g., Power BI) and experience with programming languages such as Python, Java, or Scala. Experience with schema design and dimensional data modeling Analytical Skills: Strong problem-solving abilities and attention to detail. Communication: Excellent verbal and written communication skills, with the ability to explain technical concepts to non-technical stakeholders. Education: Bachelors degree in computer science, Engineering, Mathematics, or a related field. Advanced degrees or certifications are a plus Thanks & Regards Sulabh Tailang HR-Talent Acquisition Manager |Celebal Technologies |91-9448844746 Sulabh.tailang@celebaltech.com|LinkedIn-sulabhtailang |Twitter-Ersulabh Website-www.celebaltech.com
Posted 2 months ago
8.0 - 13.0 years
8 - 17 Lacs
Chennai
Remote
MS Fabric (Data Lake, OneLake, Lakehouse, Warehouse, Real-Time Analytics) and integration with Power BI, Synapse, and Azure Data Factory. DevOps Knowledge Team Leading experience
Posted 2 months ago
8.0 - 13.0 years
8 - 17 Lacs
Chennai
Remote
MS Fabric (Data Lake, OneLake, Lakehouse, Warehouse, Real-Time Analytics) and integration with Power BI, Synapse, and Azure Data Factory. DevOps Knowledge Team Leading experience
Posted 2 months ago
5.0 - 7.0 years
20 - 25 Lacs
Mohali, Pune, Bengaluru
Hybrid
Required Skills and Experiences: Collaborate with cross-functional teams, including data analysts, data scientists, and business stakeholders, to understand their data requirements and deliver effective solutions. Leverage Fabric Lakehouse for data storage, governance, and processing to support Power BI and automation initiatives. Expertise in data modeling, with a strong focus on data warehouse and lakehouse design. Design and implement data models, warehouses, and databases using MS Fabric, Azure Synapse Analytics, Azure Data Lake Storage, and other Azure services. • Develop ETL (Extract, Transform, Load) processes using SQL Server Integration Services (SSIS), Azure Synapse Pipelines, or similar tools to prepare data for analysis and reporting. Implement data quality checks and governance practices to ensure accuracy, consistency, and security of data assets. Monitor and optimize data pipelines and workflows for performance, scalability, and cost efficiency, utilizing Microsoft Fabric for real-time analytics and AI-powered workloads. Strong proficiency in Business Intelligence (BI) tools such as Power BI, Tableau, and other analytics platforms. Experience with data integration and ETL tools like Azure Data Factory. • Proven expertise in Microsoft Fabric or similar data platforms. • In-depth knowledge of the Azure Cloud Platform, particularly in data warehousing and storage solutions.
Posted 2 months ago
5.0 - 7.0 years
6 - 8 Lacs
Hyderabad
Work from Office
Role Overview We are seeking an experienced .NET Backend Developer with strong Azure Data Engineering skills to join our growing team in Hyderabad. You will work closely with cross-functional teams to build scalable backend systems, modern APIs, and data pipelines using cutting-edge tools like Azure Databricks and MS Fabric. Technical Skills (Must-Have) Strong hands-on experience in C#, SQL Server, and OOP Concepts Proficiency with .NET Core, ASP.NET Core, Web API, Entity Framework (v6 or above) Strong understanding of Microservices Architecture Experience with Azure Cloud technologies including Data Engineering, Azure Databricks, MS Fabric, Azure SQL, Blob Storage, etc. Experience with Snowflake or similar cloud data platforms Experience working with NoSQL databases Skilled in Database Performance Tuning and Design Patterns Working knowledge of Agile methodologies Ability to write reusable libraries and modular, maintainable code Excellent verbal and written communication skills (especially with US counterparts) Strong troubleshooting and debugging skills Nice to Have Skills Experience with Angular, MongoDB, NPM Familiarity with Azure DevOps CI/CD pipelines for build and release configuration Self-starter attitude with strong analytical and problem-solving abilities Willingness to work extra hours when needed to meet tight deadlines Why Join UsWork with a passionate, high-performing team Opportunity to grow your technical and leadership skills in a dynamic environment Be part of global digital transformation initiatives with top-tier clients Exposure to real-world enterprise data systems Opportunity to work on cutting-edge Azure and cloud technologies Performance-based growth & internal mobility opportunities
Posted 3 months ago
7.0 - 12.0 years
8 - 14 Lacs
Bengaluru
Work from Office
Individual Accountabilities Collaboration Collaborates with domain architects in the DSS, OEA, EUS, and HaN towers and if appropriate, the respective business stakeholders in architecting data solutions for their data service needs. Collaborates with the Data Engineering and Data Software Engineering teams to effectively communicate the data architecture to be implemented. Contributes to prototype or proof of concept efforts. Collaborates with InfoSec organization to understand corporate security policies and how they apply to data solutions. Collaborates with the Legal and Data Privacy organization to understand the latest policies so they may be incorporated into every data architecture solution. Suggest architecture design with Ontologies, MDM team. Technical skills & design Significant experience working with structured and unstructured data at scale and comfort with a variety of different stores (key-value, document, columnar, etc.) as well as traditional RDBMS and data warehouses. Deep understanding of modern data services in leading cloud environments, and able to select and assemble data services with maximum cost efficiency while meeting business requirements of speed, continuity, and data integrity. Creates data architecture artifacts such as architecture diagrams, data models, design documents, etc. Guides domain architect on the value of a modern data and analytics platform. Research, design, test, and evaluate new technologies, platforms and third-party products. Working experience with Azure Cloud, Data Mesh, MS Fabric, Ontologies, MDM, IoT, BI solution and AI would be greater assets. Expert troubleshoot skills and experience. Leadership Mentors aspiring data architects typically operating in data engineering and software engineering roles. Key shared accountabilities Leads medium to large data services projects. Provides technical partnership to product owners Shared stewardship, with domains architects, of the Arcadis data ecosystem. Actively participates in Arcadis Tech Architect community. Key profile requirements Minimum of 7 years of experience in designing and implementing modern solutions as part of variety of data ingestion and transformation pipelines Minimum of 5 years of experience with best practice design principles and approaches for a range of application styles and technologies to help guide and steer decisions. Experience working in large scale development and cloud environment.
Posted 3 months ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad
Work from Office
We Are Hiring: Senior .NET Backend Developer with Azure Data Engineering Experience Job Location: Hyderabad, India Work Mode: Onsite Only Experience: Minimum 6+ Years Qualification: B.Tech, B.E, MCA, M.Tech Role Overview We are seeking an experienced .NET Backend Developer with strong Azure Data Engineering skills to join our growing team in Hyderabad. You will work closely with cross-functional teams to build scalable backend systems, modern APIs, and data pipelines using cutting-edge tools like Azure Databricks and MS Fabric. Technical Skills (Must-Have) Strong hands-on experience in C, SQL Server, and OOP Concepts Proficiency with .NET Core, ASP.NET Core, Web API, Entity Framework (v6 or above) Strong understanding of Microservices Architecture Experience with Azure Cloud technologies including Data Engineering, Azure Databricks, MS Fabric, Azure SQL, Blob Storage, etc. Experience with Snowflake or similar cloud data platforms Experience working with NoSQL databases Skilled in Database Performance Tuning and Design Patterns Working knowledge of Agile methodologies Ability to write reusable libraries and modular, maintainable code Excellent verbal and written communication skills (especially with US counterparts) Strong troubleshooting and debugging skills Nice to Have Skills Experience with Angular, MongoDB, NPM Familiarity with Azure DevOps CI/CD pipelines for build and release configuration Self-starter attitude with strong analytical and problem-solving abilities Willingness to work extra hours when needed to meet tight deadlines Why Join Us Work with a passionate, high-performing team Opportunity to grow your technical and leadership skills in a dynamic environment Be part of global digital transformation initiatives with top-tier clients Exposure to real-world enterprise data systems Opportunity to work on cutting-edge Azure and cloud technologies Performance-based growth & internal mobility opportunities Tags DotNetDeveloper BackendDeveloper AzureDataEngineering Databricks MSFabric Snowflake Microservices CSharpJobs HyderabadJobs FullTimeJob HiringNow EntityFramework ASPNetCore CloudEngineering SQLJobs DevOps DotNetCore BackendJobs SuzvaCareers DataPlatformDeveloper SoftwareJobsIndia
Posted 3 months ago
5.0 - 10.0 years
5 - 10 Lacs
Gurgaon / Gurugram, Haryana, India
On-site
Design, develop, and maintain cloud infrastructure using Azure and MS Fabric: Architect and implement cloud solutions leveraging Microsoft Azure services and MS Fabric. Ensure the infrastructure supports scalability, reliability, performance, and cost-efficiency. Integrate containerization and orchestration technologies: Utilize Kubernetes and Docker for containerization and orchestration. Manage and optimize Azure Kubernetes Service (AKS) deployments. Implement DevOps practices and automation: Develop CI/CD pipelines to automate code deployment and infrastructure provisioning. Use automation tools and Terraform to streamline operations and reduce manual intervention. Collaborate with development teams to build and deploy cloud-native applications: Provide guidance and support for designing and implementing cloud-native applications. Ensure applications are optimized for cloud environments. Monitor, troubleshoot, and optimize cloud infrastructure: Implement monitoring and alerting systems to ensure infrastructure health. Optimize resource usage and performance to reduce costs and improve efficiency. Develop cost optimization strategies for efficient use of Azure resources. Troubleshoot and resolve issues quickly to minimize impact on users. Ensure high availability and uptime of applications. Enhance system security and compliance: Implement security best practices and ensure compliance with industry standards. Perform regular security assessments and audits EDUCATION University background: Bachelors/Master's degree in computer science & information systems or related engineering. BEHAVIORAL COMPETENCIES: Outstanding Technical leader with proven hands on in configuration and deployment of DevOps towards successful delivery. Be Innovative and be aligned to new product development technologies and methods. Demonstrate excellent communication skills and able to guide, influence and convince others in a matrix organization. Demonstrated teamwork and collaboration in a professional setting Proven capabilities with worldwide teams Team Player with prior experience in working with European customer is not mandatory but preferable. 5 to 10 years in IT and/or digital companies or startups Knowledge of ansible. Extensive knowledge of cloud technologies, particularly Microsoft Azure and MS Fabric. Proven experience with containerization and orchestration tools such as Kubernetes and Docker. Experience with Azure Kubernetes Service (AKS), Terraform, and DevOps practices. Strong automation skills, including scripting and using automation tools. Proven track record in designing and implementing cloud infrastructure. Experience in optimizing cloud resource usage and performance. Proven experience in Azure cost optimization strategies. Proven experience ensuring uptime of applications and rapid troubleshooting in case of failures. Strong understanding of security best practices and compliance standards. Proven experience providing technical guidance to teams. Proven experience in managing customer expectations. Proven track record of driving decisions collaboratively, resolving conflicts, and ensuring follow-through. Extensive knowledge of software development and system operations. Proven experience in designing stable solutions, testing, and debugging. Demonstrated technical guidance with worldwide teams. Demonstrated teamwork and collaboration in a professional setting. Proven capabilities with worldwide teams. Proficient in English; proficiency in French is a plus Performance Measurements: On-Time Delivery (OTD) Infrastructure Reliability and Availability Cost Optimization and Efficiency Application Uptime and Failure Resolution
Posted 3 months ago
4.0 - 6.0 years
4 - 6 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Generate CPG business insights and reports using Excel, PowerPoint, and PowerBI. Develop and maintain Power BI reports, dashboards, and visualizations that provide meaningful insights to stakeholders. Create comprehensive content and presentations to support business decisions and strategies. Extract and analyze data from NIQ, Circana, Spins , and other relevant sources. Work with cross-functional teams to develop and implement data-driven solutions, including data visualizations, reports, and dashboards. Manage analytics projects and work streams, and build dashboards and reports. Provide expert-level support to stakeholders on analytics and data visualization. Present findings and recommendations to stakeholders in a clear and concise manner. Required Education Bachelor's Degree Preferred Education Master's Degree Required Technical and Professional Expertise B. Tech, Bachelor's or Master's degree in Computer Science, Science, or relevant education. 4-6 years of experience in data analysis or a related field. Proficiency in MS Fabric, PowerBI, SQL, Excel , and experience with NIQ, Circana, and Spins . Preferred Technical and Professional Experience Strong analytical skills and attention to detail. Excellent communication and presentation abilities. Ability to manage multiple tasks and meet deadlines. Experience in the CPG industry is preferred.
Posted 3 months ago
4.0 - 7.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Design, develop, and implement data solutions using Microsoft Fabric , including data pipelines, data warehouses, and data marts. Develop data pipelines, data transformations, and data workflows using Microsoft Fabric.
Posted 3 months ago
3.0 - 8.0 years
9 - 19 Lacs
Bengaluru, Delhi / NCR
Work from Office
Key Responsibilities: Lead the implementation and optimization of Microsoft Purview across the clients data estate in MS Fabric/Azure Cloud Platform (ADF or Data Bricks etc). Define and enforce data governance policies, data classification, sensitivity labeling, and data lineage to ensure readiness for GenAI use cases. Collaborate with data engineers, architects, and AI/ML teams to ensure data discoverability, compliance, and ethical AI readiness. Design and implement data cataloging strategies to support GenAI model training and inference. Provide guidance on data access controls, privacy, and regulatory compliance (e.g., GDPR, HIPAA). Conduct workshops and training sessions for client stakeholders on Purview capabilities and best practices. Monitor and report on data governance KPIs and GenAI readiness metrics. Required Skills & Qualifications: Proven experience as a Microsoft Purview SME in enterprise environments. Strong knowledge of Microsoft Fabric, OneLake, and Synapse Data Engineering. Experience with data governance frameworks and metadata management. Hands-on experience with data classification, sensitivity labels, and data lineage tracking. Understanding of compliance standards and data privacy regulations. Excellent communication and stakeholder management skills. Preferred Qualifications: Microsoft certifications in Azure Data, Purview, or Security & Compliance. Experience working with Azure OpenAI, Copilot integrations, or other GenAI platforms. Background in data science, AI ethics, or ML operations is a plus.
Posted 3 months ago
2.0 - 5.0 years
3 - 8 Lacs
Bengaluru
Work from Office
Job Title: Power BI Developer Experience: 23 Years Location: Bangalore - Indiranagar (Work from Office Only) Employment Type: Full-Time Job Description: We are looking for a Power BI Developer with 23 years of hands-on experience in designing and developing BI reports and dashboards using Power BI. Candidates with experience in Microsoft Fabric will be given preference. Strong communication skills are essential, as the role involves close collaboration with cross-functional teams. Key Responsibilities: Develop, design, and maintain interactive dashboards and reports in Power BI Work closely with stakeholders to gather requirements and translate them into effective data visualizations Optimize data models for performance and usability Implement row-level security and data governance best practices Stay updated with Power BI and MS Fabric capabilities and best practices Requirements: 23 years of hands-on Power BI development experience Familiarity with Power Query, DAX, and data modeling techniques Experience in Microsoft Fabric is a plus Strong analytical and problem-solving skills Excellent verbal and written communication skills Interested candidates kindly share your CV and below details to usha.sundar@adecco.com 1) Present CTC (Fixed + VP) - 2) Expected CTC - 3) No. of years experience - 4) Notice Period - 5) Offer-in hand - 6) Reason of Change - 7) Present Location -
Posted 3 months ago
3.0 - 4.0 years
5 - 6 Lacs
Hyderabad
Work from Office
Overview This role serves as an Associate Analyst for the GTM Data analytics COE project development team. This role is one of the go-to resource for building/ maintaining key reports, data pipelines and advanced analytics necessary to bring insights to light for senior leaders and Sector and field end users. Responsibilities The COEs core competencies are a mastery of data visualization, data engineering, data transformation, predictive and prescriptive analytics Enhance data discovery, processes, testing, and data acquisition from multiple platforms. Apply detailed knowledge of PepsiCos applications for root-cause problem-solving. Ensure compliance with PepsiCo IT governance rules and design best practices. Participate in project planning with stakeholders to analyze business opportunities and define end-to-end processes. Translate operational requirements into actionable data presentations. Support data recovery and integrity issue resolution between business and PepsiCo IT. Provide performance reporting for the GTM function, including ad-hoc requests using internal, shipment data systems Develop on-demand reports and scorecards for improved agility and visualization. Collate and analyze large data sets to extract meaningful insights on performance trends and opportunities. Present insights and recommendations to the GTM Leadership team regularly. Manage expectations through effective communication with headquarters partners. Ensure timely and accurate data delivery per service level agreements (SLA). Collaborate across functions to gather insights for action-oriented analysis. Identify and act on opportunities to improve work delivery. Implement process improvements, reporting standardization, and optimal technology use. Foster an inclusive and collaborative environment. Provide baseline support for monitoring SPA mailboxes, work intake, and other ad-hoc requests.Additionally, the role will provide baseline support for monitoring work intake & other adhoc requests, queries Qualifications Undergrad degree in Business or related technology 3-4 Yrs working experience in Power BI 1-2 Yrs working experience in SQL and Python Preferred qualifications : Information technology or analytics experience is a plus Familiarity with Power BI/ Tableau, Python, SQL, Teradata, Azure, MS Fabric Requires a level of analytical, critical thinking, and problem-solving skills as well as great attention to detail Strong time management skills, ability to multitask, set priorities, and plan
Posted 3 months ago
4.0 - 9.0 years
0 - 25 Lacs
Hyderabad, Pune, Greater Noida
Work from Office
Roles and Responsibilities : Design, develop, test, deploy, and maintain large-scale data pipelines using Azure Data Factory (ADF) to integrate various data sources into a centralized platform. Collaborate with cross-functional teams to gather requirements for data integrations and ensure seamless delivery of high-quality solutions. Develop complex SQL queries to extract insights from large datasets stored in relational databases such as PostgreSQL or MySQL. Troubleshoot issues related to data pipeline failures, identify root causes, and implement fixes to prevent future occurrences. Job Requirements : 4-9 years of experience in designing and developing data integration solutions using ADF or similar tools like Informatica PowerCenter or Talend Open Studio. Strong understanding of Microsoft Azure services including storage options (e.g., Blob Storage), compute resources (e.g., Virtual Machines), networking concepts (e.g., VPN). Proficiency in writing complex SQL queries for querying large datasets stored in relational databases such as PostgreSQL or MySQL.
Posted 3 months ago
8 - 12 years
19 - 30 Lacs
Pune, Bengaluru
Work from Office
About Position: We at Persistent are looking for a Data Engineering lead with experience in MS Fabric, SQL, Python along with knowledge in Data Extraction and ETL Process. Role: Data Engineering Lead Location: Pune, Bangalore Experience: 8+ years Job Type: Full Time Employment What You'll Do: Work with business to understand business requirements and translate into low level design Design and implement robust, fault tolerant, scalable, and secure data pipelines using pyspark, notebooks in MS Fabric Review code of peers and mentor junior team members Participate in sprint planning and other agile ceremonies Drive automation and efficiency in Data ingestion, data movement and data access workflow Contribute ideas to help ensure that required standards and processes are in place and actively look for opportunities to enhance standards and improve process efficiency. Expertise You'll Bring: Around 8 to 12 years of experience, at least 1 year in MS fabric and Azure cloud Leadership: Ability to lead and mentor junior data engineers, help with planning and estimations Data migration: Experience on migrating and re-modeling large enterprise data from legacy warehouse to Lakehouse (Delta lake) on MS Fabric or Databricks. Strong Data Engineering Skills: Proficiency in data extraction, transformation, and loading (ETL) processes, data modeling, and database management. Also experience around setting up pipelines using Notebooks and ADF, setting up monitoring and alert notifications. Experience with Data Lake Technologies: MS Fabric, Azure, Databricks, Python, Orchestration tool like Apache Airflow or Azure Data Factory, Azure Synapse along with stored procedures, Azure data lake storage. Data Integration Knowledge: Familiarity with data integration techniques, including batch processing, streaming, and real-time data ingestion, auto-loader, change data capture, creation of fact and dimension tables. Programming Skills: Proficiency in SQL, Python, Pyspark for data manipulation and transformation. DP-700 certification will be preferred Benefits: Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment: Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment. We are committed to creating an inclusive environment where all employees can thrive. Our company fosters a values-driven and people-centric work environment that enables our employees to: Accelerate growth, both professionally and personally Impact the world in powerful, positive ways, using the latest technologies Enjoy collaborative innovation, with diversity and work-life wellbeing at the core Unlock global opportunities to work and learn with the industry's best Let's unleash your full potential at Persistent "Persistent is an Equal Opportunity Employer and prohibits discrimination and harassment of any kind."
Posted 4 months ago
8.0 - 13.0 years
15 - 30 Lacs
pune, chennai, bengaluru
Hybrid
Role & responsibilities Mandatory skill set : MS FABRIC, ADF ADB Responsibilities: Data Integration: Design, develop, and maintain data integration solutions using Microsoft Fabric, ensuring seamless data flow across various systems. Data Modeling: Create and manage data models within Microsoft Fabric to support analytics and reporting needs. ETL Processes: Develop and optimize ETL (Extract, Transform, Load) processes to ensure data quality and consistency. Collaboration: Work closely with data analysts, data scientists, and business stakeholders to understand data requirements and deliver solutions. Performance Tuning: Monitor and optimize the performance of data pipelines and workflows within Microsoft Fabric. Documentation: Document data integration processes, architectures, and workflows for future reference and compliance. Best Practices: Implement best practices for data engineering and ensure adherence to data governance policies. Troubleshooting: Identify and resolve issues related to data integration and performance. Qualifications: Education: Bachelors degree in Computer Science, Information Technology, or a related field. Experience: Proven experience in data engineering, specifically with Microsoft Fabric and related technologies. Programming Skills: Proficiency in programming languages such as SQL, Python, or C#. Cloud Platforms: Experience with Microsoft Azure and its data services. Data Warehousing: Familiarity with data warehousing concepts and tools, particularly within the Microsoft ecosystem. Version Control: Experience with version control systems like Git. Communication Skills: Strong verbal and written communication skills to collaborate with cross-functional teams. Preferred Skills: Experience with Power BI for data visualization and reporting. Knowledge of machine learning frameworks and libraries. Familiarity with CI/CD practices for data pipelines
Posted Date not available
7.0 - 12.0 years
25 - 35 Lacs
mumbai, bengaluru, mumbai (all areas)
Work from Office
Data Modeling Developing conceptual, logical and physical data models that meet the business requirements. Database design - To optimize data storage, retrieval and performance in relational data databases, NoSql databases, Data warehouses, Data lakes (or Lakehouse). Data integration Integrating data from various sources such as internal databases, external APIs, third-party services. Data Migration - Must have experience of SQL Server to Azure migration and SSIS to ADF migration Data governance Implementing data governance policies, standards and procedures to ensure data catalog, data quality, data security (encryption and data masking if required), compliance and privacy. Data architecture strategy to develop data architecture in Data Warehouse / Data Lake /Data Lakehouse solutions using emerging technologies of Cloud Big data technology stack (E.g.: Azure ADLS gen2, Spark ETL, Python, Data factory and Synapse analytics, Data governance and Security tools). Collaboration – Must be able to work as individual contributor and working closely with cross-functional teams including data engineers, data analysts, ETL and BI developers and business stakeholders to understand data requirements. Performance tuning - Monitoring and optimizing the performance of database systems, identifying bottlenecks and implementing solutions to improve efficiency and scalability. Data security - Implementing robust security measures to protect sensitive data from unauthorized access, breaches and cyber threats. This involves implementing encryption, authentication and access control mechanisms. API integration experience and EAI etc. Worked on Synapse analytics and has understanding of MS Fabric Biztalk, Mulesoft etc. not required - only person should know the concept of it.
Posted Date not available
5.0 - 7.0 years
20 - 30 Lacs
bengaluru
Work from Office
We are hiring for- Role: Azure Data Engineer Key Responsibilities: Design, develop, and optimize scalable data pipelines using Databricks (PySpark, Scala, SQL). Implement ETL/ELT workflows for large-scale data integration across cloud and on-premise environments. Leverage Microsoft Fabric (Data Factory, OneLake, Lakehouse, DirectLake, etc.) to build unified data solutions. Collaborate with data architects, analysts, and stakeholders to deliver business-critical data models and pipelines. Monitor and troubleshoot performance issues in data pipelines. Ensure data governance, quality, and security across all data assets. Work with Delta Lake, Unity Catalog, and other modern data lakehouse components. Automate and orchestrate workflows using Azure Data Factory, Databricks Workflows, or Microsoft Fabric pipelines. Participate in code reviews, CI/CD practices, and agile ceremonies. Required Skills: 57 years of experience in data engineering, with strong exposure to Databricks . Proficient in PySpark, SQL, and performance tuning of Spark jobs. Hands-on experience with Microsoft Fabric components . Experience with Azure Synapse, Data Factory, and Azure Data Lake. Understanding of Lakehouse architecture and modern data mesh principles. Familiarity with Power BI integration and semantic modeling (preferred). Knowledge of DevOps, CI/CD for data pipelines (e.g., using GitHub Actions, Azure DevOps). Excellent problem-solving, communication, and collaboration skills. Muugddha Vanjarii 7822804824 mugdha.vanjari@sunbrilotechnologies.com
Posted Date not available
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |