Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 14.0 years
0 Lacs
karnataka
On-site
You are an experienced Data Architect who will be responsible for leading the transformation of enterprise data solutions, particularly focused on migrating Alteryx workflows into Azure Databricks. Your expertise in the Microsoft Azure ecosystem, including Azure Data Factory, Databricks, Synapse Analytics, Microsoft Fabric, and strong background in data architecture, governance, and distributed computing will be crucial for this role. Your strategic thinking and hands-on architectural leadership will ensure the development of scalable, secure, and high-performance data solutions. Your key responsibilities will include defining the migration strategy for transforming Alteryx workflows into scalable, cloud-native data solutions on Azure Databricks. You will architect end-to-end data frameworks leveraging Databricks, Delta Lake, Azure Data Lake, and Synapse, while establishing best practices, standards, and governance frameworks for pipeline design, orchestration, and data lifecycle management. Collaborating with business stakeholders, guiding engineering teams, overseeing data quality, lineage, and security compliance, and driving CI/CD adoption for Azure Databricks will also be part of your role. Furthermore, you will provide architectural leadership, design reviews, and mentorship to engineering and analytics teams. Optimizing solutions for performance, scalability, and cost-efficiency within Azure, participating in enterprise architecture forums, and influencing data strategy across the organization are also expected from you. To be successful in this role, you should have at least 10 years of experience in data architecture, engineering, or solution design. Proven expertise in Alteryx workflows and their modernization into Azure Databricks, deep knowledge of the Microsoft Azure data ecosystem, strong background in data governance, lineage, security, and compliance frameworks, and proficiency in Python, SQL, and Apache Spark are essential. Excellent leadership, communication, and stakeholder management skills are also required. Preferred qualifications include Microsoft Azure certifications, experience in leading large-scale migration programs or modernization initiatives, familiarity with enterprise architecture frameworks, exposure to machine learning enablement on Azure Databricks, and understanding of Agile delivery and working in multi-disciplinary teams.,
Posted 6 days ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
Withum is a place where talent thrives - where who you are matters. It's a place of endless opportunities for growth. A place where entrepreneurial energy plus inclusive teamwork equals exponential results. Withum empowers clients and our professional staff with innovative tools and solutions to address their accounting, tax and overall business management and operational needs. As a US nationally ranked Top 25 firm, we recruit only the best and brightest people with a genuine passion for the business. We are seeking an experienced Lead Consultant Data Engineering with a strong background in consulting services and hands-on skills in building modern, scalable data platforms and pipelines. This is a client-facing, delivery-focused role. Please note that this position is centered around external client delivery and is not part of an internal IT or product engineering team. This is a foundational hire. You will be responsible for delivering hands-on client work, support for our proprietary data products, and building the team underneath you. Withum's brand is a reflection of our people, our culture, and our strength. Withum has become synonymous with teamwork and client service excellence. The cornerstone of our success can truly be accredited to the dedicated professionals who work here every day, easy to work with a sense of purpose and caring for their co-workers and whose mission is to help our clients grow and thrive. But our commitment goes beyond our clients as we continue to live the Withum Way, promoting personal and professional growth for all team members, clients, and surrounding communities. How You'll Spend Your Time: - Architect, implement, and optimize data transformation pipelines, data lakes, and cloud-native warehouses for mid- and upper mid-market clients. - Deliver hands-on engineering work across client environments - building fast, scalable, and well-documented pipelines that support both analytics and AI use cases. - Lead technical design and execution using tools such as Tableau, Microsoft Fabric, Synapse, Power BI, Snowflake, and Databricks. - Also have a good hands-on familiarity with SQL Databases. - Optimize for sub-50GB datasets and local or lightweight cloud execution where appropriate - minimizing unnecessary reliance on cluster-based compute. - Collaborate with subject-matter experts to understand business use cases prior to designing data model. - Operate as a client-facing consultant: conduct discovery, define solutions, and lead agile project delivery. - Switch context rapidly across 23 active clients or service streams in a single day. - Provide support for our proprietary data products as needed. - Provide advisory and strategic input to clients on data modernization, AI enablement, and FP&A transformation efforts. - Deliver workshops, demos, and consultative training to business and technical stakeholders. - Ability to implement coding modifications to pre-existing code/procedures in a manner that results in a validated case study. - Take full ownership of hiring, onboarding, and mentoring future data engineers and analysts within the India practice. - During bench time, contribute to building internal data products and tooling - powering our own consulting operations (e.g., utilization dashboards, delivery intelligence, practice forecasting). - Help define and scale delivery methodology, best practices, and reusable internal accelerators for future engagements. - Ability to communicate openly about conflicting deadlines to ensure prioritization aligns with client expectations, with ample time to reset client expectations as needed. - Ensure coding is properly commented to help explain logic or purpose behind more complex sections of code. Requirements: - 6+ years of hands-on experience in data engineering roles, at least 3+ years in a consulting or client delivery environment. - Proven ability to context-switch, self-prioritize, and communicate clearly under pressure. - Demonstrated experience owning full lifecycle delivery, from architecture through implementation and client handoff. - Strong experience designing and implementing ETL / ELT pipelines, preferably in SQL-first tools. - Experience with Microsoft SQL Server / SSIS for maintenance and development of ETL processes. - Real-world experience with SQL Databases, Databricks, Snowflake, and/or Synapse - and a healthy skepticism of when to use them. - Deep understanding of data warehousing, data lakes, data modeling, and incremental processing. - Proficient in Python for ETL scripting, automation, and integration work. - Experience with tools such as dbt core in production environments. - Strong practices around data testing, version control, documentation, and team-based dev workflows. - Working knowledge of Power BI, Tableau, Looker, or similar BI tools. - Experience building platforms for AI/ML workflows or supporting agentic architectures. - Familiarity with Microsoft Fabric's Lakehouse implementation, Delta Lake, Iceberg, and Parquet. - Background in DataOps, CI/CD for data pipelines, and metadata management. - Microsoft certifications (e.g., Azure Data Engineer, Fabric Analytics Engineer) are a plus Website: www.withum.com,
Posted 6 days ago
8.0 - 11.0 years
10 - 15 Lacs
ahmedabad
Work from Office
Key Responsibilities: Design, build and maintain scalable data pipelines and transformation processes using Microsoft Fabric components including Data Factory, OneLake, Dataflows and Notebooks. Develop and manage data models and analytics solutions. Develop ETL processes and integrate data from diverse sources (on-cloud and on-premise) into centralised and governed environments using Fabrics capabilities. Ensure data quality, integrity, consistency, security, governance and compliance with industry standards (Uniclass & SFG20) across all storage layers. Monitor performance, optimise data models and cost efficiency of Fabric components. Collaborate with bi developers, data analysts and developers to support initiatives by understanding business requirements and translate them into technical data solutions. Troubleshoot and resolve data related issues. Implement best practises for data engineering including coding standards, testing and deployment. Prepare and maintain documentation of data architecture, workflows and configurations. Requirements: Proven experience as a data engineer or in a similar role. Strong proficiency in programming languages such as Python, TSQL, Power Query and KQL. Strong understanding of Data Factory, data pipelines, Notebooks and Lakehouses within Fabric. Familiarity with Microsoft security and governance frameworks, including Purview and RBAC. Excellent analytical and problem solving skills. Exposure to CI/CD practices and tools within Azure DevOps. Desirable An understanding of the lifecycle of machine learning models. Experience with Azure Machine Learning or other cloud-based ML platforms for deploying and managing models.
Posted 6 days ago
6.0 - 11.0 years
6 - 14 Lacs
pune
Work from Office
My linkedin linkedin.com/in/yashsharma1608. contract period - 6-12month payroll will be - ASV consulting , my company client - Team Computers Job location - Pune - onsite(WFO) budget - upto 12-13lpa , depending on last (relevant hike) Exprnce - 6+ **************JD is************************** Cloud Data Engineer / Azure Specialist role. Cloud Platform-AZURE SHOULD HAVE GOOD KNOWLEDGE & Also should have hands on Microsoft Fabric.
Posted 6 days ago
5.0 - 10.0 years
5 - 10 Lacs
noida, uttar pradesh, india
On-site
As a Senior Finance Data Engineer, you will play a pivotal role in scaling our finance data team in Noida. You will leverage technologies like Microsoft Fabric , Power BI , Power Apps , Power Automate , DAX , Power Query , and advanced data modeling to deliver high-performance data pipelines and solutions for finance and revenue stream analysis. Your work will directly support business-critical financial decision-making, working closely with senior stakeholders and leadership in Finance and Revenue teams globally. Effective communication is essential, as you will be interacting with teams overseas, managing complex requirements, and providing strategic data insights. Your duties and responsibilities Data Engineering Architecture: Set up, configure, and maintain data engineering infrastructure leveraging Microsoft Fabric for data integration and management. Design, build, maintain and optimize scalable data pipelines and data models to support BI and analytics initiatives for finance and revenue streams. Collaborate with other engineers to ensure that data architectures support high-performance reporting and data-driven decision-making. Business Intelligence Reporting: Develop and maintain Power BI reports, dashboards, and visualizations tailored to financial analysis, performance tracking, and revenue management. Use Power Query, DAX, and Power BI data models to provide actionable insights to stakeholders in Finance and Revenue departments. Process Automation Optimization: Leverage Power Automate and Power Apps to automate business processes, streamline workflows, and improve operational efficiency. Integrate process automation solutions with core data systems to enhance data accuracy and reduce manual effort. Collaboration with Stakeholders: Work directly with internal leadership and finance teams located overseas, gathering requirements, designing solutions, and translating business needs into technical specifications. Ensure that data solutions align with business goals, regulatory standards, and industry best practices. Data Governance Quality Assurance: Ensure high data quality and integrity throughout the engineering lifecycle. Apply data governance principles to manage data access, privacy, and security, especially in financial contexts. Conduct quality checks to guarantee the accuracy and reliability of financial data. Team Development Leadership: As part of the finance data team, you will help mentor and train junior engineers and foster a collaborative and innovative environment. Contribute to building a strong team culture and sharing knowledge within the team to drive technical excellence. Continuous Improvement: Stay up-to-date with the latest trends and advancements in data engineering, cloud technologies, and finance. Identify opportunities for performance improvements and make recommendations for optimizing data infrastructure and business intelligence tools. Your skills, experience, and qualifications Experience: Minimum of 5 years of hands-on experience in data engineering, with a focus on Microsoft Fabric , Power BI , Power Apps , Power Automate , DAX , and Power Query . Proven experience working with data solutions in the finance industry, ideally supporting revenue stream analysis, financial reporting, or risk management. Technical Skills: Advanced experience in building and optimizing data models for large-scale reporting systems, using tools like Power BI and Microsoft Fabric . Expertise in Power Query and DAX for data transformation and analysis. Strong proficiency in Power Apps and Power Automate for automating business processes and integrating systems. Hands-on experience with cloud-based data engineering technologies and platforms (e.g., Microsoft Azure, data lakes, and data warehouses). Familiarity with ETL processes , data pipeline orchestration, and large data sets. Communication Skills: Excellent communication skills, with the ability to interact effectively with senior leadership, stakeholders, and cross-functional teams across different geographic locations. Ability to translate complex technical concepts into clear and actionable insights for non-technical audiences. Strong collaboration and interpersonal skills, particularly in working with overseas teams in different time zones. Soft Skills: Strong problem-solving and critical thinking abilities. Self-motivated, detail-oriented, and able to work independently in a fast-paced environment. Ability to prioritize tasks and manage multiple competing demands effectively. Preferred Qualifications: Experience in implementing data engineering solutions within finance , capital markets , or fintech industries. Knowledge of data security standards and regulatory compliance within financial services. Familiarity with other tools in the Microsoft Power Platform (Copilot, Power BI Service, etc.). Previous experience in team building , mentoring, or leading junior engineers.
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana, india
On-site
Position : Senior Sales Manager Techno/Functional Focus Areas : Presales, conversions, SAP, Microsoft Fabric, Microsoft Product Stack, Data & AI Solutions Reporting To : Chief Data & AI Officer Location Preference : India- Hyderabad with exposure to USA & UAE markets Preferred Attributes Strategic thinker with a consultative selling mindset Passion for AI-driven transformation and data-centric innovation Good Exposure to exposure to USA & UAE markets Comfortable navigating multi-cloud environments and hybrid architectures Strong interpersonal skills and a bias for action Strong team player with a commit for Cause attitude. Required Skills & Experience 1.Sales & Pre-Sales Experience Minimum 8 to 12 years of experience in technical sales, solution consulting, or pre-sales roles Demonstrated customer conversion success across enterprise accounts Skilled in inside sales enablement , lead qualification, and opportunity nurturing Experience conducting pre-sales discovery calls , demos, and solution walkthroughs with CXO-level stakeholders 2. Cross-Functional Collaboration Ability to work closely with account managers to shape customer strategy and drive pipeline velocity Partner with digital marketing teams to align campaigns with technical messaging and product positioning Collaborate with data research teams to tailor outreach based on industry-specific pain points and use cases 3. Communication & Enablement Exceptional presentation and storytelling skills for technical and business audiences Experience creating sales toolkits , pitch decks, and value proposition briefs Ability to translate complex technical concepts into customer-centric narratives 4. Technical Expertise Deep understanding of SAP architecture , integration strategies, and solution positioning Proven experience with Microsoft Fabric , including OneLake and Power BI integration Strong command of Microsoft product stack : Azure, Power Platform, Dynamics 365, Purview, and Copilot Familiarity with data governance , AI/ML use cases, and cloud-native open Fabric, databricks and snowflake analytics 5. Global Exposure Proven engagement with clients or partners in USA and UAE markets Understanding of regional compliance, procurement, and digital transformation trends Good sales achievement records and customer succuss management mindset year after Year Show more Show less
Posted 1 week ago
4.0 - 10.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Job Title: Data Engineering Senior Associate Microsoft Fabric, Azure (Databricks & ADF), PySpark Experience: 410 Years Location: PAN India Job Summary: We are looking for a skilled and experienced Data Engineer with 4-10 years of experience in building scalable data solutions on the Microsoft Azure ecosystem. The ideal candidate must have strong hands-on experience with Microsoft Fabric, Azure Databricks along with strong PySpark, Python and SQL expertise. Familiarity with Data Lake, Data Warehouse concepts, and end-to-end data pipelines is essential. Key Responsibilities: Requirement gathering and analysis Design and implement data pipelines using Microsoft Fabric & Databricks Extract, transform, and load (ETL) data from various sources into Azure Data Lake Storage Implement data security and governance measures Monitor and optimize data pipelines for performance and efficiency Troubleshoot and resolve data engineering issues Provide optimized solution for any problem related to data engineering Ability to work with a variety of sources like Relational DB, API, File System, Realtime streams, CDC etc. Strong knowledge on Databricks, Delta tables Required Skills: 410 years of experience in Data Engineering or related roles. Hands-on experience in Microsoft Fabric Hands-on experience in Azure Databricks Proficiency in PySpark for data processing and scripting. Strong command over Python & SQL writing complex queries, performance tuning, etc. Experience working with Azure Data Lake Storage and Data Warehouse concepts (e.g., dimensional modeling, star/snowflake schemas). Hands on experience in performance tuning & optimization on Databricks & MS Fabric. Ensure alignment with overall system architecture and data flow. Understanding CI/CD practices in a data engineering context. Excellent problem-solving and communication skills. Exposure to BI tools like Power BI, Tableau, or Looker. Good to Have: Experienced in Azure DevOps. Familiarity with data security and compliance in the cloud. Experience with different databases like Synapse, SQL DB, Snowflake etc. Show more Show less
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
delhi
On-site
The Analytics and AI Presales Consultant role based in Mumbai/Delhi is a full-time position that requires a highly skilled and motivated individual to join our dynamic team. As an ideal candidate, you should have a strong background in analytics and artificial intelligence, along with a successful track record in presales engagements. This role demands excellent communication skills, technical expertise, and the ability to collaborate effectively with clients and internal teams. Your key responsibilities will include collaborating with the sales team and customers to understand client requirements, conducting product demonstrations, presentations, and workshops to showcase our analytics and AI offerings, providing technical support during the sales process, developing compelling proposals, staying updated with the latest trends in analytics and AI technologies, building strong relationships with clients and stakeholders, assisting in the development of marketing materials, and participating in industry events to promote our solutions. To qualify for this role, you should hold a Bachelor's degree in Computer Science, Data Science, Engineering, or a related field, with a Master's degree considered a plus. You should have a minimum of 5 years of experience in analytics, AI, or a related field, focusing on presales activities. Your expertise should include a strong understanding of analytics and AI technologies such as machine learning, data mining, and predictive analytics, proficiency in programming languages like Python, R, or SQL, experience with platforms and tools like TensorFlow, PyTorch, and Power BI, excellent communication and presentation skills, problem-solving abilities, and the capacity to work both independently and collaboratively in a fast-paced environment. Preferred qualifications for this role include experience in a consulting or client-facing position, knowledge of cloud platforms like AWS, Azure, or Google Cloud, and certifications in analytics or AI technologies such as Microsoft Certified: Azure AI Engineer Associate or Google Cloud Professional Data Engineer. Joining us offers you the opportunity to work with cutting-edge analytics and AI technologies in a collaborative and inclusive work environment. You will also benefit from a competitive salary and benefits package, as well as professional development and growth opportunities.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
We are looking for an experienced Senior Data Engineer with expertise in Microsoft Fabric to contribute to our enterprise data modernization and analytics transformation efforts. You should possess a strong understanding of data pipelines, lakehouse architecture, Power BI, Synapse integration, and the ability to modernize legacy data systems to cloud-native solutions. Your role will be crucial in developing scalable, secure, and high-performing data solutions within the Microsoft ecosystem. Your responsibilities will include designing and implementing data pipelines using Microsoft Fabric's Data Factory, Synapse Data Engineering, and OneLake components. You will also be expected to construct and manage lakehouse architectures utilizing Delta Lake, Parquet, and OneLake within Microsoft Fabric. Additionally, you will lead projects aiming to modernize legacy ETL/ELT processes to cloud-native data pipelines. Collaboration with Data Architects, BI Developers, and Analysts will be essential to deliver scalable data models for analytics and reporting purposes. Optimizing Power BI datasets and reports through effective data modeling and DAX practices will also be part of your role. Furthermore, you will implement data governance and security controls, incorporating tools like Microsoft Purview, role-based access, and lineage tracking. Working alongside cross-functional teams, you will contribute to cloud migration projects, particularly transitioning from on-premises SQL/Oracle/Hadoop platforms to Microsoft Azure & Fabric. Your expertise will be needed to evaluate and implement CI/CD practices for data pipelines using Azure DevOps or GitHub Actions. The ideal candidate should hold a Bachelor's/Master's degree in Computer Science, Information Systems, or a related field, along with a minimum of 8 years of experience in data engineering. Proficiency in Microsoft Fabric components such as Data Factory, Lakehouse/OneLake, Synapse Data Engineering, and Power BI is crucial. Experience with data modeling, performance tuning in Power BI, modern data architecture patterns, and various languages like SQL, PySpark, T-SQL, DAX, and Power Query is required. Familiarity with Azure ecosystem tools and strong experience in CI/CD pipelines are also essential. Knowledge of data security, GDPR, HIPAA, and enterprise data governance is preferred. Preferred qualifications include Microsoft certifications like Microsoft Certified: Fabric Analytics Engineer Associate, Azure Data Engineer Associate (DP-203), experience with DataOps and Agile delivery methods, and knowledge of Machine Learning/AI integration with Fabric. Hands-on experience with Notebooks in Microsoft Fabric using Python or Scala would be a plus. In addition to technical skills, the ideal candidate should possess strong analytical and problem-solving abilities, excellent communication, and stakeholder management skills. The ability to lead projects, mentor junior engineers, and collaborate effectively with cross-functional teams are also valued traits.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Engineer at our Pune location, you will leverage your 5+ years of experience in data engineering to contribute to our team full-time with shift timings from 3 PM IST to 12 AM IST. Your key responsibilities will include hands-on experience with ETL processes using PySpark, SQL, Microsoft Fabric, and other relevant technologies. You will collaborate with clients and stakeholders to understand data requirements, design efficient data models and solutions, optimize existing data pipelines for performance and scalability, and ensure data quality and integrity throughout the data pipeline. Additionally, you will document technical designs, processes, and procedures, stay updated on emerging technologies and best practices in data engineering, and build a CI/CD pipeline using Github. To excel in this role, you should hold a Bachelor's degree in computer science, engineering, or a related field, with at least 3 years of experience in data engineering or a similar role. A strong understanding of ETL concepts and best practices, proficiency in Azure Synapse, Microsoft Fabric, and other data processing technologies, experience with cloud-based data platforms such as Azure or AWS, and knowledge of data warehousing concepts and methodologies are essential. Proficiency in Python, PySpark, SQL programming languages for data manipulation and scripting is also required. Nice-to-have qualifications include experience with data lake concepts, familiarity with data visualization tools such as Power BI or Tableau, and certifications in relevant technologies like Microsoft Certified: Azure Data Engineer Associate. Joining Stratacent, an IT Managed Services firm with a focus on Financial Services, Insurance, Healthcare, and Life Sciences, means becoming part of our digital transformation journey. With headquarters in Jersey City, NJ, and global delivery centers in the New York City area and New Delhi area, as well as offices in London, Canada, and Pune, India, we offer a dynamic work environment. Our partnerships with SAS, Automation Anywhere, Snowflake, Azure, AWS, and GCP ensure that you will have exposure to cutting-edge technologies. In addition, we provide benefits such as group medical insurance, cab facilities, meals/snacks, and a continuous learning program.,
Posted 1 week ago
5.0 - 7.0 years
0 Lacs
chennai, tamil nadu, india
On-site
Enhance enterprise-wide analytics with Microsoft Fabric Are you interested in transforming data into actionable insights As a Senior Data Engineer, you will play a pivotal role in designing and managing scalable data solutions using Microsoft Fabrics integrated services. This is a unique opportunity to be part of a global IT team supporting digital transformation and data-driven decision-making across the organization. Youll work with advanced technologies like OneLake, Synapse, and Power BI, enabling real-time analytics and business intelligence. Join us in advancing data capabilities in a collaborative, international environment. Do you want to be a key contributor in our digital growth journey Design and deliver scalable data solutions As a Senior Data Engineer, you will be responsible for building robust data pipelines, managing enterprise data storage, and enabling advanced analytics through Microsoft Fabric. Youll collaborate closely with cross-functional teams to translate business needs into technical solutions that support strategic decision-making. Your responsibility will be to: Design and maintain data pipelines using Data Factory and Synapse Optimize OneLake storage for performance and consistency Build enterprise-grade Data Warehouses within Microsoft Fabric Develop interactive dashboards and reports using Power BI Ensure data governance, security, and compliance across platforms You will report to the Chief Enterprise Architect and work closely with global IT teams and business stakeholders. The role is based in Chennai and may involve occasional international travel for training and collaboration. Experienced Data Engineer With Global Mindset We are looking for a structured and results-oriented person who thrives in a collaborative environment. You take responsibility for your work, communicate effectively, and enjoy solving complex data challenges. You are someone who values teamwork, inclusivity, and continuous learning. You also have: 5+ years of experience in data engineering or BI development Experience with Microsoft Fabric (Data Factory, Synapse, Power BI) Proficiency in SQL, Python, and Spark/PySpark Solid understanding of data modeling, warehousing, and governance Familiarity with CI/CD, version control, and DevOps practices Preferred: Experience with real-time data streaming and API integrations Play a key role in the development of enterprise analytics NKT is committed to developing a diverse organization and culture where people of diverse backgrounds can grow and are inspired to do their best. We are establishing gender diversity at NKT and encouraging all interested candidates to apply even if you dont tick all the boxes described. We believe that a diverse organization enables long-term performance, and that an inclusive and welcoming culture creates a better work environment. At NKT, youll be part of a collaborative team with opportunities to grow your skills in an international setting. We offer a work environment where innovation and knowledge sharing are encouraged. Youll have access to training, mentorship, and the chance to work on impactful projects that support our global digital strategy. "As a leader, I believe in empowering individuals to innovate and grow through collaboration and continuous learning," says Hiring Manager, Sapna Anand. Read more about our offer and listen to some voices of NKT Connectors here! We will review applications continuously, but we recommend you apply no later than 30th of September. Be aware that personality and cognitive tests might be included in the recruitment process. Please note that due to the GDPR regulations we cannot accept any applications via e-mail. Be a Connector of the green tomorrow! About NKT NKT connects a greener world with high-quality power cable technology and takes centre stage as the world moves towards green energy. NKT designs, manufactures and installs low-, medium- and high-voltage power cable solutions enabling sustainable energy transmission. Since 1891, NKT has innovated the power cable technology building the infrastructure for the first light bulbs to the megawatts created by renewable energy today. NKT is headquartered in Denmark and employs 6,000 people. NKT is listed on Nasdaq Copenhagen and realised a revenue of EUR 3.3 billion in 2024. We connect a greener world. www.nkt.com Show more Show less
Posted 1 week ago
8.0 - 13.0 years
20 - 25 Lacs
pune
Remote
Design Databases & Data Warehouses, Power BI Solutions, Support Enterprise Business Intelligence, Strong Team Player & Contributor, Continuous Improvement Experience in SQL, Oracle, SSIS/SSRS, Azure, ADF, CI/CD, Power BI, DAX, Microsoft Fabric Required Candidate profile Source system data structures, data extraction, data transformation, warehousing, DB administration, query development, and Power BI. Develop WORK FROM HOME
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. As part of our GDS Consulting team, you will be part of NCLC team delivering specific to Microsoft account. You will be working on latest Microsoft BI technologies and will collaborate with other teams within Consulting services. The opportunity We're looking for resources with expertise in Microsoft BI, Power BI, Azure Data Factory, Data Bricks to join the group of our Data Insights team. This is a fantastic opportunity to be part of a leading firm whilst being instrumental in the growth of our service offering. Your key responsibilities - Responsible for managing multiple client engagements. - Understand and analyze business requirements by working with various stakeholders and create the appropriate information architecture, taxonomy, and solution approach. - Work independently to gather requirements, cleanse extraction and loading of data. - Translate business and analyst requirements into technical code. - Create interactive and insightful dashboards and reports using Power BI, connecting to various data sources and implementing DAX calculations. - Design and build complete ETL/Azure Data Factory processes moving and transforming data for ODS, Staging, and Data Warehousing. - Design and development of solutions in Data Bricks, Scala, Spark, SQL to process and analyze large datasets, perform data transformations, and build data models. - Design SQL Schema, Database Schema, Stored procedures, function, and T-SQL queries. Skills and attributes for success - Collaborating with other members of the engagement team to plan the engagement and develop work program timelines, risk assessments, and other documents/templates. - Able to manage Senior stakeholders. - Experience in leading teams to execute high-quality deliverables within stipulated timelines. - Skills in PowerBI, Azure Data Factory, Databricks, Azure Synapse, Data Modeling, DAX, Power Query, Microsoft Fabric. - Strong proficiency in Power BI, including data modeling, DAX, and creating interactive visualizations. - Solid experience with Azure Databricks, including working with Spark, PySpark (or Scala), and optimizing big data processing. - Good understanding of various Azure services relevant to data engineering, such as Azure Blob Storage, ADLS Gen2, Azure SQL Database/Synapse Analytics. - Strong SQL Skills and experience with one of the following: Oracle, SQL, Azure SQL. - Good to have experience in SSAS or Azure SSAS and Agile Project Management. - Basic Knowledge of Azure Machine Learning services. - Excellent Written and Communication Skills and ability to deliver technical demonstrations. - Quick learner with a can-do attitude. - Demonstrating and applying strong project management skills, inspiring teamwork and responsibility with engagement team members. To qualify for the role, you must have - A bachelor's or master's degree. - A minimum of 4-7 years of experience, preferably a background in a professional services firm. - Excellent communication skills with consulting experience preferred. Ideally, you'll also have - Analytical ability to manage multiple projects and prioritize tasks into manageable work products. - Can operate independently or with minimum supervision. What working at EY offers At EY, we're dedicated to helping our clients, from startups to Fortune 500 companies, and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: - Support, coaching, and feedback from some of the most engaging colleagues around. - Opportunities to develop new skills and progress your career. - The freedom and flexibility to handle your role in a way that's right for you. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
As a Power BI Architect - Data Engineer, you will play a crucial role in designing, implementing, and managing comprehensive business intelligence solutions. Your focus will be on data modeling, report development, and ensuring data security and compliance. Working within high-performing and collaborative teams, you will present data migration solutions and influence key stakeholders in client groups. Your expertise will assist clients in driving towards strategic data architecture goals by enhancing the coherence, quality, security, and availability of the organization's data assets through the development of data migration roadmaps. Your responsibilities will include designing and leading real-time data architectures for large volumes of information, implementing integration flows with Data Lakes and Microsoft Fabric, optimizing and governing tabular models in Power BI, and ensuring high availability, security, and scalability. You will also coordinate data quality standards with a focus on DataOps for continuous deployments and automation. To be successful in this role, you should have demonstrable experience in Master data management and at least 7 years of experience in designing and implementing BI solutions and data architectures. You must possess advanced modeling skills, proficiency in DAX, and expertise in optimization and governance. Strong knowledge and mastery of Data Lake, Microsoft Fabric, and real-time ingestion methods are essential. Hands-on experience and knowledge of Python or R for data manipulation/transformation and automation are also required. Additionally, you should have proven experience in tabular modeling, DAX queries, and report optimization in Power BI. Your ability to plan, define, estimate, and manage the delivery of work packages using your experience will be crucial. Excellent communication skills and flexibility to respond to various program demands are essential for this role. You should have a deep understanding of key technical developments in your area of expertise and be able to lead the definition of information and data models, data governance structures, and processes. Experience in working in complex environments across multiple business and technology domains is preferred, along with the ability to bridge the gap between functional and non-functional teams.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
vadodara, gujarat
On-site
You will be responsible for creating blueprints for data flow, storage, and integration to ensure scalability, security, and efficiency. Your focus will be on the overall structure and strategy for data systems, designing the architecture for data warehouses, lakes, and integration platforms. You will define data governance, security, and compliance standards, as well as create strategies for data management and scalability. Your role will also involve providing input into developing data models and schemas and working closely with business stakeholders, data engineers, and analysts to define requirements and align the data strategy with business goals. Your outputs will include design documents, architecture diagrams, and data governance policies. To be considered for this role, you must have a total of 8+ years of experience in developing and managing Microsoft Data solutions, with at least 2+ years of experience in designing and architecting data solutions using Microsoft Azure. You should have expertise in database design, data modeling (e.g., star schema, snowflake schema), and hands-on experience with Azure Data services such as Power BI for Visualization, Azure Synapse (or AAS) for analytics, Azure Data Factory for ETL/Pipeline, Azure Data Lake Storage Gen 1/ Gen 2, and Azure Blob for storage and warehouse. Mandatory knowledge of using Microsoft Fabric (or Azure Synapse) and its components such as notebooks, shortcuts, data lake, and data warehouse administration is required. Proficiency in conceptual and logical design tools (e.g., Lucidchart) is also expected. Desired qualifications include a good understanding of programming languages such as Python or C#, knowledge of governance frameworks (Microsoft Purview) and cloud architecture (Azure Cloud Solutions), and Microsoft certified Fabric DP-600 or DP-700. Any other relevant Azure Data certification is considered an advantage, along with any Microsoft certification as an Azure Enterprise Data Architect or Azure Data Fundamental.,
Posted 1 week ago
4.0 - 7.0 years
4 - 8 Lacs
chennai, tamil nadu, india
On-site
Develop andmaintainprocesses to collect, analyze, and report security-related data from various sources (e.g., security controls, vulnerability assessments, incident response). Design and implement automated reporting solutions using tools like Microsoft Fabric tomonitorkey business metrics. Conduct regular risk assessments toidentifyvulnerabilities and threats in cloud environments. Define and track key performance indicators (KPIs) to evaluate the effectiveness of monitoring programs. Monitor and analyze logs, event data, and alerts to detect anomalies and ensure compliance with security policies. Evaluate vulnerability scans and penetration tests to assess system security posture. Review security documentation, including system security plans and risk assessments. Support internal and external audits by compiling and presenting evidence of compliance. Create andmaintaina comprehensive continuous monitoring plan based on NIST SP 800-137 and FedRAMP requirements. Requirement Programmingand automation experience with one or more Python, SQL, PowerShell, Bash. Proficiencyin data visualization tools Microsoft Fabric, Power BI. Familiarity with cloud platforms Azure, AWS, or Google Cloud. Experience with containerization and infrastructure-as-code tools Docker, Kubernetes, Terraform, GitHub. Understanding of ETL processes and data warehousing. Experience with vulnerability management tools Qualys, ServiceNow. Familiarity with SIEM tools Microsoft Sentinel, Splunk.
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
You are looking for a skilled and experienced Microsoft Fabric Engineer to join the data engineering team. Your main responsibilities will include designing, developing, and maintaining data solutions using Microsoft Fabric. This will involve working across key workloads such as Data Engineering, Data Factory, Data Science, Real-Time Analytics, and Power BI. In this role, you will need to have a deep understanding of Synapse Data Warehouse, OneLake, Notebooks, Lakehouse architecture, and Power BI integration within the Microsoft ecosystem. Some of your key responsibilities will include designing and implementing scalable and secure data solutions, building and maintaining Data Pipelines using Dataflows Gen2 and Data Factory, working with Lakehouse architecture, and managing datasets in OneLake. You will also be responsible for developing and optimizing notebooks (PySpark or T-SQL) for data transformation and processing, collaborating with data analysts and business users to create interactive dashboards and reports using Power BI (within Fabric), leveraging Synapse Data Warehouse and KQL databases for structured and real-time analytics, monitoring and optimizing performance of data pipelines and queries, and ensuring data quality, security, and governance practices are adhered to. To excel in this role, you should have at least 3 years of hands-on experience with Microsoft Fabric or similar tools in the Microsoft data stack. You must be proficient in tools such as Data Factory (Fabric), Synapse Data Warehouse/SQL Analytics Endpoints, Power BI integration, and DAX, as well as have a solid understanding of data modeling, ETL/ELT processes, and real-time data streaming. Experience with KQL (Kusto Query Language) is a plus, and familiarity with Microsoft Purview, Azure Data Lake, or Azure Synapse Analytics is advantageous. Overall, as a Microsoft Fabric Engineer, you will play a crucial role in designing, developing, and maintaining data solutions using Microsoft Fabric, collaborating with various teams to ensure data quality and security, and staying current with Microsoft Fabric updates and best practices to recommend enhancements. Please note that the qualifications required for this role include proficiency in Microsoft Fabric, OneLake, Data Factory, Data Lake, and DataMesh.,
Posted 1 week ago
4.0 - 12.0 years
0 Lacs
maharashtra
On-site
As an Azure Data Engineer specializing in Microsoft Fabric (Data Lake) based in Mumbai, you should have a minimum of 4 years of experience in the field, with at least 2 years dedicated to working with Microsoft Fabric technologies. Your expertise in Azure services is key, specifically in Data Lake, Synapse Analytics, Data Factory, Azure Storage, and Azure SQL. Your responsibilities will involve data modeling, ETL/ELT processes, and data integration patterns. It is essential to have experience in Power BI integration for effective data visualization. Proficiency in SQL, Python, or PySpark for data transformations is required for this role. A solid understanding of data governance, security, and compliance in cloud environments is also necessary. Previous experience working in Agile/Scrum environments is a plus. Strong problem-solving skills and the ability to work both independently and collaboratively within a team are crucial for success in this position.,
Posted 1 week ago
3.0 - 6.0 years
12 - 15 Lacs
gurugram
Work from Office
Job Purpose This role is central to managing the airlines operational data, building scalable data pipelines, enabling organization-wide reporting, and driving automation using the Microsoft ecosystem. Key Accountabilities Functional Activities Leading a team of experienced data engineers and analyst Design and maintain end-to-end data pipelines using SQL, Microsoft Fabric, and Python to support operations and planning analytics. Develop and manage centralized databases and reusable data models for seamless integration across Power BI, Power Apps, and other analytics platforms. Build and optimize Power BI dashboards and data models, implementing Row-Level Security (RLS), performance tuning, and enterprise-wide report distribution. Collaborate with stakeholders to gather reporting requirements and deliver actionable insights via automated dashboards and self-service tools. Integrate Power Automate and Power Apps for workflow automation and operational process digitization. Maintain data accuracy, schedule refreshes, and monitor Power BI services and dataflows for uninterrupted delivery. Support adoption of Microsoft Fabric ecosystem including Dataflows, Lakehouses, Notebooks, and Pipelines for modern data architecture. Any other additional r esponsibility could be assigned to the role holder from time to time as a standalone project or regular work. The same would be suitably represented in the Primary responsibilities and agreed between the incumbent, reporting officer and HR.
Posted 1 week ago
8.0 - 13.0 years
10 - 20 Lacs
chennai
Remote
Greetings from IT Resonance Inc We're always looking to expand our team of talented professionals IT Resonance Inc and currently seeking qualified candidates that would make a good fit for Microsoft Fabric & Power BI Developer. #Position : Microsoft Fabric & Power BI Developer Work Timing: 4:30 PM IST to 12:30 AM IST Work location: Remote Job Type: Freelance/Contract Experience: 8+ Years Responsibilities : 1. Develop and manage data pipelines and Lakehouse/Warehouse models in Microsoft Fabric. 2. Build interactive Power BI dashboards and reports based on business needs. 3. Design and implement data models and semantic layers for reporting. 4. Solid understanding of Business Intelligence concepts, with hands-on experience in Power BI for data visualization and reporting. 5. Understanding of data warehousing concepts, architectures and models. 6. Familiarity with cloud computing platforms (e.g., Azure/ AWS/ Google Cloud) and services related to data storage and processing. 7. Use Git and deployment pipelines for version control and release management. 8. Monitor and maintain Power BI Service reports, datasets, and workspaces. Ensure data quality, performance tuning, and efficient refresh schedules. Interested candidates can share profiles to swetha@itresonance.com / +91 8925526510
Posted 1 week ago
6.0 - 11.0 years
1 - 6 Lacs
bengaluru
Remote
ETL/ELT, Data Modelling, Synapse, ADF, Microsoft Fabric, Databricks, SQL
Posted 1 week ago
10.0 - 15.0 years
11 - 17 Lacs
bengaluru, karnataka, india
On-site
We are seeking a Senior QA Analyst with over a decade of experience in quality assurance, specializing in BI and data integration. The ideal candidate will be a senior professional responsible for defining and executing robust QA strategies across complex data pipelines and reporting systems. This role is crucial for ensuring data accuracy, integrity, and quality throughout the entire data lifecycle, from legacy systems to modern cloud platforms. Key Responsibilities QA Strategy & Implementation : Define and implement comprehensive QA strategies for BI reporting, master data management (MDM) verification, and ERP migration data flows. Validation & Testing : Conduct a wide range of testingincluding functional, regression, and performance testing on integrated datasets from sources like Salesforce, SQL, and Oracle ERP. You will validate data pipelines, transformations, and reporting layers across Microsoft Fabric, Azure Data Factory (ADF) , and legacy systems like SSIS and SQL Server . Automation & Defect Management : Automate tests and create validation scripts for continuous quality checks within CI/CD pipelines. Lead defect analysis, debugging, and resolution tracking. BI & Reporting Accuracy : Ensure the accuracy of reports by validating Power BI dashboards and their components, including DAX logic, RLS, filters, and visuals . Collaboration & Governance : Work closely with business analysts and engineers to align QA coverage with business priorities. Monitor data governance adherence by validating data lineage and catalog accuracy. Risk & Documentation : Independently manage workloads, proactively communicate risks, and propose remediation strategies. Maintain comprehensive documentation of test coverage, issue tracking, and release quality. Required Qualifications Experience : 10+ years of total QA experience , with a specialization in BI and data integration. You should have extensive hands-on experience validating data across Azure Data Services (ADF and Fabric) . Technical Skills : Proven experience in testing and validating Power BI dashboards , including semantic modeling and DAX logic. Skilled in writing test scripts and automation for data pipelines using languages like Python and SQL , and tools like Azure DevOps . Familiarity with MDM structures, ERP migration workflows, and related reconciliation frameworks. Experience with legacy system modernization ( SSIS, SQL Server ) and cloud-native transitions. Communication : Strong analytical, documentation, and troubleshooting skills. You must also have outstanding stakeholder engagement skills, with the ability to translate business risks into effective QA coverage. Certifications : Experience with Microsoft Purview and Profisee is a plus. Integrity : High integrity and accountability are required, along with the ability to pass rigorous background checks.
Posted 1 week ago
8.0 - 13.0 years
12 - 15 Lacs
hyderabad
Work from Office
Required Skills & Experience 1. Sales & Pre-Sales Experience Minimum 8 to 12 years of experience in technical sales, solution consulting, or pre-sales roles Demonstrated customer conversion success across enterprise accounts Skilled in inside sales enablement , lead qualification, and opportunity nurturing Experience conducting pre-sales discovery calls , demos, and solution walkthroughs with CXO-level stakeholders 2. Cross-Functional Collaboration Ability to work closely with account managers to shape customer strategy and drive pipeline velocity Partner with digital marketing teams to align campaigns with technical messaging and product positioning Collaborate with data research teams to tailor outreach based on industry-specific pain points and use cases 3. Communication & Enablement Exceptional presentation and storytelling skills for technical and business audiences Experience creating sales toolkits , pitch decks, and value proposition briefs Ability to translate complex technical concepts into customer-centric narratives 4. Technical Expertise Deep understanding of SAP architecture , integration strategies, and solution positioning Proven experience with Microsoft Fabric , including OneLake and Power BI integration Strong command of Microsoft product stack : Azure, Power Platform, Dynamics 365, Purview, and Copilot Familiarity with data governance , AI/ML use cases, and cloud-native open Fabric, databricks and snowflake analytics 5. Global Exposure Proven engagement with clients or partners in USA and UAE markets Understanding of regional compliance, procurement, and digital transformation trends Good sales achievement records and customer succuss management mindset year after Year
Posted 1 week ago
7.0 - 15.0 years
25 - 35 Lacs
chennai, tamil nadu, india
On-site
Hi, Greetings from Tasya Infra IT Solutions Pvt.Ltd. We are looking for Senior Data Engineers fpr Chennai. Senior Data Engineer: (MS Fabric, Power BI, ADF, API, SAP and HR Domain): Only Chennai-5 days work from Office Responsibilities: Lead the design and delivery of advanced analytics solutions, from raw data to insight-driven stories. Collaborate with stakeholders to define KPIs, performance measures, and strategic questions. Own the development of predictive models and what-if scenarios that drive business foresight. Design and optimize robust data models, pipelines, and ETL processes using Power BI, SQL, and Azure tools. Guide and mentor junior analysts and developers, setting high standards for analytical reasoning and impact. Present insights confidently to senior leaders in clear, actionable, and visually compelling ways. Collaborate with cross-functional teams to connect analytics with broader digitalization strategies. Requirement: 7+ years of hands-on experience in analytics, BI, or data engineering roles with proven business value delivery. Mastery of Power BI (DAX, data modeling, storytelling) and strong SQL skills. Deep understanding of data pipelines, ETL frameworks, and cloud platforms (Azure preferred). Strong predictive analytics mindset comfort with trend analysis, forecasting, or even basic machine learning techniques. Ability to distill complexity into insights excellent storytelling and data visualization skills. Demonstrated self-leadership, strong work ethic, and ownership mentality. Excellent communication and interpersonal skills able to translate data into business action. Experience with Microsoft Fabric, Azure Data Factory, Databricks, or similar tools. Previous success in HR analytics, financial analytics, or operational intelligence. Exposure to shared services, outsourcing, or people-intensive businesses. Certifications: PL-300, DP-900, AZ-900, or similar. (Good to have)
Posted 1 week ago
8.0 - 13.0 years
15 - 25 Lacs
pune, bengaluru, delhi / ncr
Hybrid
Role Snapshot Title: Senior Microsoft Fabric Data Engineer Experience: 8+ years in Data Engineering ( minimum 4+ years on Azure , and ( Or ) 6 months to 1+ year with Microsoft Fabric ) Tech Focus: Microsoft Fabric, Azure Data Factory (ADF), Databricks (Python, PySpark, Spark SQL), Delta Lake, Power BI (DAX), Azure Storage, Lakehouse, Warehouse Engagement: Client-facing, hands-on, design-to-delivery Must-Have Skills (Strong, Hands-On) Microsoft Fabric (2024+) OneLake, Lakehouse, Warehouse, Pipelines, Dataflows Gen2, Notebooks, capacities, workspace & item security, RLS/OLS. Azure Data Factory (ADF) Reusable, parameterized pipelines; high-level orchestration; robust scheduling, logging, retries, and alerts. Databricks (5+ years on Azure) Python, PySpark, Spark SQL: complex transformations, joins, window functions, UDFs/UDAs. Complex & nested notebooks; modular code with %run / dbutils.notebook.run. Structured Streaming: watermarks, triggers, checkpointing, foreachBatch, schema evolution. Delta Lake: Z-ORDER, OPTIMIZE/VACUUM, MERGE for SCD, Auto Optimize, compaction, time travel. Performance tuning: partitioning, file sizing, broadcast hints, caching, Photon (where available), cluster sizing/pools. Medallion Architecture Bronze/Silver/Gold patterns, SCD (Type 1/2), handling late-arriving dimensions. Azure Storage ADLS Gen2 (hierarchical namespace), tiering (Hot/Cool/Archive), lifecycle & cost optimization, shortcuts into OneLake. Data Warehousing Dimensional modeling, fact/aggregate design, query performance tuning in Fabric Warehouse & Lakehouse SQL endpoint. SQL Excellent SQL development; advanced joins, windowing, CTEs, performance tuning/indexing where applicable. Power BI (DAX) Awareness of Power BI and DAX; RLS alignment with Warehouse/Lakehouse. Security & Compliance RBAC, item-level permissions, credentials for data sources, RLS/OLS, secret management (Key Vault), PII handling. ETL/ELT Methodologies Robust, testable pipelines; idempotency; error handling; data quality gates. Ways of Working Agile delivery, client-facing communication, crisp demos, documentation, and best-practice advocacy.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |