Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. As part of our GDS Consulting team, you will be part of Digital & Emerging team delivering specific to Microsoft account. You will be working on latest Microsoft BI technologies and will collaborate with other teams within Consulting services. The opportunity We're looking for resources with expertise in Microsoft BI, Power BI, Azure Data Factory to join the group of our Data Visualization team. This is a fantastic opportunity to be part of a leading firm whilst being instrumental in the growth of our service offering. Your Key Responsibilities - Responsible for managing multiple client engagements. - Ability to understand and analyse business requirements by working with various stakeholders and create the appropriate information architecture, taxonomy and solution approach - Work independently to gather requirements, cleansing extraction and loading of data - Experience designing and building complete ETL/Azure Data Factory processes moving and transforming data for ODS, Staging, and Data Warehousing - Translate requirements from the business and analyst into technical code - Exhibit report creation, visualization and tuning of data. - To appropriate modelling data, develop and deploy BI Dashboard. - Good at SQL Schema Design, Database Schema Design, Stored procedures, function, and T-SQL. - Ability to prototype visualizations, scorecards, KPI using Power BI - Good to have technical knowledge on Data Bricks/Data Lake. Scala/Spark/SQL - Has basic knowledge on Azure Machine Learning Services Skills And Attributes For Success - Collaborating with other members of the engagement team to plan the engagement and develop work program timelines, risk assessments and other documents/templates. - Able to manage Senior stakeholders. - Experience in leading teams to execute high quality deliverables within stipulated timeline. - Skills in Azure Data Factory, Azure Synapse, Power BI, Data Modelling, DAX, Power Query, Microsoft Fabric - One or more of the following databases: Oracle, SQL, Azure SQL. - Good to have skills: SSAS or Azure SSAS - Excellent Written and Communication Skills - Ability to deliver technical demonstrations - Quick learner with can do attitude - Demonstrating and applying strong project management skills, inspiring teamwork and responsibility with engagement team members To qualify for the role, you must have - A bachelor's or master's degree - A minimum of 6-10 years of experience, preferably background in a professional services firm. - Excellent communication skills with consulting experience preferred Ideally, you'll also have - Analytical ability to manage multiple projects and prioritize tasks into manageable work products. - Can operate independently or with minimum supervision What Working At EY Offers At EY, we're dedicated to helping our clients, from startups to Fortune 500 companies - and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 1 week ago
1.0 - 5.0 years
0 Lacs
hyderabad, telangana
On-site
Your journey at Crowe starts here with an opportunity to build a meaningful and rewarding career. At Crowe, you will enjoy real flexibility to balance work with life moments while being trusted to deliver results and make a significant impact. You will be embraced for who you are, your well-being will be cared for, and your career will be nurtured. Equitable access to opportunities for career growth and leadership is available to everyone. With a history spanning over 80 years, delivering excellent service through innovation is ingrained in our DNA across audit, tax, and consulting groups. We continuously invest in innovative ideas like AI-enabled insights and technology-powered solutions to enhance our services. Join us at Crowe and embark on a career where you can contribute to shaping the future of our industry. As a Data Engineer at Crowe, you will play a crucial role in providing integration infrastructure for analytical support and solution development for the broader Enterprise. Leveraging your expertise in API integration, pipelines or notebooks, programming languages (such as Python, Spark, T-SQL), dimensional modeling, and advanced data engineering techniques, you will create and deliver robust solutions and data products. The ideal candidate will possess deep expertise in API integration and configuration, infrastructure development, data modeling and analysis. You will be responsible for designing, developing, and maintaining the Enterprise Analytics Platform to facilitate data-driven decision-making across the organization. Success in this role hinges on a strong interest and passion in data analytics, ETL/ELT best practices, critical thinking, problem-solving, as well as excellent interpersonal, communication, listening, and presentation skills. The Data team at Crowe aims for an unparalleled client experience and will rely on you to promote our success and image firmwide. Qualifications for this position include a Bachelor's degree in computer science, Data Analytics, Data/Information Science, Information Systems, Mathematics, or related fields. You should have 3+ years of experience with SQL and data warehousing concepts supporting Business Intelligence, data analytics, and reporting, along with 2+ years of experience coding in Python, PySpark, and T-SQL (or other programming languages) using Notebooks. Additionally, 2+ years of experience managing projects from inception to execution, 1+ years of experience with Microsoft Power BI (including DAX, Power Query, and M language), and 1+ years of hands-on experience working with Delta Lake or Apache Spark (Fabric or Databricks) are required. Hands-on experience or certification with Microsoft Fabric (preferred DP-600 or DP-700) is also beneficial. Candidates are expected to uphold Crowe's values of Care, Trust, Courage, and Stewardship, which define the organization's ethos. Ethical behavior and integrity are paramount for all individuals at Crowe at all times. Crowe offers a comprehensive benefits package to employees, recognizing that great people are what make a great firm. In an inclusive culture that values diversity, talent is nurtured, and employees have the opportunity to meet regularly with Career Coaches to guide them in their career goals and aspirations. Crowe Horwath IT Services Private Ltd. is a wholly owned subsidiary of Crowe LLP (U.S.A.), a public accounting, consulting, and technology firm with a global presence. As an independent member firm of Crowe Global, one of the largest global accounting networks, Crowe LLP is connected with over 200 independent accounting and advisory firms in more than 130 countries worldwide. Please note that Crowe does not accept unsolicited candidates, referrals, or resumes from any third-party entities. Any submissions without a pre-existing agreement will be considered the property of Crowe, free of charge.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Reporting Analyst/Power BI Analyst/Developer (SME) at EQRS Shared Services within the Reporting, BI Services department, you will play a key role in contributing to the setup of a newly centralized Reporting and Business Intelligence (BI) team within Retirement Solutions Share Services. Your primary objective will be to collaborate with RS stakeholders to develop more automated and insightful reporting processes using tools such as SQL, Power BI, and Fabric. Your responsibilities will include gathering business requirements in a BI context, designing BI solutions and data models to transform complex raw data into meaningful interactive insights, and developing and maintaining dashboards and interactive visual reports using Power BI. You will also work directly with stakeholders team members in the UK and India to build strong trusting relationships, ensure requirements are understood, and achieve collaborative solution delivery. Additionally, you will be involved in developing new data access processes, ETL and Storage processes (including cloud-based), writing and developing SQL scripts, working with complex SQL databases, developing new Microsoft Fabric solutions, working with complex excel tools and spreadsheets, analyzing and cleansing source system data, and working on multiple complex projects simultaneously. To excel in this role, you must possess strong communication skills in English at a fluent business level, extensive experience designing, building, and maintaining Power BI dashboards, strong experience in SQL script writing and database management, strong Microsoft Excel experience, and ideally experience with Microsoft Fabric. You should also have a good knowledge of testing approaches/methods and experience in devising and running testing procedures. Your work will be instrumental in driving business and client decision-making through standardization and optimization of all internal and client reports. If you are someone who thrives in a collaborative environment, has a passion for data analysis, and enjoys working on complex projects to deliver high-quality solutions on time, then this role at EQRS Shared Services could be the perfect fit for you. Join us at EQ and be a part of our global growth story!,
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
chennai, tamil nadu
On-site
You have a great opportunity to join Ashra Technologies as a Microsoft Fabric professional with a minimum of 7 years of experience. The role will require you to have hands-on experience with Microsoft Fabric components such as Lakehouse, Data Factory, and Synapse. You should also possess a strong expertise in PySpark and Python for large-scale data processing and transformation. In this role, you will be expected to have deep knowledge of various Azure data services including ADLS Gen2, Azure Databricks, Synapse, ADF, Azure SQL, among others. Your responsibilities will include designing, implementing, and optimizing end-to-end data pipelines specifically on Azure platform. It would be advantageous if you have an understanding of Azure infrastructure setup, including networking, security, and access management. While not mandatory, having knowledge of the healthcare domain would be considered a plus. This is a full-time position based out of Pune, Mumbai, Chennai, or Bangalore. If you are interested in this exciting opportunity, please share your resume with us at akshitha@ashratech.com or contact us at 8688322632. We look forward to potentially having you join our team at Ashra Technologies.,
Posted 2 weeks ago
3.0 - 8.0 years
0 Lacs
karnataka
On-site
You will be joining a fast-paced team in Bangalore, India, working on a full-time basis and reporting directly to the Director of Data Engineering. As a Technical Program Manager, you will take charge of a crucial project aimed at modernizing the enterprise data warehouse using Microsoft Azure and Microsoft Fabric. Your role will require a combination of agile leadership, technical expertise, and team management to ensure the successful completion of this significant initiative. Your responsibilities will include leading the program, from planning to execution and delivery of the data warehouse modernization project, ensuring alignment with organizational objectives and timelines. You will engage with stakeholders such as business partners, data engineers, and architects to gather requirements, manage expectations, and maintain visibility throughout the project lifecycle. Additionally, you will provide technical oversight, collaborating closely with technical teams to ensure that the architecture and implementation adhere to best practices for Azure and Microsoft Fabric. Identifying and addressing potential risks and impediments proactively to minimize their impact on project delivery will also be part of your role. You will be responsible for maintaining clear and concise documentation of project progress, including status reports, dashboards, and metrics to update partners and leadership on the project's status. To qualify for this role, you should hold a Bachelor's degree in Computer Science, Information Technology, or a related field, or possess equivalent experience. You must have a minimum of 8 years of experience in program/project management roles, with at least 3 years in agile environments. Technical proficiency in data warehousing concepts and hands-on experience with Microsoft Azure services like Azure Data Factory and Azure Synapse Analytics, as well as Microsoft Fabric, is essential. Possessing a Certified Scrum Master (CSM) or equivalent agile certification is preferred. Proficiency in project management and agile tools such as Azure DevOps, Jira, or similar platforms is required. Moreover, you should demonstrate superb communication, leadership, and problem-solving skills, with the ability to work effectively in a multi-functional, multicultural environment. Joining our team presents you with an opportunity to play a pivotal role in transforming our data infrastructure, enabling advanced analytics and business intelligence capabilities. You will collaborate with a dynamic team that is committed to innovation and excellence in a supportive and collaborative environment. At LSEG, we prioritize work-life balance and offer a flexible working culture to ensure that success at work does not come at the expense of personal sacrifices. Our inclusive team culture encourages learning and curiosity, fostering a sense of inclusion that empowers employees to embrace their differences. Additionally, we provide mentorship and career growth opportunities to help individuals develop into well-rounded professionals. As a leading global financial markets infrastructure and data provider, LSEG is dedicated to driving financial stability, empowering economies, and facilitating sustainable growth. Our values of Integrity, Partnership, Excellence, and Change guide our actions and decisions, shaping our organizational culture. By working with us, you will be part of a dynamic and diverse workforce across 65 countries, where your individuality is valued, and your contributions are encouraged to support sustainable economic growth and inclusive economic opportunities. LSEG offers a range of benefits and support tailored to employees, including healthcare, retirement planning, paid volunteering days, and wellbeing initiatives.,
Posted 2 weeks ago
1.0 - 6.0 years
0 Lacs
pune, maharashtra
On-site
Capgemini is looking for a highly motivated and detail-oriented Strategy Integration & Reporting Analyst to join a top 10 US Insurance Carrier. As a Strategy Integration & Reporting Analyst, your responsibilities will include assisting in strategic analysis, developing interactive reports, tools, and trackers using Power BI. Proficiency in creating data process flows in SQL, Python, Power Query, and Microsoft Fabric is beneficial. The ideal candidate should possess technical skills, attention to detail, and the ability to create well-structured and visually engaging dashboards. Key responsibilities of the role include data manipulation, data validation, maintaining and updating reports and analyses, as well as conducting Ad Hoc analysis and generating reports as needed. The client, one of the largest insurers in the United States, offers a wide range of insurance and financial services products with a gross written premium exceeding US$25 Billion (P&C). They cater to over 10 million U.S. households and more than 19 million individual policies across all 50 states through the dedicated efforts of over 48,000 exclusive and independent agents and nearly 18,500 employees. Additionally, the client is part of one of the largest Insurance Groups globally. Requirements: - Total Work Experience: Minimum Required 1-3 years / Preferred 4-6 years - Work Experience in This Field: Minimum Required None required / Preferred 1-3 years - English Proficiency: Minimum Required Intermediate / Preferred Fluent - Required Education: Bachelor's degree Software / Tool Skills: - Power BI (Nice to have) - Semantic Model Design (Nice to have) - DAX (Nice to have) - SQL (Nice to have) - Python (Nice to have) Benefits: - Competitive compensation and benefits package - Competitive salary and performance-based bonuses - Comprehensive benefits package - Career development and training opportunities - Flexible work arrangements (remote and/or office-based) - Dynamic and inclusive work culture within a globally renowned group - Private Health Insurance - Pension Plan - Paid Time Off - Training & Development Note: Benefits may vary based on the employee's level. About Capgemini: Capgemini is a global leader in partnering with companies to transform and manage their businesses through technology. With over 340,000 team members in more than 50 countries, Capgemini is committed to unleashing human energy through technology for an inclusive and sustainable future. The company has a strong 55-year heritage and deep industry expertise, trusted by clients to address their business needs comprehensively. Capgemini operates in the evolving world of cloud, data, AI, connectivity, software, digital engineering, and platforms, with revenues of 22.5 billion in 2023.,
Posted 2 weeks ago
4.0 - 7.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Job Description About Our Company/Team At Oracle Finergy, we are committed to delivering innovative solutions to the Banking, Financial Services, and Insurance (BFSI) sector. Our team of experts leverages proven methodologies and cutting-edge technologies to address the complex financial needs of our clients. We pride ourselves on being a leading provider of end-to-end banking solutions, enhancing operational efficiency, and ensuring technology aligns with our clients' business goals. Our mission is to empower financial institutions to succeed in a rapidly changing world. Job Summary As a Microsoft Fabric Data Engineer/Developer, you will play a vital role in designing, developing, and implementing robust and scalable data solutions within the Microsoft Fabric ecosystem. You will collaborate closely with data architects, business stakeholders, and cross-functional teams to transform raw data into actionable insights, driving informed decision-making across the organization. If you are passionate about data engineering, possess a strong technical background, and excel in collaborative environments, we invite you to join our growing data team. Career Level - IC2 Responsibilities Microsoft Fabric Development: Design, develop, and deploy end-to-end data solutions using various components of Microsoft Fabric, including Lakehouse, Data Warehouse, Data Factory, and Data Engineering. Implement and optimize data pipelines for ingestion, transformation, and curation of data from diverse sources (e.g., Azure Data Lake Storage Gen2, on-premises databases, APIs, third-party systems). Develop and optimize data models within Microsoft Fabric, ensuring adherence to best practices for performance, scalability, and data quality. Utilize Power BI for data visualization and reporting, ensuring seamless integration with Fabric data assets. Azure Data Services Integration: Demonstrate strong hands-on experience with core Microsoft Azure data services, including Azure Data Factory (for ETL/ELT orchestration), Azure Databricks (for advanced analytics and processing), and Azure Data Lake Storage Gen2. Integrate Microsoft Fabric solutions with existing Azure data services and other enterprise systems. Data Architecture & Governance: Contribute to the design and implementation of robust, scalable, and secure data architectures within the Microsoft Fabric platform. Implement data quality, validation, and reconciliation processes to ensure data integrity and accuracy. Apply data governance best practices, including security, access controls (e.g., role-based access control), and compliance within Fabric and Azure Purview. Documentation & Knowledge Sharing: Maintain comprehensive documentation for data architectures, pipelines, data models, and processes. Stay updated with the latest advancements in Microsoft Fabric, Azure data services, and data engineering best practices. Qualifications & Skills Mandatory: Bachelor&aposs degree in Computer Science, Information Technology, Engineering, or a related field. 4-7 years of professional experience as a Data Engineer, Data Developer, or in a similar role. Hands-on experience with Microsoft Fabric, including its core components (Lakehouse, Data Warehouse, Data Factory, Data Engineering). Strong expertise in Microsoft Azure data services: Azure Data Factory (ADF) Azure Data Lake Storage Gen2 Proven experience in designing, developing, and maintaining scalable data pipelines. Solid understanding of data warehousing concepts, dimensional modeling, and data lakehouse architectures. Proficiency in SQL for data manipulation and querying. Experience with version control systems (e.g., Git, Azure Repos). Strong analytical and problem-solving skills with meticulous attention to detail. Excellent communication skills (written and verbal) and the ability to collaborate effectively with cross-functional teams. Good-to-Have: Certification in Microsoft Azure or Microsoft Fabric. Experience with cloud-based data platforms, such as Amazon Web Services (AWS) or Google Cloud Platform (GCP). Knowledge of data governance frameworks and best practices. Additional Notes Ideal to have some background knowledge around Finance / Investment Banking / Fixed Income / OCIO Business Self-Assessment Questions To help you determine if this role is a good fit, please consider the following questions: Can you describe your experience with Microsoft Fabric and its core components, highlighting specific projects or accomplishments How do you ensure data quality, validation, and reconciliation in your data pipelines, and can you provide an example from a previous project Can you explain your approach to data governance, including security, access controls, and compliance, and how you&aposve applied this in a previous role How do you stay up-to-date with the latest advancements in Microsoft Fabric, Azure data services, and data engineering best practices Can you provide an example of a complex data problem you&aposve solved in the past, highlighting your analytical and problem-solving skills About Us As a world leader in cloud solutions, Oracle uses tomorrows technology to tackle todays challenges. Weve partnered with industry-leaders in almost every sectorand continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. Thats why were committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. Were committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing [HIDDEN TEXT] or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 2 weeks ago
6.0 - 8.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Withum is a place where talent thrives - where who you are matters. Its a place of endless opportunities for growth. A place where entrepreneurial energy plus inclusive teamwork equals exponential results. Withum empowers clients and our professional staff with innovative tools and solutions to address their accounting, tax and overall business management and operational needs. As a US nationally ranked Top 25 firm, we recruit only the best and brightest people with a genuine passion for the business. We are seeking an experienced Lead Consultant Data Engineering with a strong background in consulting services and hands-on skills in building modern, scalable data platforms and pipelines. This is a client-facing, delivery-focused role. Please note that this position is centered around external client delivery and is not part of an internal IT or product engineering team. This is a foundational hire. You will be responsible for delivering hands-on client work, support for our proprietary data products, and building the team underneath you. Withums brand is a reflection of our people, our culture and our strength. Withum has become synonymous with teamwork and client service excellence. The cornerstone of our success can truly be accredited to the dedicated professionals who work here every day, easy to work with a sense of purpose and caring for their co-workers and whose mission is to help our clients grow and thrive. But our commitment goes beyond our clients as we continue to live the Withum Way, promoting personal and professional growth for all team members, clients, and surrounding communities. How Youll Spend Your Time: Architect, implement, and optimize data transformation pipelines, data lakes, and cloud-native warehouses for mid- and upper mid-market clients. Deliver hands-on engineering work across client environments building fast, scalable, and well-documented pipelines that support both analytics and AI use cases. Lead technical design and execution using tools such as Tableau, Microsoft Fabric, Synapse, Power BI, Snowflake, and Databricks. Also have a good hands-on familiarity with SQL Databases. Optimize for sub-50GB datasets and local or lightweight cloud execution where appropriate minimizing unnecessary reliance on cluster-based compute. Collaborate with subject-matter experts to understand business use cases prior to designing data model. Operate as a client-facing consultant: conduct discovery, define solutions, and lead agile project delivery. Switch context rapidly across 23 active clients or service streams in a single day. Provide support for our proprietary data products as needed. Provide advisory and strategic input to clients on data modernization, AI enablement, and FP&A transformation efforts. Deliver workshops, demos, and consultative training to business and technical stakeholders. Ability to implement coding modifications to pre-existing code/procedures in a manner that results in a validated case study (i.e. if done properly, the result will be xyz and the total amount will reconcile to abc). Take full ownership of hiring, onboarding, and mentoring future data engineers and analysts within the India practice. During bench time, contribute to building internal data products and tooling powering our own consulting operations (e.g., utilization dashboards, delivery intelligence, practice forecasting). Help define and scale delivery methodology, best practices, and reusable internal accelerators for future engagements. Ability to communicate openly about conflicting deadlines to ensure prioritization aligns with client expectations, with ample time to reset client expectations as needed. Ensure coding is properly commented to help explain logic or purpose behind more complex sections of code. Requirements: 6+ years of hands-on experience in data engineering roles, at least 3+ years in a consulting or client delivery environment. Proven ability to context-switch, self-prioritize, and communicate clearly under pressure. Demonstrated experience owning full lifecycle delivery, from architecture through implementation and client handoff. Strong experience designing and implementing ETL / ELT pipelines, preferably in SQL-first tools (e.g., dbt core, SQLMesh, DuckDB). Experience with Microsoft SQL server / SSIS for maintenance and development of ETL processes. Real-world experience with SQL Databases, Databricks, Snowflake, and/or Synapse and a healthy skepticism of when to use them. Deep understanding of data warehousing, data lakes, data modeling, and incremental processing. Proficient in Python for ETL scripting, automation, and integration work. Experience with dbt core a comparable tool such as SQLMesh, Dataform, etc, in production environments. Strong practices around data testing, version control, documentation, and team-based dev workflows. Working knowledge of Power BI, Tableau, Looker, or similar BI tools enough to support downstream teams but not as your primary skillset. Experience building platforms for AI/ML workflows or supporting agentic architectures. Familiarity with Microsoft Fabric&aposs Lakehouse implementation, Delta Lake, Iceberg, and Parquet. Background in DataOps, CI/CD for data pipelines, and metadata management. Microsoft certifications (e.g., Azure Data Engineer, Fabric Analytics Engineer) are a plus Website: www.withum.com Withum will not discriminate against any employee or applicant for employment because of race, color, religion, sex, sexual orientation, gender identity, national origin, age, marital status, genetic information, disability or because he or she is a protected veteran. Show more Show less
Posted 2 weeks ago
6.0 - 8.0 years
10 - 20 Lacs
bengaluru
Remote
6+ years with Data engineer,Data Modelling,Synapse,ADF,Microsoft Fabric,Databricks,SQL,ETL,Agile,medallion architecture,Data profiling, Anomaly Detection, Devops.
Posted 2 weeks ago
12.0 - 16.0 years
65 - 70 Lacs
ahmedabad, chennai, bengaluru
Work from Office
About the Role: Looking for a Principal Data Engineer to lead the design and delivery of scalable data solutions using Azure Data Factory and Azure Data bricks. This is a consulting- focused role that requires strong technical expertise, stakeholder engagement, and architectural thinking. You will work closely with business, functional, and technical teams to define data strategies, Design robust pipelines, and ensure smooth delivery in an Agile environment. Responsibilities Collaborate with business and technology stakeholders to gather and understand data needs Translate functional requirements into scalable and maintainable data architecture Design and implement robust data pipelines Lead data modeling, transformation, and performance optimization efforts Ensure data quality, validation, and consistency Participate in Agile ceremonies including sprint planning and backlog grooming Support CI/CD automation for data pipelines and integration workflows Mentor junior engineers and promote best practices in data engineering Must Have 12+ years of IT experience, with at least 5years in data architecture roles in modern metadata driven and cloud-based technologies ,bringing a software engineering mindset Strong analytical and problem-solving skills-Ability to determine data patterns and perform root cause analysis to resolve production issues Excellent communication skills, with experience in leading client-facing discussion Strong hands-on experience with Azure Data Factory and Data bricks, leveraging custom solutioning and design beyond drag-and-drop capabilities for big data workloads Demonstrated proficiency in SOL, Python, and Spark Experience with CI/CD pipelines, version control and DevOps tools Experience with applying dimensional and Data Vault methodologies Background in working with Agile methodologies and sprint-based delivery Ability to produce clear and comprehensive technical documentation Nice to Have Experience with Azure Synapse and Power BI Experience with Microsoft Purview and/or Unity Catalog Understanding of Data Lakehouse and Data Mesh concepts Familiarity with enterprise data governance and quality frameworks Manufacturing experience within the operations domain Location : Ahmedabad,Bengaluru,Chennai,Gurugram,Hyderabad,Mumbai,Pune
Posted 2 weeks ago
6.0 - 11.0 years
16 - 27 Lacs
hyderabad, bengaluru, delhi / ncr
Hybrid
Cloud Technical Lead- Data Shift: 2:00 PM- 11:00 PM IST Location : Delhi NCR, Hyderabad, Bangalore, Pune, Chennai, this is a hybrid work opportunity. Azure Data Engineering (Microsoft Fabric) Shift: 2:00 PM11:00 PM IST About the Role We are seeking a Architect-Azure Data Engineering with mandatory Microsoft Fabric hands-on experience (minimum 1 year) . The ideal candidate should have proven Fabric implementation exposure on client projects, strong Azure Data Engineering background, and the ability to design and deliver enterprise-scale data solutions. Key Responsibilities Lead end-to-end Microsoft Fabric implementations for enterprise clients. Build and maintain ETL/data pipelines using Azure Data Factory, Databricks, and Fabric Data Pipelines. Design, develop, and optimize large-scale data solutions on Azure (Fabric, Synapse, Data Lake, SQL DB). Implement data models and data warehousing solutions using Fabric Lakehouse, Synapse, and SQL. Collaborate with stakeholders to translate business needs into scalable Fabric-based data solutions. Ensure high-performance, secure, and compliant data solutions. Mentor junior engineers on Fabric, Databricks, and ADF best practices. Provide architectural input for enterprise cloud data platforms. Required Skills & Experience Bachelor’s degree in Computer Science, IT, or a related field. 5-8+ years of experience in data engineering, including 5+ years of hands-on experience with Azure Databricks, ADF, and Synapse. Minimum 7 months to 1 year of mandatory hands-on experience with Microsoft Fabric, demonstrated through client project implementations. Strong experience in data modeling, data architecture, and database design. Proficiency in SQL, Python, and PySpark. Familiarity with data governance, security, and compliance practices, with hands-on experience in tools such as Microsoft Purview or Unity Catalog. Experience with Azure DevOps CI/CD for data solutions. Strong interpersonal and communication skills, with the ability to lead teams. Insight at a Glance 14,000+ engaged teammates globally with operations in 25 countries across the globe. Received 35+ industry and partner awards in the past year $9.2 billion in revenue #20on Fortune’s World'sBest Workplaces™ list #14 on Forbes World's Best Employers in IT – 2023 #23 on Forbes Best Employers for Women in IT- 2023 $1.4M+ total charitable contributions in 2023 by Insight globally Now is the time to bring your expertise to Insight. We are not just a tech company we are a people-first company. We believe that by unlocking the power of people and technology, we can accelerate transformation and achieve extraordinary results. As a Fortune 500 Solutions Integrator with deep expertise in cloud, data, AI, cybersecurity, and intelligent edge, we guide organizations through complex digital decisions. What you can expect We’re legendary for taking care of you, your family and to help you engage with your local community. We want you to enjoy a full, meaningful life and own your career at Insight. Some of our benefits include: Freedom to work from another location—even an international destination—for up to 30 consecutive calendar days per year. Medical Insurance Health Benefits Professional Development: Learning Platform and Certificate Reimbursement Shift Allowance But what really sets us apart are our core values of Hunger, Heart, and Harmony, which guide everything we do, from building relationships with teammates, partners, and clients to making a positive impact in our communities. Join us today, your ambitious journey starts here. Insight is an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, sexual orientation or any other characteristic protected by law. When you apply, please tell us the pronouns you use and any reasonable adjustments you may need during the interview process. At Insight, we celebrate diversity of skills and experience so even if you don’t feel like your skills are a perfect match - we still want to hear from you! Today's talent leads tomorrow's success. Learn more about Insight: https://www.linkedin.com/company/insight/
Posted 2 weeks ago
5.0 - 10.0 years
15 - 27 Lacs
hyderabad, pune, gurugram
Work from Office
GSPANN is hiring! Role: Senior Fabric Developer Experience: 5+ Years Locations: Gurugram | Hyderabad | Pune Send CVs to: heena.ruchwani@gspann.com Were looking for experts in Microsoft Fabric, Power BI, SQL, and Python/PySpark. Candidates with Microsoft Fabric certifications (e.g., Fabric Analytics Engineer Associate) are preferred.
Posted 2 weeks ago
8.0 - 10.0 years
6 - 14 Lacs
chennai, tamil nadu, india
Remote
Role Responsibilities: Design and develop scalable data pipelines using MS Fabric to support business intelligence and analytics needs. Build and optimize data models that facilitate effective data storage and retrieval. Manage ETL (Extract, Transform, Load) processes ensuring efficient data extraction, transformation, and loading. Collaborate with cross-functional teams to gather and define comprehensive data requirements. Ensure data quality, integrity, and consistency across all data processes. Implement and enforce best practices for data management, storage, and processing. Conduct performance tuning for data storage systems and query execution to enhance efficiency. Create and maintain detailed documentation for data architecture, workflows, and processes. Troubleshoot data-related issues and implement timely and effective solutions. Monitor and optimize cloud-based data solutions for scalability and resource efficiency. Research and evaluate emerging data engineering tools and technologies for project incorporation. Assist in designing and enforcing data governance frameworks and policies. Provide technical guidance and mentorship to junior data engineers. Participate in code reviews to ensure adherence to coding standards and quality. Stay updated on industry trends and best practices in data engineering and analytics. Qualifications: Minimum of 8 years of experience in data engineering or related roles. Strong expertise and hands-on experience with MS Fabric and its ecosystem. Proficiency in SQL and experience working with relational database management systems. Solid experience in data warehousing solutions and data modeling techniques. Hands-on experience with ETL tools and data integration processes. Familiarity with major cloud computing platforms such as Azure, AWS, and GCP. Working knowledge of Python or other programming languages commonly used in data engineering. Proven ability to communicate complex technical concepts to non-technical stakeholders clearly. Experience implementing data quality measures and data governance practices. Excellent problem-solving skills and a keen attention to detail. Ability to work independently in remote and distributed team environments. Experience with data visualization tools is advantageous. Strong analytical and organizational skills. Bachelor's degree in Computer Science, Engineering, or a related discipline. Familiarity with Agile methodologies and project management practices.
Posted 2 weeks ago
5.0 - 10.0 years
4 - 9 Lacs
kolkata, bengaluru
Hybrid
Job Description (JD) tailored for a Data Engineering role focused on Microsoft Fabric , Azure Databricks with PySpark capabilities and strong emphasis on Python, SQL, Data Lake, and Data Warehouse : Job Title: Data Engineer Microsoft Fabric, Azure (Databricks & ADF), PySpark Experience: More than 8 Years Location: Kolkata / Bangalore Employment Type: Full-Time Job Summary: We are looking for a skilled and experienced Data Engineer with more than 8 years of experience in building scalable data solutions on the Microsoft Azure ecosystem . The ideal candidate must have strong hands-on experience with Microsoft Fabric, Azure Databricks along with strong PySpark , Python and SQL expertise. Familiarity with Data Lake , Data Warehouse concepts, and end-to-end data pipelines is essential. Key Responsibilities: Requirement gathering and analysis Experience with different databases like Synapse, SQL DB, Snowflake etc. Design and implement data pipelines using Microsoft Fabric & Databricks Extract, transform, and load (ETL) data from various sources into Azure Data Lake Storage Implement data security and governance measures Monitor and optimize data pipelines for performance and efficiency Troubleshoot and resolve data engineering issues Provide optimized solution for any problem related to data engineering Ability to work with a variety of sources like Relational DB, API, File System, Realtime streams, CDC etc. Provide mentorship on debugging and problem-solving. Strong knowledge on Databricks, Delta tables Required Skills: More than 8 years of experience in Data Engineering or related roles. Hands-on experience in Microsoft Fabric Hands-on experience in Azure Databricks Proficiency in PySpark for data processing and scripting. Strong command over Python & SQL – writing complex queries, performance tuning, etc. Experience working with Azure Data Lake Storage and Data Warehouse concepts (e.g., dimensional modeling, star/snowflake schemas). Hands on experience in performance tuning & optimization on Databricks & MS Fabric. Ensure alignment with overall system architecture and data flow. Understanding CI/CD practices in a data engineering context. Excellent problem-solving and communication skills. Exposure to BI tools like Power BI , Tableau , or Looker . Good to Have: Experienced in Azure DevOps . Knowledge of Scala or other distributed processing frameworks. Familiarity with data security and compliance in the cloud. Experience in leading a development team.
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
You will be responsible for supporting the gathering, analysis, and design of requirements, extracting necessary report data from various sources, and managing reports, dashboards, and visualizations to effectively communicate business data and insights. Conducting comprehensive data analysis to identify trends, patterns, and insights crucial for strategic business decision-making will be a key aspect of your role. Collaborating with stakeholders in business requirements gathering sessions to understand their needs and specifications for reporting and analysis is essential. Your duties will include administering and maintaining BI tools and platforms, managing user access, implementing security protocols, and optimizing performance. Creating impactful visualizations that transform complex data into clear, actionable insights for business users, and interpreting data analysis findings to present them to business stakeholders in a clear and actionable manner will be part of your responsibilities. Additionally, you will provide comprehensive training and ongoing support to empower business users in effectively utilizing BI tools and reports for self-service analytics. Monitoring the performance of enterprise and client reports, optimizing queries and processes to enhance efficiency and operational performance will be crucial. Maintaining documentation such as business requirements documents, data dictionaries, data catalogs, and data mapping documents is also expected. You may be required to perform additional tasks and duties as instructed by the manager or supervisor based on team and business needs, including administrative duties, project support, backup development roles, and other ad-hoc tasks. Required Competencies: - Bachelor's degree in Computer Science, Information Technology, Mathematics, or a related field. - Minimum of 5 years of hands-on experience developing BI solutions using Power BI. - At least 3 years of experience in data warehousing and data modeling, including normalization and denormalization. - Extensive experience performing ETL to extract data from various sources using SSIS, Data Factory, or Microsoft Fabric. - Proficiency in T-SQL scripting and strong technical knowledge of databases. - Expertise in data visualization techniques to create visually appealing and insightful dashboards and reports using Power BI. - Strong skills in designing and developing data models that structure data for analysis and reporting. - Solid knowledge of data warehousing concepts and architectures. - Ability to analyze user needs and data, translating them into technical specifications for BI solutions. - Knowledgeable in other data analysis languages, such as DAX, KQL, or their equivalents.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
faridabad, haryana
On-site
The role involves providing business intelligence services, spearheading BI software development, and creating Power BI reports. You will manage Power BI applications, train end users, and collaborate with product teams and business analysts for report development. Your responsibilities will include developing, maintaining, and optimizing Power BI reports using data from SAP BW, focusing on BEx queries and other BW objects. You will design reports and dashboards using data from Excel, data lakes, and Microsoft Fabric, and implement DAX queries to support complex data models within Power BI. Additionally, you will work with Azure Synapse, Azure Analysis Services (AAS), and other tools to integrate and analyze data. You will schedule report delivery, ensure data security and compliance, and manage Power BI workspaces. Collaborating with business users, you will gather requirements, design intuitive UX, and provide ongoing support. Monitoring performance, staying updated with Power BI features, and continuously improving reporting capabilities are key aspects of the role. Requirements include a Bachelor's degree in Computer Science or related field, at least 4 years of experience in developing Power BI solutions, proficiency in Microsoft BI stack, DAX, Azure Synapse, AAS, and scripting. You should have experience in SAP BW integration, SQL server, data visualization, and UX design principles. Strong problem-solving skills, attention to detail, excellent communication, and the ability to lead data warehousing projects are essential. Certification in Power BI or related technologies is preferred. Thank you for considering a career with Varel Energy Solutions.,
Posted 2 weeks ago
2.0 - 4.0 years
0 Lacs
pune, maharashtra, india
On-site
Responsibilities Calling all innovators - find your future at Fiserv. We're Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day - quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we're involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Data Engineer About your role: At Fiserv, we are dedicated to transforming financial services technology to benefit our clients. As Data Engineer you will play a key role in making sure that the data movement is smooth and efficient. You will have a good understanding of all the data elements, schemas and various data assets. The development of new pipelines to expand the value of our data assets is also an integral part of the role. What you'll do: Ensure repeatability and robustness of data movement and storage Execute data purge strategies from the OLTP systems Execute effective error detection and mitigation strategies Ensure update to data related documents Experience you'll need to have: Deep Knowledge of Microsoft Azure environment, Managed and non-managed SQL Deep expertise around Microsoft Fabric Proven experience of developing and managing data pipelines Good oral and written communication skills An undergraduate (Bachelor's) degree preferably in Computer Science, Master's degree will be an added advantage 2+ years of experience Post undergrad / master's degree Experience that would be great to have: Experience in the financial services industry Thank you for considering employment with Fiserv. Please: Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our commitment to Diversity and Inclusion: Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note to agencies: Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning about fake job posts: Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.
Posted 2 weeks ago
3.0 - 6.0 years
8 - 13 Lacs
kolkata, bengaluru
Hybrid
We are looking for a skilled and experienced Data Engineer with 3-5 years of experience in building scalable data solutions on the Microsoft Azure ecosystem . The ideal candidate must have strong hands-on experience with Microsoft Fabric, Azure Databricks along with PySpark , Python and SQL expertise. Familiarity with Data Lake , Data Warehouse concepts, and end-to-end data pipelines is essential. Key Responsibilities: Requirement gathering and analysis Experience with different databases like Synapse, SQL DB, Snowflake etc. Design and implement data pipelines using Microsoft Fabric & Databricks Extract, transform, and load (ETL) data from various sources into Azure Data Lake Storage Implement data security and governance measures Monitor and optimize data pipelines for performance and efficiency Troubleshoot and resolve data engineering issues Provide optimized solution for any problem related to data engineering Ability to work with a variety of sources like Relational DB, API, File System, Realtime streams, CDC etc. Strong knowledge on Databricks, Delta tables Role & responsibilities Plz Note-Only Serving immediate candidates. Preferred candidate profile Hands-on experience in Microsoft Fabric Hands-on experience in Azure Databricks Proficiency in PySpark for data processing and scripting. Strong command over Python & SQL writing complex queries, performance tuning, etc. Experience working with Azure Data Lake Storage and Data Warehouse concepts (e.g., dimensional modeling, star/snowflake schemas). Understanding CI/CD practices in a data engineering context. Excellent problem-solving and communication skills.
Posted 2 weeks ago
2.0 - 5.0 years
4 - 6 Lacs
vadodara
Work from Office
M3J seeks a Data Integration & Reporting Analyst to automate reports, cleanse data, and create KPI visualizations using Power BI and Excel. Develop data-driven apps for desktop, web, and mobile to drive growth. Join our team! Required Candidate profile BS in CS, IE, Data Science, or equivalent. 3 yrs Power BI, DAX, Power Query. 5 yrs Excel. Strong SQL, data modeling, BI skills. Detail-oriented, analytical, proactive. Power BI Cert & Fabric exp. Perks and benefits Paid Time Off & Holidays
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
hyderabad, telangana
On-site
The position of Microsoft Fabric Power BI Sr Developer is an offshore/India-based role that does not require in-office work. However, occasional visits to the Hyderabad office for knowledge transfer and relationship-building purposes are encouraged. The working schedule offers flexibility, with the majority of the team working from Noon until 8:30pm IST, or later, to accommodate meetings with US-based counterparts when necessary. As a Microsoft Fabric Power BI Sr Developer, your responsibilities will include developing and designing comprehensive reports with millions of data points using Power BI. You will guide the team in utilizing the features of Microsoft Fabric for creating new reports and perform detailed data analysis to support decision-making processes. Efficient handling of large data sets to ensure data integrity and accuracy is crucial. Collaboration with stakeholders to understand data flow, identify improvement opportunities, and communicate findings and recommendations will be essential. Additionally, you will research and implement new solutions to enhance business intelligence capabilities, act as a subject matter expert on Power BI and Microsoft Fabric, and provide training and support to team members for seamless integration of Power BI reports within the Microsoft Fabric ecosystem. The ideal candidate for this role should possess expertise as a Business Intelligence Developer with hands-on experience in Power BI. A minimum of 7-10 years of Business Data Analysis experience, particularly in handling large data sets, is required. Familiarity with Microsoft Fabric is essential, as Power BI is transitioning into Microsoft Fabric. Strong analytical, problem-solving, and data analysis skills are necessary, along with excellent communication abilities to facilitate collaboration between business and technical teams. Nice-to-have skills for this position include knowledge of the US Healthcare domain, Microsoft Certifications such as Fabric Analytics Engineer Associate or Power BI Data Analyst Associate, familiarity with Agile methodologies, and a keen focus on innovation. Proficiency in English at the C2 level is required for effective communication in this role.,
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
haryana
On-site
YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we are a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Microsoft Fabric Professionals in the following areas: **Position**: Data Analytics Lead **Experience**: 8+ Years **Responsibilities**: - Build, manage, and foster a high-functioning team of data engineers and data analysts. - Collaborate with business and technical teams to capture and prioritize platform ingestion requirements. - Experience of working with the manufacturing industry in building a centralized data platform for self-service reporting. - Lead the data analytics team members, providing guidance, mentorship, and support to ensure their professional growth and success. - Responsible for managing customer, partner, and internal data on the cloud and on-premises. - Evaluate and understand current data technologies and trends and promote a culture of learning. - Build an end-to-end data strategy from collecting the requirements from business to modeling the data and building reports and dashboards. **Required Skills**: - Experience in data engineering and architecture, with a focus on developing scalable cloud solutions in Azure Synapse / Microsoft Fabric / Azure Databricks. - Accountable for the data group's activities including architecting, developing, and maintaining a centralized data platform including our operational data, data warehouse, data lake, Data factory pipelines, and data-related services. - Experience in designing and building operationally efficient pipelines, utilizing core Azure components, such as Azure Data Factory, Azure Databricks, and Pyspark, etc. - Strong understanding of data architecture, data modeling, and ETL processes. - Proficiency in SQL and Pyspark. - Strong knowledge of building PowerBI reports and dashboards. - Excellent communication skills. - Strong problem-solving and analytical skills. **Required Technical/Functional Competencies**: - Domain/Industry Knowledge - Requirement Gathering and Analysis - Product/Technology Knowledge - Architecture Tools and Frameworks - Architecture Concepts and Principles - Analytics Solution Design - Tools & Platform Knowledge **Accountability**: - Takes responsibility for and ensures the accuracy of own work, as well as the work and deadlines of the team. **Required Behavioral Competencies**: - Collaboration - Agility - Customer Focus - Communication - Drives Results - Resolves Conflict **Certifications**: - Mandatory At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles: flexible work arrangements, free spirit, and emotional positivity; agile self-determination, trust, transparency, and open collaboration; all support needed for the realization of business goals; stable employment with a great atmosphere and ethical corporate culture.,
Posted 2 weeks ago
14.0 - 16.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Alpha Data Services - Securities Mastering SME - Vice President We are seeking a highly skilled and motivated Specialist to join our team. The ideal candidate will have a strong background in data modelling and a keen interest in ensuring the usability of models as deliverables. This role will involve working closely with various departments to develop and implement data models that support our business processes and enhance our service offerings. State Street is making a multi-year strategic investment in Alpha to build platforms for Data Modelling and Lineage to improve useability of the data sets for Insight and analytics. As we look to increase our capabilities across the Alpha Data Services, we need a strong manager to work with clients and prospects on new product development and execution and refinement of our multi-year strategy. This role will report to the EMEA ADS lead for ADS practice and is responsible for leading the product development program in terms of evolving the service offering (capabilities, services and platform) starting with Security and Issuer data. The product offering includes front-office technology, traditional middle and back-office asset and fund administration. The individual plays a key role in leading the development program across, and in partnership with, the client facing, product, global delivery (operations), technology and Alpha teams. The role is a key voice in our overall private markets product and in helping to strengthen State Street's position in the industry. Responsibilities In collaboration with the Alpha Data Service Product Domain leads , build and execute the data service product for State Street's Security Master Solutions . Gather and track client feedback/requests and incorporate in product development roadmap. Manage conception, development and implementation of new products, capabilities or services impacting the Security and issuer data end to end including partnering with other members of the Product team, Global Delivery, Technology and the Alpha team. Oversee resources performing business analysis and requirements definition. Support annual planning and funding prioritization activities where required. Understand the firm's strategic plans, business, process and architecture and apply that knowledge in the solution design and development phases. Act as subject matter expert for the Security and issuer refence data products. Participate and provide thought leadership for the product. Collect and monitor client feedback and requests, integrating them into the product development roadmap. Skills Excellent written and verbal communication, influencing and collaboration skills including the ability to effectively present information and respond to questions from senior leaders at State Street, clients and prospects. Demonstrated ability to lead development life cycle, complex projects or client engagements and translating needs to requirements to drive outcomes. Comfortable discussing Data operating models with external clients. Demonstrate strong understanding of Securities and Issuer reference data Ability to use Microsoft Fabric to integrate data on Desktop and understanding and ability to draw data from Snowflake. Strong project management, analytical and process transformation skills, including experience with agile development processes and tools. Ability to organize, prioritize, balance key tasks and manage time effectively. - ability to lead as well as roll up your sleeves as needed Comfortable facilitating project working groups including internal and external stakeholders. Self-motivated professional with the ability to work under pressure to meet deadlines and goals Education & Preferred Qualifications Minimum of 14 Years experience in Financial Services. Minimum of 10 years of alternatives experience in back, front and/or middle office. Expertise in understanding the use of Sec reference data thoughout an Asset manager environment - Order Management, deal set up, Trading, Setup and Middle office platforms.
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
surat, gujarat
On-site
The ideal candidate should possess expert-level proficiency in Power BI Desktop, Power BI Service, and Power BI Gateway. You must have a deep understanding of DAX, Power Query, and various data modeling techniques. Strong SQL skills are essential for data extraction, joins, and aggregations. In the realm of Business Intelligence, you should have experience in creating executive dashboards and operational reports. A keen eye for visual design, user experience, and the ability to tell a compelling story through data are crucial skills for this role. A basic understanding or familiarity with Microsoft Fabric would be considered an additional advantage. Collaboration and delivery are key aspects of this role, so the ability to work effectively in Agile/Scrum environments is necessary. You should also be proficient in requirement gathering, providing UAT support, and engaging in iterative development cycles.,
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
Greetings from Ashra Technologies! We are currently looking to hire a skilled professional for the role of Azure Microsoft Fabric with at least 7+ years of experience. This position is based in Pune, Mumbai, Chennai, or Bangalore. Key Responsibilities & Requirements: - Hands-on experience with Microsoft Fabric, specifically Lakehouse, Data Factory, and Synapse. - Strong expertise in PySpark and Python for large-scale data processing and transformation. - Deep knowledge of Azure data services such as ADLS Gen2, Azure Databricks, Synapse, ADF, Azure SQL, etc. - Proven experience in designing, implementing, and optimizing end-to-end data pipelines on Azure. - Familiarity with Azure infrastructure setup including networking, security, and access management would be beneficial. - While not mandatory, domain knowledge in healthcare would be considered a plus. If you meet the above requirements and are interested in this opportunity, please share your resume with us at akshitha@ashratech.com or contact us at 8688322632. Thank you!,
Posted 2 weeks ago
1.0 - 6.0 years
0 Lacs
chennai, tamil nadu
On-site
Capgemini is looking for a dedicated and detail-oriented Strategy Integration & Reporting Analyst to join the team of a top 10 US Insurance Carrier. As a Strategy Integration & Reporting Analyst, your responsibilities will include assisting in strategic analysis, developing interactive reports, tools, and trackers using Power BI. Proficiency in creating data process flows in SQL, Python, Power Query, and Microsoft Fabric is advantageous. The ideal candidate should possess technical skills, keen attention to detail, and the ability to craft well-structured and visually engaging dashboards. Your role will involve tasks such as data manipulation, data validation, maintaining and updating reports and analyses, as well as conducting Ad Hoc analysis and report generation. The position offers an opportunity to work with one of the largest insurers in the United States, serving millions of households and offering a diverse range of insurance and financial services products. Requirements: - Total Work Experience: Minimum 1-3 years, Preferred 4-6 years - Work Experience in This Field: No minimum required, Preferred 1-3 years - English Proficiency: Intermediate level required, Fluent preferred - Required Education: Bachelor's degree Software / Tool Skills: - Power BI (Nice to have) - Semantic Model Design (Nice to have) - Dax (Nice to have) - SQL (Nice to have) - Python (Nice to have) Benefits: The position offers a competitive compensation and benefits package, including a competitive salary, performance-based bonuses, comprehensive benefits, career development opportunities, flexible work arrangements, private health insurance, pension plan, paid time off, and training & development programs. The specific benefits may vary based on the employee's level within the organization. About Capgemini: Capgemini is a global leader in technology transformation and business management, with a mission to unleash human energy through technology for an inclusive and sustainable future. With over 340,000 team members in more than 50 countries, Capgemini leverages its 55-year heritage and industry expertise to help clients address their business needs across various domains. The organization focuses on cloud, data, AI, connectivity, software, digital engineering, and platforms to drive innovation and growth. In 2023, Capgemini reported revenues of 22.5 billion.,
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |