Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a Salesforce Datacloud & Agentforce Solution Architect at NTT DATA in Hyderabad, Telangana (IN-TG), India, you will be responsible for designing, developing, and implementing AI-powered conversational experiences within the Salesforce platform. Your role will involve utilizing Agentforce capabilities to create automated customer interactions across various channels, requiring strong technical skills in Salesforce development and natural language processing (NLP) to build effective virtual agents. Your core responsibilities will include architecting and building data integration solutions using Salesforce Data Cloud to unify customer data from diverse sources, implementing data cleansing, matching, and enrichment processes to improve data quality, designing and managing data pipelines for efficient data ingestion, transformation, and loading, collaborating with cross-functional teams to understand business requirements and translate them into data solutions, monitoring data quality, identifying discrepancies, and taking corrective actions, as well as establishing and enforcing data governance policies to maintain data consistency and compliance. To be successful in this role, you should have expertise in Salesforce Data Cloud features like data matching, cleansing, enrichment, and data quality rules. You should also possess an understanding of data modeling concepts, proficiency in using Salesforce Data Cloud APIs and tools for data integration, knowledge of data warehousing concepts, and experience in implementing Salesforce Data Cloud for customer 360 initiatives. Additionally, you will be involved in designing and developing data integration solutions, managing data quality issues, collaborating with business stakeholders to define data requirements and KPIs, building and customizing Agentforce conversational flows, refining NLP models for accurate understanding of customer queries, monitoring Agentforce performance, integrating Agentforce with other Salesforce components, testing and deploying Agentforce interactions across various channels, and more. Key skills to highlight on your resume include expertise in Salesforce administration, development, understanding of Salesforce architecture, deep knowledge of Agentforce features, familiarity with NLP techniques, proven ability in conversational design, skills in data analysis, and experience in designing, developing, and deploying solutions on the Salesforce Data Cloud platform. At NTT DATA, a trusted global innovator of business and technology services, we are committed to helping clients innovate, optimize, and transform for long-term success. With a diverse team of experts, we provide services in business and technology consulting, data and artificial intelligence, industry solutions, application development, infrastructure management, and connectivity. Join us to be part of a global network dedicated to moving confidently and sustainably into the digital future.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
KLA is a global leader in diversified electronics for the semiconductor manufacturing ecosystem. Virtually every electronic device in the world is produced using our technologies. No laptop, smartphone, wearable device, voice-controlled gadget, flexible screen, VR device or smart car would have made it into your hands without us. KLA invents systems and solutions for the manufacturing of wafers and reticles, integrated circuits, packaging, printed circuit boards and flat panel displays. The innovative ideas and devices that are advancing humanity all begin with inspiration, research and development. KLA focuses more than average on innovation and we invest 15% of sales back into R&D. Our expert teams of physicists, engineers, data scientists and problem-solvers work together with the worlds leading technology providers to accelerate the delivery of tomorrows electronic devices. Life here is exciting and our teams thrive on tackling really hard problems. There is never a dull moment with us. The Information Technology (IT) group at KLA is involved in every aspect of the global business. ITs mission is to enable business growth and productivity by connecting people, process, and technology. It focuses not only on enhancing the technology that enables our business to thrive but also on how employees use and are empowered by technology. This integrated approach to customer service, creativity and technological excellence enables employee productivity, business analytics, and process excellence. As a Sr. data engineer, part of Data Sciences and Analytics team, you will play a key role for KLAs Data strategy principles and technique. As part of the centralized analytics team we will help analyze and find key data insights into various business unit processes across the company. You will be providing key performance indicators, dashboards to help various business users/partners to make business critical decisions. You will craft and develop analytical solutions by capturing business requirements and translating them into technical specifications, building data models and data visualizations. Responsibilities: - Design, develop and deploy Microsoft Fabric solutions, Power BI reports and dashboards. - Collaborate with business stakeholders to gather requirements and translate them into technical specifications. - Develop data models and establish data connections to various data sources. Use expert knowledge on Microsoft fabric architecture, deployment, and management. - Optimize Power BI solutions for performance and scalability. - Implement best practices for data visualization and user experience. - Conduct code reviews and provide mentorship to junior developers. - Manage permissions and workspaces in Power BI ensuring secure and efficient analytics platform. - Conduct assessments and audits of existing Microsoft Fabric environments to identify areas for improvement. - Stay current with the latest Fabric and Power BI features and updates. - Troubleshoot and resolve issues related to Fabric objects, Power BI reports and data sources. - Create detailed documentation, including design specifications, implementation plans, and user guides. Minimum Qualifications: - Doctorate (Academic) Degree and 0 years related work experience; Master's Level Degree and related work experience of 3 years; Bachelor's Level Degree and related work experience of 5 years - Proven experience as a Power BI Developer, with a strong portfolio of Power BI projects. - In-depth knowledge of Power BI, including DAX, Power Query, and data modeling. - Experience with SQL and other data manipulation languages. - In-depth knowledge of Microsoft Fabric and Power BI, including its components and capabilities. - Strong understanding of Azure cloud computing, data integration, and data management. - Excellent problem-solving skills and the ability to work independently and as part of a team. - Excellent Technical Problem Solving skill, performance optimization skills - Specialist in SQL and Stored procedures with Data warehouse concepts - Performed ETL Processes (Extract, Load, Transform). - Exceptional communication and interpersonal skills. - Expert knowledge of cloud and big data concepts and tools. Azure, AWS, Data Lake, Snowflake, etc. Nice to have: - Extremely strong SQL skills - Foundational knowledge of Metadata management, Master Data Management, Data Governance, Data Analytics - Good to have technical knowledge on Data Bricks/Data Lake/Spark/SQL. - Experience in configuring SSO(Single Single-On), RBAC, security roles on an analytics platform. - SAP functional knowledge is a plus - Microsoft certifications related to Microsoft Fabric/Power BI or Azure/analytics are a plus. - Good understanding of requirements and converting them into data warehouse solutions We offer a competitive, family friendly total rewards package. We design our programs to reflect our commitment to an inclusive environment, while ensuring we provide benefits that meet the diverse needs of our employees. KLA is proud to be an equal opportunity employer Be aware of potentially fraudulent job postings or suspicious recruiting activity by persons that are currently posing as KLA employees. KLA never asks for any financial compensation to be considered for an interview, to become an employee, or for equipment. Further, KLA does not work with any recruiters or third parties who charge such fees either directly or on behalf of KLA. Please ensure that you have searched KLAs Careers website for legitimate job postings. KLA follows a recruiting process that involves multiple interviews in person or on video conferencing with our hiring managers. If you are concerned that a communication, an interview, an offer of employment, or that an employee is not legitimate, please send an email to talent.acquisition@kla.com to confirm the person you are communicating with is an employee. We take your privacy very seriously and confidentially handle your information.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
You are an experienced Databricks on AWS and PySpark Engineer looking to join our team. Your role will involve designing, building, and maintaining large-scale data pipelines and architectures using Databricks on AWS and PySpark. You will also be responsible for developing and optimizing data processing workflows, collaborating with data scientists and analysts, ensuring data quality, security, and compliance, troubleshooting data pipeline issues, and staying updated with industry trends in data engineering and big data. Your responsibilities will include: - Designing, building, and maintaining large-scale data pipelines and architectures using Databricks on AWS and PySpark - Developing and optimizing data processing workflows using PySpark and Databricks - Collaborating with data scientists and analysts to design and implement data models and architectures - Ensuring data quality, security, and compliance with industry standards and regulations - Troubleshooting and resolving data pipeline issues and optimizing performance - Staying up-to-date with industry trends and emerging technologies in data engineering and big data Requirements: - 3+ years of experience in data engineering, with a focus on Databricks on AWS and PySpark - Strong expertise in PySpark and Databricks, including data processing, data modeling, and data warehousing - Experience with AWS services such as S3, Glue, and IAM - Strong understanding of data engineering principles, including data pipelines, data governance, and data security - Experience with data processing workflows and data pipeline management Soft Skills: - Excellent problem-solving skills and attention to detail - Strong communication and collaboration skills - Ability to work in a fast-paced, dynamic environment - Ability to adapt to changing requirements and priorities If you are a proactive and skilled professional with a passion for data engineering and a strong background in Databricks on AWS and PySpark, we encourage you to apply for this opportunity.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
You should have more than 5 years of experience in database administration. A Bachelor's or Master's degree in Computer Science or a related technical discipline is required. Additionally, you should have 4-5 years of hands-on experience in database administration. Desired Skills and Experience: - Proficiency in Oracle (Oracle 9i, 10g, and 11g) / MySQL database installation, upgradation, patching, and environment setup. - Strong knowledge of Oracle / MySQL fundamentals. - Experience in various types of backups and recoveries. - Ability to troubleshoot any Oracle / MySQL issues. - Extensive experience in creating high-performance normalized databases with correct indexes and partitioning. - Handling large databases efficiently. - Hands-on experience with Oracle stored procedures and functions. - Design and implementation of database development, preferably Oracle / MySQL, and related technologies, applications, and processes. - Proficiency in using software development tools. - Familiarity with data architecture design, database design, data analysis, and data modeling. - Proficiency in database and SQL performance tuning. - Knowledge of PL SQL, SQL Plus, Discover, Oracle Enterprise Manager, and TOAD. - Understanding of Oracle RAC (Real Application Cluster). - Knowledge of system integration and performance analysis. - Ability to work effectively in a fast-paced, multitasking, and results-focused environment. - Strong interpersonal, analytical, and logical skills. - Oracle / MySQL Certification is preferred. If you meet the above requirements and have the desired skills and experience, please apply for the position by sending your resume to hr@noduco.com.,
Posted 1 week ago
1.0 - 5.0 years
0 Lacs
gautam buddha nagar, uttar pradesh
On-site
As a Reporting Analyst specializing in Power BI/Tableau, you will be responsible for managing all phases of the Reporting Life Cycle. This includes efficiently handling business requests, from their initial logging to final delivery into production. Your role will involve tasks such as specification, prioritization, development, and quality control to ensure a seamless process. Your strong analytical skills will be crucial in collecting, organizing, analyzing, and effectively disseminating significant amounts of information. Attention to detail and accuracy will be key as you work independently or as part of a team to design, develop, test, and implement dashboard reports. Collaboration with business and leadership teams is essential to ensure that the deliverables meet the requirements and quality standards. You will also be responsible for managing and coordinating software releases, overseeing the process from development and testing to deployment in production environments. To excel in this role, you must have 1-3 years of experience in developing dashboards and reports using Tableau or Power BI. Additionally, a strong background in SQL and database programming is necessary, with proficiency in SQL Server, MySQL, Oracle, or other relational databases. Your technical expertise should encompass data models, database design development, data mining, and segmentation techniques. Experience in performance tuning related to reporting queries will be highly advantageous for this position. Effective verbal and written communication skills are essential for successful collaboration with team members and stakeholders. If you are passionate about data analysis, reporting, and visualization, and possess the required technical skills and communication abilities, we encourage you to apply for this exciting opportunity.,
Posted 1 week ago
21.0 - 25.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Job Title: Network Architect Work Location: Bangalore, KA Skills Required: Data Architecture and Modelling. Experience Range in Required Skills: 20 Years and above. Job Description: Velocloud SDWAN, MPLS, BGP, Global WAN routing, Cloud Connect (AWS, Azure). Design and implement Cisco Application Centric Infrastructure (ACI) solutions. Define ACI architecture and ensure seamless integration with existing network environments. Optimize network performance, scalability, and security. Collaborate with cross-functional teams to deliver efficient and secure network solutions. Strong expertise in Cisco ACI, SDN, and data center networking. Collaborate with cross-functional teams to deliver efficient and secure network solutions. Strong expertise in Cisco ACI, SDN, and data center networking. Essential Skills: Velocloud SDWAN, MPLS, BGP, Global WAN routing, Cloud Connect (AWS, Azure). Design and implement Cisco Application Centric Infrastructure (ACI) solutions. Define ACI architecture and ensure seamless integration with existing network environments. Optimize network performance, scalability, and security.
Posted 1 week ago
1.0 - 3.0 years
2 - 4 Lacs
Ahmedabad
Work from Office
Design, develop and maintain complex SQL queries, stored procedures, views, triggers and functions. Work on database design, normalization and scheme creation to meet business needs. Optimize queries for performance, scalability, and maintainability.
Posted 1 week ago
5.0 - 10.0 years
15 - 25 Lacs
Hyderabad/Secunderabad, Bangalore/Bengaluru, Delhi / NCR
Hybrid
Ready to build the future with AI? At Genpact, we don't just keep up with technology we set the pace. AI and digital innovation are redefining industries, and were leading the charge. Genpacts AI Gigafactory , our industry-first accelerator, is an example of how were scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of whats possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Lead Consultant-Power BI Developer! Responsibilities: Working within a team to identify, design and implement a reporting/dashboarding user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability best practices Gathering query data from tables of industry cognitive model/data lake and build data models with BI tools ¢ Apply requisite business logic using data transformation and DAX ¢ Understanding on Power BI Data Modelling and various in-built functions ¢ Knowledge on reporting sharing through Workspace/APP, Access management, Dataset Scheduling and Enterprise Gateway ¢ Understanding of static and dynamic row level security ¢ Ability to create wireframes based on user stories and Business requirement ¢ Basic Understanding on ETL and Data Warehousing concepts ¢ Conceptualizing and developing industry specific insights in forms dashboards/reports/analytical web application to deliver of Pilots/Solutions following best practices Qualifications we seek in you! Minimum Qualifications Graduate Why join Genpact? Lead AI-first transformation – Build and scale AI solutions that redefine industries. Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Gain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills Grow with the best – Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace Committed to ethical AI – Work in an environment where governance, transparency, and security are at the core of everything we build Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
Posted 1 week ago
12.0 - 15.0 years
5 - 9 Lacs
Vadodara
Remote
Location : Remote. Experience : 12+ years. Role Expectations : - Design, develop, and maintain interactive dashboards and reports using Power BI. - Create and optimize data models, including star/snowflake schemas. - Develop complex DAX calculations and KPIs to drive business insights. - Integrate data from various sources (SQL Server, Excel, SharePoint, APIs, etc.) - Collaborate with business stakeholders to gather requirements and translate them into technical solutions. - Perform data validation and troubleshooting to ensure accuracy and performance. - Implement row-level security and user-based data access strategies. - Provide guidance on Power BI governance, best practices, and self-service analytics models. - Maintain data refresh schedules and monitor report performance. - Work with cross-functional teams including data engineers, analysts, and business users. Qualifications : - 12+ years of hands-on Power BI development. - Advanced DAX and data modeling expertise. - Experience with Power Query/M Language. - Strong dashboard and visualization design skills. - Power BI Service and Power BI Report Server knowledge. - Row-level security and gateway configurations. - Familiarity with Power BI REST API and embedding techniques. - Proficient in writing complex T-SQL queries, stored procedures, views, and functions. - Data extraction, transformation, and integration using SQL. - Experience with large datasets and performance tuning.
Posted 1 week ago
10.0 - 16.0 years
25 - 40 Lacs
Gurugram, Bengaluru, Delhi / NCR
Work from Office
Acuity Knowledge Partners (Acuity) is a leading provider of bespoke research, analytics and technology solutions to the financial services sector, including asset managers, corporate and investment banks, private equity and venture capital firms, hedge funds and consulting firms. Its global network of over 6,000 analysts and industry experts, combined with proprietary technology, supports more than 600 financial institutions and consulting companies to operate more efficiently and unlock their human capital, driving revenue higher and transforming operations. Acuity is headquartered in London and operates from 10 locations worldwide. The company fosters a diverse, equitable and inclusive work environment, nurturing talent, regardless of race, gender, ethnicity or sexual orientation. Acuity was established as a separate business from Moodys Corporation in 2019, following its acquisition by Equistone Partners Europe (Equistone). In January 2023, funds advised by global private equity firm Permira acquired a majority stake in the business from Equistone, which remains invested as a minority shareholder. For more information, visit www.acuitykp.com Position Title- Associate Director (Senior Architect – Data) Department-IT Location- Gurgaon/ Bangalore Job Summary The Enterprise Data Architect will enhance the company's strategic use of data by designing, developing, and implementing data models for enterprise applications and systems at conceptual, logical, business area, and application layers. This role advocates data modeling methodologies and best practices. We seek a skilled Data Architect with deep knowledge of data architecture principles, extensive data modeling experience, and the ability to create scalable data solutions. Responsibilities include developing and maintaining enterprise data architecture, ensuring data integrity, interoperability, security, and availability, with a focus on ongoing digital transformation projects. Key Responsibilities 1. Strategy & Planning o Develop and deliver long-term strategic goals for data architecture vision and standards in conjunction with data users, department managers, clients, and other key stakeholders. o Create short-term tactical solutions to achieve long-term objectives and an overall data management roadmap. o Establish processes for governing the identification, collection, and use of corporate metadata; take steps to assure metadata accuracy and validity. o Establish methods and procedures for tracking data quality, completeness, redundancy, and improvement. o Conduct data capacity planning, life cycle, duration, usage requirements, feasibility studies, and other tasks. o Create strategies and plans for data security, backup, disaster recovery, business continuity, and archiving. o Ensure that data strategies and architectures are aligned with regulatory compliance. o Develop a comprehensive data strategy in collaboration with different stakeholders that aligns with the transformational projects’ goals. o Ensure effective data management throughout the project lifecycle. 2. Acquisition & Deployment o Ensure the success of enterprise-level application rollouts (e.g. ERP, CRM, HCM, FP&A, etc.) Liaise with vendors and service providers to select the products or services that best meet company goals 3. Operational Management o Assess and determine governance, stewardship, and frameworks for managing data across the organization. o Develop and promote data management methodologies and standards. o Document information products from business processes and create data entities o Create entity relationship diagrams to show the digital thread across the value streams and enterprise o Create data normalization across all systems and data base to ensure there is common definition of data entities across the enterprise o Document enterprise reporting needs develop the data strategy to enable single source of truth for all reporting data o Address the regulatory compliance requirements of each country and ensure our data is secure and compliant o Select and implement the appropriate tools, software, applications, and systems to support data technology goals. o Oversee the mapping of data sources, data movement, interfaces, and analytics, with the goal of ensuring data quality. o Collaborate with project managers and business unit leaders for all projects involving enterprise data. o Address data-related problems regarding systems integration, compatibility, and multiple-platform integration. o Act as a leader and advocate of data management, including coaching, training, and career development to staff. o Develop and implement key components as needed to create testing criteria to guarantee the fidelity and performance of data architecture. o Document the data architecture and environment to maintain a current and accurate view of the larger data picture. o Identify and develop opportunities for data reuse, migration, or retirement. 4. Data Architecture Design: o Develop and maintain the enterprise data architecture, including data models, databases, data warehouses, and data lakes. o Design and implement scalable, high-performance data solutions that meet business requirements. 5. Data Governance: o Establish and enforce data governance policies and procedures as agreed with stakeholders. o Maintain data integrity, quality, and security within Finance, HR and other such enterprise systems. 6. Data Migration: o Oversee the data migration process from legacy systems to the new systems being put in place. o Define & Manage data mappings, cleansing, transformation, and validation to ensure accuracy and completeness. 7. Master Data Management: o Devise processes to manage master data (e.g., customer, vendor, product information) to ensure consistency and accuracy across enterprise systems and business processes. o Provide data management (create, update and delimit) methods to ensure master data is governed 8. Stakeholder Collaboration: o Collaborate with various stakeholders, including business users, other system vendors, and stakeholders to understand data requirements. o Ensure the enterprise system meets the organization's data needs. 9. Training and Support: o Provide training and support to end-users on data entry, retrieval, and reporting within the candidate enterprise systems. o Promote user adoption and proper use of data. 10 Data Quality Assurance: o Implement data quality assurance measures to identify and correct data issues. o Ensure the Oracle Fusion and other enterprise systems contain reliable and up-to-date information. 11. Reporting and Analytics: o Facilitate the development of reporting and analytics capabilities within the Oracle Fusion and other systems o Enable data-driven decision-making through robust data analysis. 1. Continuous Improvement: o Continuously monitor and improve data processes and the Oracle Fusion and other system's data capabilities. o Leverage new technologies for enhanced data management to support evolving business needs. Technology and Tools: Oracle Fusion Cloud Data modeling tools (e.g., ER/Studio, ERwin) ETL tools (e.g., Informatica, Talend, Azure Data Factory) Data Pipelines: Understanding of data pipeline tools like Apache Airflow and AWS Glue. Database management systems: Oracle Database, MySQL, SQL Server, PostgreSQL, MongoDB, Cassandra, Couchbase, Redis, Hadoop, Apache Spark, Amazon RDS, Google BigQuery, Microsoft Azure SQL Database, Neo4j, OrientDB, Memcached) Data governance tools (e.g., Collibra, Informatica Axon, Oracle EDM, Oracle MDM) Reporting and analytics tools (e.g., Oracle Analytics Cloud, Power BI, Tableau, Oracle BIP) Hyperscalers / Cloud platforms (e.g., AWS, Azure) Big Data Technologies such as Hadoop, HDFS, MapReduce, and Spark Cloud Platforms such as Amazon Web Services, including RDS, Redshift, and S3, Microsoft Azure services like Azure SQL Database and Cosmos DB and experience in Google Cloud Platform services such as BigQuery and Cloud Storage. • Programming Languages: (e.g. using Java, J2EE, EJB, .NET, WebSphere, etc.) o SQL: Strong SQL skills for querying and managing databases. Python: Proficiency in Python for data manipulation and analysis. Java: Knowledge of Java for building data-driven applications. Data Security and Protocols: Understanding of data security protocols and compliance standards. Key Competencies Qualifications: • Education: o Bachelor’s degree in computer science, Information Technology, or a related field. Master’s degree preferred. Experience: 10+ years overall and at least 7 years of experience in data architecture, data modeling, and database design. Proven experience with data warehousing, data lakes, and big data technologies. Expertise in SQL and experience with NoSQL databases. Experience with cloud platforms (e.g., AWS, Azure) and related data services. Experience with Oracle Fusion or similar ERP systems is highly desirable. Skills: Strong understanding of data governance and data security best practices. Excellent problem-solving and analytical skills. Strong communication and interpersonal skills. Ability to work effectively in a collaborative team environment. Leadership experience with a track record of mentoring and developing team members. Excellent in documentation and presentations. Good knowledge of applicable data privacy practices and laws. Certifications: Relevant certifications (e.g., Certified Data Management Professional, AWS Certified Big Data – Specialty) are a plus. Behavioral • A self-starter, an excellent planner and executor and above all, a good team player • Excellent communication skills and inter-personal skills are a must • Must possess organizational skills, including multi-task capability, priority setting and meeting deadlines • Ability to build collaborative relationships and effectively leverage networks to mobilize resources • Initiative to learn business domain is highly desirable Likes dynamic and constantly evolving environment and requirements
Posted 1 week ago
5.0 - 10.0 years
15 - 20 Lacs
Bengaluru
Work from Office
About the Role We are seeking a Staff Software Engineer to lead Growth Data Platform initiatives for our client. This role is ideal for someone with strong hands-on experience, deep technical expertise, and a track record of delivering production-grade healthcare platform services on AWS. Youll work closely with MarTech, Growth Marketing, and Enrollment Marketing teams to design and develop scalable, compliant solutions. You will also mentor engineers, contribute to platform architecture, and champion CI/CD and DevOps best practices. Key Responsibilities Lead the design and architecture of scalable platform projects Streamline and maintain CI/CD pipelines for company applications Build and maintain high availability systems (target: 99.99% uptime) Collaborate across pods to tackle complex features and architecture upgrades Drive cost-effective AWS infrastructure automation and adoption Act as mentor and technical advisor, overseeing design and development practices Execute impactful changes to workflows and tools on a quarterly basis
Posted 1 week ago
12.0 - 15.0 years
5 - 9 Lacs
Pimpri-Chinchwad
Remote
Role Expectations : - Design, develop, and maintain interactive dashboards and reports using Power BI. - Create and optimize data models, including star/snowflake schemas. - Develop complex DAX calculations and KPIs to drive business insights. - Integrate data from various sources (SQL Server, Excel, SharePoint, APIs, etc.) - Collaborate with business stakeholders to gather requirements and translate them into technical solutions. - Perform data validation and troubleshooting to ensure accuracy and performance. - Implement row-level security and user-based data access strategies. - Provide guidance on Power BI governance, best practices, and self-service analytics models. - Maintain data refresh schedules and monitor report performance. - Work with cross-functional teams including data engineers, analysts, and business users. Qualifications : - 12+ years of hands-on Power BI development. - Advanced DAX and data modeling expertise. - Experience with Power Query/M Language. - Strong dashboard and visualization design skills. - Power BI Service and Power BI Report Server knowledge. - Row-level security and gateway configurations. - Familiarity with Power BI REST API and embedding techniques. - Proficient in writing complex T-SQL queries, stored procedures, views, and functions. - Data extraction, transformation, and integration using SQL. - Experience with large datasets and performance tuning.
Posted 1 week ago
12.0 - 15.0 years
5 - 9 Lacs
Jaipur
Remote
Role Expectations : - Design, develop, and maintain interactive dashboards and reports using Power BI. - Create and optimize data models, including star/snowflake schemas. - Develop complex DAX calculations and KPIs to drive business insights. - Integrate data from various sources (SQL Server, Excel, SharePoint, APIs, etc.) - Collaborate with business stakeholders to gather requirements and translate them into technical solutions. - Perform data validation and troubleshooting to ensure accuracy and performance. - Implement row-level security and user-based data access strategies. - Provide guidance on Power BI governance, best practices, and self-service analytics models. - Maintain data refresh schedules and monitor report performance. - Work with cross-functional teams including data engineers, analysts, and business users. Qualifications : - 12+ years of hands-on Power BI development. - Advanced DAX and data modeling expertise. - Experience with Power Query/M Language. - Strong dashboard and visualization design skills. - Power BI Service and Power BI Report Server knowledge. - Row-level security and gateway configurations. - Familiarity with Power BI REST API and embedding techniques. - Proficient in writing complex T-SQL queries, stored procedures, views, and functions. - Data extraction, transformation, and integration using SQL. - Experience with large datasets and performance tuning.
Posted 1 week ago
5.0 - 10.0 years
8 - 18 Lacs
Pune, Delhi / NCR
Hybrid
Skills Required: Automation Testing, ETL, SQL Queries, Data Warehousing, Data Modeling 5 to 10 years experience as a test engineer specially in ETL Testing. Corporate Banking knowledge/Experience would be added advantage. Proven experience in test plan design Understanding of the software development lifecycle and the deliverable created during the development lifecycle Strong analytical skills, creative and critical thinking ability and problem solving skills Familiarity with relevant quality assurance industry standard best practices and methodologies Dedication to customer satisfaction Excellent communication skills Problem solving skills Excellent time management skills Experience in creating test solutions. Use of test management tools similar to JIRA. Understanding of executing test automation and interpreting results. Proficient in Window technologies along with Office (Excel, PowerPoint)and Adobe. Proficient in all forms of functional testing across all browsers and devices. Knowledge of ETL Tools.
Posted 1 week ago
12.0 - 15.0 years
5 - 9 Lacs
Thane
Work from Office
Location : Remote. Experience : 12+ years. Role Expectations : - Design, develop, and maintain interactive dashboards and reports using Power BI. - Create and optimize data models, including star/snowflake schemas. - Develop complex DAX calculations and KPIs to drive business insights. - Integrate data from various sources (SQL Server, Excel, SharePoint, APIs, etc.) - Collaborate with business stakeholders to gather requirements and translate them into technical solutions. - Perform data validation and troubleshooting to ensure accuracy and performance. - Implement row-level security and user-based data access strategies. - Provide guidance on Power BI governance, best practices, and self-service analytics models. - Maintain data refresh schedules and monitor report performance. - Work with cross-functional teams including data engineers, analysts, and business users. Qualifications : - 12+ years of hands-on Power BI development. - Advanced DAX and data modeling expertise. - Experience with Power Query/M Language. - Strong dashboard and visualization design skills. - Power BI Service and Power BI Report Server knowledge. - Row-level security and gateway configurations. - Familiarity with Power BI REST API and embedding techniques. - Proficient in writing complex T-SQL queries, stored procedures, views, and functions. - Data extraction, transformation, and integration using SQL. - Experience with large datasets and performance tuning.
Posted 1 week ago
12.0 - 15.0 years
5 - 9 Lacs
Mumbai
Work from Office
Location : Remote. Experience : 12+ years. Role Expectations : - Design, develop, and maintain interactive dashboards and reports using Power BI. - Create and optimize data models, including star/snowflake schemas. - Develop complex DAX calculations and KPIs to drive business insights. - Integrate data from various sources (SQL Server, Excel, SharePoint, APIs, etc.) - Collaborate with business stakeholders to gather requirements and translate them into technical solutions. - Perform data validation and troubleshooting to ensure accuracy and performance. - Implement row-level security and user-based data access strategies. - Provide guidance on Power BI governance, best practices, and self-service analytics models. - Maintain data refresh schedules and monitor report performance. - Work with cross-functional teams including data engineers, analysts, and business users. Qualifications : - 12+ years of hands-on Power BI development. - Advanced DAX and data modeling expertise. - Experience with Power Query/M Language. - Strong dashboard and visualization design skills. - Power BI Service and Power BI Report Server knowledge. - Row-level security and gateway configurations. - Familiarity with Power BI REST API and embedding techniques. - Proficient in writing complex T-SQL queries, stored procedures, views, and functions. - Data extraction, transformation, and integration using SQL. - Experience with large datasets and performance tuning.
Posted 1 week ago
12.0 - 15.0 years
5 - 9 Lacs
Patna
Remote
Role Expectations : - Design, develop, and maintain interactive dashboards and reports using Power BI. - Create and optimize data models, including star/snowflake schemas. - Develop complex DAX calculations and KPIs to drive business insights. - Integrate data from various sources (SQL Server, Excel, SharePoint, APIs, etc.) - Collaborate with business stakeholders to gather requirements and translate them into technical solutions. - Perform data validation and troubleshooting to ensure accuracy and performance. - Implement row-level security and user-based data access strategies. - Provide guidance on Power BI governance, best practices, and self-service analytics models. - Maintain data refresh schedules and monitor report performance. - Work with cross-functional teams including data engineers, analysts, and business users. Qualifications : - 12+ years of hands-on Power BI development. - Advanced DAX and data modeling expertise. - Experience with Power Query/M Language. - Strong dashboard and visualization design skills. - Power BI Service and Power BI Report Server knowledge. - Row-level security and gateway configurations. - Familiarity with Power BI REST API and embedding techniques. - Proficient in writing complex T-SQL queries, stored procedures, views, and functions. - Data extraction, transformation, and integration using SQL. - Experience with large datasets and performance tuning.
Posted 1 week ago
6.0 - 11.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Role: Tech Lead Data Engineer Location: Bengaluru What you ll do We re MiQ, a global programmatic media partner for marketers and agencies. Our people are at the heart of everything we do, and as a Tech Lead Data Engineering, you ll be right at the center of it. This role combines technical excellence, hands-on problem-solving, and engineering leadership to help us build optimized, scalable, and cost-efficient data products and pipelines across our platforms. You will lead by example, guiding a team of data engineers to build and maintain robust data systems while collaborating with cross-functional teams to deliver high-impact business outcomes. This role goes beyond writing code you ll be setting standards, making architectural decisions, mentoring team members, and driving technical innovation across the organization. Who are your stakeholders You ll work closely with: Data Engineering Team members along with Principal engineers and Data Architects Product Managers and Product Leads across global business units Engineering managers and cross-team leads (DevOps, Data Science, Analytics) Internal data consumers like Analysts, Business Intelligence, and ML teams Platform teams responsible for cost optimization and infrastructure scalability What you ll bring 6+ years of experience in Data Engineering, with 2+ years in a technical or team leadership role Strong expertise in data architecture, data modeling, distributed computing, and modern data platforms Proficiency with big data ecosystems like Spark, Databricks, Kafka, Presto , and Delta Lake Strong coding skills in Python, Scala or Java , and a solid understanding of software engineering principles Expertise in SQL/NoSQL , data & delta lakes , and cloud-native services (preferably AWS or GCP ) Proven track record in building production-grade ETL pipelines and event-driven systems Experience leading POCs, evaluating tools, and setting engineering best practices across teams Familiarity with DataOps , DevOps and CI/CD , including tools for monitoring, alerting, and observability A passion for mentoring and developing engineers, and a love for collaborative problem-solving We ve highlighted some key skills, experience and requirements for this role. But please don t worry if you don t meet every single one. Our talent team strives to find the best people. They might see something in your background that s a fit for this role, or another opportunity at MiQ. If you have a passion for the role, please still apply. What impact will you create As a Tech Lead, your mission is to set direction, not just velocity . You will: Drive the architectural vision for scalable, cost-optimized, and performant data systems Design, review, and guide the implementation of complex pipelines and data products Introduce and champion engineering best practices, design patterns, and quality standards Act as a force multiplier by mentoring engineers and helping them level up Represent the team in technical forums and work closely with leadership on strategic initiatives Bring a product mindset to engineering and advocate for the why as much as the how Keep your hands dirty build things, break things (responsibly), and help your team learn from both What s in it for you At MiQ, you ll be empowered to influence technical strategy at a global scale. You ll be joining a diverse and dynamic team in our Center of Excellence, where innovation and experimentation are encouraged and celebrated. Your work will not only power our platform but shape the future of data-driven programmatic advertising. You ll be part of a culture that champions curiosity, autonomy, and continuous learning while having a lot of fun doing it. Values Our values are so much more than statements . They unite MiQers in every corner of the world. They shape the way we work and the decisions we make. And they inspire us to stay true to ourselves and to aim for better. Our values are there to be embraced by everyone, so that we naturally live and breathe them. Just like inclusivity, our values flow through everything we do - no matter how big or small. We do what we love - Passion We figure it out - Determination We anticipate the unexpected - Agility We always unite - Unite We dare to be unconventional - Courage Benefits Every region and office have specific perks and benefits, but every person joining MiQ can expect: A hybrid work environment New hire orientation with job specific onboarding and training Internal and global mobility opportunities Competitive healthcare benefits Bonus and performance incentives Generous annual PTO paid parental leave, with two additional paid days to acknowledge holidays, cultural events, or inclusion initiatives. Employee resource groups designed to connect people across all MiQ regions, drive action, and support our communities. Apply today! Equal Opportunity Employer
Posted 1 week ago
8.0 - 13.0 years
40 - 45 Lacs
Hyderabad
Work from Office
About the job: We are an innovative global healthcare company, driven by one purpose: we chase the miracles of science to improve people s lives. Our team, across some 100 countries, is dedicated to transforming the practice of medicine by working to turn the impossible into the possible. We provide potentially life-changing treatment options and life-saving vaccine protection to millions of people globally, while putting sustainability and social responsibility at the center of our ambitions. Sanofi has recently embarked into a vast and ambitious digital transformation program. A cornerstone of this roadmap is the acceleration of its data transformation and of the adoption of artificial intelligence (AI) and machine learning (ML) solutions that will accelerate Manufacturing & Supply performance and help bring drugs and vaccines to patients faster, to improve health and save lives. As part of the Digital M&S Foundations organization, the data modeler designs, implements, and documents data architecture and data modeling solutions, which include the use of relational and dimensional databases. These solutions support Manufacturing and Supply Data and Analytical products and other business interests. What you will be doing: Be responsible for the development of the conceptual, logical, and physical data models in line with the architecture and platforms strategy Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively with the M&S teams Demonstrate a strong expertise in one of the following functional business areas of M&S: Manufacturing, Quality or Supply Chain Main Responsibilities Design and implement business data models in line with data foundations strategy and standards Work with business and application/solution teams to understand requirements, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, and analytic models. Hands-on data modeling, design, configuration, and performance tuning Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Skills Bachelor s or master s degree in computer/data engineer technical or related experience. 8+ years of hands-on relational, dimensional, and/or analytic experience, including 5+ years of hands-on experience with data from core manufacturing and supply chain systems such as SAP, Quality Management, LIMS, MES, Planning Experience hands-on programing in SQL Experience with data warehouse (Snowflake), data lake (AWS based), and enterprise big data platforms in a pharmaceutical company. Good knowledge of metadata management, data modeling, and related tools: Snowflake, Informatica, DBT Experience with Agile Good communication, and presentation skills Why choose us Bring the miracles of science to life alongside a supportive, future-focused team. Discover endless opportunities to grow your talent and drive your career, whether it s through a promotion or lateral move, at home or internationally. Enjoy a thoughtful, well-crafted rewards package that recognizes your contribution and amplifies your impact. Take good care of yourself and your family, with a wide range of health and wellbeing benefits including high-quality healthcare, prevention and wellness programs and at least 14 weeks gender-neutral parental leave. Opportunity to work in an international environment, collaborating with diverse business teams and vendors, working in a dynamic team, and fully empowered to propose and implement innovative ideas. Pursue Progress . Discover Extraordinary . Progress doesn t happen without people people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. You can be one of those people. Chasing change, embracing new ideas and exploring all the opportunities we have to offer. Let s pursue progress. And let s discover extraordinary together. At Sanofi, we provide equal opportunities to all regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, or gender identity. Watch our ALL IN video and check out our Diversity Equity and Inclusion actions at sanofi.com !
Posted 1 week ago
0.0 - 4.0 years
2 - 6 Lacs
Hyderabad
Work from Office
newmark is looking for Analyst 1 - Gerald Eve to join our dynamic team and embark on a rewarding career journeyCollect, analyze, and interpret data from various sources to support business decisions and strategy development.Prepare detailed reports, dashboards, and visualizations that highlight trends, patterns, and actionable insights.Collaborate with cross-functional teams to understand data requirements and deliver accurate analytical solutions.Use statistical methods and data modeling techniques to solve business problems and improve processes.Validate data integrity and ensure accuracy in all analyses and reports.Monitor key performance indicators (KPIs) and provide regular updates to management with recommendations.
Posted 1 week ago
10.0 - 20.0 years
25 - 40 Lacs
Noida, Hyderabad/Secunderabad, Bangalore/Bengaluru
Hybrid
Dear candidate, We found your profile suitable for our current opening, please go through the below JD for better understanding of the role, Job Description: Role: Technical Architect / Senior TA Exp :10 - 15 years Employment Type : Full-time Mode of work: Hybrid Model (3days WFO) Work Location: Hyderabad/Bangalore/Noida/Pune/Kolkata Role Overview: We are looking for a skilled Business Analyst with strong domain experience in Real Estate Investment Trusts (REITs) , specifically in Mortgage-Backed Securities (MBS) and/or Capital Allocation . The ideal candidate will have exposure to data platforms or application development and be capable of translating business needs into actionable insights and technical requirements. Key Responsibilities: Understand and analyze business processes in the REIT domain (MBS, capital allocation). Collaborate with stakeholders to gather and document requirements. Define domain models and mappings for data platforms. Work closely with data engineering and application development teams. Support product and platform enhancements through data-driven insights. Participate in stakeholder meetings, L2 interviews, and managerial discussions. Required Skills: Strong domain knowledge in REITs MBS and/or Capital Allocation. Experience as a Business Analyst or in a similar analytical role. Exposure to data platforms or application development projects. Ability to define domain models and mappings. Good understanding of SQL (basic knowledge is acceptable; writing skills are preferred). Excellent communication and documentation skills. Preferred Qualifications: Experience working in financial services, investment platforms, or real estate analytics. Familiarity with tools like Power BI, Tableau, Snowflake, or similar. Comfortable working in remote teams and cross-functional environments. How to Apply: Please share your updated resume highlighting relevant REIT domain experience, BA skills, and exposure to data platforms or app development. Please check below link for organisation details, https://www.tavant.com/ If interested , please drop your resume to dasari.gowri@tavant.com Regards Dasari Krishna Gowri Associate Manager - HR www.tavant.com
Posted 1 week ago
4.0 - 10.0 years
5 - 9 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Responsibilities Requirement Analysis: Collaborate with clients and stakeholders to gather and understand business requirements. Translate business needs into technical specifications for MicroStrategy BI solutions. MicroStrategy Development: Design and develop MicroStrategy reports, dashboards, and interactive visualizations. Utilize MicroStrategy features to create efficient and user-friendly BI solutions. Data Modeling: Define and implement data models that support reporting and analytics requirements. Ensure data accuracy, integrity, and optimal performance within MicroStrategy. Performance Optimization: Optimize MicroStrategy reports and queries for improved performance. Identify and implement best practices to enhance overall system efficiency. Client Collaboration: Work closely with clients to demonstrate MicroStrategy capabilities and gather feedback. Provide training and support to end-users to ensure effective use of MicroStrategy solutions. Integration: Integrate MicroStrategy with various data sources and third-party applications as needed. Collaborate with IT teams to ensure seamless data flow between systems. Security Implementation: Design and implement security models within the MicroStrategy environment. Define user roles, access controls, and data security measures. Documentation: Create and maintain documentation for MicroStrategy solutions, configurations, and best practices. Ensure knowledge transfer and documentation for future reference. Technology Evaluation: Stay updated on the latest MicroStrategy features and updates. Evaluate and recommend new technologies to enhance BI capabilities Qualifications: Bachelor s degree in Computer Science, Information Technology, or related field. Proven experience as a MicroStrategy Consultant with expertise in MicroStrategy architecture and development. Strong understanding of BI concepts, data modeling, and data warehousing. Proficient in SQL, with the ability to write complex queries for data analysis. Excellent problem-solving and analytical skills. Strong communication and interpersonal skills for client interactions. Preferred Skills: 1. MicroStrategy certification is a plus. 2. Experience with other BI tools such as Tableau, Power BI, or QlikView. 3. Knowledge of data visualization best practices. 4. Familiarity with ETL processes and tools. Good to have: One of the following certifications to be considered: MicroStrategy Certified Master Analyst (MCMA) Certification, MicroStrategy Certified Specialist Developer (MCSD) Certification, MicroStrategy Certified Master Developer (MCSD) Certification, MicroStrategy Certified Developer (MCD) Certification.
Posted 1 week ago
3.0 - 7.0 years
15 - 16 Lacs
Chennai
Work from Office
Jun 23, 2025 Location: Chennai Designation: Senior Consultant Entity: Deloitte Touche Tohmatsu India LLP What impact will you make? Every day, your work will make an impact that matters, while you thrive in a dynamic culture of inclusion, collaboration and high performance. As the undisputed leader in professional services, Deloitte is where you will find unrivaled opportunities to succeed and realize your full potential Deloitte is where you will find unrivaled opportunities to succeed and realize your full potential. The Team Deloitte s Technology & Transformation practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning. Learn more about Analytics and Information Management Practice Primary Skills: Blend of skills in business processes and technology as well as strong hands-on experience in the Microsoft PowerApps platform. This position will be customer-facing and responsible for delivering business transformation & technology projects. Deep understanding and demonstrated hands-on experience with PowerApps (Canvas, Portal, SharePoint Form Apps, and Model Driven Apps), Power Automate Cloud Flows & Desktop, PowerBI, AI Builder, and Copilot Studio. Expertise in implementing Power Automate Flows and Power Automate Desktop (Automated, Instant, Business Process Flow, and UI Flows). Experience in technical documentation, including solution design architecture, design specifications, and technical standards. Problem-solving mindset with the ability to analyze complex data-related challenges and devise effective solutions. Project management experience, including scope definition, timeline management, and resource allocation. Knowledge of AI agents, agent frameworks, and working with large language models (LLMs). Experience with Azure AI foundry and Azure AI services. Secondary Skills: Proven experience in implementing and managing CI/CD pipelines in Azure (Azure DevOps, GitHub Actions, etc.). Strong expertise in Azure Data Factory and related Azure data services. Hands-on experience with Azure Open AI services, AI foundry, and Semantic Kernel. Proficient in creating data pipelines for ETL/ELT processes. Solid understanding of relational databases, data modeling, and SQL optimization. Hands-on experience in creating solutions and custom connectors. Experience with cloud computing technologies, including Azure Key Vault, Service Principles, App Principles, and App Registration. Certifications: Power Platform PL 900, 400, 200, 100, 600 Azure AZ-900, DP-900, AZ-104, DP-300 Azure AI AI-900, AI-102 Your role as a leader At Deloitte India, we believe in the importance of leadership at all levels. We expect our people to embrace and live our purpose by challenging themselves to identify issues that are most important for our clients, our people, and for society and make an impact that matters. In addition to living our purpose, Senior Consultant across our organization: Develop high-performing people and teams through challenging and meaningful opportunities Deliver exceptional client service; maximize results and drive high performance from people while fostering collaboration across businesses and borders Influence clients, teams, and individuals positively, leading by example and establishing confident relationships with increasingly senior people Understand key objectives for clients and Deloitte; align people to objectives and set priorities and direction. Acts as a role model, embracing and living our purpose and values, and recognizing others for the impact they make How you will grow At Deloitte, our professional development plan focuses on helping people at every level of their career to identify and use their strengths to do their best work every day. From entry-level employees to senior leaders, we believe there is always room to learn. We offer opportunities to help build excellent skills in addition to hands-on experience in the global, fast-changing business world. From on-the-job learning experiences to formal development programs at Deloitte University, our professionals have a variety of opportunities to continue to grow throughout their career. Explore Deloitte University, The Leadership Centre. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our purpose Deloitte is led by a purpose: To make an impact that matters . Every day, Deloitte people are making a real impact in the places they live and work. We pride ourselves on doing not only what is good for clients, but also what is good for our people and the Communities in which we live and work always striving to be an organization that is held up as a role model of quality, integrity, and positive change. Learn more about Deloittes impact on the world Recruiter tips We want job seekers exploring opportunities at Deloitte to feel prepared and confident. To help you with your interview, we suggest that you do your research: know some background about the organization and the business area you are applying to. Check out recruiting tips from Deloitte professionals.
Posted 1 week ago
7.0 - 10.0 years
5 - 10 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
This role is part of MUFG Retirement Solutions Data Technology domain and is responsible to deliver enterprise-wide data solutions. Individual will be responsible for designing and developing of data solutions which support our internal and external customers. This role requires a mix of creativity, attention to detail and a curiosity to find the best data solution for our customers. Key Accountabilities and main responsibilities Strategic Focus Contribute to the development of the data engineering capability. Apply software engineering best practices and database design to build scalable data pipelines, data integrations, and data models for use in applications and reports. Transform raw data into clean, accurate, and reusable datasets with combination of technical expertise and business acumen to seamlessly integrate data thus facilitating data-driven decision-making throughout the organization. Responsible for data curation translating data needs from stakeholders into architecting, building, and maintaining efficient & reliable data models and pipelines. Determining business needs and drivers through stakeholder consultation and collaboration and translate them into actionable requirements for data products. Keep up to date with latest trends and best practices in data technology. Operational Management Design, develop, and implement data solutions using our MS stack tools. Executing against the Data and Analytics roadmap, including optimisation of the existing capability as well as introduction of new data capabilities. Utilize data modeling techniques, to structure data for efficient analysis. Apply various transformations to ensure data accuracy, including removing inaccuracies or corrupted data and aggregating data items. Maintain comprehensive data documentation to ensure a common understanding of data definitions and language across the team. Apply standard practices, including version control, unit testing, and continuous integration. Collaborate with stakeholders and cross-functional teams to identify business opportunities and enhance strategies with automated data solutions. Partner closely with other engineers and analysts to improve foundational data models and accelerate the productization of data insight. Governance & Risk Ensure timely completion of activities by adhering with the agreed processes & quality standards. Individual should adhere to data design best practices. Adhere to MUFG s standards, policies, and procedures. The above list of key accountabilities is not an exhaustive list and may change from time-to-time based on business needs. Experience & Personal Attributes Experience Overall, 7-10 years of development experience with advanced SQL skills with the ability to develop efficient queries of varying complexity. Minimum 3-5 years of data engineering experienced in SQL database development and end-to-end ETL processes. Experience working with SSIS and other data transformation tools. Expertise to Implementing Data Warehousing Solutions. Developing Modern Data Warehouse solutions using Azure Stack (Azure Data Lake, Azure Data Factory, Azure Databricks). Excellent Understanding of Modern Data Warehouse/Lambda Architecture, Data warehousing concepts. Proficient in a source code control system such as GIT; Strong T-SQL scripting and data modelling skills. Coding Spark Scala or Python is desirable. Strong relational and dimensional Data Modelling skills is desirable. Personal Attributes Effective verbal and written communication skills, needed to communicate with global teams/stakeholders. Practical and simple problem-solving approach. Effective Team player with collaborative skills, learning and proactive attitude. Quality orientation with attention to detail. Commitment to continuous improvement. Excellent planning and organizational skills. Collaborate and share knowledge with other members of the team to ensure we are always evolving our collective skills and staying on the cutting-edge. Effectively communicate complex technical concepts to non-technical stakeholders and collaborate closely to understand evolving business requirements.
Posted 1 week ago
4.0 - 9.0 years
6 - 10 Lacs
Mumbai
Work from Office
We are seeking a highly skilled Power BI Developer with over 4 years of experience in data visualization and reporting, specifically within the financial services domain . The ideal candidate should have a strong understanding of financial instruments and fund accounting systems , with the ability to transform complex financial data into meaningful insights using Power BI. Key Responsibilities: Job Description: We are seeking a highly skilled Power BI Developer with over 4 years of experience in data visualization and reporting, specifically within the financial services domain . The ideal candidate should have a strong understanding of financial instruments and fund accounting systems , with the ability to transform complex financial data into meaningful insights using Power BI. Required Skills & Qualifications: 4+ years of hands-on experience with Power BI , including DAX, Power Query, and data modeling. Strong understanding of financial instruments , including: Securities Cash Market Equity Futures & Options Experience working with fund accounting systems and concepts, especially Net Asset Value (NAV) . Proficiency in SQL and data warehousing concepts. Ability to analyze and visualize large datasets in a clear and insightful manner. Excellent communication and stakeholder management skills Certifications (Preferred): Experience with financial data sources (e.g., Bloomberg, Reuters). Knowledge of other BI tools (Tableau, Qlik) is a plus. Familiarity with Azure or other cloud platforms.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough