Home
Jobs

124 Aggregations Jobs - Page 4

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

40.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Amgen Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. About The Role Role Description: We are seeking a seasoned Senior Engineering Manager (Data Engineering) to drive the development & maintenance of data pipelines developed by data engineering teams focusing on deep domain expertise of HR/Finance data. This role will lead a team of data engineers who will be maintaining data pipelines, and operational frameworks that support enterprise-wide data solutions. The ideal candidate will drive best practices in data engineering, cloud technologies, and Agile development, ensuring robust governance, data quality, and efficiency. The role requires technical expertise, operational excellence and a deep understanding of data solutions to optimize data-driven decision-making. Roles & Responsibilities: Lead and mentor a high performing team of data engineers who will be developing and maintaining the complex data pipelines. Drive the development of data tools and frameworks for managing and accessing data efficiently across the organization. Oversee the implementation of performance monitoring protocols across data pipelines, ensuring real-time visibility, alerts, and automated recovery mechanisms. Coach engineers in building dashboards and aggregations to monitor pipeline health and detect inefficiencies, ensuring optimal performance and cost-effectiveness. Lead the implementation of self-healing solutions, reducing failure points and improving pipeline stability and efficiency across multiple product features. Oversee data governance strategies, ensuring compliance with security policies, regulations, and data accessibility best practices. Guide engineers in data modeling, metadata management, and access control, ensuring structured data handling across various business use cases. Collaborate with business leaders, product owners, and cross-functional teams to ensure alignment of data architecture with product requirements and business objectives. Prepare team members for stakeholder discussions by helping assess data costs, access requirements, dependencies, and availability for business scenarios. Drive Agile and Scaled Agile (SAFe) methodologies, managing sprint backlogs, prioritization, and iterative improvements to enhance team velocity and project delivery. Stay up-to-date with emerging data technologies, industry trends, and best practices, ensuring the organization leverages the latest innovations in data engineering and architecture. Functional Skills: Must-Have Skills: Experience managing a team of data engineers in biotech/pharma domain companies. Experience in designing and maintaining data pipelines and analytics solutions that extract, transform, and load data from multiple source systems. Demonstrated hands-on experience with cloud platforms (AWS) and the ability to architect cost-effective and scalable data solutions. Proficiency in Python, PySpark, SQL. Experience with dimensional data modeling. Experience working with Apache Spark, Apache Airflow. Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops. Experienced with AWS or GCP or Azure cloud services. Understanding of end-to-end project/product life cycle. Well versed with full stack development & DataOps automation, logging frameworks, and pipeline orchestration tools. Strong analytical and problem-solving skills to address complex data challenges. Effective communication and interpersonal skills to collaborate with cross-functional teams. Good-to-Have Skills: Data Engineering Management experience in Biotech/Life Sciences/Pharma Experience using graph databases such as Stardog or Marklogic or Neo4J or Allegrograph, etc. Education and Professional Certifications 12 -15 years of experience in Computer Science, IT or related field AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Project Management certifications preferred Soft Skills: Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation. Show more Show less

Posted 2 weeks ago

Apply

0.0 - 4.0 years

0 Lacs

Pune, Maharashtra

On-site

Indeed logo

We are looking for a talented Power BI Data Visualization Engineer with 3-5 years of experience to join our dynamic Data, Analytics and Visualization team. In this role, you will be responsible for designing, developing, and maintaining impactful data visualizations and dashboards in Power BI. You will collaborate with business users, data analysts, and other stakeholders to translate data into actionable insights, empowering decision-makers to make data-driven decisions. Qualifications and Skills: · Bachelor’s or master’s degree in computer / data science. · Power BI Dashboard Development - Design and develop visually appealing, interactive, and user-friendly dashboards and reports in Power BI to support business decision-making across various departments. · Relevant years of hands-on experience in Business Intelligence and Data Analytics, with a focus on Power BI (DAX, Power Query, and Power BI Service) development. · Data Modeling & Transformation - Work with data engineers and analysts to transform raw data into structured, analytical datasets. Design and implement data models in Power BI to optimize performance and ensure consistency in reports. · Data Integration - Connect Power BI to a variety of data sources including SQL Server, Azure SQL, Excel, CSV files, and cloud-based systems (e.g., Azure Data Lake, Salesforce, etc.). · Advanced Visualization - Create advanced visualizations using custom visuals and out-of-the-box Power BI features such as slicers, drilldowns, and custom DAX measures to facilitate deeper insights. · Performance Optimization - Ensure the performance of Power BI reports and dashboards by optimizing queries, aggregations, and data models for faster load times and better user experience. · Collaboration with Stakeholders - Collaborate with business users, product owners, and senior management to understand reporting needs, gather requirements, and deliver tailored visualizations that meet business objectives. · Data Governance & Security - Implement row-level security (RLS) and ensure that Power BI reports adhere to data governance, privacy, and compliance standards. · Documentation - Create documentation for Power BI reports, data models, and processes. Conduct training and knowledge transfer sessions to empower business users with self-service BI capabilities. · Continuous Improvement - Stay updated with the latest Power BI features, industry trends, and best practices to continuously enhance the quality and functionality of reports and dashboards. Certifications and Good to have skills Power BI Data Analyst Associate (PL 300) Fabric Analytics Engineer Associate (DP 600) intersected candidates can send updated resume to karthik@busisol.net or whatsapp @ 9791876677 Job Type: Permanent Pay: Up to ₹900,000.00 per year Benefits: Health insurance Provident Fund Schedule: Day shift Monday to Friday Morning shift Supplemental Pay: Performance bonus Application Question(s): we are looking for immediate joiner candidates only What is current ctc salary what is exp ctc and notice periods Experience: Power BI: 4 years (Required) Azure: 4 years (Required) Microsoft SQL Server: 4 years (Required) License/Certification: Power BI Data Analyst Associate PL 300 (Required) Location: Pune, Maharashtra (Required) Work Location: In person

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Summary Reporting experience from BWonHANA to BW4HANA ¿ Analyze the business requirements and converting them into technical design documents ¿ Creation of Composite providers in BW Modelling Tool based on Calculation views, ADSO¿s ¿ Creation of Advance DSO¿s in BWMT, Transformations and loaded data from Data source based on Calc view ¿ Experience on AMDP¿s based on requirements to make the flows more optimized. ¿ Creation of Process Chains and make them scheduled for regular loads upon agreed frequency ¿ Creation of Calculation Views based on the requirements ¿ Performance Tuning, usage of Aggregations. Projections, unions, joins etc. ¿ Transport HANA Calculation Views from Dev to QA, Pre Prod and Production systems using Transaction SCTS_HTA. ¿ Data Load Monitoring of daily, weekly & monthly using Process chains. ¿ Resolving tickets. ¿ Data validation consistency in reporting. ¿ Knowledge in SOLMAN to make the transport Movements of SAP Developed Objects Show more Show less

Posted 2 weeks ago

Apply

40.0 years

5 - 8 Lacs

Hyderābād

On-site

India - Hyderabad JOB ID: R-213122 LOCATION: India - Hyderabad WORK LOCATION TYPE: On Site DATE POSTED: Apr. 29, 2025 CATEGORY: Information Systems Test Automation Engineer - Data Visualization ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. ABOUT THE ROLE Let’s do this. Let’s change the world. We seek a highly motivated Test Engineer specializing in data visualization. In this role, you will manage and optimize our Power BI and Tableau environments. You will develop, execute, and maintain test scripts to ensure data accuracy and integrity during integration processes with platforms like Snowflake and Databricks. Your expertise in SQL queries will support validation of data transformations and integrations. Collaborating closely with cross-functional teams, you will understand data requirements and provide solutions, identify and troubleshoot issues, and document testing processes. Your efforts will ensure that visualization dashboards meet business requirements and provideaccurate performance insights. Roles & Responsibilities: Develop, execute, and maintain automated test scripts for data visualization tools like Power BI and Tableau. Ensure data accuracy and integrity during the integration process with Snowflake and Databricks. Create and manage SQL queries for validating data transformations and integrations. Collaborate with cross-functional teams to understand data requirements and provide solutions. Identify and troubleshoot issues related to data visualization and integration. Document testing processes, results, and recommendations for improvements. Verify that all Key Performance Indicators (KPIs) are correctly calculated and displayed. Cross-check KPIs against business logic, source data, and defined metrics. Validate consistency of KPI results across different reports and filters. Compare final dashboards and reports against approved wireframes or design mockups. Ensure all visual components (charts, labels, filters, layout) match design specifications. Provide feedback on gaps, misalignments, or inconsistencies. Perform functional, usability, and cross-device testing on dashboards (e.g., Power BI, Tableau). Validate filters, drill-down paths, sorting, and interactivity. Use SQL or other tools to validate data between source systems and dashboards. Confirm correctness of aggregations, joins, and transformation logic. Monitor dashboard load times and responsiveness under various data scenarios. Report and escalate performance bottlenecks. Create test plans, test cases, and validation checklists based on BI requirements. Document and track bugs/issues using tools like JIRA, Confluence, or TestRail. Work closely with data engineers, BI developers, and business analysts to align on expectation. Must-Have Skills: Experienced with AWS Cloud Services 2+ years of experience in QA, with a focus on data or BI platforms. 3 to 5 years of overall experience in QA /Test Automation is expected. Strong knowledge of data visualization tools (e.g., Tableau, Power BI, Qlik). Experiencedwith SQL/NOSQL database. Familiarity with business KPIs and performance metrics. Understanding of dashboard wireframing and UX design principles. Excellent problem-solving skills and attention to detail. Good knowledge of python and Databricks. Familiarity with Agile/Scrum methodologies. Excellent problem-solving and analytical abilities. Superior communication and teamwork skills. Ability to work in a fast-paced, dynamic environment. Ability to learn new technologies quickly. 3-5 years of Testing experience, including 2year in Data Visualization testing. Good-to-Have Skills: Familiarity with distributed systems, databases, and large-scale system architectures. Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills. Basic Qualifications: Bachelor’s/Masters degree in Computer Science, Engineering, or related field. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

A bachelors degree in business or related field or an MBA. A minimum of 3-7 years of experience in business analysis or a related field. Exceptional analytical and conceptual thinking skills. The ability to influence stakeholders and work closely with them to determine acceptable solutions. Advanced technical BA skills & Excellent documentation skills. Fundamental analytical and conceptual thinking skills. Experience creating detailed reports and giving presentations. Competency in Microsoft applications including Word, Excel, Visio & Outlook. A track record of following through on commitments. Excellent planning, organizational, and time management skills. Experience in working along top-performing individuals. A history of working in successful agile projects. Skills BA in Banking Domain is MUST. Preferrably Conventional Debt Participates in development of the functional design and user documentation by analyzing business process flows or client requests and identifying changes Collects and defines business or functional requirements and translates them into functional design, test planning, and user documentation processes Able to understand and document current legacy systems and their future transformations. Strong proficiency in SQL (writing complex queries, data manipulation, joins, sub-queries, and aggregations). In-depth understanding of data analysis, data reporting, data mapping and business intelligence concepts. Expertise in requirements gathering, functional specifications, and technical documentation. Strong communication skills and the ability to engage with both business and technical teams. Good understanding of Banking Domain is required Strong problem-solving abilities and the ability to translate business requirements into technical solutions. Proven experience working in Agile Scrum project environments. Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Uttar Pradesh, India

Remote

Linkedin logo

Back Sr Data Engineer JOB_DESCRIPTION.SHARE.HTML CAROUSEL_PARAGRAPH JOB_DESCRIPTION.SHARE.HTML Uttar Pradesh, India Data Science & Business Intelligence Corporate Remote 48831 McGraw Hill LLC. mail_outline Get future jobs matching this search or Overview Job Description Build the Future At McGraw Hill we create best-in-class, next-generation learning platforms that are used by millions of students and educators worldwide from kindergarten through graduate school. Our goal is to accelerate student success through intuitive and effective learning tools and content that maximize a teacher’s time and a student’s learning experience. Our engineering team drive progress and help build the future of learning. If you have the passion and technical expertise to thrive in an innovative and agile environment, we want to learn more about you. What is this role about? McGraw-Hill Education, the leading provider of digital and print educational materials is looking for a Senior Data Engineer for our Data Analytics Group. The Senior Data Engineer in Data and Analytics is responsible for enhancing McGraw-Hill Education’s (MHE) business intelligence and data services capabilities, ensuring the delivery of actionable and timely insights to support financial, product, customer, user, and third-party data. This role also involves managing and monitoring the performance of the Data Platform, ensuring efficiency and reliability with hands-on data engineering, designing and architecting dynamic reporting, analytics, and modeling solutions to drive success in the education domain. The ideal candidate will have a strong data engineering background, with expertise in Oracle Cloud Infrastructure (OCI) with Exadata, Informatica Intelligent Cloud Services (IICS) and/or Databricks, AWS, with tht advanced proficiency in SQL queries. Additionally, this role requires close collaboration with stakeholders to ensure the successful delivery of projects. What You Will Be Doing Senior Data Engineer must have prior hands-on experience developing and delivering data solution with AWS and/Or Oracle technologies. Strong knowledge working with data from financial and operational systems, such as Oracle ERP Sales, Oracle DB and data modelling architecture with slow changing dimension (SCD). Experience in running cloud platform with optimized solution architecture with the ability to meet the daily runbook SLA. Strong experience with version control software like GIT and project management software like Jira with Agile/Kanban. Strong experience with Data Modelling concepts and Modern data architecture including cloud technologies. Ability to translate business requirements into technical requirements and deliveries. Design and develop parallel processing ETL solutions for optimal resource usage and faster processing. Understand ETL specification documents for mapping requirements and create mappings using transformations such as the Aggregator, Lookup, Router, Joiner, Union, Sorter, Normalizer and Update Strategy. Create UNIX shell scripts as Informatica workflow wrapper and perform housekeeping activities like cleanup and archive files. Experience in technical specification design - Proven experience in designing and building integrations supporting standard data modelling objects (Fact dimensions, aggregations, star schema, etc.) Ability to provide end-to-end technical guidance on the software development life cycle (requirements through implementation). Ability to create high quality solution design documentation for end-to-end solutions What You Need To Be Considered Expertise in Data warehousing and modern data lake concepts. 5+ years of experience in Data Engineering using tools such as:Informatica/IICS, Oracle DB and Oracle packages. AWS services. Data platforms like Athena with iceberg, lambda, EMR, and glue, Data bricks. Scripting languages like Python, Scala, Java or node. 1+ years of experience in Unix shell scripting 3+ years of experience working with Cloud like OCI, AWS, and Azure on Data technologies. Preferred Experience with Publication and Education domain. Prior experience or familiarity with Tableau/Alteryx. Experience working with financial data like sales, revenue, cogs and manufacturing etc. Experience with IBM planning Analytics (TM1). Why work for us? At McGraw Hill, you will be empowered to make a real impact on a global scale. Every day your individual efforts contribute to the lives of millions. There has never been a better time to join McGraw Hill. In our culture of curiosity and innovation, you will be able to own your growth and develop as we do. The work you do at McGraw Hill will be work those matters. We are collectively designing content that will build the future of education. Play your part and experience a sense of fulfilment that will inspire you to even greater heights. If you are curious, open to new ideas and ready to make a difference, we want to talk to you. We have a collective passion for the work we do and a curiosity to find new solutions. If you share our determination, together we will drive learning forward. Here’s What We Offer At McGraw Hill, you will be empowered to make a real impact on a global scale. Every day your individual efforts can contribute to the lives of millions. McGraw Hill recruiters always use a “@mheducation.com” email address and/or from our Applicant Tracking System, iCIMS. Any variation of this email domain should be considered suspicious. Additionally, McGraw Hill recruiters and authorized representatives will never request sensitive information in email. 48831 McGraw Hill uses an automated employment decision tool (AEDT) to assist in the screening process by recommending candidates with “like skills” based on resume and job data. To request an alternative screening process, please select “Opt-Out” when asked to “Consent to use of Automated Employment Decision Tools” during the application. Apply JOB_DESCRIPTION.SHARE.HTML CAROUSEL_PARAGRAPH JOB_DESCRIPTION.SHARE.HTML Show more Show less

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

About the Role We seek an experienced Senior Node.js Developer to lead enterprise-grade backend systems with complex business workflows (non-eCommerce). You will architect scalable solutions, manage cross-functional teams (React.js/iOS), and own end-to-end delivery—from development to deployment and client communication. Key Responsibilities Technical Leadership: Architect and develop enterprise Node.js applications (BFSI, ERP, healthcare, or logistics domains). Design and optimize complex business workflows (multi-step approvals, real-time data processing, async operations). Manage deployment pipelines (CI/CD, containerization, cloud infrastructure). Team Management: Lead developers (React.js frontend + iOS native), ensuring code quality and timely delivery. Mentor team on best practices for scalability, security, and performance. Client & Project Delivery: Interface directly with clients for requirements, updates, and troubleshooting (English fluency essential). Drive root-cause analysis for critical production issues. Infrastructure & Ops: Oversee server management (AWS/Azure/GCP), monitoring (Prometheus/Grafana), and disaster recovery. Optimize MongoDB clusters (sharding, replication, indexing). Required Skills Core Expertise: 5–7 years with Node.js (Express/NestJS) + MongoDB (complex aggregations, transactions). Proven experience in enterprise applications Deployment Mastery: CI/CD (Jenkins/GitLab), Docker/Kubernetes, cloud services (AWS EC2/Lambda/S3). Leadership & Communication: Managed teams of 3+ developers (frontend/mobile). Fluent English for client communication, documentation, and presentations. System Design: Built systems with complex workflows (state machines, event-driven architectures, BPMN). Proficient in microservices, REST/GraphQL, and message brokers (Kafka/RabbitMQ). Preferred Skills Basic knowledge of React.js and iOS native (Swift/Objective-C). Infrastructure-as-Code (Terraform/CloudFormation). Experience with TypeScript, Redis, or Elasticsearch. Non-Negotiables Enterprise application background Deployment ownership : CI/CD, cloud, and server management. English fluency for daily client communication. Valid passport + readiness to travel to Dubai. On-site work in Kochi + immediate joining (0–15 days). What We Offer Competitive salary + performance bonuses. Global enterprise projects with scalable impact. Upskilling support in cloud infrastructure and leadership. Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Linkedin logo

Do you want to be part of an inclusive team that works to develop innovative therapies for patients? Every day, we are driven to develop and deliver innovative and effective new medicines to patients and physicians. If you want to be part of this exciting work, you belong at Astellas! Astellas Pharma Inc. is a pharmaceutical company conducting business in more than 70 countries around the world. We are committed to turning innovative science into medical solutions that bring value and hope to patients and their families. Keeping our focus on addressing unmet medical needs and conducting our business with ethics and integrity enables us to improve the health of people throughout the world. For more information on Astellas, please visit our website at www.astellas.com . We’re looking for skilled Testers specializing in SQL, Python, and PySpark to join our FoundationX team. As part of this team, your role will involve maintaining operational, scalable data-driven systems that deliver business value. Purpose And Scope: As a Tester, you will play a crucial role in ensuring the quality and reliability of our pharmaceutical data systems. Your expertise in testing methodologies, data validation, and automation will be essential in maintaining high standards for our data solutions. This position is based in Bengaluru and will require some on-site work. Essential Skills & Knowledge: SQL ProficiencyStrong skills in SQL for data querying, validation, and analysis. Experience with complex joins, subqueries, and performance optimization. Knowledge of database management systems (e.g., SQL Server, Oracle, MySQL). Python And PySpark: Proficiency in Python for test automation and data manipulation. Familiarity with PySpark for big data testing. Testing Methodologies: Knowledge of testing frameworks (e.g., pytest, unittest). Experience with test case design, execution, and defect management. Data Validation: Ability to validate data transformations, aggregations, and business rules. Understanding of data lineage and impact analysis. Business Intelligence Tools: Proficiency in designing and maintaining QLIK/Tableau applications. Strong expertise in creating interactive reports and dashboards using Power BI. Knowledge of DAX and Power Automate (MS Flow). Data Modelling And Integration: Ability to design and implement logical and physical data models. Familiarity with data warehousing concepts and ETL processes. Data Governance And Quality: Understanding of data governance principles and metadata management. Experience ensuring data quality and consistency. Testing Techniques: Familiarity with SQL, Python, or R for data validation and testing. Manual and automated testing (e.g., Selenium, JUnit). Test Management Tools: Experience with test management software (e.g., qTest, Zephyr, ALM). Data Analysis: Knowledge of statistical analysis and data visualization tools (e.g., Tableau, Power BI). Pharmaceutical Domain: Understanding of pharmaceutical data (clinical trials, drug development, patient records). Attention To Detail: Meticulous review of requirements and data quality. Collaboration And Communication: Ability to work with end users and technical team members. Self-starter who collaborates effectively. Experience And Agile Mindset: Experience with data warehouses, BI, and DWH systems. Analytical thinking and adherence to DevOps principles. Preferred Skills And Knowledge: Experience working in the Pharma industry. Understanding of pharmaceutical data (clinical trials, drug development, patient records) is advantageous. Certifications in BI tools or testing methodologies. Global Perspective: Demonstrated understanding of global pharmaceutical or healthcare technical delivery, providing exceptional customer service and enabling strategic insights and decision-making. Responsibilities: Development Ownership: Take ownership of testing key development projects related to the Data Warehouse and other MI systems Collaborate with senior team members in a multi-skilled project team. Contribute to efficient administration of multi-server environments. Test Strategy And Planning: Collaborate with stakeholders, data scientists, analysts, and developers to understand project requirements and data pipelines Develop comprehensive end-to-end test strategies and plans for data validation, transformation, and analytics. Data Validation And Quality Assurance: Execute manual and automated tests on data pipelines, ETL processes, and analytical models. Verify data accuracy, completeness, and consistency. Identify anomalies and data quality issues. Ensure compliance with industry standards (e.g., GxP, HIPAA). Regression Testing: Validate changes and enhancements to data pipelines and analytics tools. Verify data accuracy, consistency, and completeness in QLIK, Power BI, and Tableau reports. Monitor performance metrics and identify deviations. Test Case Design And Execution: Create detailed test cases based on business requirements and technical specifications Execute test cases, record results, and report defects. Collaborate with development teams to resolve issues promptly. Documentation: Maintain comprehensive documentation of test scenarios, test data, and results Document test procedures and best practices. Data Security And Privacy: Ensure data security and compliance with privacy regulations (e.g., GDPR) Validate access controls and encryption mechanisms. Collaboration And Communication: Work closely with cross-functional teams, including data engineers, data scientists, and business stakeholders Experience: 3+ years of experience as a Tester, Developer or Data Analyst within a Pharmaceutical or working within a similar regulatory environment. Analytical mindset and logical thinking Familiarity with Business Intelligence and Data Warehousing concepts Web integration skills (Qlik Sense). Advanced SQL knowledge. Understanding of stored procedures, triggers, and tuning. Experience with other BI tools (Tableau, D3.js) is a plus. Education: Bachelor's degree in computer science, information technology, or related field (or equivalent experience.) Qualifications: Global Perspective: Demonstrated understanding of global pharmaceutical or healthcare technical delivery, providing exceptional customer service, and enabling strategic insights and decision-making Other complex and highly regulated industry experience will be considered Data Analysis and Automation Skills: Proficient in identifying, standardizing, and automating critical reporting metrics and modelling tools Analytical Thinking: Demonstrated ability to lead ad hoc analyses, identify performance gaps, and foster a culture of continuous improvement. Technical Proficiency: Strong coding skills in SQL, R, and/or Python, coupled with expertise in machine learning techniques, statistical analysis, and data visualization. Working Environment At Astellas we recognize the importance of work/life balance, and we are proud to offer a hybrid working solution allowing time to connect with colleagues at the office with the flexibility to also work from home. We believe this will optimize the most productive work environment for all employees to succeed and deliver. Hybrid work from certain locations may be permitted in accordance with Astellas’ Responsible Flexibility Guidelines. \ Category FoundationX Astellas is committed to equality of opportunity in all aspects of employment. EOE including Disability/Protected Veterans Show more Show less

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Position: Sr. Cloud Data Engineer (AWS-Big Data) Location: Nagpur/Pune/Chennai/Bangalore Purpose of the Position: As a Sr. Cloud Data Engineer, this position requires candidate who are enthusiastic about specialized skills in AWS Services and Big Data.As a member of the team, you will help our clients, by building Models that supports to progress on their AWS cloud journey. Key Result Areas and Activities: Share and Build Expertise- Develop and share expertise in cloud solutioning domain and actively mine the experience and expertise in the organization for sharing across teams and clients in the firm. Support the cloud COE initiatives. Nurture and Grow Talent- Provide support for recruitment, coaching and mentoring, and building practice capacity in the firm in Cloud. AWS Integration- Integrate various AWS services to create seamless workflows and data processing solutions. Data Pipeline Development- Design, build, and maintain scalable data pipelines using AWS services to support data processing and analytics. Essential Skills: Knowledge of following AWS Services required: S3, EC2, EMR, Severless, Athena, AWS Glue, Lambda, Step Functions Cloud Databases– AWS Aurora, Singlestore, RedShift, Snowflake Big Data- Hadoop, Hive, Spark, YARN Programming Language– Scala, Python, Shell Scripts, PySpark Operating System- Any flavor of Linux, Windows Strong SQL Skills Orchestration Tools: Apache Airflow Hands-on in developing ETL workflows comprising complex transformations like SCD, deduplications, aggregations, etc. Desirable Skills: Experience and Technical Knowledge in Databricks Strong Experience with event stream processing technologies such as Kafka, KDS Knowledge of Operating System- Any flavor of Linux, ETL Tools Informatica, Talend Experience with at least one major Hadoop platform (Cloudera, Hortonworks, MapR) will be a plus Qualifications: Overall 7-9 years of IT experience & 3+ years of AWS related project Bachelors degree in computer science, engineering, or related field (Masters degree is a plus) Demonstrated continued learning through one or more technical certifications or related methods Qualities: Hold strong technical knowledge and experience Should have the capability to deep dive and research in various technical related fields Self-motivated and focused on delivering outcomes for a fast growing team and firm Able to communicate persuasively through speaking, writing, and client presentations Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Company Description Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, all using our unique combination of data, analytics and software. We also assist millions of people to realise their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.co m Job Description Job description You will be based in Hyderabad and reporting to manager. This is individual contributor (Non-Managerial) role Must have experience working on High-Availability and Load-Balanced EPM Infrastructure environments. Should possess analytical and development experience on Oracle EPM Hyperion Product suite, including Essbase, Planning, FDMEE, Financial reports, Hyperion Financial Management, DRM, Oracle Data Integrator. Familiarity with Oracle cloud products, such as EDMCS, FCCS, and EPBCS and migration from on-premise to EPM cloud. Support on-premise migration to EPM cloud, data integrations to cloud applications. Lead in the design, development, testing and implementation activities relating to project deliverables Expected to lead process improvement initiatives. Perform periodic maintenance such as loading, clearing and copying data in HFM Provide application production support, analyze, design, develop, code, and implement programs to support Hyperion/Business Intelligence systems. Ability to translate requirements to high quality set of technical requirements Experience with diverse source systems and relational databases in EPM and BI space Experience handling the windows and Unix batch scripting Support the EPM applications and work on enhancements in a distributed global environment Experience supporting the following Hyperion applications: Hyperion Financial Management, Hyperion Planning, Hyperion Financial Reports, Essbase, Hyperion Financial Data Management and EPM cloud Analyse and Manage the ELT process using ODI (12c) to support the Actual, Plan and Forecast processes for Sales Reporting and ensure data is available to Essbase for month end reporting. Define the rule files to manage the metadata and data load cube build processes using Hyperion Essbase Manage partitions and customized aggregations on the multidimensional Essbase cubes Experience handling the installation and configuration of Oracle EPM Hyperion products Must have excellent understanding of PSU's and CPU's patches in relation to the Hyperion and OAS Products. Familiarity with MAXL for automating the Essbase tasks and EPM automate for cloud applications Support EDMCS batch processing for the import and export operations Automation of windows batch scripts and Linux scripts for Hyperion applications Expected to assist and mentor less experienced team members. Expected to support and streamline daily Batch Jobs/scheduled jobs and automate the same. Qualifications Qualifications Oracle EPM ODI Oracle Planning, Essbase, HFM Additional Information Our uniqueness is that we celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what matters; DEI, work/life balance, development, authenticity, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's people first approach is award-winning; World's Best Workplaces™ 2024 (Fortune Global Top 25), Great Place To Work™ in 24 countries, and Glassdoor Best Places to Work 2024 to name a few. Check out Experian Life on social or our Careers Site and Glassdoor to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is a critical part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, color, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Benefits Experian care for employee's work life balance, health, safety and wellbeing. In support of this endeavor, we offer best-in-class family well-being benefits, enhanced medical benefits and paid time off. Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Experience Range - 5+years Location - Noida or Bangalore Job Description Roles & Responsibilities • Lead the development of interactive dashboards and reports using Power BI, ensuring high-quality, visually appealing, and user-friendly outputs. • Integrate Power BI with various data sources (SQL Server, Azure, Salesforce, cloud-based platforms, etc.), ensuring seamless data flow. • Ensure accurate data extraction, transformation, and loading (ETL) from multiple systems or data sources into Power BI. • Optimize Power BI data models for performance, minimizing load times and improving report rendering. • Implement query optimization techniques and ensure efficient use of memory and resources. • Troubleshoot and resolve performance issues related to data loading and report execution. • Collaborate with business stakeholders, including executives and department heads, to gather and understand business requirements. • Translate business needs into detailed technical specifications and functional designs for BI solutions. • Provide strategic guidance and insights based on data-driven analysis, aligning BI solutions with business goals. • Mentor and provide technical guidance to junior Power BI developers, assisting with troubleshooting, performance tuning, and development best practices. • Lead or coordinate BI project teams, ensuring that projects are completed on time and within budget. • Develop advanced, interactive visualizations that facilitate business decision-making and provide actionable insights. • Create interactive visualizations, KPIs, and custom features that enable users to explore data dynamically. • Work closely with the IT team to ensure the proper configuration and deployment of Power BI reports and dashboards. • Collaborate with database administrators and data engineers to ensure data quality, security, and integration with existing systems. • Participate in data warehouse and cloud data architecture discussions to ensure BI tools are fully integrated into the data ecosystem. • Manage the end-to-end lifecycle of BI projects, from requirements gathering and design to deployment and post-launch support. • Create and maintain project documentation, including user guides, technical specifications, and deployment plans. • Continuously monitor and evaluate new features and best practices within the Power BI platform. • Stay up-to-date with emerging BI trends, tools, and technologies to ensure the organization remains at the forefront of data analytics. Required Technical and Professional Expertise Technical Skills: • In-depth knowledge of Power BI Desktop, Power BI Service, and Power BI Mobile, including advanced features such as Power BI Embedded and Power BI Premium. • Expertise in creating complex, enterprise-level dashboards and reports with interactive data visualizations. • Extensive experience with data integration, including the ability to connect Power BI to various data sources (SQL Server, cloud-based data sources like Azure, AWS, Salesforce, etc.). • Expertise in using Power Query for advanced data transformations, including merging and appending datasets, filtering data, and applying transformations at scale. • Creation of data model in Power BI as per best practices (preferable Star Schema) • Advanced skills in DAX (Data Analysis Expressions) for building complex calculations, aggregations, time intelligence, and custom metrics. • Ability to troubleshoot and optimize DAX formulas and data models for performance improvements. • Strong SQL skills for writing complex queries, working with stored procedures, views, and functions to extract and manipulate data from relational databases. • Ability to optimize SQL queries for performance, particularly when working with large datasets. • Expertise in optimizing Power BI reports and dashboards for high performance, including query optimization, reducing data model size, and implementing efficient data refresh strategies. • Knowledge of Power BI Performance Analyzer, Query Diagnostics, and advanced optimization techniques. • Experience with Power BI administration, including managing workspaces, datasets, dataflows, and security policies. • Expertise in implementing Row-Level Security (RLS) and other access control methods to ensure proper data governance. • Experience with Power Automate for automating workflows, data refreshes, and notifications. • Knowledge of integrating Power BI with other Microsoft 365 tools (Excel, SharePoint, Teams) and third-party systems. Professional Skills: • Expertise in engaging with stakeholders across the organization to gather detailed business requirements and translate them into scalable, robust BI solutions. • Ability to understand the strategic needs of the business and recommend data-driven solutions that align with organizational goals. • Proven ability to lead and mentor junior Power BI developers, providing guidance on best practices, troubleshooting, and optimization techniques. • Strong leadership skills to manage end-to-end BI project lifecycle and ensure timely delivery of projects. • Strong communication skills to effectively present complex data findings and insights to non-technical stakeholders, including executives and business leaders. • Ability to create clear, compelling presentations and reports that drive business decision-making. • Advanced problem-solving skills to troubleshoot complex data issues, optimize reporting processes, and improve user experience. • Ability to think critically about the business impact of data and recommend actionable insights. • A proactive attitude toward staying current with the latest Power BI features, industry trends, and advancements in data analytics. • Provide innovative ideas and contribute in Organization’s products or assets. Preferred Skills • Familiarity with cloud-based BI solutions, particularly in the Azure ecosystem. • Microsoft Power BI Certification (PL-300, PL-600, or similar) is a plus. • Experience in leading a team of BI developers and mentoring junior professionals. • Strong problem-solving skills, analytical thinking, and stakeholder management abilities. • Clear, concise communication with stakeholders to understand requirements and explain solutions or suggestions or recommendations. Show more Show less

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana

On-site

Indeed logo

Location: Gurgaon 100% onsite Detailed Job Description: Analytics Developer with deep expertise in Databricks, Power BI, and ETL technologies to design, develop, and deploy advanced analytics solutions. The ideal candidate will focus on creating robust, scalable data pipelines, implementing actionable business intelligence frameworks, and delivering insightful dashboards and reports that drive strategic decision-making. This role involves close collaboration with both technical teams and business stakeholders to ensure analytics initiatives align with organizational objectives. KEY RESPONSIBILITIES: Leverage Databricks to develop and optimize scalable data pipelines for real-time and batch data processing. Design and implement Databricks Notebooks for exploratory d ata analysis, ETL workflows, and machine learning models. Manage and optimize Databricks clusters for performance, cost efficiency, and scalability. Use Databricks SQL for advanced query development, data aggregation, and transformation. Incorporate Python and/or Scala within Databricks workflows to automate and enhance data engineering processes. Develop solutions to i ntegrate Databricks with other platforms, such as Azure Data Factory, for seamless data orchestration. Create interactive and visually compelling Power BI dashboards and reports to enable self-service analytics. Leverage DAX (Data Analysis Expressions) for building calculated columns, measures, and complex aggregations. Design effective data models in Power BI using star schema and snowflake schema principles for optimal performance. Configure and manage Power BI workspaces, gateways, and permissions for secure data access. Implement row-level security (RLS) and data masking strategies in Power BI to ensure compliance with governance policies. Build real-time dashboards by integrating Power BI with Databricks, Azure Synapse, and other data sources. Provide end-user training and support for Power BI adoption across the organization. Develop and maintain ETL/ELT workflows , ensuring high data quality and reliability. Implement data governance frameworks to maintain data lineage, security, and compliance with organizational policies. Optimize data flow across multiple environments, including data lakes, warehouses, and real-time processing systems. Collaborate with data governance teams to enforce standards for metadata management and audit trails. Work closely with IT teams to integrate analytics solutions with ERP, CRM, and other enterprise systems. Troubleshoot and resolve technical challenges related to data integration, analytics performance, and reporting accuracy. Stay updated on the latest advancements in Databricks, Power BI, and data analytics technologies. Drive innovation by integrating AI/ML capabilities into analytics solutions using Databricks. Contributes to the enhancement of organizational analytics maturity through scalable and reusable architectures. REQUIRED SKILLS: Self-Management – You need to possess the drive and ability to deliver on projects without constant supervision. Technical – This role has a heavy emphasis on thinking and working outside the box. You need to have a thirst for learning new technologies and be receptive to adopting new approaches and ways of thinking. Logic – You need to have the ability to work through and make logical sense of complicated and often abstract solutions and processes. Language – Customer has a global footprint, with offices and clients around the globe. The ability to read, write, and speak fluently in English, is a must. Other languages could prove useful. Communication – Your daily job will regularly require communication with Customer team members. The ability to clearly communicate, on a technical level, is essential to your job. This includes both verbal and written communication. ESSENTIAL SKILLS AND QUALIFICATIONS: Bachelor’s degree in Computer Science, Data Science, or a related field (Master’s preferred). Certifications (Preferred): Microsoft Certified: Azure Data Engineer Associate Databricks Certified Data Engineer Professional Microsoft Certified: Power BI Data Analyst Associate 8+ years of experience in analytics, data integration, and reporting. 4+ years of hands-on experience with Databricks, including: Proficiency in Databricks Notebooks for development and testing. Advanced skills in Databricks SQL, Python, and/or Scala for data engineering. Expertise in cluster management, auto-scaling, and cost optimization. 4+ years of expertise with Power BI, including: Advanced DAX for building measures and calculated fields. Proficiency in Power Query for data transformation. Deep understanding of Power BI architecture, workspaces, and row-level security. Strong knowledge of SQL for querying, aggregations, and optimization. Experience with modern ETL/ELT tools such as Azure Data Factory, Informatica, or Talend. Proficiency in Azure cloud platforms and their application to analytics solutions. Strong analytical thinking with the ability to translate data into actionable insights. Excellent communication skills to effectively collaborate with technical and non-technical stakeholders. Ability to manage multiple priorities in a fast-paced environment with high customer expectations. Job Type: Full-time Pay: Up to ₹3,000,000.00 per year Ability to commute/relocate: Gurugram, Haryana: Reliably commute or planning to relocate before starting work (Preferred) Application Question(s): What is your total work experience? How much experience do you have in Databricks? How much experience do you have in Power BI What is your Notice Period? Work Location: In person

Posted 3 weeks ago

Apply

2.0 - 3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description As a Data Analyst in the Arium group, part of the Verisk Extreme Event Solutions division, you will support the development of the insurance industry’s leading liability catastrophe modeling solution for the global insurance and reinsurance market. The Arium models quantify the systemic and emerging liability risks that can lead to significant aggregations and clash risks across portfolios, ranging from climate change to recessions to chemical exposures and across major commercial liability lines, including commercial general liability and professional/management liability lines. This role is part of the Arium research and analytics team, and will be based in Hyderabad. Responsibilities Day to Day Responsibilities: Own all Arium View of Risk data and data preparation processes Own the update process for historical event parameterization Collaborate with the research team to make all necessary data adjustments to maintain a robust database Support all necessary data preparation and pre-processing for specific modeling projects and initiatives Streamline data update and cleaning processes, including refactoring and optimizing legacy code to improve performance Support Ad-hoc data analysis and queries from various Arium teams Qualifications QUALIFICATIONS Bachelor's Degree or equivalent in statistics, computer science, information technology, business analytics, or other related majors Minimum of 2-3 years of experience preferred Proficiency in Excel Proficiency in either Python or R Proficiency in SQL and database management Experience performing data preprocessing and cleaning Dashboarding experience would be a plus Proficient in working with JSON data formats Demonstrated ability to write clean, maintainable, and well-documented code Insurance experience would be a plus Ability to work independently and as part of a team Strong interpersonal, oral, and written communication skills, including presentation skills About Us For over 50 years, Verisk has been the leading data analytics and technology partner to the global insurance industry by delivering value to our clients through expertise and scale. We empower communities and businesses to make better decisions on risk, faster. At Verisk, you'll have the chance to use your voice and build a rewarding career that's as unique as you are, with work flexibility and the support, coaching, and training you need to succeed. For the eighth consecutive year, Verisk is proudly recognized as a Great Place to Work® for outstanding workplace culture in the US, fourth consecutive year in the UK, Spain, and India, and second consecutive year in Poland. We value learning, caring and results and make inclusivity and diversity a top priority. In addition to our Great Place to Work® Certification, we’ve been recognized by The Wall Street Journal as one of the Best-Managed Companies and by Forbes as a World’s Best Employer and Best Employer for Women, testaments to the value we place on workplace culture. We’re 7,000 people strong. We relentlessly and ethically pursue innovation. And we are looking for people like you to help us translate big data into big ideas. Join us and create an exceptional experience for yourself and a better tomorrow for future generations. Verisk Businesses Underwriting Solutions — provides underwriting and rating solutions for auto and property, general liability, and excess and surplus to assess and price risk with speed and precision Claims Solutions — supports end-to-end claims handling with analytic and automation tools that streamline workflow, improve claims management, and support better customer experiences Property Estimating Solutions — offers property estimation software and tools for professionals in estimating all phases of building and repair to make day-to-day workflows the most efficient Extreme Event Solutions — provides risk modeling solutions to help individuals, businesses, and society become more resilient to extreme events. Specialty Business Solutions — provides an integrated suite of software for full end-to-end management of insurance and reinsurance business, helping companies manage their businesses through efficiency, flexibility, and data governance Marketing Solutions — delivers data and insights to improve the reach, timing, relevance, and compliance of every consumer engagement Life Insurance Solutions – offers end-to-end, data insight-driven core capabilities for carriers, distribution, and direct customers across the entire policy lifecycle of life and annuities for both individual and group. Verisk Maplecroft — provides intelligence on sustainability, resilience, and ESG, helping people, business, and societies become stronger Verisk Analytics is an equal opportunity employer. All members of the Verisk Analytics family of companies are equal opportunity employers. We consider all qualified applicants for employment without regard to race, religion, color, national origin, citizenship, sex, gender identity and/or expression, sexual orientation, veteran's status, age or disability. Verisk’s minimum hiring age is 18 except in countries with a higher age limit subject to applicable law. https://www.verisk.com/company/careers/ Unsolicited resumes sent to Verisk, including unsolicited resumes sent to a Verisk business mailing address, fax machine or email address, or directly to Verisk employees, will be considered Verisk property. Verisk will NOT pay a fee for any placement resulting from the receipt of an unsolicited resume. Verisk Employee Privacy Notice Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

PLEASE FILL THE GOOGLE FORM TO APPLY. https://forms.gle/F6q6Tbc66DeBPZEYA Data Analyst - Contract to Hire Role: Transform raw data into actionable insights through statistical analysis, automated reporting, and interactive applications. Technical Requirements: R Programming (primary) or Python for statistical analysis and modeling Data Manipulation: Reshaping datasets (pivot operations, melting/casting), joins, aggregations, and complex transformations Data Structures: Proficiency with lists, data frames, matrices, and nested data handling SQL: Query optimization, joins, window functions, and database performance tuning Linux/Bash: Shell scripting, file system operations, process management Cron Jobs: Scheduling and maintaining automated data pipelines R Shiny: Building interactive dashboards and web applications React: Single Page Application development for data visualization frontends REST APIs: Integration, authentication, and data exchange protocols Contract Terms: 6-month initial contract Full-time conversion possible after 3 months Monthly stipend with performance-based increases 100% remote work Experience: Previous data analysis projects demonstrating end-to-end pipeline development. Please fill the form if you wish to apply. https://forms.gle/F6q6Tbc66DeBPZEYA Show more Show less

Posted 3 weeks ago

Apply

8.0 - 10.0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

This role will be instrumental in building and maintaining robust, scalable, and reliable data pipelines using Confluent Kafka, ksqlDB, Kafka Connect, and Apache Flink. The ideal candidate will have a strong understanding of data streaming concepts, experience with real-time data processing, and a passion for building high-performance data solutions. This role requires excellent analytical skills, attention to detail, and the ability to work collaboratively in a fast-paced environment. Essential Responsibilities Design & develop data pipelines for real time and batch data ingestion and processing using Confluent Kafka, ksqlDB, Kafka Connect, and Apache Flink. Build and configure Kafka Connectors to ingest data from various sources (databases, APIs, message queues, etc.) into Kafka. Develop Flink applications for complex event processing, stream enrichment, and real-time analytics. Develop and optimize ksqlDB queries for real-time data transformations, aggregations, and filtering. Implement data quality checks and monitoring to ensure data accuracy and reliability throughout the pipeline. Monitor and troubleshoot data pipeline performance, identify bottlenecks, and implement optimizations. Automate data pipeline deployment, monitoring, and maintenance tasks. Stay up-to-date with the latest advancements in data streaming technologies and best practices. Contribute to the development of data engineering standards and best practices within the organization. Participate in code reviews and contribute to a collaborative and supportive team environment. Work closely with other architects and tech leads in India & US and create POCs and MVPs Provide regular updates on the tasks, status and risks to project manager The experience we are looking to add to our team Required Bachelors degree or higher from a reputed university 8 to 10 years total experience with majority of that experience related to ETL/ELT, big data, Kafka etc. Proficiency in developing Flink applications for stream processing and real-time analytics. Strong understanding of data streaming concepts and architectures. Extensive experience with Confluent Kafka, including Kafka Brokers, Producers, Consumers, and Schema Registry. Hands-on experience with ksqlDB for real-time data transformations and stream processing. Experience with Kafka Connect and building custom connectors. Extensive experience in implementing large scale data ingestion and curation solutions Good hands on experience in big data technology stack with any cloud platform - Excellent problemsolving, analytical, and communication skills. Ability to work independently and as part of a team Good to have Experience in Google Cloud Healthcare industry experience Experience in Agile Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Job Description: Job Summary: SSENTIAL SKILLS AND QUALIFICATIONS: Bachelor’s degree in computer science, Data Science, or a related field (Master’s preferred). Certifications (Preferred): Microsoft Certified: Azure Data Engineer Associate Databricks Certified Data Engineer Professional Microsoft Certified: Power BI Data Analyst Associate 8+ years of experience in analytics, data integration, and reporting. 4+ years of hands-on experience with Databricks, including: Proficiency in Databricks Notebooks for development and testing. Advanced skills in Databricks SQL, Python, and/or Scala for data engineering. Expertise in cluster management, auto-scaling, and cost optimization. 4+ years of expertise with Power BI, including: Advanced DAX for building measures and calculated fields. Proficiency in Power Query for data transformation. Deep understanding of Power BI architecture, workspaces, and row-level security. Strong knowledge of SQL for querying, aggregations, and optimization. Experience with modern ETL/ELT tools such as Azure Data Factory, Informatica, or Talend. Proficiency in Azure cloud platforms and their application to analytics solutions. Strong analytical thinking with the ability to translate data into actionable insights. Excellent communication skills to effectively collaborate with technical and non-technical stakeholders. Ability to manage multiple priorities in a fast-paced environment with high customer expectations. Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Greater Bengaluru Area

On-site

Linkedin logo

What if the work you did every day could impact the lives of people you know? Or all of humanity? At Illumina, we are expanding access to genomic technology to realize health equity for billions of people around the world. Our efforts enable life-changing discoveries that are transforming human health through the early detection and diagnosis of diseases and new treatment options for patients. Working at Illumina means being part of something bigger than yourself. Every person, in every role, has the opportunity to make a difference. Surrounded by extraordinary people, inspiring leaders, and world changing projects, you will do more and become more than you ever thought possible. Position Summary We are seeking a highly skilled Senior Data Engineer Developer with 5+ years of experience to join our talented team in Bangalore. In this role, you will be responsible for designing, implementing, and optimizing data pipelines, ETL processes, and data integration solutions using Python, Spark, SQL, Snowflake, dbt, and other relevant technologies. Additionally, you will bring strong domain expertise in operations organizations, with a focus on supply chain and manufacturing functions. If you're a seasoned data engineer with a proven track record of delivering impactful data solutions in operations contexts, we want to hear from you. Responsibilities Lead the design, development, and optimization of data pipelines, ETL processes, and data integration solutions using Python, Spark, SQL, Snowflake, dbt, and other relevant technologies. Apply strong domain expertise in operations organizations, particularly in functions like supply chain and manufacturing, to understand data requirements and deliver tailored solutions. Utilize big data processing frameworks such as Apache Spark to process and analyze large volumes of operational data efficiently. Implement data transformations, aggregations, and business logic to support analytics, reporting, and operational decision-making. Leverage cloud-based data platforms such as Snowflake to store and manage structured and semi-structured operational data at scale. Utilize dbt (Data Build Tool) for data modeling, transformation, and documentation to ensure data consistency, quality, and integrity. Monitor and optimize data pipelines and ETL processes for performance, scalability, and reliability in operations contexts. Conduct data profiling, cleansing, and validation to ensure data quality and integrity across different operational data sets. Collaborate closely with cross-functional teams, including operations stakeholders, data scientists, and business analysts, to understand operational challenges and deliver actionable insights. Stay updated on emerging technologies and best practices in data engineering and operations management, contributing to continuous improvement and innovation within the organization. All listed requirements are deemed as essential functions to this position; however, business conditions may require reasonable accommodations for additional task and responsibilities. Preferred Experience/Education/Skills Bachelor's degree in Computer Science, Engineering, Operations Management, or related field. 5+ years of experience in data engineering, with proficiency in Python, Spark, SQL, Snowflake, dbt, and other relevant technologies. Strong domain expertise in operations organizations, particularly in functions like supply chain and manufacturing. Strong domain expertise in life sciences manufacturing equipment, with a deep understanding of industry-specific challenges, processes, and technologies. Experience with big data processing frameworks such as Apache Spark and cloud-based data platforms such as Snowflake. Hands-on experience with data modeling, ETL development, and data integration in operations contexts. Familiarity with dbt (Data Build Tool) for managing data transformation and modeling workflows. Familiarity with reporting and visualization tools like Tableau, Powerbi etc. Good understanding of advanced data engineering and data science practices and technologies like pypark, sagemaker, cloudera MLflow etc. Experience with SAP, SAP HANA and Teamcenter applications is a plus. Excellent problem-solving skills, analytical thinking, and attention to detail. Strong communication and interpersonal skills, with the ability to collaborate effectively with cross-functional teams and operations stakeholders. Eagerness to learn and adapt to new technologies and tools in a fast-paced environment. Illumina believes that everyone has the ability to make an impact, and we are proud to be an equal opportunity employer committed to providing employment opportunity regardless of sex, race, creed, color, gender, religion, marital status, domestic partner status, age, national origin or ancestry, physical or mental disability, medical condition, sexual orientation, pregnancy, military or veteran status, citizenship status, and genetic information. Show more Show less

Posted 3 weeks ago

Apply

2.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description You thrive on diversity and creativity, and we welcome individuals who share our vision of making a lasting impact. Your unique combination of design thinking and experience will help us achieve new heights. As a Data Engineer II at JPMorgan Chase within the Commercial & Investment Bank Payments Technology team, you are part of an agile team that works to enhance, design, and deliver the data collection, storage, access, and analytics solutions in a secure, stable, and scalable way. As an emerging member of a data engineering team, you execute data solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role. Job Responsibilities Collaborate with all of JPMorgan’s lines of business and functions to delivery software solutions. Experiment, Architect, develop and productionize efficient Data pipelines, Data services and Data platforms contributing to the Business. Design and implement highly scalable, efficient and reliable data processing pipelines and perform analysis and insights to drive and optimize business result. Organizes, updates, and maintains gathered data that will aid in making the data actionable Demonstrates basic knowledge of the data system components to determine controls needed to ensure secure data access Adds to team culture of diversity, equity, inclusion, and respect Required Qualifications, Capabilities, And Skills Formal training or certification on large scale technology program concepts and 2+ years applied experience in Data Technologies. Experienced programming skills with Java and Python. Experience across the data lifecycle, building Data frameworks, working with Data lakes. Experience with Batch and Real time Data processing with Spark or Flink Basic knowledge of the data lifecycle and data management functions Advanced at SQL (e.g., joins and aggregations) Working understanding of NoSQL databases Significant experience with statistical data analysis and ability to determine appropriate tools to perform analysis Basic knowledge of data system components to determine controls needed Good knowledge / experience on infrastructure as code. Preferred Qualifications, Capabilities, And Skills Cloud computing: Amazon Web Service, Docker, Kubernetes, Terraform Experience in big data technologies: Hadoop, Hive, Spark, Kafka. Experience in distributed system design and development About Us JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team J.P. Morgan’s Commercial & Investment Bank is a global leader across banking, markets, securities services and payments. Corporations, governments and institutions throughout the world entrust us with their business in more than 100 countries. The Commercial & Investment Bank provides strategic advice, raises capital, manages risk and extends liquidity in markets around the world. Show more Show less

Posted 3 weeks ago

Apply

2.0 years

0 Lacs

Kerala, India

On-site

Linkedin logo

Job Summary: We are seeking a passionate and experienced Data Analyst Mentor to guide and mentor aspiring data analysts in developing core analytical and technical skills. You will play a key role in shaping the next generation of data professionals by delivering structured training, providing practical insights, and ensuring learners are industry-ready. Key Responsibilities: Conduct hands-on training sessions in Excel, Power BI, MySQL, Python (Numpy, Pandas, Matplotlib, Seaborn), and PySpark . Design and deliver engaging learning modules, exercises, and real-world projects. Provide mentorship to students, assisting with doubts, career guidance, and project reviews. Assess student performance through assignments, quizzes, and project evaluations. Stay up to date with industry trends and continuously improve training content. Collaborate with curriculum developers and instructional designers for content refinement. Participate in webinars, bootcamps, and hackathons as needed. Track student progress and provide timely feedback. Required Skills & Qualifications: Bachelor's/Master’s degree in Computer Science, Data Science, Engineering, Statistics, or a related field. 2+ years of professional experience in data analysis or data science roles. Proficiency in: Excel (Formulas, Pivot Tables, Dashboards) Power BI (Data Modeling, DAX, Visualizations) MySQL (Queries, Joins, Aggregations) Python (especially with libraries like Numpy, Pandas, Matplotlib, Seaborn) PySpark (RDDs, DataFrames, Spark SQL) Strong understanding of data wrangling, visualization, and storytelling. Excellent communication and presentation skills. Prior teaching, mentoring, or training experience is a strong plus. Preferred Qualifications: Certification in Data Analytics, Power BI, or related technologies. Experience mentoring in bootcamps or online education platforms. Show more Show less

Posted 3 weeks ago

Apply

40.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Job Description Preferred skills and qualifications 7 or more years of experience in building and delivering analytical applications.BE/BTech or higher degree in Computer Science, Computer Engineering.Excellent knowledge on OBIEE/OAC RPD development.Experience in OBIEE/OAC Reports & Dashboard and/or Data Visualization Projects development.Exposure to Data Warehouse design, Star Schema modeling. Career Level - IC3 Responsibilities Build and manage the OAC RPD models to optimize data accessibility and performance. Create dynamic, user-friendly OAC dashboards and visualizations that turn data into insights. Set up alerts and automate data flows to keep business teams informed in real-time. Collaborate on data warehouse design, data pipelines, helping shape how we store and manage data. Use ETL tools (Oracle Data Integrator) to move and transform data, ensuring it’s ready for analysis. Have a good broader business understanding and mindset, while technically leading efforts in specific areas. Make effective judgments and choices on design and roadmap based on various considerations. Work with business and technical teams to understand needs and deliver effective OAC solutions. Troubleshoot and solve any issues that come up with data flows, dashboards, or OAC configurations. Stay current with OAC updates and best practices, continually improving our analytics capabilities. Excellent knowledge in development of repository at all the three layers including Physical Layer, Business Model and Mapping Layer and Presentation Layer. Knowledge in creating Logical columns, Hierarchies (Level Based, Parent Child), Measures, Aggregation Rules, Aggregations, Time Series in Business Model and Catalog Folders in Presentation layers. Good Knowledge in security set up for reports and dashboards, and OBIEE administrative activities. Experience on Oracle SQL, database performance tuning by implementing Views, Materialized views, creating indexes. Strong interpersonal and team leadership skills About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 3 weeks ago

Apply

40.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Job Description Preferred skills and qualifications 7 or more years of experience in building and delivering analytical applications.BE/BTech or higher degree in Computer Science, Computer Engineering.Excellent knowledge on OBIEE/OAC RPD development.Experience in OBIEE/OAC Reports & Dashboard and/or Data Visualization Projects development.Exposure to Data Warehouse design, Star Schema modeling. Career Level - IC3 Responsibilities Build and manage the OAC RPD models to optimize data accessibility and performance. Create dynamic, user-friendly OAC dashboards and visualizations that turn data into insights. Set up alerts and automate data flows to keep business teams informed in real-time. Collaborate on data warehouse design, data pipelines, helping shape how we store and manage data. Use ETL tools (Oracle Data Integrator) to move and transform data, ensuring it’s ready for analysis. Have a good broader business understanding and mindset, while technically leading efforts in specific areas. Make effective judgments and choices on design and roadmap based on various considerations. Work with business and technical teams to understand needs and deliver effective OAC solutions. Troubleshoot and solve any issues that come up with data flows, dashboards, or OAC configurations. Stay current with OAC updates and best practices, continually improving our analytics capabilities. Excellent knowledge in development of repository at all the three layers including Physical Layer, Business Model and Mapping Layer and Presentation Layer. Knowledge in creating Logical columns, Hierarchies (Level Based, Parent Child), Measures, Aggregation Rules, Aggregations, Time Series in Business Model and Catalog Folders in Presentation layers. Good Knowledge in security set up for reports and dashboards, and OBIEE administrative activities. Experience on Oracle SQL, database performance tuning by implementing Views, Materialized views, creating indexes. Strong interpersonal and team leadership skills About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 3 weeks ago

Apply

40.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Job Description Preferred skills and qualifications 7 or more years of experience in building and delivering analytical applications.BE/BTech or higher degree in Computer Science, Computer Engineering.Excellent knowledge on OBIEE/OAC RPD development.Experience in OBIEE/OAC Reports & Dashboard and/or Data Visualization Projects development.Exposure to Data Warehouse design, Star Schema modeling. Career Level - IC3 Responsibilities Build and manage the OAC RPD models to optimize data accessibility and performance. Create dynamic, user-friendly OAC dashboards and visualizations that turn data into insights. Set up alerts and automate data flows to keep business teams informed in real-time. Collaborate on data warehouse design, data pipelines, helping shape how we store and manage data. Use ETL tools (Oracle Data Integrator) to move and transform data, ensuring it’s ready for analysis. Have a good broader business understanding and mindset, while technically leading efforts in specific areas. Make effective judgments and choices on design and roadmap based on various considerations. Work with business and technical teams to understand needs and deliver effective OAC solutions. Troubleshoot and solve any issues that come up with data flows, dashboards, or OAC configurations. Stay current with OAC updates and best practices, continually improving our analytics capabilities. Excellent knowledge in development of repository at all the three layers including Physical Layer, Business Model and Mapping Layer and Presentation Layer. Knowledge in creating Logical columns, Hierarchies (Level Based, Parent Child), Measures, Aggregation Rules, Aggregations, Time Series in Business Model and Catalog Folders in Presentation layers. Good Knowledge in security set up for reports and dashboards, and OBIEE administrative activities. Experience on Oracle SQL, database performance tuning by implementing Views, Materialized views, creating indexes. Strong interpersonal and team leadership skills About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

We are looking for an experienced SSAS Data Engineer with strong expertise in SSAS (Tabular and/or Multidimensional Models) , SQL , MDX/DAX , and data modeling . The ideal candidate will have a solid background in designing and developing BI solutions, working with large datasets, and building scalable SSAS cubes for reporting and analytics. Experience with ETL processes and reporting tools like Power BI is a strong plus. Key Responsibilities Design, develop, and maintain SSAS models (Tabular and/or Multidimensional). Build and optimize MDX or DAX queries for advanced reporting needs. Create and manage data models (Star/Snowflake schemas) supporting business KPIs. Develop and maintain ETL pipelines for efficient data ingestion (preferably using SSIS or similar tools). Implement KPIs, aggregations, partitioning, and performance tuning in SSAS cubes. Collaborate with data analysts, business stakeholders, and Power BI teams to deliver accurate and insightful reporting solutions. Maintain data quality and consistency across data sources and reporting layers. Implement RLS/OLS and manage report security and governance in SSAS and Power BI. Primary Required Skills: SSAS – Tabular & Multidimensional SQL Server (Advanced SQL, Views, Joins, Indexes) DAX & MDX Data Modeling & OLAP concepts Secondary ETL Tools (SSIS or equivalent) Power BI or similar BI/reporting tools Performance tuning & troubleshooting in SSAS and SQL Version control (TFS/Git), deployment best practices Skills: business intelligence,data visualization,sql proficiency,data modeling & olap concepts,mdx,dax & mdx,data analysis,performance tuning,ssas,data modeling,etl tools (ssis or equivalent),version control (tfs/git), deployment best practices,ssas - tabular & multidimensional,etl,sql server (advanced sql, views, joins, indexes),multidimensional expressions (mdx),dax,performance tuning & troubleshooting in ssas and sql,power bi or similar bi/reporting tools Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

New Delhi, Delhi, India

On-site

Linkedin logo

We are looking for an experienced SSAS Data Engineer with strong expertise in SSAS (Tabular and/or Multidimensional Models) , SQL , MDX/DAX , and data modeling . The ideal candidate will have a solid background in designing and developing BI solutions, working with large datasets, and building scalable SSAS cubes for reporting and analytics. Experience with ETL processes and reporting tools like Power BI is a strong plus. Key Responsibilities Design, develop, and maintain SSAS models (Tabular and/or Multidimensional). Build and optimize MDX or DAX queries for advanced reporting needs. Create and manage data models (Star/Snowflake schemas) supporting business KPIs. Develop and maintain ETL pipelines for efficient data ingestion (preferably using SSIS or similar tools). Implement KPIs, aggregations, partitioning, and performance tuning in SSAS cubes. Collaborate with data analysts, business stakeholders, and Power BI teams to deliver accurate and insightful reporting solutions. Maintain data quality and consistency across data sources and reporting layers. Implement RLS/OLS and manage report security and governance in SSAS and Power BI. Primary Required Skills: SSAS – Tabular & Multidimensional SQL Server (Advanced SQL, Views, Joins, Indexes) DAX & MDX Data Modeling & OLAP concepts Secondary ETL Tools (SSIS or equivalent) Power BI or similar BI/reporting tools Performance tuning & troubleshooting in SSAS and SQL Version control (TFS/Git), deployment best practices Skills: business intelligence,data visualization,sql proficiency,data modeling & olap concepts,mdx,dax & mdx,data analysis,performance tuning,ssas,data modeling,etl tools (ssis or equivalent),version control (tfs/git), deployment best practices,ssas - tabular & multidimensional,etl,sql server (advanced sql, views, joins, indexes),multidimensional expressions (mdx),dax,performance tuning & troubleshooting in ssas and sql,power bi or similar bi/reporting tools Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Faridabad, Haryana, India

On-site

Linkedin logo

We are looking for an experienced SSAS Data Engineer with strong expertise in SSAS (Tabular and/or Multidimensional Models) , SQL , MDX/DAX , and data modeling . The ideal candidate will have a solid background in designing and developing BI solutions, working with large datasets, and building scalable SSAS cubes for reporting and analytics. Experience with ETL processes and reporting tools like Power BI is a strong plus. Key Responsibilities Design, develop, and maintain SSAS models (Tabular and/or Multidimensional). Build and optimize MDX or DAX queries for advanced reporting needs. Create and manage data models (Star/Snowflake schemas) supporting business KPIs. Develop and maintain ETL pipelines for efficient data ingestion (preferably using SSIS or similar tools). Implement KPIs, aggregations, partitioning, and performance tuning in SSAS cubes. Collaborate with data analysts, business stakeholders, and Power BI teams to deliver accurate and insightful reporting solutions. Maintain data quality and consistency across data sources and reporting layers. Implement RLS/OLS and manage report security and governance in SSAS and Power BI. Primary Required Skills: SSAS – Tabular & Multidimensional SQL Server (Advanced SQL, Views, Joins, Indexes) DAX & MDX Data Modeling & OLAP concepts Secondary ETL Tools (SSIS or equivalent) Power BI or similar BI/reporting tools Performance tuning & troubleshooting in SSAS and SQL Version control (TFS/Git), deployment best practices Skills: business intelligence,data visualization,sql proficiency,data modeling & olap concepts,mdx,dax & mdx,data analysis,performance tuning,ssas,data modeling,etl tools (ssis or equivalent),version control (tfs/git), deployment best practices,ssas - tabular & multidimensional,etl,sql server (advanced sql, views, joins, indexes),multidimensional expressions (mdx),dax,performance tuning & troubleshooting in ssas and sql,power bi or similar bi/reporting tools Show more Show less

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies