Home
Jobs

458 Olap Jobs - Page 16

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

We are looking for an experienced SSAS Data Engineer with strong expertise in SSAS (Tabular and/or Multidimensional Models) , SQL , MDX/DAX , and data modeling . The ideal candidate will have a solid background in designing and developing BI solutions, working with large datasets, and building scalable SSAS cubes for reporting and analytics. Experience with ETL processes and reporting tools like Power BI is a strong plus. Key Responsibilities Design, develop, and maintain SSAS models (Tabular and/or Multidimensional). Build and optimize MDX or DAX queries for advanced reporting needs. Create and manage data models (Star/Snowflake schemas) supporting business KPIs. Develop and maintain ETL pipelines for efficient data ingestion (preferably using SSIS or similar tools). Implement KPIs, aggregations, partitioning, and performance tuning in SSAS cubes. Collaborate with data analysts, business stakeholders, and Power BI teams to deliver accurate and insightful reporting solutions. Maintain data quality and consistency across data sources and reporting layers. Implement RLS/OLS and manage report security and governance in SSAS and Power BI. Primary Required Skills: SSAS – Tabular & Multidimensional SQL Server (Advanced SQL, Views, Joins, Indexes) DAX & MDX Data Modeling & OLAP concepts Secondary ETL Tools (SSIS or equivalent) Power BI or similar BI/reporting tools Performance tuning & troubleshooting in SSAS and SQL Version control (TFS/Git), deployment best practices Skills: business intelligence,data visualization,sql proficiency,data modeling & olap concepts,mdx,dax & mdx,data analysis,performance tuning,ssas,data modeling,etl tools (ssis or equivalent),version control (tfs/git), deployment best practices,ssas - tabular & multidimensional,etl,sql server (advanced sql, views, joins, indexes),multidimensional expressions (mdx),dax,performance tuning & troubleshooting in ssas and sql,power bi or similar bi/reporting tools Show more Show less

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

J At Personify Health, we value and celebrate diversity and are committed to creating an inclusive environment for all employees. We believe in creating teams made up of individuals with various backgrounds, experiences, and perspectives. Why? Because diversity inspires innovation and collaboration and challenges us to produce better solutions. But more than this, diversity is our strength, and a catalyst in our ability to #changelivesforgood. Job Summary As a Business Intelligence developer, you'll understand the Schema layer to build complex BI reports and Dashboards with a keen focus on the healthcare and wellbeing industry. Your SQL skills will play a significant role in data manipulation and delivery, and your experience with MicroStrategy will be vital for creating BI tools and reports. This role will help migrate and build new analytics products based on MicroStrategy to support teams with their internal and external reporting for Health Comp data. Essential Functions/Responsibilities/Duties Work closely with Senior Business Intelligence engineer and BI architect to understand the schema objects and build BI reports and Dashboards Participation in sprint refinement, planning, and kick-off to understand the Agile process and Sprint priorities Develop necessary transformations and aggregate tables required for the reporting\Dashboard needs Understand the Schema layer in MicroStrategy and business requirements Develop complex reports and Dashboards in MicroStrategy Investigate and troubleshoot issues with Dashboard and reports Proactively researching new technologies and proposing improvements to processes and tech stack Create test cases and scenarios to validate the dashboards and maintain data accuracy Education And Experience 3 years of experience in Business Intelligence and Data warehousing 3+ years of experience in MicroStrategy Reports and Dashboard development 2 years of experience in SQL Bachelors or master’s degree in IT or Computer Science or ECE. Nice to have – Any MicroStrategy certifications Required Knowledge, Skills, And Abilities Good in writing complex SQL, including aggregate functions, subqueries and complex date calculations and able to teach these concepts to others. Detail oriented and able to examine data and code for quality and accuracy. Self-Starter – taking initiative when inefficiencies or opportunities are seen. Good understanding of modern relational and non-relational models and differences between them Good understanding of Datawarehouse concepts, snowflake & star schema architecture and SCD concepts Good understanding of MicroStrategy Schema objects Develop Public objects such as metrics, filters, prompts, derived objects, custom groups and consolidations in MicroStrategy Develop complex reports and dashboards using OLAP and MTDI cubes Create complex dashboards with data blending Understand VLDB settings and report optimization Understand security filters and connection mappings in MSTR Work Environment At Personify Health, we value and celebrate diversity and are committed to creating an inclusive environment for all employees. We believe in creating teams made up of individuals with various backgrounds, experiences, and perspectives. Diversity inspires innovation and collaboration and challenges us to produce better solutions. But more than this, diversity is our strength and a catalyst in our ability to change lives for the good. Physical Requirements Constantly operates a computer and other office productivity machinery, such as copy machine, computer printer, calculator, etc. Show more Show less

Posted 3 weeks ago

Apply

7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Location: Hyderabad / Bangalore / Pune Experience: 7-16 Years Work Mode: Hybrid Mandatory Skills: ERstudio, Data Warehouse, Data Modelling, Databricks, ETL, PostgreSQL, MySQL, Oracle, NoSQL, Hadoop, Spark, Dimensional Modelling ,OLAP, OLTP, Erwin, Data Architect and Supplychain (preferred) Job Description We are looking for a talented and experienced Senior Data Modeler to join our growing team. As a Senior Data Modeler, you will be responsible for designing, implementing, and maintaining data models to enhance data quality, performance, and scalability. You will collaborate with cross-functional teams including data analysts, architects, and business stakeholders to ensure that the data models align with business requirements and drive efficient data management. Key Responsibilities Design, implement, and maintain data models that support business requirements, ensuring high data quality, performance, and scalability. Collaborate with data analysts, data architects, and business stakeholders to align data models with business needs. Leverage expertise in Azure, Databricks, and data warehousing to create and manage data solutions. Manage and optimize relational and NoSQL databases such as Teradata, SQL Server, Oracle, MySQL, MongoDB, and Cassandra. Contribute to and enhance the ETL processes and data integration pipelines to ensure smooth data flows. Apply data modeling principles and techniques, such as ERD and UML, to design and implement effective data models. Stay up-to-date with industry trends and emerging technologies, such as big data technologies like Hadoop and Spark. Develop and maintain data models using data modeling tools such as ER/Studio and Hackolade. Drive the adoption of best practices and standards for data modeling within the organization. Skills And Qualifications Minimum of 6+ years of experience in data modeling, with a proven track record of implementing scalable and efficient data models. Expertise in Azure and Databricks for building data solutions. Proficiency in ER/Studio, Hackolade, and other data modeling tools. Strong understanding of data modeling principles and techniques (e.g., ERD, UML). Experience with relational databases (e.g., Teradata, SQL Server, Oracle, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra). Solid understanding of data warehousing, ETL processes, and data integration. Familiarity with big data technologies such as Hadoop and Spark is an advantage. Industry Knowledge: A background in supply chain is preferred but not mandatory. Excellent analytical and problem-solving skills. Strong communication skills, with the ability to interact with both technical and non-technical stakeholders. Ability to work well in a collaborative, fast-paced environment. Education B.Tech in any branch or specialization Skills: data visualization,oltp,models,databricks,spark,data modeller,supplychain,oracle,skills,databases,hadoop,dimensional modelling,erstudio,nosql,data warehouse,data models,data modeling,modeling,data architect,er studio,data modelling,data,etl,erwin,mysql,olap,postgresql Show more Show less

Posted 3 weeks ago

Apply

9.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

This role is for one of Weekday's clients Salary range: Rs 3500000 - Rs 6000000 (ie INR 35-60 LPA) Min Experience: 9 years Location: Bengaluru JobType: full-time Requirements About the Role: We are seeking a highly experienced and innovative Senior Data Engineer to join our growing data team. As a Data Engineer, you will be responsible for designing, developing, and maintaining robust data pipelines and scalable architectures that support our business intelligence, analytics, and data-driven decision-making initiatives. You will work closely with data scientists, analysts, product managers, and other stakeholders to ensure that our data infrastructure is efficient, reliable, and aligned with business goals. This is a strategic role ideal for someone who thrives in a fast-paced environment, has deep experience in data engineering best practices, and is passionate about leveraging data to drive impact at scale. Key Responsibilities: Data Architecture & Pipeline Development: Design and build highly scalable, efficient, and secure data pipelines for batch and real-time data processing. Develop and maintain ETL/ELT processes to extract, transform, and load data from multiple sources into data warehouses and data lakes. Data Modeling & Warehousing: Create and maintain optimized data models that support advanced analytics and reporting. Design and implement data warehousing solutions using modern data storage technologies. Data Quality & Governance: Ensure high levels of data availability, quality, and integrity through the implementation of robust data validation, monitoring, and governance practices. Partner with compliance and data governance teams to enforce data security and privacy policies. Collaboration & Cross-functional Partnership: Work closely with data scientists, analysts, and business teams to understand data requirements and provide reliable data solutions. Collaborate with DevOps and infrastructure teams to automate deployment and manage cloud-based data environments. Tooling & Performance Optimization: Implement monitoring tools and optimize performance of data pipelines and database systems. Stay updated with the latest trends in data engineering, and evaluate new tools and technologies for adoption. Required Qualifications: Bachelor's or Master's degree in Computer Science, Information Systems, Engineering, or a related field. 9+ years of hands-on experience in Data Engineering, with a deep understanding of scalable data pipeline architecture. Proficient in at least one programming language such as Python, Java, or Scala. Strong experience with ETL/ELT frameworks, data orchestration tools (e.g., Apache Airflow, DBT), and workflow management. Solid experience working with cloud data platforms (e.g., AWS, Azure, GCP) and data storage solutions (e.g., Snowflake, Redshift, BigQuery). Expertise in SQL and data modeling for both OLAP and OLTP systems. Familiarity with distributed systems, streaming technologies (Kafka, Spark), and containerization (Docker, Kubernetes) is a plus. Preferred Skills: Experience in a fast-paced startup or enterprise data team. Exposure to big data technologies and real-time data processing. Strong analytical thinking and problem-solving skills. Excellent communication and project management abilities. Show more Show less

Posted 3 weeks ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Job Description A minimum of 2 years of working as a MDM consultant or directly with clients leveraging popular tools like Informatica, Reltio etc. A minimum of 2 years in a role that had taken ownership of the data assets for the organization to provide users with high-quality data that is accessible in a consistent manner. A minimum of 2 years facilitating Data cleansing and enrichment through data de-duplication and construction A minimum of 2 years in a role that captures the current state of the system, encompassing processes such as data discovery, profiling, inventories. A minimum of 2 years in a role that defined processes include data classification, business glossary creation and business rule definition. A minimum of 2 years in a role that applied processes with aim to operationalize and ensure compliance with policies and include automating rules, workflows, collaboration etc. Experience in a role that led measurement and monitoring to determine the value generated and include impact analysis, data lineage, proactive monitoring, operational dashboards and business value. Experience performing: Master Data Management Metadata Management Data Management and Integration Systems Development Lifecycle (SDLC) Data Modeling Techniques and Methodologies Database Management Database Technical Design and Build Extract Transform & Load (ETL) Tools Cloud Data Architecture, Data Architecture Principle Online Analytical Processing (OLAP) Data Processes Data Architecture Principles Data Architecture Estimation Mandatory Skill Sets Master Data Management, ETL, Database Management Preferred Skill Sets Master Data Management, ETL, Database Management Years Of Experience Required 2-4 Years Education Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Database Management, Extract Transform Load (ETL), Master Data Management Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 12 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description For Lead Data Engineer QA Rank – Manager Location – Bengaluru/Chennai/Kerela/Kolkata Objectives and Purpose The Lead Data Engineer QA will be responsible for testing business intelligence and data warehouse solutions, both in on-premises and cloud platforms. We are seeking an innovative and talented individual who can create test plans, protocols, and procedures for new software. In addition, you will be supporting build of large-scale data architectures that provide information to downstream systems and business users. Your Key Responsibilities Design and execute manual and automatic test cases, including validating alignment with ELT data integrity and compliance. Support conducting QA test case designs, including identifying opportunities for test automation and developing scripts for automatic processes as needed. Follow quality standards, conduct continuous monitoring and improvement, and manage test cases, test data, and defect processes using a risk-based approach as needed. Ensure all software releases meet regulatory standards, including requirements for validation, documentation, and traceability, with particular emphasis on data privacy and adherence to infrastructure security best practices. Proactively foster strong partnerships across teams and stakeholders to ensure alignment with quality requirements and address any challenges. Implement observability within testing processes to proactively identify, track, and resolve quality issues, contributing to sustained high-quality performance. Establish methodology to test effectiveness of BI and DWH projects, ELT reports, integration, manual and automation functionality Work closely with product team to monitor data quality, integrity, and security throughout the product lifecycle, implementing data quality checks to ensure accuracy, completeness, and consistency. Lead the evaluation, implementation and deployment of emerging tools and processes to improve productivity. Develop and maintain scalable data pipelines, in line with ETL principles, and build out new integrations, using AWS native technologies, to support continuing increases in data source, volume, and complexity. Define data requirements, gather, and mine data, while validating the efficiency of data tools in the Big Data Environment. Establish methodology to test effectiveness of BI and DWH projects, ELT reports, integration, manual and automation functionality. Implement processes and systems to provide accurate and available data to key stakeholders, downstream systems, and business processes. Partner with Business Analytics and Solution Architects to develop technical architectures for strategic enterprise projects and initiatives. Coordinate with Data Scientists to understand data requirements, and design solutions that enable advanced analytics, machine learning, and predictive modelling. Mentor and coach junior Data Engineers on data standards and practices, promoting the values of learning and growth. Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions. To qualify for the role, you must have the following: Essential Skillsets Bachelor’s degree in Engineering, Computer Science, Data Warehousing, or related field 10+ years of experience in software development, data science, data engineering, ETL, and analytics reporting development Understanding of project and test lifecycle, including exposure to CMMi and process improvement frameworks Experience designing, building, implementing, and maintaining data and system integrations using dimensional data modelling and development and optimization of ETL pipelines Proven track record of designing and implementing complex data solutions Understanding of business intelligence concepts, ETL processing, dashboards, and analytics Testing experience in Data Quality, ETL, OLAP, or Reports Knowledge in Data Transformation Projects, including database design concepts & white box testing Experience in cloud based data solution – AWS/Azure Demonstrated understanding and experience using: Cloud-based data solutions (AWS, IICS, Databricks) GXP and regulatory and risk compliance Cloud AWS infrastructure testing Python data processing SQL scripting Test processes (e.g., ELT testing, SDLC) Power BI/Tableau Script (e.g., perl and shell) Data Engineering Programming Languages (i.e., Python) Distributed Data Technologies (e.g., Pyspark) Test Management and Defect Management tools (e.g., HP ALM) Cloud platform deployment and tools (e.g., Kubernetes) DevOps and continuous integration Databricks/ETL Understanding of database architecture and administration Utilizes the principles of continuous integration and delivery to automate the deployment of code changes to elevate environments, fostering enhanced code quality, test coverage, and automation of resilient test cases Processes high proficiency in code programming languages (e.g., SQL, Python, Pyspark, AWS services) to design, maintain, and optimize data architecture/pipelines that fit business goals Strong organizational skills with the ability to manage multiple projects simultaneously and operate as a leading member across globally distributed teams to deliver high-quality services and solutions Excellent written and verbal communication skills, including storytelling and interacting effectively with multifunctional teams and other strategic partners Strong problem solving and troubleshooting skills Ability to work in a fast-paced environment and adapt to changing business priorities EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description For Lead Data Engineer QA Rank – Manager Location – Bengaluru/Chennai/Kerela/Kolkata Objectives and Purpose The Lead Data Engineer QA will be responsible for testing business intelligence and data warehouse solutions, both in on-premises and cloud platforms. We are seeking an innovative and talented individual who can create test plans, protocols, and procedures for new software. In addition, you will be supporting build of large-scale data architectures that provide information to downstream systems and business users. Your Key Responsibilities Design and execute manual and automatic test cases, including validating alignment with ELT data integrity and compliance. Support conducting QA test case designs, including identifying opportunities for test automation and developing scripts for automatic processes as needed. Follow quality standards, conduct continuous monitoring and improvement, and manage test cases, test data, and defect processes using a risk-based approach as needed. Ensure all software releases meet regulatory standards, including requirements for validation, documentation, and traceability, with particular emphasis on data privacy and adherence to infrastructure security best practices. Proactively foster strong partnerships across teams and stakeholders to ensure alignment with quality requirements and address any challenges. Implement observability within testing processes to proactively identify, track, and resolve quality issues, contributing to sustained high-quality performance. Establish methodology to test effectiveness of BI and DWH projects, ELT reports, integration, manual and automation functionality Work closely with product team to monitor data quality, integrity, and security throughout the product lifecycle, implementing data quality checks to ensure accuracy, completeness, and consistency. Lead the evaluation, implementation and deployment of emerging tools and processes to improve productivity. Develop and maintain scalable data pipelines, in line with ETL principles, and build out new integrations, using AWS native technologies, to support continuing increases in data source, volume, and complexity. Define data requirements, gather, and mine data, while validating the efficiency of data tools in the Big Data Environment. Establish methodology to test effectiveness of BI and DWH projects, ELT reports, integration, manual and automation functionality. Implement processes and systems to provide accurate and available data to key stakeholders, downstream systems, and business processes. Partner with Business Analytics and Solution Architects to develop technical architectures for strategic enterprise projects and initiatives. Coordinate with Data Scientists to understand data requirements, and design solutions that enable advanced analytics, machine learning, and predictive modelling. Mentor and coach junior Data Engineers on data standards and practices, promoting the values of learning and growth. Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions. To qualify for the role, you must have the following: Essential Skillsets Bachelor’s degree in Engineering, Computer Science, Data Warehousing, or related field 10+ years of experience in software development, data science, data engineering, ETL, and analytics reporting development Understanding of project and test lifecycle, including exposure to CMMi and process improvement frameworks Experience designing, building, implementing, and maintaining data and system integrations using dimensional data modelling and development and optimization of ETL pipelines Proven track record of designing and implementing complex data solutions Understanding of business intelligence concepts, ETL processing, dashboards, and analytics Testing experience in Data Quality, ETL, OLAP, or Reports Knowledge in Data Transformation Projects, including database design concepts & white box testing Experience in cloud based data solution – AWS/Azure Demonstrated understanding and experience using: Cloud-based data solutions (AWS, IICS, Databricks) GXP and regulatory and risk compliance Cloud AWS infrastructure testing Python data processing SQL scripting Test processes (e.g., ELT testing, SDLC) Power BI/Tableau Script (e.g., perl and shell) Data Engineering Programming Languages (i.e., Python) Distributed Data Technologies (e.g., Pyspark) Test Management and Defect Management tools (e.g., HP ALM) Cloud platform deployment and tools (e.g., Kubernetes) DevOps and continuous integration Databricks/ETL Understanding of database architecture and administration Utilizes the principles of continuous integration and delivery to automate the deployment of code changes to elevate environments, fostering enhanced code quality, test coverage, and automation of resilient test cases Processes high proficiency in code programming languages (e.g., SQL, Python, Pyspark, AWS services) to design, maintain, and optimize data architecture/pipelines that fit business goals Strong organizational skills with the ability to manage multiple projects simultaneously and operate as a leading member across globally distributed teams to deliver high-quality services and solutions Excellent written and verbal communication skills, including storytelling and interacting effectively with multifunctional teams and other strategic partners Strong problem solving and troubleshooting skills Ability to work in a fast-paced environment and adapt to changing business priorities EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

Role -: Azure Synapse Data Engineer Experience -: 8 - 10 Yrs Must have skills-: Experience in designing and hands-on development in Azure-based analytics solutions. Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required. Designing and building of data pipelines using API ingestion and Streaming ingestion methods. Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. Strong experience in common data warehouse modelling principles including Kimball, Inmon. Experience in additional Modern Data Platforms related technologies such as Microsoft Fabric are a distinct advantage Knowledge in Azure Databricks, Azure IoT, Azure HDInsight + Spark, Azure Stream Analytics, Power BI is desirable Knowledge of Microsoft BI Stack (SSRS/SSAS (Tabular with DAX & OLAP with MDX) SSIS) is desirable. Knowledge of C# and Hands-on with Powershell scripting is desirable. Working knowledge of Python and Pyspark is desirable Experience developing security models is good to have. Show more Show less

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Gurgaon, Haryana, India

Remote

Linkedin logo

ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it, our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage and passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. ZS Master Data Management Team has an extensive track record of completing over 1000 global projects and partnering with 15 of the top 20 Global Pharma organizations. They specialize in various MDM domains, offering end-to-end project implementation, change management, and data stewardship support. Their services encompass MDM strategy consulting, implementation for key entities (e.g., HCP, HCO, Employee, Payer, Product, Patient, Affiliations), and operational support including KTLO and Data Stewardship. With 50+ MDM implementations and Change Management programs annually for Life Sciences clients, the team has developed valuable assets like MDM libraries and pre-built accelerators. Strategic partnerships with leading platform vendors (Reltio, Informatica, Veeva, Semarchy etc) and collaborations with 18+ data vendors and technology providers further enhance their capabilities. You as Business Technology Solutions Manager will take ownership of one or more client delivery at a cross office level encompassing the area of digital experience transformation. The successful candidate will work closely with ZS Technology leadership and be responsible for building and managing client relationships, generating new business engagements, and providing thought leadership in the Digital Area. What You’ll Do Lead the delivery process - right from discovery/ POC to managing operations, across 3-4 client engagements helping to deliver world-class MDM solutions Ownership to ensure the proposed design/ architecture, deliverables meets the client expectation and solves the business problem with high degree of quality; Partner with Senior Leadership team and assist in project management responsibility i.e. Project planning, staffing management, people growth, etc.; Develop and implement master data management strategies and processes to maintain high-quality master data across the organization. Design and manage data governance frameworks, including data quality standards, policies, and procedures. Outlook for continuous improvement, innovation and provide necessary mentorship and guidance to the team; Liaison with Staffing partner, HR business partners for team building/ planning; Lead efforts for building POV on new technology or problem solving, Innovation to build firm intellectual capital: Actively lead unstructured problem solving to design and build complex solutions, tune to meet expected performance and functional requirements; Stay current with industry trends and emerging technologies in master data management and data governance. What You’ll Bring: Bachelor's/Master's degree with specialization in Computer Science, MIS, IT or other computer related disciplines; 10-14 years of relevant consulting-industry experience (Preferably Healthcare bad Life Science) working on medium-large scale MDM solution delivery engagements: 5+ years of hands-on experience on designing, implementation MDM services & capabilities using tools such as Informatica MDM, Reltio etc Strong understanding of data management principles, including data modeling, data quality, and metadata management. Strong understanding of various cloud based data management (ETL Tools) platforms such as AWS, Azure, Snowflake etc.,; Experience in designing and driving delivery of mid-large-scale solutions on Cloud platforms; Experience with ETL design and development, and (OLAP) tools to support business applications Additional Skills Ability to manage a virtual global team environment that contributes to the overall timely delivery of multiple projects; Knowledge of current data modeling, and data warehouse concepts, issues, practices, methodologies, and trends in the Business Intelligence domain; Experience with analyzing and troubleshooting the interaction between databases, operating systems, and applications; Significant supervisory, coaching and hands-on project management skills; Willingness to travel to other global offices as needed to work with client or other internal project teams. Perks & Benefits: ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel: Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying? At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application: Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At: www.zs.com Show more Show less

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

India

On-site

Linkedin logo

Role: Lead Data Engineer Assignment description: About the Role We’re seeking an experienced Data Engineer to join a cross-functional team and help build and migrate our data solutions onto Google Cloud Platform. You’ll design, deploy, and operationalize data products that power analytics, ML/AI initiatives, and business insights. Key Responsibilities Build & Migrate Data Products Develop new data pipelines and models on GCP using BigQuery and DBT Refactor and migrate existing solutions to leverage GCP best practices Data Integration & Orchestration Sync data from diverse sources via Cloud Functions (Python) Automate infrastructure and workflows using Terraform and Cloud Workflows Collaboration & Architecture Work closely with Solution Architects and Business Experts to define optimal data models Partner with Data Scientists and Software Engineers to deliver scalable, reliable solutions Optional (Nice-to-have) Experience building and maintaining OLAP cubes Experience developing Power BI reports Tech Stack: Cloud: GCP (BigQuery, Cloud Functions, Cloud Workflows) Data Modeling: DBT Scripting: Python Infrastructure as Code: Terraform Version Control: Git Who You Are: Experienced: 3+ years as a Data Engineer, preferably with GCP Independent & Collaborative: Comfortable driving work on your own and within a diverse team Problem Solver: Strong analytical skills and a pragmatic mindset Communicator: Able to translate complex technical details into clear, actionable plans Required skills: Dbt GCP BigQuery Languages: English (Proficient) Ready for your next career move? Explore opportunities at Co-Workertech.com Join our LinkedIn groups for updates on upcoming opportunities! Connect, collaborate, and thrive with industry leaders : Co-Worker Technology Co-Worker Renewable Energy Industry Jobs Follow us to stay updated on the latest news, insights, and exciting announcements from our company. 👉 Facebook 👉 Instagram Let's stay connected and grow together! 🚀 Show more Show less

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Vadodara, Gujarat, India

On-site

Linkedin logo

Company Description Wiser Solutions is a suite of in-store and eCommerce intelligence and execution tools. We're on a mission to enable brands, retailers, and retail channel partners to gather intelligence and automate actions to optimize in-store and online pricing, marketing, and operations initiatives. Our Commerce Execution Suite is available globally. Job Description When looking to buy a product, whether it is in a brick and mortar store or online, it can be hard enough to find one that not only has the characteristics you are looking for but is also at a price that you are willing to pay. It can also be especially frustrating when you finally find one, but it is out of stock. Likewise, brands and retailers can have a difficult time getting the visibility they need to ensure you have the most seamless experience as possible in selecting their product. We at Wiser believe that shoppers should have this seamless experience, and we want to do that by providing the brands and retailers the visibility they need to make that belief a reality. Our goal is to solve a messy problem elegantly and cost effectively. Our job is to collect, categorize, and analyze lots of structured and semi-structured data from lots of different places every day (whether it’s 20 million+ products from 500+ websites or data collected from over 300,000 brick and mortar stores across the country). We help our customers be more competitive by discovering interesting patterns in this data they can use to their advantage, while being uniquely positioned to be able to do this across both online and instore. We are looking for a lead-level software engineer to lead the charge on a team of like-minded individuals responsible for developing the data architecture that powers our data collection process and analytics platform. If you have a passion for optimization, scaling, and integration challenges, this may be the role for you. What You Will Do Think like our customers – you will work with product and engineering leaders to define data solutions that support customers’ business practices. Design/develop/extend our data pipeline services and architecture to implement your solutions – you will be collaborating on some of the most important and complex parts of our system that form the foundation for the business value our organization provides Foster team growth – provide mentorship to both junior team members and evangelizing expertise to those on others. Improve the quality of our solutions – help to build enduring trust within our organization and amongst our customers by ensuring high quality standards of the data we manage Own your work – you will take responsibility to shepherd your projects from idea through delivery into production Bring new ideas to the table – some of our best innovations originate within the team Technologies We Use Languages: SQL, Python Infrastructure: AWS, Docker, Kubernetes, Apache Airflow, Apache Spark, Apache Kafka Databases: Snowflake, Trino/Starburst, Redshift, MongoDB, Postgres, MySQL Others: Tableau (as a business intelligence solution) Qualifications Bachelors/Master’s degree in Computer Science or relevant technical degree 10+ years of professional software engineering experience Strong proficiency with data languages such as Python and SQL Strong proficiency working with data processing technologies such as Spark, Flink, and Airflow Strong proficiency working of RDMS/NoSQL/Big Data solutions (Postgres, MongoDB, Snowflake, etc.) Solid understanding of streaming solutions such as Kafka, Pulsar, Kinesis/Firehose, etc. Solid understanding of Docker and Kubernetes Solid understanding of ETL/ELT and OLTP/OLAP concepts Solid understanding of columnar/row-oriented data structures (e.g. Parquet, ORC, Avro, etc.) Solid understanding of Apache Iceberg, or other open table formats Proven ability to transform raw unstructured/semi-structured data into structured data in accordance to business requirements Solid understanding of AWS, Linux and infrastructure concepts Proven ability to diagnose and address data abnormalities in systems Proven ability to learn quickly, make pragmatic decisions, and adapt to changing business needs Experience building data warehouses using conformed dimensional models Experience building data lakes and/or leveraging data lake solutions (e.g. Trino, Dremio, Druid, etc.) Experience working with business intelligence solutions (e.g. Tableau, etc.) Experience working with ML/Agentic AI pipelines (e.g. , Langchain, LlamaIndex, etc.) Understands Domain Driven Design concepts and accompanying Microservice Architecture Passion for data, analytics, or machine learning. Focus on value: shipping software that matters to the company and the customer Bonus Points Experience working with vector databases Experience working within a retail or ecommerce environment. Proficiency in other programming languages such as Scala, Java, Golang, etc. Experience working with Apache Arrow and/or other in-memory columnar data technologies Supervisory Responsibility Provide mentorship to team members on adopted patterns and best practices. Organize and lead agile ceremonies such as daily stand-ups, planning, etc Additional Information EEO STATEMENT Wiser Solutions, Inc. is an Equal Opportunity Employer and prohibits Discrimination, Harassment, and Retaliation of any kind. Wiser Solutions, Inc. is committed to the principle of equal employment opportunity for all employees and applicants, providing a work environment free of discrimination, harassment, and retaliation. All employment decisions at Wiser Solutions, Inc. are based on business needs, job requirements, and individual qualifications, without regard to race, color, religion, sex, national origin, family or parental status, disability, genetics, age, sexual orientation, veteran status, or any other status protected by the state, federal, or local law. Wiser Solutions, Inc. will not tolerate discrimination, harassment, or retaliation based on any of these characteristics. Show more Show less

Posted 3 weeks ago

Apply

5.0 - 7.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking an experienced SQL Developer with expertise in SQL Server Analysis Services (SSAS) and AWS to join our growing team. The successful candidate will be responsible for designing, developing, and maintaining SQL Server-based OLAP cubes and SSAS models for business intelligence purposes. You will work with multiple data sources, ensuring data integration, optimization, and performance of the reporting models. This role offers an exciting opportunity to work in a hybrid work environment, collaborate with cross-functional teams, and apply your skills in SSAS, SQL, and AWS to design scalable and high-performance data solutions. Key Responsibilities : - Design, develop, and maintain SQL Server Analysis Services (SSAS) models, including both Multidimensional and Tabular models, to support business intelligence and reporting solutions. - Create and manage OLAP cubes that are optimized for fast query performance and used for analytical reporting and decision-making. - Develop and implement multidimensional and tabular data models for various business needs, ensuring the models are flexible, scalable, and optimized for reporting. - Work on performance tuning and optimization of SSAS solutions, ensuring efficient query processing and high performance even with large data sets. - Integrate data from various sources, including SQL Server databases, flat files, and cloud-based storage, into SSAS models for seamless and accurate reporting. - Integrate and manage data from AWS services (e.g., S3, Redshift, etc.) into the SQL Server database and SSAS models for hybrid cloud and on-premise data solutions. - Leverage SQL Server PolyBase to access and integrate data from external data sources like AWS S3, Azure Blob Storage, or other systems for data processing. - Ensure data integrity, consistency, and accuracy within the data models and reporting systems. Work closely with data governance teams to maintain high-quality data standards. - Work in an agile team environment with BI developers, data engineers, and business analysts to align data models and solutions with business requirements. - Provide support for production systems, troubleshoot issues with SSAS models, queries, and reporting solutions, and implement fixes when necessary. - Maintain clear and detailed technical documentation for SSAS model designs, ETL processes, and best practices for data integration. Required Skills & Experience : - 5+ years of experience as a SQL Developer with strong hands-on expertise in SSAS. - In-depth experience in creating and managing SSAS models, including multidimensional (OLAP) and tabular models. - Proficiency in SQL Server (T-SQL, SSIS, SSRS) for data integration, data transformation, and reporting. - Strong understanding of SSAS performance tuning, query optimization, and processing. - Experience with AWS services, particularly AWS S3, AWS Redshift, and their integration with SQL Server-based solutions. - Knowledge of SQL Server PolyBase for data integration and access from external data sources. - Experience in business intelligence solutions and creating reports using tools like Power BI or SSRS. - Familiarity with cloud data integration, ensuring seamless integration between on-premise SQL Server databases and cloud-based storage (AWS). - Strong problem-solving skills and the ability to troubleshoot and resolve issues in data models and data warehouses. - Excellent communication skills, both verbal and written, with the ability to effectively collaborate with cross-functional teams. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 3 weeks ago

Apply

8.0 - 12.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Job Position Python Lead Total Exp Required 6+ years Relevant Exp Required around 5 Mandatory skills required Strong Python coding and development Good to have skills required Cloud, SQL , data analysis skills Location Pune - Kharadi - WFO - 3 days/week. About The Role : We are seeking a highly skilled and experienced Python Lead to join our team. The ideal candidate will have strong expertise in Python coding and development, along with good-to-have skills in cloud technologies, SQL, and data analysis. Key Responsibilities : - Lead the development of high-quality, scalable, and robust Python applications. - Collaborate with cross-functional teams to define, design, and ship new features. - Ensure the performance, quality, and responsiveness of applications. - Develop RESTful applications using frameworks like Flask, Django, or FastAPI. - Utilize Databricks, PySpark SQL, and strong data analysis skills to drive data solutions. - Implement and manage modern data solutions using Azure Data Factory, Data Lake, and Data Bricks. Mandatory Skills : - Proven experience with cloud platforms (e.g. AWS) - Strong proficiency in Python, PySpark, R, and familiarity with additional programming languages such as C++, Rust, or Java. - Expertise in designing ETL architectures for batch and streaming processes, database technologies (OLTP/OLAP), and SQL. - Experience with the Apache Spark, and multi-cloud platforms (AWS, GCP, Azure). - Knowledge of data governance and GxP data contexts; familiarity with the Pharma value chain is a plus. Good to Have Skills : - Experience with modern data solutions via Azure. - Knowledge of principles summarized in the Microsoft Cloud Adoption Framework. - Additional expertise in SQL and data analysis. Educational Qualifications Bachelor's/Master's degree or equivalent with a focus on software engineering. If you are a passionate Python developer with a knack for cloud technologies and data analysis, we would love to hear from you. Join us in driving innovation and building cutting-edge solutions! Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 3 weeks ago

Apply

8.0 - 10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Key Responsibilities Key responsibilities would include: Own the Supply Chain Inventory Optimization Reporting and Analytics deliverables for regions or businesses Focus on bringing value to the Output data and improving the input / coverage Focused towards the big picture of Analytics and ensuring the project moves up the value chain Develop the engagement; focus on building the Framework to help SC teams take strategic decisions. Proactive reporting & analytics methodology. Build business models, research both internal and external sources, and extract key insights from data analysis to inform key focus areas and development priorities for the organization. Work with the HP Inc Supply Operations Strategy & Development Organization to support large projects across $57 billion HP Inc. business WW. Ability to build/trouble shoot reporting and analytics that provide actionable business insights, validate hypothesis, carry out enhancements based on business/stakeholder feedback. Collaborate with BI team, IT, regions and global partners to ensure data integrity and gather and analyze updated data for various finance supply chain operations specific analytics initiatives Other Responsibilities Developing creative solutions in an unstructured environment Work with different stakeholders in the PPS Supply Chain teams to continuously improve the level of reporting and analysis Analyze and Implement business logic in scorecards/ dashboards Maintain high quality reporting and analysis Help team meet Operational Excellence and Technical development standards and create Process maps, process docs for the engagement. Skills Requirement Technical Skills: In Depth knowledge and experience in : VBA, Excel, SQL Server, SSIS, Visual Basic.NET, Windows Server admin Strong competency in VBA programming in MS Excel Very Good with MS Office. Ability to represent data through XL/PowerPoint / PDF Software product development experience in front end technologies VB.NET. Experience with PowerShell. Database skills – maintenance, design and development like SQL language, SQL server, SSIS, OLAP Cubes etc.) Strong with Installing, configuring and supporting Microsoft technology solutions on windows 2003/2008 platform hosting web and database applications, while maintaining security, backups, monitoring and performing routine server maintenance. Other Skills: Supply chain domain knowledge in topics like – Supply and Demand Management, Inventory Management, Inventory Optimization Independent judgment to perform planning analysis, raise issues and concerns on the plan to propose action plans and solutions without coaching Experience in interacting with Senior Supply Chain business and function leaders and the planners. Successful project management expertise. Must be strong at multi-tasking and ideally have experience with virtual cross functional teams Process mapping , documentation (for process and training) Strong analytical skills, and very detail-oriented. Good English & Communication skills needed Proactive & Result driven approach very important (with focus to optimize workflows and “get things done”) Person needs to be a strong team player Person needs to be agile, quick learner, motivated & assertive. Ability to work flexible hours - method, organization & flexibility Intercultural & international awareness / experience Enthusiastic, motivated, positive, "CAN DO" attitude, Results focused Experience & Background 8-10 Years work experience in Supply Chain BI Solutions design and development Bachelors/Master’s degree in Engineering, Systems, Computer Applications (MCA) At least 4-5 years of experience in VBA, Excel, SQL Server, Visual Basic.NET, Windows Server admin Must have a proven record of delivering technical solutions (working with multiple data sources, desktop automation projects) HP is a proven leader in personal systems and printing, delivering innovations that empower people to create, interact, and inspire like never before. We leverage our strong financial position to extend our leadership in traditional markets and invest in exciting new technologies. HP has an impressive portfolio and strong innovation pipeline across areas such as blended reality technology, 3D printing, multi-function printing, Ink in the office, notebooks and mobile workstations. Roles And Responsibilities: This is an opportunity to join the Global Analytics organization of HP Inc and responsible for supply chain planning and analytics of Personal Systems business unit. Global Analytics hosts planning activities for WW Operations and provide analytics and operational expertise to drive fact-based strategic and tactical supply planning, manage and execute forecast improvement initiatives by leveraging advanced analytics through simulation, modelling, optimization techniques & predictive analytics. Key metrics include Forecast Accuracy and bias, days of inventory (DOI), and availability. The team work closely with various business units, regions, functions and manufacturing partners across multiple domains. Skills Requirement: Supply and Demand Planning process with focus on improving the forecast accuracy/availability with optimal inventory. Proven expertise in transforming the end to end value chain in a complex environment. Proficiency in SAP APO/IBP/Ariba or any other planning solutions is a must. Capability to design analytics solutions, simulation, modelling to address business problems. Exposure to advanced analytics/optimization techniques and experience in analytics tools like R and Python will be an added advantage. Ability to understand complex data structures and cloud based systems/services. Strong business acumen, a high degree of ownership and integrity, and a high attention to detail. Ability to build relationships with external entities with controls on timelines, cost and returns. Effectively and creatively tell stories and create visualizations to describe and communicate data insights. Skills on a visualization tools like Power BI/Tableau would be preferable. Academics And Experience: Bachelor's or Master's in Operations / Operations Research / Computer Science / Statistics / Mathematics 4 -8 years of work experience in Supply Chain Management/ Operations planning function in Hi Tech or related industry with significant hands-on experience in multitasking/ cross functional environment. About the engagement The Supply Chain Business Intelligence team within the Supply Chain Strategy and Development Organization is focused at simplifying how information moves back and forth across various supply chain functions through collaboration and business processes. It is about an ecosystem of infrastructure, management solutions and workflows enabling information to work better for SC teams across businesses and regions. This objective is to assess and understand the information and insights required and how the reporting and analytics team can assist to accelerate business collaboration and increase efficiency. Supply chain BI Inventory Optimization Solution Lead would be joining the Supply Chain Analytics team within HP. S/he would work with HP’s businesses on business intelligence Inventory Optimization reporting and analytical solutions development. Candidate would apply subject matter knowledge to develop and execute business critical reports, dashboards and scorecards. Candidate would also be involved in supply chain solution building as an expert on frontend GUI/ backend database to enable critical supply chain analytics delivery. Drive supply chain planning related reports development and execution - help HP businesses to understand and generate dashboards, scorecards/reports, drill down/deep dive analysis of data related to units, orders, revenue, shipments, backlog, forecast etc. from various internal databases. Develop new solutions & capabilities in the Supply chain domain based on white spaces & business requirements working closely with various delivery team members Show more Show less

Posted 3 weeks ago

Apply

5.0 - 7.0 years

1 - 4 Lacs

Mumbai

Work from Office

Naukri logo

- SSA Architects is one of India's largest full-service architectural firms, based in Mumbai, India. For over twenty years, we have provided cutting-edge, world-class services. With a fast-expanding global presence and a 275+ team that is innovative, dynamic and young, we have become the one-stop solution provider of choice for dozens of clients. - Our core strengths are our integrated service offerings and multidisciplinary capabilities. From evolving that first out-of-the-box idea to tightening the last bolt, we offer the entire spectrum of architectural services, from Architecture and Interior Design right through to Project Management. - The role will allow you to hone and develop your professional, technical, and managerial skills, while providing immediate growth opportunity. The scope shall be including and not limited to the following 1. Co-ordination with all project stakeholders including Client. 2. Liaison work with relevant authorities & Govt. Departments for permissions, clearances from statutory authorities and No Objection Certificates (NOCs) etc. 3. Maintain checklist required for various permissions / NOCs, approvals etc. Track status of correspondence with authorities and do the necessary follow-up. 4. Checking Layout plans, drawings required for obtaining various approvals, NOCs and other statutory permissions etc. 5. Attend meetings project consultants, client, Statutory bodies, Planning Authorities and others as and when required. 7. Prepare meeting notes as and when required. 8. Assist in drafting letters, reply to all the queries raised at various levels of the approval process and guide the team in obtaining approvals / permissions etc. 9. Visit various Statutory bodies, Ward offices, Planning Authorities etc. as and when required to obtain various information and approvals. 10. Assist senior for preparing presentations to all project stakeholders, Statutory bodies etc. as and when required. 11. Assist senior for project related meetings / presentations etc. 12. To assist/perform any other task / duty to complete the work assigned to SSA. 13. To ensure that the data compiled / prepared is in the format required by client / project requirement. 14. To assist/participate in compilation of data / drawings/documents required by Client for various statutory permissions/NOC etc. as per project requirement. 15. All tasks that are required to be done as per scope of work, responsibility matrix and Client requirement for the project. Educational QualificationBE Civil/B Arch with minimum 5 to 7 years working experience in liasioning Other Requirements : - Good Communication skills (Verbal and Written - English and Marathi) - Ability to independently draft letters / take and prepare meeting notes - Team player - Abilities to multitask This job opening was posted long time back. It may not be active. Nor was it removed by the recruiter. Please use your discretion.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

6 - 11 Lacs

Pune

Work from Office

Naukri logo

About The Role : We are seeking a highly skilled and experienced MSTR (MicroStrategy) Developer to join our Business Intelligence team. In this role, you will be responsible for the design, development, implementation, and maintenance of robust and scalable BI solutions using the MicroStrategy platform. Your primary focus will be on leveraging your deep understanding of MicroStrategy architecture and strong SQL skills to deliver insightful and actionable data to our stakeholders. This is an excellent opportunity to contribute to critical business decisions by providing high-quality BI solutions. Responsibilities - Design, develop, and deploy MicroStrategy objects including reports, dashboards, cubes (Intelligent Cubes, OLAP Cubes), documents, and visualizations. - Utilize various MicroStrategy features and functionalities such as Freeform SQL, Query Builder, MDX connectivity, and data blending. - Optimize MicroStrategy schema objects (attributes, facts, hierarchies) for performance and usability. - Implement security models within MicroStrategy, including user and group management, object-level security, and data-level security. - Perform performance tuning and optimization of MicroStrategy reports and dashboards. - Participate in the administration and maintenance of the MicroStrategy environment, including metadata management, project configuration, and user support. - Troubleshoot and resolve issues related to MicroStrategy reports, dashboards, and the overall platform. - Write complex and efficient SQL queries to extract, transform, and load data from various data sources. - Understand database schema design and data modeling principles. - Optimize SQL queries for performance within the MicroStrategy environment. - Work with different database platforms (e.g., Oracle, SQL Server, Teradata, Snowflake) and understand their specific SQL dialects. - Develop and maintain database views and stored procedures to support MicroStrategy development. - Collaborate with business analysts and end-users to understand their reporting and analytical requirements. - Translate business requirements into technical specifications for MicroStrategy development. - Participate in the design and prototyping of BI solutions. - Develop and execute unit tests and integration tests for MicroStrategy objects. - Participate in user acceptance testing (UAT) and provide support to end-users during the testing phase. - Ensure the accuracy and reliability of data presented in MicroStrategy reports and dashboards. - Create and maintain technical documentation for MicroStrategy solutions, including design documents, user guides, and deployment instructions. - Provide training and support to end-users on how to effectively use MicroStrategy reports and dashboards. - Adhere to MicroStrategy best practices and development standards. - Stay updated with the latest MicroStrategy features and functionalities. - Proactively identify opportunities to improve existing MicroStrategy solutions and processes. Required Skills and Expertise - Strong proficiency in MicroStrategy development (5+ years of hands-on experience is essential). This includes a deep understanding of the MicroStrategy architecture, object creation, report development, dashboard design, and administration. - Excellent SQL skills (5+ years of experience writing complex queries, optimizing performance, and working with various database systems). - Experience in data modeling and understanding of dimensional modeling concepts (e.g., star schema, snowflake schema). - Solid understanding of BI concepts, data warehousing principles, and ETL processes. - Experience in performance tuning and optimization of MicroStrategy reports and SQL queries. - Ability to gather and analyze business requirements and translate them into technical specifications. - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills, with the ability to work effectively with both technical and business stakeholders. - Experience with version control systems (e.g., Git). - Ability to work independently and as part of a team. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role : As a SaaS Developer focused on SSAS and OLAP, you will play a critical role in our data warehousing and business intelligence initiatives. You will work closely with data engineers, business analysts, and other stakeholders to ensure the delivery of accurate and timely data insights. Your expertise in SSAS development, performance optimization, and data integration will be essential to your success. Responsibilities : - Design, develop, and maintain SQL Server Analysis Services (SSAS) models (multidimensional and tabular). - Create and manage OLAP cubes to support business intelligence reporting and analytics. - Implement best practices for data modeling and cube design. - Optimize the performance of SSAS solutions for efficient query processing and data retrieval. - Tune SSAS models and cubes to ensure optimal performance. - Identify and resolve performance bottlenecks. - Integrate data from various sources (relational databases, flat files, APIs) into SQL Server databases and SSAS models. - Develop and implement ETL (Extract, Transform, Load) processes for data integration. - Ensure data quality and consistency across integrated data sources. - Support the development of business intelligence reports and dashboards. - Collaborate with business analysts to understand reporting requirements and translate them into SSAS solutions. - Provide technical support and troubleshooting for SSAS-related issues. - Preferably have knowledge of AWS S3 and SQL Server PolyBase for data integration and cloud-based data warehousing. - Integrate data from AWS S3 into SSAS models using PolyBase or other appropriate methods. Required Skills & Qualifications : Experience : - 5-8 years of experience as a SQL Developer with a focus on SSAS and OLAP. - Proven experience in designing and developing multidimensional and tabular SSAS models. Technical Skills : - Strong expertise in SQL Server Analysis Services (SSAS) and OLAP cube development. - Proficiency in writing MDX and DAX queries. - Experience with data modeling and database design. - Strong understanding of ETL processes and data integration techniques. - Experience with SQL Server databases and related tools. - Preferably knowledge of AWS S3 and SQL Server PolyBase. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 3 weeks ago

Apply

8.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Key Responsibilities Key responsibilities would include: Own the Supply Chain Inventory Optimization Reporting and Analytics deliverables for regions or businesses Focus on bringing value to the Output data and improving the input / coverage Focused towards the big picture of Analytics and ensuring the project moves up the value chain Develop the engagement; focus on building the Framework to help SC teams take strategic decisions. Proactive reporting & analytics methodology. Build business models, research both internal and external sources, and extract key insights from data analysis to inform key focus areas and development priorities for the organization. Work with the HP Inc Supply Operations Strategy & Development Organization to support large projects across $57 billion HP Inc. business WW. Ability to build/trouble shoot reporting and analytics that provide actionable business insights, validate hypothesis, carry out enhancements based on business/stakeholder feedback. Collaborate with BI team, IT, regions and global partners to ensure data integrity and gather and analyze updated data for various finance supply chain operations specific analytics initiatives Other Responsibilities Developing creative solutions in an unstructured environment Work with different stakeholders in the PPS Supply Chain teams to continuously improve the level of reporting and analysis Analyze and Implement business logic in scorecards/ dashboards Maintain high quality reporting and analysis Help team meet Operational Excellence and Technical development standards and create Process maps, process docs for the engagement. Skills Requirement Technical Skills: In Depth knowledge and experience in : VBA, Excel, SQL Server, SSIS, Visual Basic.NET, Windows Server admin Strong competency in VBA programming in MS Excel Very Good with MS Office. Ability to represent data through XL/PowerPoint / PDF Software product development experience in front end technologies VB.NET. Experience with PowerShell. Database skills – maintenance, design and development like SQL language, SQL server, SSIS, OLAP Cubes etc.) Strong with Installing, configuring and supporting Microsoft technology solutions on windows 2003/2008 platform hosting web and database applications, while maintaining security, backups, monitoring and performing routine server maintenance. Other Skills: Supply chain domain knowledge in topics like – Supply and Demand Management, Inventory Management, Inventory Optimization Independent judgment to perform planning analysis, raise issues and concerns on the plan to propose action plans and solutions without coaching Experience in interacting with Senior Supply Chain business and function leaders and the planners. Successful project management expertise. Must be strong at multi-tasking and ideally have experience with virtual cross functional teams Process mapping , documentation (for process and training) Strong analytical skills, and very detail-oriented. Good English & Communication skills needed Proactive & Result driven approach very important (with focus to optimize workflows and “get things done”) Person needs to be a strong team player Person needs to be agile, quick learner, motivated & assertive. Ability to work flexible hours - method, organization & flexibility Intercultural & international awareness / experience Enthusiastic, motivated, positive, "CAN DO" attitude, Results focused Experience & Background 8-10 Years work experience in Supply Chain BI Solutions design and development Bachelors/Master’s degree in Engineering, Systems, Computer Applications (MCA) At least 4-5 years of experience in VBA, Excel, SQL Server, Visual Basic.NET, Windows Server admin Must have a proven record of delivering technical solutions (working with multiple data sources, desktop automation projects) HP is a proven leader in personal systems and printing, delivering innovations that empower people to create, interact, and inspire like never before. We leverage our strong financial position to extend our leadership in traditional markets and invest in exciting new technologies. HP has an impressive portfolio and strong innovation pipeline across areas such as blended reality technology, 3D printing, multi-function printing, Ink in the office, notebooks and mobile workstations. Roles And Responsibilities: This is an opportunity to join the Global Analytics organization of HP Inc and responsible for supply chain planning and analytics of Personal Systems business unit. Global Analytics hosts planning activities for WW Operations and provide analytics and operational expertise to drive fact-based strategic and tactical supply planning, manage and execute forecast improvement initiatives by leveraging advanced analytics through simulation, modelling, optimization techniques & predictive analytics. Key metrics include Forecast Accuracy and bias, days of inventory (DOI), and availability. The team work closely with various business units, regions, functions and manufacturing partners across multiple domains. Skills Requirement: Supply and Demand Planning process with focus on improving the forecast accuracy/availability with optimal inventory. Proven expertise in transforming the end to end value chain in a complex environment. Proficiency in SAP APO/IBP/Ariba or any other planning solutions is a must. Capability to design analytics solutions, simulation, modelling to address business problems. Exposure to advanced analytics/optimization techniques and experience in analytics tools like R and Python will be an added advantage. Ability to understand complex data structures and cloud based systems/services. Strong business acumen, a high degree of ownership and integrity, and a high attention to detail. Ability to build relationships with external entities with controls on timelines, cost and returns. Effectively and creatively tell stories and create visualizations to describe and communicate data insights. Skills on a visualization tools like Power BI/Tableau would be preferable. Academics And Experience: Bachelor's or Master's in Operations / Operations Research / Computer Science / Statistics / Mathematics 4 -8 years of work experience in Supply Chain Management/ Operations planning function in Hi Tech or related industry with significant hands-on experience in multitasking/ cross functional environment. About the engagement The Supply Chain Business Intelligence team within the Supply Chain Strategy and Development Organization is focused at simplifying how information moves back and forth across various supply chain functions through collaboration and business processes. It is about an ecosystem of infrastructure, management solutions and workflows enabling information to work better for SC teams across businesses and regions. This objective is to assess and understand the information and insights required and how the reporting and analytics team can assist to accelerate business collaboration and increase efficiency. Supply chain BI Inventory Optimization Solution Lead would be joining the Supply Chain Analytics team within HP. S/he would work with HP’s businesses on business intelligence Inventory Optimization reporting and analytical solutions development. Candidate would apply subject matter knowledge to develop and execute business critical reports, dashboards and scorecards. Candidate would also be involved in supply chain solution building as an expert on frontend GUI/ backend database to enable critical supply chain analytics delivery. Drive supply chain planning related reports development and execution - help HP businesses to understand and generate dashboards, scorecards/reports, drill down/deep dive analysis of data related to units, orders, revenue, shipments, backlog, forecast etc. from various internal databases. Develop new solutions & capabilities in the Supply chain domain based on white spaces & business requirements working closely with various delivery team members Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About GSPANN GSPANN is a global IT services and consultancy provider headquartered in Milpitas, California (U.S.A.). With five global delivery centers across the globe, GSPANN provides digital solutions that support the customer buying journeys of B2B and B2C brands worldwide. With a strong focus on innovation and client satisfaction, GSPANN delivers cutting-edge solutions that drive business success and operational excellence. GSPANN helps retail, finance, manufacturing, and high-technology brands deliver competitive customer experiences and increased revenues through our solution delivery, technologies, practices, and operations for each client. For more information, visit www.gspann.com JD for your reference: We are looking for a passionate Data Modeler to build, optimize and maintain conceptual and logical/Physical database models. The Candidate will turn data into information, information into insight and insight into business decisions. Job Position-Data Modeller Experience- 5+ years Location- Hyderabad, Gurugram Skills- Data Modeling, Data Analysis, Cloud and SQL Responsibilities: Design and develop conceptual, logical, and physical data models for databases, data warehouses, and data lakes. Translate business requirements into data structures that fit both OLTP (Online Transaction Processing) and OLAP (Online Analytical Processing) environments.Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field. 3+ years of experience as a Data Modeler or in a related role. Proficiency in data modeling tools (Erwin, ER/Studio, SQL Developer Data Modeler). Strong experience with SQL and database technologies (Oracle, SQL Server, MySQL, PostgreSQL). Familiarity with ETL tools (Informatica, Talend, Apache NiFi) and data integration techniques. Knowledge of data warehousing concepts and data lake architecture. Understanding of Big Data technologies (Hadoop, Spark) is a plus. Experience with cloud platforms like AWS, GCP, or Azure Why Choose GSPANN? At GSPANN, we don’t just serve our clients—we co-create. The GSPANNians are passionate technologists who thrive on solving the toughest business challenges, delivering trailblazing innovations for marquee clients. This collaborative spirit fuels a culture where every individual is encouraged to sharpen their skills, feed their curiosity, and take ownership to learn, experiment, and succeed. We believe in celebrating each other’s successes—big or small—and giving back to the communities we call home. If you’re ready to push boundaries and be part of a close-knit team that’s shaping the future of tech, we invite you to carry forward the baton of innovation with us. Let’s Co-Create the Future—Together. Discover Your Inner Technologist Explore and expand the boundaries of tech innovation without the fear of failure. Accelerate Your Learning Shape your career while scripting the future of tech. Seize the ample learning opportunities to grow at a rapid pace. Feel Included At GSPANN, everyone is welcome. Age, gender, culture, and nationality do not matter here, what matters is YOU. Inspire and Be Inspired When you work with the experts, you raise your game. At GSPANN, you’re in the company of marquee clients and extremely talented colleagues. Enjoy Life We love to celebrate milestones and victories, big or small. Ever so often, we come together as one large GSPANN family. Give Back Together, we serve communities. We take steps, small and large so we can do good for the environment, weaving in sustainability and social change in our endeavors. We invite you to carry forward the baton of innovation in technology with us. Let’s Co-Create GSPANN | Consulting Services, Technology Services, and IT Services Provider GSPANN provides consulting services, technology services, and IT services to e-commerce businesses with high technology, manufacturing, and financial services. Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Introduction A career in IBM Consulting embraces long-term relationships and close collaboration with clients across the globe. In this role, you will work for IBM BPO, part of Consulting that, accelerates digital transformation using agile methodologies, process mining, and AI-powered workflows. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio, including IBM Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be supported by mentors and coaches who will encourage you to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in groundbreaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and learning opportunities in an environment that embraces your unique skills and experience. Your Role And Responsibilities Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Graduate with a minimum of 5+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Preferred Education Master's Degree Required Technical And Professional Expertise Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions. Show more Show less

Posted 3 weeks ago

Apply

5.0 - 10.0 years

12 - 22 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

Job Title: ======= Senior MS BI Developer Onsite Location: ============= Dubai, UAE Doha , Qatar Riyadh, Saudi Onsite Monthly Salary: ============== 10k AED - 15k AED - Full tax free salary , Depending on Experience Gulf work permit will be sponsored by our client Project duration: ============= 2 Years, Extendable Desired Experience Level Needed: =========================== 5 - 10 Years Qualification: ========== B.Tech / M.Tech / MCA / M.Sc or equivalent Experience Needed: =============== Over all: 5 or more Years of total IT experience Solid 3+ Years experience or MS - BI Developer with Microsoft Stack / MS - DWH Engineer Job Responsibilities: ================ - Design and develop DWH data flows - Able to build SCD -1 / SCD - 2 / SCD -3 dimensions - Build Cubes - Maintain SSAS / DWH data - Design Microsoft DWH & its ETL packages - Able to code T-SQL - Able to create Orchestrations - Able to design batch jobs / Orchestrations runs - Familiarity with data models - Able to develop MDM (Master Data Management) Experience: ================ - Experience as DWH developer with Microsoft DWH data flows and cubes - Exposure and experience with Azure services including Azure Data Factory - Sound knowledge of BI practices and visualization tools such as PowerBI / SSRS/ QlikView - Collecting / gathering data from various multiple source systems - Creating automated data pipelines - Configuring Azure resources and services Skills: ================ - Microsoft SSIS / SSAS / SSRS - Informatica - Azure Data Factory - Spark - SQL Nice to have: ========== - Any on site experience is added advantage, but not mandatory - Microsoft certifications are added advantage Business Vertical: ============== - Banking / Investment Banking - Capital Markets - Securities / Stock Market Trading - Bonds / Forex Trading - Credit Risk - Payments Cards Industry (VISA/ Master Card/ Amex) Job Code: ====== MSBI_DEVP_0525 No.of positions: ============ 05 Email: ===== spectrumconsulting1977@gmail.com if you are interested, please email your CV as ATTACHMENT with job ref. code [ MSBI_DEVP_0525 ] as subject

Posted 3 weeks ago

Apply

5.0 - 10.0 years

8 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Job Title : Azure Synapse Developer Position Type : Permanent Experience : 5+ Years Location : Hyderabad (Work From Office / Hybrid) Shift Timings : 2 PM to 11 PM Mode of Interview : 3 rounds (Virtual/In-person) Notice Period : Immediate to 15 days Job Description : We are looking for an experienced Azure Synapse Developer to join our growing team. The ideal candidate should have a strong background in Azure Synapse Analytics, SSRS, and Azure Data Factory (ADF), with a solid understanding of data modeling, data movement, and integration. As an Azure Synapse Developer, you will work closely with cross-functional teams to design, implement, and manage data pipelines, ensuring the smooth flow of data across platforms. The candidate must have a deep understanding of SQL and ETL processes, and ideally, some exposure to Power BI for reporting and dashboard creation. Key Responsibilities : - Develop and maintain Azure Synapse Analytics solutions, ensuring scalability, security, and performance. - Design and implement data models for efficient storage and retrieval of data in Azure Synapse. - Utilize Azure Data Factory (ADF) for ETL processes, orchestrating data movement, and integrating data from various sources. - Leverage SSIS/SSRS/SSAS to build, deploy, and maintain data integration and reporting solutions. - Write and optimize SQL queries for data manipulation, extraction, and reporting. - Collaborate with business analysts and other stakeholders to understand reporting needs and create actionable insights. - Perform performance tuning on SQL queries, pipelines, and Synapse workloads to ensure high performance. - Provide support for troubleshooting and resolving data integration and performance issues. - Assist in setting up automated data processes and create reusable templates for data integration. - Stay updated on Azure Synapse features and tools, recommending improvements to the data platform as appropriate. Required Skills & Qualifications : - 5+ years of experience as a Data Engineer or Azure Synapse Developer. - Strong proficiency in Azure Synapse Analytics (Data Warehouse, Data Lake, and Analytics). - Solid understanding and experience in data modeling for large-scale data architectures. - Expertise in SQL for writing complex queries, optimizing performance, and managing large datasets. - Hands-on experience with Azure Data Factory (ADF) for data integration, ETL processes, and pipeline creation. - SSRS (SQL Server Reporting Services) and SSIS (SQL Server Integration Services) expertise. - Power BI knowledge (basic to intermediate) for reporting and data visualization. - Familiarity with SSAS (SQL Server Analysis Services) and OLAP concepts is a plus. - Experience in troubleshooting and optimizing complex data processing tasks. - Strong communication and collaboration skills to work effectively in a team-oriented environment. - Ability to quickly adapt to new tools and technologies in the Azure ecosystem.

Posted 3 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

We have a great opportunity for the role of Data Modeler, OLTP and OLAP systems. Relevant Exp : 4+ Yrs Hands-on data modelling for OLTP and OLAP systems In-Depth knowledge of Conceptual, Logical and Physical data modelling Strong understanding of Indexing, partitioning, data sharding with practical experience of having done the same Strong understanding of variables impacting database performance for near-real time reporting and application interaction. Should have working experience on at least one data modelling tool, preferably DBSchema, Erwin Good understanding of GCP databases like AlloyDB, CloudSQL and BigQuery. People with functional knowledge of mutual fund industry will be a plus Should be willing to work from Chennai, office presence is mandatory -Immediate Joiners to 15 days Preferred -Job location- Bangalore and Chennai Ask me about awesome referral award of INR 50000 Check global opportunity at iitjobs.com Show more Show less

Posted 4 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

About Company : Our client is prominent Indian multinational corporation specializing in information technology (IT), consulting, and business process services and its headquartered in Bengaluru with revenues of gross revenue of ₹222.1 billion with global work force of 234,054 and listed in NASDAQ and it operates in over 60 countries and serves clients across various industries, including financial services, healthcare, manufacturing, retail, and telecommunications. The company consolidated its cloud, data, analytics, AI, and related businesses under the tech services business line. Major delivery centers in India, including cities like Chennai, Pune, Hyderabad, and Bengaluru, kochi, kolkatta, Noida. Job Title : Java Backend Developer (Contract to Hire) Experience : 6 to 10 Years Job Location : Bengaluru / Chennai Interview Mode : Face-to-Face Interview Date : 24th May 2025 Venue : [Will be shared shortly] · Notice period :- Immediate joiners. Standard Job Requirements 6+ Years of experience in Application Development using Java and Advance Technologies tool Strong understanding of fundamental architecture and design principles, object-orientation principles, and coding standards Ability to design and build smart, scalable, and resilient solutions with tight deadlines, both high and low-level. Strong analytical and problem-solving skills Strong verbal and written communication skills Good knowledge in DevOps, CI-CD Understanding on source control, versioning, branching etc. Experienced in Agile methodology and Waterfall models Strong experience in Application Delivery, that also includes Production Support Very Good presentation and documentation skills Ability to learn and adapt to new technologies and frameworks Awareness about Release Management Strong team player who can collaborate effectively with relevant stakeholders Recommend future technology capabilities and architecture design considering business objectives, technology strategy, trends and regulatory requirements. Technical Competence Must Have Strong programming and hands-on skills in Java 8 or above (preferably Java 17) Good hands on Java Collections and Streams Good hands on Data structure and Algorithms. Good experience in developing vulnerable free Spring Framework applications Good knowledge on Spring DI/Blueprints, Spring Boot, etc. Good knowledge about Design Patterns and Principles Good knowledge on OR frameworks like Hibernate, JPA etc. Good knowledge on API building (Web Service, SOAP/REST) Good knowledge on Unit testing and code coverage using JUnit/Mockito Good knowledge on code quality tools like SonarQube, Security Scans etc. Good knowledge on containerized platforms like Kubernetes, OpenShift, EKS (AWS) Good knowledge in Enterprise Application Integration patterns (synchronous, asynchronous) Good knowledge on multi-threading and multi-processing implementations Experience in RDBMS (Oracle, PostgreSQL, MySQL) Knowledge on SQL queries Ability to work in quick paced, dynamic environment adapting agile methodologies Ability to work with minimal guidance and/or high-level design input Knowledge on Microservices based development and implementation Knowledge on CI-CD pattern with related tools like Azure DevOps, GIT, Bitbucket, etc. Knowledge on JSON libraries like Jackson/GSON Knowledge on basic Unix Commands Possess good documentation and presentation skills Able to articulate ideas, designs, and suggestions Mentoring fellow team members, conducting code reviews Good to Have Hands-on skills in J2EE specifications like JAX-RS, JAX-WS Experience in working and supporting OLTP and OLAP systems Good Knowledge on Spring Batch, Spring Security Good knowledge in Linux Operating System (Preferably RHEL) Good knowledge on NoSQL offerings (Cassandra, MongoDB, GraphDB, etc) Knowledge on testing methodologies like performance testing, smoke testing, stress testing, endurance testing, etc. Knowledge in Python, Groovy Knowledge in middleware technologies like Kafka, Solace etc. Knowledge in DSE DataStax or Neo4j Cloud environments knowledge (AWS / Azure etc.) Knowledge on IMDG (Hazelcast, Ignite) Knowledge on Rule Engines like Drools, OpenL Tablets, Easy Rules etc. Experience in presenting solutions to architecture forums and follow the principles and standards in implementation Domain: Good to Have Experience in application development for Client Due Diligence (CDD), On-boarding, FATCA & CRS, AML, KYC, and Screening Good knowledge on Cloud native application development, and knowledge of Cloud computing services Training, Qualifications and Certifications Training/qualifications and Certifications in some of the functional and/or technical domains as mentioned will be an added advantage Show more Show less

Posted 4 weeks ago

Apply

5 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Title: MSBI + PowerBI Developer Work Location: Bangalore Experience: 5+ Years Notice Period: Immediate Mandatory Skills: TSQL - Should be strong in writing SQL Queries Should be well versed in SQL Concepts like Index, Temp Tables Strong in Power Bi Service like Subscriptions, Report Access Configuration Should bale to answer scenario based questions 5 years of total IT experience, with min 3+ Power BI Technical Skills Understand business requirements in MSBI context and design data models to transform raw data into meaningful insights. Awareness of star and snowflake schemas in DWH and DWH concepts Should be familiar and experienced in T-SQL Have good knowledge and experience in SSIS (ETL) Creation of Dashboards & Visual Interactive Reports using Power BI Extensive experience in both Power BI Service & Power BI On-Premise environment Create relationships between data sources and develop data models accordingly Experience in implementing Tabular model implementation and row level data security. Experience in writing and optimizing DAX queries. Experience in Power Automate Flow Performance tuning and optimization of Power BI reports Good understanding of Data warehouse concepts. Knowledge of Microsoft Azure analytics is a plus. General Skills ∙ Good Interpersonal skill and ability to manage multiple tasks with enthusiasm ∙ Interact with clients to understand the requirements ∙ Up to date knowledge about the best practices and advancements in Power BI ∙ Should have an analytical and problem solving mindset and approach Technical Skills ∙ Understand business requirements in BI context and design data models to transform raw data into meaningful insights ∙ Good knowledge on all variants of Power BI (Power BI Embedded, Power BI Service, Power BI Report Server) ∙ Strong SQL skills and SQL Performance tuning ∙ Provide expertise in Data Modeling and Database Design and provide recommendations ∙ Make suggestions / best practices in implementing data models, ETL packages, OLAP cubes, and Reports ∙ Experience working with direct query and import mode ∙ Expertise in implementing static & dynamic Row level security ∙ Knowledge to integrate Power BI reports in external web applications ∙ Should have experience setting up data gateways & data preparation ∙ Creation of Dashboards & Visual Interactive Reports using Power BI ∙ Experience working with third party custom visuals like Zebra BI etc. ∙ Create relationships between data sources and develop data models accordingly ∙ Have good knowledge of the various DAX functions and ability to write complex DAX queries ∙ Awareness of star and snowflake schemas in DWH and DWH concepts ∙ Knowledge in Tabular Models ∙ Be familiar with creating TSQL objects, scripts, views, and stored procedure Show more Show less

Posted 4 weeks ago

Apply

Exploring OLAP Jobs in India

With the increasing demand for data analysis and business intelligence, OLAP (Online Analytical Processing) jobs have become popular in India. OLAP professionals are responsible for designing, building, and maintaining OLAP databases to support data analysis and reporting activities for organizations. If you are looking to pursue a career in OLAP in India, here is a comprehensive guide to help you navigate the job market.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Chennai

These cities are known for having a high concentration of IT companies and organizations that require OLAP professionals.

Average Salary Range

The average salary range for OLAP professionals in India varies based on experience levels. Entry-level professionals can expect to earn around INR 4-6 lakhs per annum, while experienced professionals with 5+ years of experience can earn upwards of INR 12 lakhs per annum.

Career Path

Career progression in OLAP typically follows a trajectory from Junior Developer to Senior Developer, and then to a Tech Lead role. As professionals gain experience and expertise in OLAP technologies, they may also explore roles such as Data Analyst, Business Intelligence Developer, or Database Administrator.

Related Skills

In addition to OLAP expertise, professionals in this field are often expected to have knowledge of SQL, data modeling, ETL (Extract, Transform, Load) processes, data warehousing concepts, and data visualization tools such as Tableau or Power BI.

Interview Questions

  • What is OLAP and how does it differ from OLTP? (basic)
  • Explain the difference between a star schema and a snowflake schema. (medium)
  • How do you optimize OLAP queries for performance? (advanced)
  • What is the role of aggregation functions in OLAP databases? (basic)
  • Can you explain the concept of drill-down in OLAP? (medium)
  • How do you handle slowly changing dimensions in OLAP databases? (advanced)
  • What are the advantages of using a multidimensional database over a relational database for OLAP purposes? (medium)
  • Describe your experience with OLAP tools such as Microsoft Analysis Services or Oracle OLAP. (basic)
  • How do you ensure data consistency in an OLAP environment? (medium)
  • What are some common challenges faced when working with OLAP databases? (advanced)
  • Explain the concept of data cubes in OLAP. (basic)
  • How do you approach designing a data warehouse for OLAP purposes? (medium)
  • Can you discuss the importance of indexing in OLAP databases? (advanced)
  • How do you handle missing or incomplete data in OLAP analysis? (medium)
  • What are the key components of an OLAP system architecture? (basic)
  • How do you troubleshoot performance issues in OLAP queries? (advanced)
  • Have you worked with real-time OLAP systems? If so, can you explain the challenges involved? (medium)
  • What are the limitations of OLAP compared to other data analysis techniques? (advanced)
  • How do you ensure data security in an OLAP environment? (medium)
  • Have you implemented any data mining algorithms in OLAP systems? If so, can you provide an example? (advanced)
  • How do you approach designing dimensions and measures in an OLAP cube? (medium)
  • What are some best practices for OLAP database design? (advanced)
  • How do you handle concurrent user access in an OLAP environment? (medium)
  • Can you explain the concept of data slicing and dicing in OLAP analysis? (basic)
  • What are your thoughts on the future of OLAP technologies in the era of big data and AI? (advanced)

Closing Remark

As you prepare for OLAP job interviews in India, make sure to hone your technical skills, brush up on industry trends, and showcase your problem-solving abilities. With the right preparation and confidence, you can successfully land a rewarding career in OLAP in India. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies