Home
Jobs

458 Olap Jobs - Page 14

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 5.0 years

2 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

Career Category Information Systems Job Description ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world s toughest diseases, and make people s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what s known today. ABOUT THE ROLE Role Description: We are looking for an Associate Data Engineer with deep expertise in writing data pipelines to build scalable, high-performance data solutions. The ideal candidate will be responsible for developing, optimizing and maintaining complex data pipelines, integration frameworks, and metadata-driven architectures that enable seamless access and analytics. This role prefers deep understanding of the big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Data Engineer who owns development of complex ETL/ELT data pipelines to process large-scale datasets Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Ensuring data integrity, accuracy, and consistency through rigorous quality checks and monitoring Exploring and implementing new tools and technologies to enhance ETL platform and performance of the pipelines Proactively identify and implement opportunities to automate tasks and develop reusable frameworks Eager to understand the biotech/pharma domains & build highly efficient data pipelines to migrate and deploy complex data across systems Work in an Agile and Scaled Agile (SAFe) environment, collaborating with cross-functional teams, product owners, and Scrum Masters to deliver incremental value Use JIRA, Confluence, and Agile DevOps tools to manage sprints, backlogs, and user stories. Support continuous improvement, test automation, and DevOps practices in the data engineering lifecycle Collaborate and communicate effectively with the product teams, with cross-functional teams to understand business requirements and translate them into technical solutions Must-Have Skills: Experience in Data Engineering with a focus on Databricks, AWS, Python, SQL, and Scaled Agile methodologies Proficiency & Strong understanding of data processing and transformation of big data frameworks (Databricks, Apache Spark, Delta Lake, and distributed computing concepts) Strong understanding of AWS services and can demonstrate the same Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery, and DevOps practices Good-to-Have Skills: Data Engineering experience in Biotechnology or pharma industry Exposure to APIs, full stack development Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc. ), CI/CD (Jenkins, Maven etc. ), automated unit testing, and Dev Ops Education and Professional Certifications Any degree and 2-5 years of experience AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation. .

Posted 3 weeks ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Pune

Work from Office

Naukri logo

What s the role all about As a BI Developer, you ll be a key contributor to developing Reports in a multi-region, multi-tenant SaaS product. You ll collaborate with the core R&D team to build high-performance Reports to serve the use cases of several applications in the suite. How will you make an impact Take ownership of the software development lifecycle, including design, development, unit testing, and deployment, working closely with QA teams. Ensure that architectural concepts are consistently implemented across the product. Act as a product expert within R&D, understanding the product s requirements and its market positioning. Work closely with cross-functional teams (Product Managers, Sales, Customer Support, and Services) to ensure successful product delivery. Design and build Reports for given requirements. Create design documents, test cases for the reports Develop SQL to address the adhoc report requirements, conduct analyses Create visualizations and reports as per the requirements Execute unit testing, functional & performance testing and document the results Conduct peer reviews and ensure quality is met at all stages Have you got what it takes Bachelor/Master of Engineering Degree in Computer Science, Electronic Engineering or equivalent from reputed institute 2-4 years of BI report development experience Expertise in SQL & any cloud-based databases. Would be able to work with any DB to write SQL for any business needs. Experience in any BI tools like Tableau, Power BI, MicroStrategy etc. . Experience working in enterprise Data warehouse/ Data Lake system Strong knowledge of Analytical Data base and schemas Development experience building solutions that leverage SQL and NoSQL databases. Experience/Knowledge of Snowflake an advantage. In-depth understanding of database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework Experience working in functional testing, Performance testing etc. . Experience with public cloud infrastructure and technologies such as AWS/Azure/GCP etc Experience working in Continuous Integration and Delivery practices using industry standard tools such as Jenkins Experience working in an Agile methodology development environment and using work item management tools like JIRA What s in it for you Enjoy NICE-FLEX! Reporting into: Tech Manager Role Type: Individual Contributor About NICE

Posted 3 weeks ago

Apply

4.0 - 8.0 years

10 - 18 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Work from Office

Naukri logo

Responsibilities: • Develop, deploy, and manage OLAP cubes and tabular models. • Collaborate with data teams to design and implement effective data solutions. • Troubleshoot and resolve issues related to SSAS and data models. • Monitor system performance and optimize queries for efficiency. • Implement data security measures and backup procedures. • Stay updated with the latest SSAS and BI technologies and best practices. Qualifications: • Bachelor's degree in Computer Science, Information Technology, or a related field. • 7+ years of experience working with SSAS (SQL Server Analysis Services). • Strong understanding of data warehousing, ETL processes, OLAP concepts and data modelling concepts • Proficiency in SQL, MDX, and DAX query languages. • Experience with data visualization tools like Power BI. • Excellent problem-solving skills and attention to detail. • Strong communication and collaboration abilities. • Experience on Agile way of working Skills: • SSAS (SQL Server Analysis Services) • SQL • MDX/DAX • Data Warehousing • ETL Processes • Performance Tuning • Data Analysis • Data Security • Data modelling • Plus: knowledge on Power BI or a reporting tool • Plus: working for ING

Posted 3 weeks ago

Apply

0.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

" FERM T enables eCommerce brands to transform clicks into conversions with highly-personalized , 1:1 dynamic shopping experiences. Weve raised $30M+ to date and are backed by Bain Capital Ventures, Greylock, QED, and other top angels and commerce investors. Located in SF, Austin, NYC, and Bangalore, were looking to expand our 70+ person team to build the future of eCommerce! After announcing our $17M Series A, FERM T is one of the fastest growing companies at this stage in the US. FERM T is the leading AI-native funnel management platform built for e-commerce marketers. We empower brands to create and manage delightful customer experiences across multiple channels in minutes. Our platform helps businesses transform their digital presence through intelligent, data-driven funnel creation that strengthens customer acquisition and drives measurable results. With FERM T, e-commerce teams can rapidly built, test and iterate on their customer journey while maintaining brand consistency across every touchpoint. About the Role: As a Senior Software Engineer on our Data Platform team, youll have a transformative impact on FERM Ts ability to deliver powerful data insights that drive business decisions. Youll architect, build, and scale our data infrastructure at a critical inflection point in our growth journey, as we expand our agency accelerator program and onboard larger enterprise clients. Your expertise will power both customer-facing analytics and internal reporting capabilities that form the backbone of our decision-making process. This role sits at the intersection of data engineering and business impact - youll work closely with teams across the organization to understand their data needs and translate them into robust, scalable data pipelines and OLAP solutions. Youll have significant autonomy to shape our data architecture, implement best practices for data governance, and mentor other engineers as we build a world-class data platform. If youre energized by transforming complex data challenges into elegant solutions that drive real business outcomes, this is an exceptional opportunity to leave your mark on a rapidly growing company. Responsibilities: Own the data platform at Fermat, powering key customer-facing and internal dashboards Build and maintain the OLAP stack and data pipelines that support reporting products Take responsibility for data QA, testing, and debugging, answering critical questions Influence company-wide architecture and technology decisions, setting trends Tackle challenging technical problems and expand your engineering skillset. Lead mission-critical projects, delivering end-to-end data ingestion capabilities with high quality Collaborate closely with Sales, Product Management, and Operations teams Handle customer escalations and resolve data-related issues Mentor and guide other engineers on the team, fostering growth and development Review and provide feedback on technical specifications Requirements: Proficient experience building software products, with a focus on data platforms Proven experience architecting and developing robust data platforms Expertise in dbt labs and writing complex data transformations in SQL and other programming languages Strong knowledge of data warehousing concepts, including building custom ETL integrations, snapshots, indexing, and partitioning Excellent cross-functional collaborator, able to explain technical concepts to non-technical partners Startup experience (with companies Strong written and verbal communication skills, with the ability to discuss and debate strategic engineering and product decisions Track record of building scalable, error-tolerant, and easily debuggable products Confident in making informed technology choices and advocating for the right tools for the job Driven to deliver secure, well-tested, and high-performing features and improvements Thrive in a product-focused startup environment, passionate about enhancing customer experience. Knowledge of Data Security and Governance highly desired Tech stack: Golang Fivetran BigQuery Nextjs React Typescript Postgres Google cloud GraphQL on Hasura

Posted 3 weeks ago

Apply

5.0 - 7.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking an experienced SQL Developer with expertise in SQL Server Analysis Services (SSAS) and AWS to join our growing team. The successful candidate will be responsible for designing, developing, and maintaining SQL Server-based OLAP cubes and SSAS models for business intelligence purposes. You will work with multiple data sources, ensuring data integration, optimization, and performance of the reporting models. This role offers an exciting opportunity to work in a hybrid work environment, collaborate with cross-functional teams, and

Posted 3 weeks ago

Apply

5.0 - 7.0 years

7 - 11 Lacs

Pune

Work from Office

Naukri logo

: We are seeking a highly skilled and experienced MSTR (MicroStrategy) Developer to join our Business Intelligence team. In this role, you will be responsible for the design, development, implementation, and maintenance of robust and scalable BI solutions using the MicroStrategy platform. Your primary focus will be on leveraging your deep understanding of MicroStrategy architecture and strong SQL skills to deliver insightful and actionable data to our stakeholders. This is an excellent opportunity to contribute to critical business decisions by providing high-quality BI solutions. Responsibilities - Design, develop, and deploy MicroStrategy objects including reports, dashboards, cubes (Intelligent Cubes, OLAP Cubes), documents, and visualizations. - Utilize various MicroStrategy features and functionalities such as Freeform SQL, Query Builder, MDX connectivity, and data blending. - Optimize MicroStrategy schema objects (attributes, facts, hierarchies) for performance and usability. - Implement security models within MicroStrategy, including user and group management, object-level security, and data-level security. - Perform performance tuning and optimization of MicroStrategy reports and dashboards. - Participate in the administration and maintenance of the MicroStrategy environment, including metadata management, project configuration, and user support. - Troubleshoot and resolve issues related to MicroStrategy reports, dashboards, and the overall platform. - Write complex and efficient SQL queries to extract, transform, and load data from various data sources. - Understand database schema design and data modeling principles. - Optimize SQL queries for performance within the MicroStrategy environment. - Work with different database platforms (e.g., Oracle, SQL Server, Teradata, Snowflake) and understand their specific SQL dialects. - Develop and maintain database views and stored procedures to support MicroStrategy development. - Collaborate with business analysts and end-users to understand their reporting and analytical requirements. - Translate business requirements into technical specifications for MicroStrategy development. - Participate in the design and prototyping of BI solutions. - Develop and execute unit tests and integration tests for MicroStrategy objects. - Participate in user acceptance testing (UAT) and provide support to end-users during the testing phase. - Ensure the accuracy and reliability of data presented in MicroStrategy reports and dashboards. - Create and maintain technical documentation for MicroStrategy solutions, including design documents, user guides, and deployment instructions. - Provide training and support to end-users on how to effectively use MicroStrategy reports and dashboards. - Adhere to MicroStrategy best practices and development standards. - Stay updated with the latest MicroStrategy features and functionalities. - Proactively identify opportunities to improve existing MicroStrategy solutions and processes. Required Skills and Expertise - Strong proficiency in MicroStrategy development (5+ years of hands-on experience is essential). This includes a deep understanding of the MicroStrategy architecture, object creation, report development, dashboard design, and administration. - Excellent SQL skills (5+ years of experience writing complex queries, optimizing performance, and working with various database systems). - Experience in data modeling and understanding of dimensional modeling concepts (e.g., star schema, snowflake schema). - Solid understanding of BI concepts, data warehousing principles, and ETL processes. - Experience in performance tuning and optimization of MicroStrategy reports and SQL queries. - Ability to gather and analyze business requirements and translate them into technical specifications. - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills, with the ability to work effectively with both technical and business stakeholders. - Experience with version control systems (e.g., Git). - Ability to work independently and as part of a team.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

: As a SaaS Developer focused on SSAS and OLAP, you will play a critical role in our data warehousing and business intelligence initiatives. You will work closely with data engineers, business analysts, and other stakeholders to ensure the delivery of accurate and timely data insights. Your expertise in SSAS development, performance optimization, and data integration will be essential to your success. Responsibilities : - Design, develop, and maintain SQL Server Analysis Services (SSAS) models (multidimensional and tabular). - Create and manage OLAP cubes to support business intelligence reporting and analytics. - Implement best practices for data modeling and cube design. - Optimize the performance of SSAS solutions for efficient query processing and data retrieval. - Tune SSAS models and cubes to ensure optimal performance. - Identify and resolve performance bottlenecks. - Integrate data from various sources (relational databases, flat files, APIs) into SQL Server databases and SSAS models. - Develop and implement ETL (Extract, Transform, Load) processes for data integration. - Ensure data quality and consistency across integrated data sources. - Support the development of business intelligence reports and dashboards. - Collaborate with business analysts to understand reporting requirements and translate them into SSAS solutions. - Provide technical support and troubleshooting for SSAS-related issues. - Preferably have knowledge of AWS S3 and SQL Server PolyBase for data integration and cloud-based data warehousing. - Integrate data from AWS S3 into SSAS models using PolyBase or other appropriate methods. Required Skills & Qualifications : Experience : - 5-8 years of experience as a SQL Developer with a focus on SSAS and OLAP. - Proven experience in designing and developing multidimensional and tabular SSAS models. Technical Skills : - Strong expertise in SQL Server Analysis Services (SSAS) and OLAP cube development. - Proficiency in writing MDX and DAX queries. - Experience with data modeling and database design. - Strong understanding of ETL processes and data integration techniques. - Experience with SQL Server databases and related tools. - Preferably knowledge of AWS S3 and SQL Server PolyBase.

Posted 3 weeks ago

Apply

20.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Description Over the past 20 years Amazon has earned the trust of over 300 million customers worldwide by providing unprecedented convenience, selection and value on Amazon.com. By deploying Amazon Pay’s products and services, merchants make it easy for these millions of customers to safely purchase from their third party sites using the information already stored in their Amazon account. In this role, you will lead Data Engineering efforts to drive automation for Amazon Pay organization. You will be part of the data engineering team that will envision, build and deliver high-performance, and fault-tolerant data pipeliens. As a Data Engineer, you will be working with cross-functional partners from Science, Product, SDEs, Operations and leadership to translate raw data into actionable insights for stakeholders, empowering them to make data-driven decisions. Key job responsibilities Design, implement, and support a platform providing ad-hoc access to large data sets Interface with other technology teams to extract, transform, and load data from a wide variety of data sources Implement data structures using best practices in data modeling, ETL/ELT processes, and SQL, Redshift, and OLAP technologies Model data and metadata for ad-hoc and pre-built reporting Interface with business customers, gathering requirements and delivering complete reporting solutions Build robust and scalable data integration (ETL) pipelines using SQL, Python and Spark. Build and deliver high quality data sets to support business analyst, data scientists, and customer reporting needs. Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers Basic Qualifications 1+ years of data engineering experience Experience with SQL Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Preferred Qualifications Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka Job ID: A2992057 Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Your contributions to organisation's growth: Maintain & develop data platforms based on Microsoft Fabric for Business Intelligence & Databricks for real-time data analytics. Design, implement and maintain standardized production-grade data pipelines using modern data transformation processes and workflows for SAP, MS Dynamics, on-premise or cloud. Develop an enterprise-scale cloud-based Data Lake for business intelligence solutions. Translate business and customer needs into data collection, preparation and processing requirements. Optimize the performance of algorithms developed by Data Scientists. General administration and monitoring of the data platforms. Competencies: Working with structured & unstructured data. Experienced in various database technologies (RDBMS, OLAP, Timeseries, etc.). Solid programming skills (Python, SQL, Scala is a plus). Experience in Microsoft Fabric (incl. Warehouse, Lakehouse, Data Factory, DataFlow Gen2, Semantic Model) and/or Databricks (Spark). Proficient in PowerBI. Experienced working with APIs. Proficient in security best practices. Data centered Azure know-how is a plus (Storage, Networking, Security, Billing). Expertise you have to bring in along with; Bachelor or Master degree in business informatics, computer science, or equal. A background in software engineering (e.g., agile programming, project organization) and experience with human centered design would be desirable. Extensive experience in handling large data sets. Experience working at least 5 years as a data engineer, preferably in an industrial company. Analytical problem-solving skills and the ability to assimilate complex information. Programming experience in modern data-oriented languages (SQL, Python). Experience with Apache Spark and DevOps. Proven ability to synthesize complex data advanced technical skills related to data modelling, data mining, database design and performance tuning. English language proficiency. Special requirements: High quality mindset paired with strong customer orientation, critical thinking, and attention to detail. Understanding of data processing at scale Influence without authority. Willingness to acquire additional system/technical knowledge as needed. Problem solver. Experience to work in an international organization and in multi-cultural teams. Proactive, creative and innovative. Show more Show less

Posted 3 weeks ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

At NICE, we don’t limit our challenges. We challenge our limits. Always. We’re ambitious. We’re game changers. And we play to win. We set the highest standards and execute beyond them. And if you’re like us, we can offer you the ultimate career opportunity that will light a fire within you. What’s the role all about? As a BI Developer, you’ll be a key contributor to developing Reports in a multi-region, multi-tenant SaaS product. You’ll collaborate with the core R&D team to build high-performance Reports to serve the use cases of several applications in the suite. How will you make an impact? Take ownership of the software development lifecycle, including design, development, unit testing, and deployment, working closely with QA teams. Ensure that architectural concepts are consistently implemented across the product. Act as a product expert within R&D, understanding the product’s requirements and its market positioning. Work closely with cross-functional teams (Product Managers, Sales, Customer Support, and Services) to ensure successful product delivery. Design and build Reports for given requirements. Create design documents, test cases for the reports Develop SQL to address the adhoc report requirements, conduct analyses Create visualizations and reports as per the requirements Execute unit testing, functional & performance testing and document the results Conduct peer reviews and ensure quality is met at all stages Have you got what it takes? Bachelor/Master of Engineering Degree in Computer Science, Electronic Engineering or equivalent from reputed institute 2-4 years of BI report development experience Expertise in SQL & any cloud-based databases. Would be able to work with any DB to write SQL for any business needs. Experience in any BI tools like Tableau, Power BI, MicroStrategy etc.. Experience working in enterprise Data warehouse/ Data Lake system Strong knowledge of Analytical Data base and schemas Development experience building solutions that leverage SQL and NoSQL databases. Experience/Knowledge of Snowflake an advantage. In-depth understanding of database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework Experience working in functional testing, Performance testing etc.. Experience with public cloud infrastructure and technologies such as AWS/Azure/GCP etc Experience working in Continuous Integration and Delivery practices using industry standard tools such as Jenkins Experience working in an Agile methodology development environment and using work item management tools like JIRA What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NICE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NICEr! Enjoy NICE-FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Reporting into: Tech Manager Role Type: Individual Contributor About NICE NICE Ltd. (NASDAQ: NICE) software products are used by 25,000+ global businesses, including 85 of the Fortune 100 corporations, to deliver extraordinary customer experiences, fight financial crime and ensure public safety. Every day, NICE software manages more than 120 million customer interactions and monitors 3+ billion financial transactions. Known as an innovation powerhouse that excels in AI, cloud and digital, NICE is consistently recognized as the market leader in its domains, with over 8,500 employees across 30+ countries. NICE is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, age, sex, marital status, ancestry, neurotype, physical or mental disability, veteran status, gender identity, sexual orientation or any other category protected by law. Show more Show less

Posted 3 weeks ago

Apply

5.0 - 10.0 years

30 - 35 Lacs

Hyderabad

Work from Office

Naukri logo

Job Title: Data Modeler Experience: 5+ Years Location: Hyderabad (WFO). Roles and Responsibilities: Experience in data modelling designing, implementing, and maintaining data models to support data quality, performance, and scalability. Proven experience as a Data Modeler and worked with data analysts, data architects and business stakeholders to ensure data models are aligned to business requirements. Expertise in Azure, Databricks, Data warehousing, ERWIN, and Supply chain background is required. Strong knowledge of data modelling principles and techniques (e.g., ERD, UML). Proficiency with data modelling tools (e.g., ER/Studio, Erwin, IBM Data Architect). Experience with relational databases (e.g., SQL Server, Oracle, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra). Solid understanding of data warehousing, ETL processes, and data integration. Able to create and maintain Source to Target mapping [STTM] Document , Bus Matrix Document , etc. Realtime experience working in OLTP & OLAP Database modelling. Additional: Familiarity with big data technologies (e.g., Hadoop, Spark) is a plus. Excellent analytical, problem-solving, and communication skills. Ability to work effectively in a collaborative, fast-paced environment.

Posted 3 weeks ago

Apply

2.0 - 4.0 years

6 - 11 Lacs

Pune

Hybrid

Naukri logo

What’s the role all about? As a BI Developer, you’ll be a key contributor to developing Reports in a multi-region, multi-tenant SaaS product. You’ll collaborate with the core R&D team to build high-performance Reports to serve the use cases of several applications in the suite. How will you make an impact? Take ownership of the software development lifecycle, including design, development, unit testing, and deployment, working closely with QA teams. Ensure that architectural concepts are consistently implemented across the product. Act as a product expert within R&D, understanding the product’s requirements and its market positioning. Work closely with cross-functional teams (Product Managers, Sales, Customer Support, and Services) to ensure successful product delivery. Design and build Reports for given requirements. Create design documents, test cases for the reports Develop SQL to address the adhoc report requirements, conduct analyses Create visualizations and reports as per the requirements Execute unit testing, functional & performance testing and document the results Conduct peer reviews and ensure quality is met at all stages Have you got what it takes? Bachelor/Master of Engineering Degree in Computer Science, Electronic Engineering or equivalent from reputed institute 2-4 years of BI report development experience Expertise in SQL & any cloud-based databases. Would be able to work with any DB to write SQL for any business needs. Experience in any BI tools like Tableau, Power BI, MicroStrategy etc.. Experience working in enterprise Data warehouse/ Data Lake system Strong knowledge of Analytical Data base and schemas Development experience building solutions that leverage SQL and NoSQL databases. Experience/Knowledge of Snowflake an advantage. In-depth understanding of database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework Experience working in functional testing, Performance testing etc.. Experience with public cloud infrastructure and technologies such as AWS/Azure/GCP etc Experience working in Continuous Integration and Delivery practices using industry standard tools such as Jenkins Experience working in an Agile methodology development environment and using work item management tools like JIRA What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NICE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NICEr! Enjoy NICE-FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Reporting into: Tech Manager Role Type: Individual Contributor

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

TCS HIRING!! ROLE: AWS Data architect LOCATION: HYDERABAD YEAR OF EXP: 8 + YEARS Data Architect: Must have: Relational SQL/ Caching expertise – Deep knowledge of Amazon Aurora PostgreSQL, ElastiCache etc.. Data modeling – Experience in OLTP and OLAP schemas, normalization, denormalization, indexing, and partitioning. Schema design & migration – Defining best practices for schema evolution when migrating from SQL Server to PostgreSQL. Data governance – Designing data lifecycle policies, archival strategies, and regulatory compliance frameworks. AWS Glue & AWS DMS – Leading data migration strategies to Aurora PostgreSQL. ETL & Data Pipelines – Expertise in Extract, Transform, Load (ETL) workflows . Glue jobs features and event-driven architectures. Data transformation & mapping – PostgreSQL PL/pgSQL migration / transformation expertise while ensuring data integrity. Cross-platform data integration – Connecting cloud and on-premises / other cloud data sources. AWS Data Services – Strong experience in S3, Glue, Lambda, Redshift, Athena, and Kinesis. Infrastructure as Code (IaC) – Using Terraform, CloudFormation, or AWS CDK for database provisioning. Security & Compliance – Implementing IAM, encryption (AWS KMS), access control policies, and compliance frameworks (eg. GDPR ,PII). Query tuning & indexing strategies – Optimizing queries for high performance. Capacity planning & scaling – Ensuring high availability, failover mechanisms, and auto-scaling strategies. Data partitioning & storage optimization – Designing cost-efficient hot/cold data storage policies. Should have experience with setting up the AWS architecture as per the project requirements Good to have: Data Warehousing – Expertise in Amazon Redshift, Snowflake, or BigQuery. Big Data Processing – Familiarity with Apache Spark, EMR, Hadoop, or Kinesis. Data Lakes & Analytics – Experience in AWS Lake Formation, Glue Catalog, and Athena. Machine Learning Pipelines – Understanding of SageMaker, BedRock etc. for AI-driven analytics. CI/CD for Data Pipelines – Knowledge of AWS CodePipeline, Jenkins, or GitHub Actions. Serverless Data Architectures – Experience with event-driven systems (SNS, SQS, Step Functions). Show more Show less

Posted 3 weeks ago

Apply

15.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Position: DAML Head - Solution Architect / Technical Delivery Experience: 15+ Years Location: Noida Job Summary: The DAML Head - Solution Architect / Technical Delivery will be responsible for leading the design and delivery of advanced data management and analytics solutions. This role involves overseeing the creation of modern data warehouses, business intelligence systems, and cutting-edge analytics platforms, with a strong emphasis on AI/ML and Generative AI technologies. The ideal candidate will possess significant experience in Big Data, program management, and senior-level stakeholder engagement. Key Responsibilities Architectural Leadership: Lead the architectural design and development of multi-tenant modern data warehouses, business intelligence systems, and analytics platforms, including AI/ML and Generative AI components. Ensure the platform’s security, data isolation, quality, integrity, extensibility, adaptability, scalability, availability, and understandability. Big Data and AI/ML Delivery: Oversee the delivery of Big Data and AI/ML projects, ensuring alignment with architectural standards and business objectives. Solution Development: Architect scalable, performance-oriented solutions using Big Data technologies and traditional ETL tools, incorporating AI/ML and Generative AI technologies where applicable. Manage logical and physical data models for data warehousing (DW) and OLAP systems. Technology Evaluation: Lead the evaluation and selection of technology products to achieve strategic business intelligence and data management goals, including AI/ML and Generative AI technologies. Stakeholder Engagement: Facilitate high-level discussions with stakeholders, including CXOs and tech leaders within customer and AWS ecosystems, to refine software requirements and provide guidance on technical components, frameworks, and interfaces. Program and People Management: Demonstrate strong program management skills, overseeing multiple projects and ensuring timely delivery. Manage and mentor team members, including junior developers and team leads, ensuring adherence to best practices and fostering professional growth. Documentation and Communication: Develop and maintain comprehensive technical design documentation related to data warehouse architecture and systems. Communicate effectively with cross-functional teams to resolve issues, manage changes in scope, and ensure successful project execution. Infrastructure Planning: Assess data volumes and customer reporting SLAs, and provide recommendations for infrastructure sizing and orchestration solutions. Skill Requirements Experience: Minimum of 15 years in data management and analytics roles, with substantial experience in Big Data solutions architecture. At least 10 years of experience in Big Data delivery roles, including hands-on experience with AI/ML and Generative AI technologies. Technical Expertise: Proficiency with Hadoop distributions (e.g., Hortonworks, Cloudera), and related technologies (e.g., Kafka, Spark, Cloud Data Flow, Pig, Hive, Sqoop, Oozie). Experience with RDBMS (e.g., MySQL, Oracle), NoSQL databases, and ETL/ELT tools (e.g., Informatica Power Center, Scoop). Experience with large-scale cluster installation and deployment. Analytical and Management Skills: Strong analytical and problem-solving skills, with the ability to develop multiple solution options. Proven program management capabilities, with a track record of managing complex projects and leading cross-functional teams. Knowledge: Deep understanding of Data Engineering, Data Management, Data Science, and AI/ML principles, including Generative AI. Familiarity with design and architectural patterns, as well as cloud-based deployment models. Knowledge of Big Data security concepts and tools, including Kerberos, Ranger, and Knox. Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Experience - 5+ Years Location: Gurgaon (Hybrid) Budget: 15-18 LPA Roles and Responsibilities ‍Formulate automated reports and dashboards using Power BI and other reporting tools. Understand business requirements to set functional specifications for reporting applications. You should be familiar with SSRS and TSQL, Power Query, MDX, PowerBI, and DAX are just a few of the tools and systems on the MS SQL Server BI Stack. Exhibit a foundational understanding of database concepts such relational database architecture, multidimensional database design, and more Design data models that transform raw data into insightful knowledge by understanding business requirements in the context of BI. Develop technical specifications from business needs, and choose a deadline for work completion. Make charts and data documentation thamore.cludes descriptions of the techniques, parameters, models, and relationships. Developing Power BI desktop to create dashboards, KPI scorecards, and visual reports. Establish row-level security on data and comprehend Power BI's application security layer models. Examine, comprehend, and study business needs as they relate to business intelligence. Design and map data models to transform raw data into insightful information. Create dynamic and eye-catching dashboards and reports using Power BI. Make necessary tactical and technological adjustments to enhance current business intelligence systems Integrate data, alter data, and connect to data sources for business intelligence. ‍ ‍ Requirements and Skills ‍Extremely good communication skills are necessary to effectively explain the requirements between both internal teams and client teams. Exceptional analytical thinking skills for converting data into illuminating reports and reports. BS in computer science or information system along with work experience in a related field knowledge of data warehousing, data gateway, and data preparation projects Working knowledge of Power BI, SSAS, SSRS, and SSIS components of the Microsoft Business Intelligence Stack Articulating, representing, and analyzing solutions with the team while documenting, creating, and modeling them Familiarity with the tools and technologies used by the Microsoft SQL Server BI Stack, including SSRS and TSQL, Power Query, MDX, PowerBI, and DAX. Knowledge of executing DAX queries on the Power BI desktop Comprehensive understanding of data modeling, administration, and visualization Capacity to perform in an atmosphere where agility and continual development are prioritized Detailed knowledge and understanding of database management systems, OLAP, and the ETL (Extract, Transform, Load) framework Awareness of BI technologies (e.g., Microsoft Power BI, Oracle BI) Expertise of SQL queries, SSRS, and SQL Server Integration Services (SSIS) NOTE: Staffing & Recruitment Companies are advised not to contact us. Show more Show less

Posted 3 weeks ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About Rippling: Rippling gives businesses one place to run HR, IT, and Finance. It brings together all of the workforce systems that are normally scattered across a company, like payroll, expenses, benefits, and computers. For the first time ever, you can manage and automate every part of the employee lifecycle in a single system. Take onboarding, for example. With Rippling, you can hire a new employee anywhere in the world and set up their payroll, corporate card, computer, benefits, and even third-party apps like Slack and Microsoft 365—all within 90 seconds. Based in San Francisco, CA, Rippling has raised $1.2B from the world’s top investors—including Kleiner Perkins, Founders Fund, Sequoia, Greenoaks, and Bedrock—and was named one of America's best startup employers by Forbes. We prioritize candidate safety. Please be aware that official communication will only be sent from @Rippling.com addresses. About the Role: The Data Platform team works on building the blocks that are used by other teams at Rippling to create advanced HR applications at lightning fast speed. At the core of our technological aspirations lies this Team, a group dedicated to pushing the boundaries of what's possible with data. We architect high-performance, scalable systems that power the next generation of data products - ranging from reports, analytics, customizable workflows, search, and many new sets of products and capabilities to help customers manage and get unprecedented value from their business data. This is a unique opportunity to work on both product and platform layers at the same time. We obsess over the scalability and extensibility of platform solutions, ensuring that solutions will meet the needs across the breadth of Rippling's product suite, along with the applications of tomorrow. You won't just be crafting features; you'll be shaping the future of business data management. What You'll Do: Develop high-quality software with attention to detail using tech stacks like Python, MongoDB, CDC, and Kafka Leverage big data technologies like Apache Presto, Apache Pinot, Flink, and Airflow Build OLAP stack and data pipelines in support of Reporting products Build custom programming languages within the Rippling Platform Create data platforms, data lakes, and data ingestion systems that work at scale Lead mission-critical projects and deliver data ingestion capabilities end-to-end with high-quality Have clear ownership of one or many products, APIs, or platform spaces Build and grow your engineering skills in different challenging areas and solve hard technical problems Influence architecture, technology selections, and trends of the whole company Qualifications: 7+ years of experience in software development, preferably in fast-paced, dynamic environments. Solid understanding of CS fundamentals, architectures, and design patterns. Proven track record in building large-scale applications, APIs, and developer tools. Excellent at cross-functional collaboration, able to articulate technical concepts to non-technical partners. You thrive in a product-focused environment and are passionate about making an impact on customer experience. Bonus Points: for contributing to open-source projects (Apache Iceberg, Parquet, Spark, Hive, Flink, Delta Lake, Presto, Trino, Avro) Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description Skill: Data Modeler Key Responsibility Hands-on data modelling for OLTP and OLAP systems In-Depth knowledge of Conceptual, Logical and Physical data modelling Strong understanding of Indexing, partitioning, data sharding with practical experience of having done the same Strong understanding of variables impacting database performance for near-real time reporting and application interaction. Should have working experience on at least one data modelling tool, preferably DBSchema, Erwin Good understanding of GCP databases like AlloyDB, CloudSQL and BigQuery. People with functional knowledge of mutual fund industry will be a plus Should be willing to work from Chennai, office presence is mandatory About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 21,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need. Show more Show less

Posted 3 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Key Accountabilities JOB DESCRIPTION The Azure Data Support engineer focuses on data-related tasks in Azure. Manage, monitor, and ensure the security and privacy of data to satisfy business needs. Monitor real time and batch processes to ensure data accuracy. Monitor azure pipelines and troubleshoot where required. Enhance existing pipelines, databricks notebooks as and when required. Involved in development stages of new pipelines as and when required. Troubleshoot pipelines, real time replication jobs and ensuring minimum data lag. Available to work on a shift basis to cover monitoring during weekends. (one weekend out of three) Act as an ambassador for DP World at all times when working; promoting and demonstrating positive behaviours in harmony with DP World’s Principles, values and culture; ensuring the highest level of safety is applied in all activities; understanding and following DP World’s Code of Conduct and Ethics policies Perform other related duties as assigned JOB CONTEXT Responsible for monitoring and enhancing existing data pipelines using Microsoft Stack. Responsible for enhancement of existing data platforms. Experience with Cloud Platforms such as Azure, AWS , Google Cloud etc. Experience with Azure data Lake, Azure datalake Analytics, Azure SQL Database, Azure, Data Bricks and Azure SQL Data warehouse. Good understanding of Spark Architecture including Spark Core, Spark SQL, Data Frames, Spark Streaming, Driver Node, Worker Node, Stages, Executors and Tasks. Good understanding of Big Data Hadoop and Yarn architecture along with various Hadoop Demons such as Job Tracker, Task Tracker, Name Node, Data Node, Resource/Cluster Manager, and Kafka (distributed stream-processing) . Experience in Database Design and development with Business Intelligence using SQL Server 2014/2016, Integration Services (SSIS), DTS Packages, SQL Server Analysis Services (SSAS), DAX, OLAP Cubes, Star Schema and Snowflake Schema. Monitoring of pipelines in ADF and experience with Azure SQL, Blob storage, Azure SQL Data warehouse. Experience in a support environment working with real time data replication will be a plus. Qualification QUALIFICATIONS, EXPERIENCE AND SKILLS Bachelor/master’s in computer science/IT or equivalent. Azure certifications will be an added advantage (Certification in AZ-900 and/or AZ-204, AZ-303, AZ-304 or AZ-400 , DP200 & DP201). ITIL certification a plus. Experience : 5 - 8 Years Must Have Skills Azure Data lake, Data factory, Azure Databricks Azure SQL database, Azure SQL Datawarehouse. Hadoop ecosystem. Azure analytics services. Programming Python, R, Spark SQL Good To Have Skills MSBI (SSIS, SSAS, SSRS), Oracle, SQL, PL/SQL Data Visualization, Power BI Data Migration Show more Show less

Posted 3 weeks ago

Apply

8.0 - 10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Information Date Opened: 05/08/2025 Job Type: Full time Industry: Software Product City: Chennai State/Province: Tamil Nadu Country: India Zip/Postal Code: 600017 Pandois a global leader in supply chain technology, building the world's quickest time-to-value Fulfillment Cloud platform. Pandos Fulfillment Cloud provides manufacturers, retailers, and 3PLs with a single pane of glass to streamline end-to-end purchase order fulfillment and customer order fulfillment to improve service levels, reduce carbon footprint, and bring down costs. As a partner of choice for Fortune 500 enterprises globally, with a presence across APAC, the Middle East, and the US, Pando is recognized as aTechnology Pioneer by the World Economic Forum (WEF), and asone of the fastest growing technology companies by Deloitte. Role As the Senior Lead for AI and Data Warehouse at Pando, you will be responsible for building and scaling the data and AI services team. You will drive the design and implementation of highly scalable, modular, and reusable data pipelines, leveraging big data technologies and low-code implementations. This is a senior leadership position where you will work closely with cross-functional teams to deliver solutions that power advanced analytics, dashboards, and AI-based insights. Key Responsibilities Lead the development of scalable, high-performance data pipelines using PySpark or Big Data ETL pipeline technologies. Drive data modeling efforts for analytics, dashboards, and knowledge graphs. Oversee the implementation of parquet-based data lakes. Work on OLAP databases, ensuring optimal data structure for reporting and querying. Architect and optimize large-scale enterprise big data implementations with a focus on modular and reusable low-code libraries. Collaborate with stakeholders to design and deliver AI and DWH solutions that align with business needs. Mentor and lead a team of engineers, building out the data and AI services organization. Requirements 8-10 years of experience in big data and AI technologies, with expertise in PySpark or similar Big Data ETL pipeline technologies. Strong proficiency in SQL and OLAP database technologies. Firsthand experience with data modeling for analytics, dashboards, and knowledge graphs. Proven experience with parquet-based data lake implementations. Expertise in building highly scalable, high-volume data pipelines. Experience with modular, reusable, low-code-based implementations. Involvement in large-scale enterprise big data implementations. Initiative-taker with strong motivation and the ability to lead a growing team. Preferred Experience leading a team or building out a new department. Experience with cloud-based data platforms and AI services. Familiarity with supply chain technology or fulfilment platforms is a plus. Join us at Pando and lead the transformation of our AI and data services, delivering innovative solutions for global enterprises! I'm interested Locations: Chennai, India | Posted on: 05/08/2025 Show more Show less

Posted 3 weeks ago

Apply

4.0 - 9.0 years

10 - 12 Lacs

Pune

Work from Office

Naukri logo

A Snapshot of Your Days Your role as a Senior JEDOX Developer is to work daily with global business users who submit tickets via SharePoint or Mailbox. You will also coordinate and work with the appropriate IT development and middleware Teams to find a solution that meets the agreed operation Level Agreement and fix it within the agreed Service Level Agreement. Besides that, you will take part in the monthly closing process where you will coordinate with end users regarding the data entered in the system and verify the same. You will also join the sprint Development meeting to understand and keep up with the ongoing developments. Work closely with collaborators and senior management, expand your network and prepare yourself for future global roles at Siemens Energy. Your opportunities for personal growth collaborate with people from different countries, cultures, and backgrounds. work without supervision work innovatively. How You ll Make an Impact / Responsibilities Lead the design, development, and implementation of data pipelines and ETL workflows. Manage and optimize workflows to ensure reliable data processing and job scheduling. Design and implement data solutions in database. Ability to be creative and proactive with report design and development using little to no documented requirements Collaborate with cross-functional teams to gather requirements and translate them into scalable data architecture and process designs. Fostering a culture of continuous improvement and innovation. Ensure data quality and integrity by implementing standard processes in data governance and validation. Monitor performance, troubleshoot issues, and optimize data systems for efficiency and scalability. Stay abreast of industry trends and emerging technologies to ensure continuous improvement of the data engineering practices. What You Bring / Skills, Capabilities You should be an experienced (6+) IT professional with your graduation in Engineering or other equivalent qualification (MCA). 4+ years of relevant work experience in developing maintaining ETL workflows. 4+ years of relevant work experience in data analytics, reporting tool like Power BI, Tableau, SAC. 4+ years of relevant work experience in SNOWFLAKE or any cloud database with proven knowledge of writing complex SQL queries. Good to have experience in working in EPM tool like JEDOX, ANAPLAN, TM1 Good to have experience in multidimensional database concepts like OLAP, Cube, Dimensions etc. Good to have experience in developing Power Automate workflows. Good to have experience in Excel Formulas like PIVOT, VLOOKUP etc. Ability to learn new software and technologies quickly and adapt to an ambitious and fast-paced environment. Experience collaborating directly with business users and relevant collaborators. About the Team Who is Siemens Energy At Siemens Energy, we are more than just an energy technology company. We meet the growing energy demand across 90+ countries while ensuring our climate is protected. With more than 94,000 dedicated employees, we not only generate electricity for over 16% of the global community, but we re also using our technology to help protect people and the environment. Our global team is committed to making sustainable, reliable, and affordable energy a reality by pushing the boundaries of what is possible. We uphold a 150-year legacy of innovation that encourages our search for people who will support our focus on decarbonization, new technologies, and energy transformation. Discover the ways you can contribute to Siemens Energy:

Posted 3 weeks ago

Apply

2.0 - 3.0 years

6 - 10 Lacs

Vadodara

Work from Office

Naukri logo

Job Description When looking to buy a product, whether it is in a brick and mortar store or online, it can be hard enough to find one that not only has the characteristics you are looking for but is also at a price that you are willing to pay. It can also be especially frustrating when you finally find one, but it is out of stock. Likewise, brands and retailers can have a difficult time getting the visibility they need to ensure you have the most seamless experience as possible in selecting their product. We at Wiser believe that shoppers should have this seamless experience, and we want to do that by providing the brands and retailers the visibility they need to make that belief a reality. Our goal is to solve a messy problem elegantly and cost effectively. Our job is to collect, categorize, and analyze lots of structured and semi-structured data from lots of different places every day (whether it s 20 million+ products from 500+ websites or data collected from over 300,000 brick and mortar stores across the country). We help our customers be more competitive by discovering interesting patterns in this data they can use to their advantage, while being uniquely positioned to be able to do this across both online and instore. We are looking for a lead-level software engineer to lead the charge on a team of like-minded individuals responsible for developing the data architecture that powers our data collection process and analytics platform. If you have a passion for optimization, scaling, and integration challenges, this may be the role for you. What You Will Do Think like our customers - you will work with product and engineering leaders to define data solutions that support customers business practices. Design/develop/extend our data pipeline services and architecture to implement your solutions - you will be collaborating on some of the most important and complex parts of our system that form the foundation for the business value our organization provides Foster team growth - provide mentorship to both junior team members and evangelizing expertise to those on others. Improve the quality of our solutions - help to build enduring trust within our organization and amongst our customers by ensuring high quality standards of the data we manage Own your work - you will take responsibility to shepherd your projects from idea through delivery into production Bring new ideas to the table - some of our best innovations originate within the team Technologies We Use Languages: SQL, Python Infrastructure: AWS, Docker, Kubernetes, Apache Airflow, Apache Spark, Apache Kafka Databases: Snowflake, Trino/Starburst, Redshift, MongoDB, Postgres, MySQL Others: Tableau (as a business intelligence solution) Qualifications Bachelors/Master s degree in Computer Science or relevant technical degree 10+ years of professional software engineering experience Strong proficiency with data languages such as Python and SQL Strong proficiency working with data processing technologies such as Spark, Flink, and Airflow Strong proficiency working of RDMS/NoSQL/Big Data solutions (Postgres, MongoDB, Snowflake, etc.) Solid understanding of streaming solutions such as Kafka, Pulsar, Kinesis/Firehose, etc. Solid understanding of Docker and Kubernetes Solid understanding of ETL/ELT and OLTP/OLAP concepts Solid understanding of columnar/row-oriented data structures (e.g. Parquet, ORC, Avro, etc.) Solid understanding of Apache Iceberg, or other open table formats Proven ability to transform raw unstructured/semi-structured data into structured data in accordance to business requirements Solid understanding of AWS, Linux and infrastructure concepts Proven ability to diagnose and address data abnormalities in systems Proven ability to learn quickly, make pragmatic decisions, and adapt to changing business needs Experience building data warehouses using conformed dimensional models Experience building data lakes and/or leveraging data lake solutions (e.g. Trino, Dremio, Druid, etc.) Experience working with business intelligence solutions (e.g. Tableau, etc.) Experience working with ML/Agentic AI pipelines (e.g. , Langchain, LlamaIndex, etc.) Understands Domain Driven Design concepts and accompanying Microservice Architecture Passion for data, analytics, or machine learning. Focus on value: shipping software that matters to the company and the customer Bonus Points Experience working with vector databases Experience working within a retail or ecommerce environment. Proficiency in other programming languages such as Scala, Java, Golang, etc. Experience working with Apache Arrow and/or other in-memory columnar data technologies Supervisory Responsibility Provide mentorship to team members on adopted patterns and best practices. Organize and lead agile ceremonies such as daily stand-ups, planning, etc

Posted 3 weeks ago

Apply

4.0 - 7.0 years

6 - 9 Lacs

Pune

Work from Office

Naukri logo

What s the role all about As a Senior BI Developer, you ll be a key contributor to developing Reports in a multi-region, multi-tenant SaaS product. You ll collaborate with the core RD team to build high-performance Reports to serve the use cases of several applications in the suite. How will you make an impact Take ownership of the software development lifecycle, including design, development, unit testing, and deployment, working closely with QA teams. Ensure that architectural concepts are consistently implemented across the product. Act as a product expert within RD, understanding the product s requirements and its market positioning. Work closely with cross-functional teams (Product Managers, Sales, Customer Support, and Services) to ensure successful product delivery. Have you got what it takes Bachelor/Master of Engineering Degree in Computer Science, Electronic Engineering or equivalent from reputed institute 4-7 years of BI report development experience Expertise in SQL any cloud-based databases. Would be able to work with any DB to write SQL for any business needs. Expertise in any BI tools like Tableau, Power BI, MicroStrategy etc.. Experience working in enterprise Data warehouse/ Data Lake system Strong knowledge of Analytical Data base and schemas Development experience building solutions that leverage SQL and NoSQL databases. Experience/Knowledge of Snowflake an advantage. In-depth understanding of database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework Experience working in functional testing, Performance testing etc.. Experience working in any Performance test script generation - Jmeter, Gatling etc.. Experience working in automating the Testing process for E2E and Regression cases. Experience working in JAVA/ Web services will be added advantage. Experience with public cloud infrastructure and technologies such as AWS/Azure/GCP etc What s in it for you Enjoy NICE-FLEX! Requisition ID: 6632 Reporting into: Tech Manager Role Type: Individual Contributor

Posted 3 weeks ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description Total 6+ years of experience 3 years of experience with leading automation tools and White-box testing (Java APIs) i.e. JUnit 2 years of Software development in Java 2EE. Experience in other automation tools i.e. Selenium, Mercury tools or self created test-harness tool 4 Year College Degree in Computer Science or related field i.e. BE or MCA Good understanding of XML, XSL/XSLT, RDBMS and Unix platforms. Experience in Multi-dimensional (OLAP technology),Data Warehouse and Financial software would be desirable. Motivated individual in learning leading-edge technology and testing complex software Career Level - IC3 Responsibilities Total 6+ years of experience 3 years of experience with leading automation tools and White-box testing (Java APIs) i.e. JUnit 2 years of Software development in Java 2EE. Experience in other automation tools i.e. Selenium, Mercury tools or self created test-harness tool 4 Year College Degree in Computer Science or related field i.e. BE or MCA Good understanding of XML, XSL/XSLT, RDBMS and Unix platforms. Experience in Multi-dimensional (OLAP technology),Data Warehouse and Financial software would be desirable. Motivated individual in learning leading-edge technology and testing complex software About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 3 weeks ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description Design, develop, troubleshoot and debug software programs for databases, applications, tools, networks etc.As a member of the software engineering division, you will take an active role in the definition and evolution of standard practices and procedures. You will be responsible for defining and developing software for tasks associated with the developing, designing and debugging of software applications or operating systems.Work is non-routine and very complex, involving the application of advanced technical/business skills in area of specialization. Leading contributor individually and as a team member, providing direction and mentoring to others. BS or MS degree or equivalent experience relevant to functional area. 7 years of software engineering or related experience.ResponsibilitiesOverview of Product – Oracle AnalyticsBe part of an energetic and challenging team building an enterprise Analytic platform that will allow users to quickly gain insights on their most valuable asset; data. Oracle Analytics is an industry-leading product that empowers entire organizations with a full range of business analytics tools, enterprise ready reporting and engaging, and easy-to-use self-service data visualizations. Our customers are business users that demand a software product that allows easy, fast navigation through the full spectrum of data scale from simple spreadsheets to analyzing enormous volumes of information in enterprise class data warehouses.Oracle Analytics is a comprehensive solution to meet the breadth of all analytics needs. Get the right data, to the right people, at the right time with analytics for everyone in your organization. With built-in security and governance, you can easily share insights and collaborate with your colleagues. By leveraging the cloud, you can scale up or down to suit your needs. The Oracle Analytics Cloud offering is a leading cloud service at Oracle built on Oracle Cloud Infrastructure. It runs with a Generation 2 offering and provides consistent high performance and unmatched governance and security controls.Self-service analytics drive business agility with faster time to insights. You no longer need help from IT to access, prepare, analyze, and collaborate on all your data. Easily create data visualizations with automated chart recommendations and optimize insights by collaborating with colleagues on analyses.Augmented analytics with embedded machine learning throughout the platform drive smarter and better insights. Always on—and always working in the background, machine learning is continuously learning from the data it takes in, making it smarter and more accurate as time goes by. Uncover deeper patterns and predict trends for impactful, unbiased recommendations.On the team we develop, deploy, and support the Oracle Analytics platform helping our customers succeed in their journey to drive business value. You will be working with experts in their field, exploring the latest technologies, you will be challenged while creating features that will be delivered to our customers, asked to be creative, and hopefully have some fun along the way. Members of our team are tasked to take on challenges along all aspect of our product.https://www.oracle.com/solutions/business-analytics Career Level - IC4 Responsibilities As a member of the development team, you will design, code, debug, and deliver innovative analytic features that involve in C++ development with extensive exposure on highly scalable, distributed, multithreaded applications. You will work closely with your peer developers located across the world, including Mexico, India, and the USA. Key responsibilities include: Design, develop, test and deliver new features on a world-class analytics platform suitable for deployment to both the Oracle Cloud and on-premise environments Lead the creation of formal design specifications and coding of complex systems Work closely with the Product Management on product requirements and functionality Build software applications following established coding standards Communicate continually with the project teams, explain progress on the development effort Contribute to continuous improvement by suggesting improvements to user interface, software architecture or recommending new technologies Ensure quality of work through development standards and QA procedures Perform maintenance and enhancements on existing software Key Qualifications: BS/MS in Computer Science or related major Exceptional analytic and problem-solving skills Extensive experience in using, building, debugging multithreaded applications Ability to design large, scalable systems for enterprise customers Solid understanding concurrency, multithreading and memory management Experienced in C++ programming including templates, STL, and object-oriented patterns Interest or experience in database kernel development Understanding of SQL and relational data processing concepts like joins and indexing strategies Experience With Java, Python Or Other Scripting Languages. Experienced in distributed and scalable server-side software development Knowledge in developing, implementing, and optimizing software algorithms Solid knowledge of data structures and operating systems Basic understanding of Agile/Scrum development methodologies Hands-on experience using source control tools such as GIT Strong written and verbal English communication skills Self-motivated and passionate in developing high quality software Strong Team Player Other Qualifications: Knowledge of Business Intelligence or Analytics Familiarity with SQL query optimization and execution Experienced in Big Data technologies (such as Hadoop, Spark) Interest or experience of OLAP, data warehousing or multidimensional databases Familiarity with Cloud services such as OCI, AWS or Azure Knowledge of Terraform/Python About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Manager, Business Analyst – C1 Employment Type: Permanent Location: Chennai Responsible Functions Gen AI: Expertise in leveraging advanced AI technologies to analyze business processes, identify automation and optimization opportunities, and drive data-driven decision-making. Capability to collaborate with stakeholders to translate business needs into AI solutions, ensuring seamless integration and maximizing operational efficiency, productivity, and innovation. Product Vision & Strategy: Perform market analysis to understand market landscape including competitor analysis, trends and customer needs to help define and communicate product vision & strategy aligning with company objectives Stakeholder Engagement: Interact with diversified stakeholders to conduct JAD sessions and use variety of techniques to elicit, elaborate, analyze and validate client requirements. Interact with Business team to conduct product demonstrations, evaluate, prioritize and build new features and functions. Requirements Management: Analyze and develop Business Requirements Document (BRD) and Functional Specification Document (FSD) for client/business reference. Translate Business Requirements to User Stories, Prioritize the Backlog and conduct Scrum Ceremonies for development consumption. Functional Solution Development: Responsible for End-to-End Functional Solutioning. Analyze the Business problem and validate the key business requirements to create a complete picture of workflows and technical requirements fulfilled by existing and proposed software. Identify, define and evaluate potential product solutions, including off-the-shelf and open-source components, and system architecture to ensure that they meet business requirements. Communication & Collaboration: Be a strong interface between Business and Internal stakeholders. Collaborate with development team (including architecture, coding & testing teams) to produce/maintain additional product and project deliverables like technical design, testing & program specifications, additional test scenarios and project plan. Proactively manage expectation regarding roadblocks, in the critical path to help ensure successful delivery of the solution. Business Value: Comprehend the business value of the fundamental solution being developed, assess the fitment within the overall architecture, risks and technical feasibility. Drive business metrics that will help optimize business & also deep dive into data for insights as required Team Management: Manage a small team of Business Analysts, define clear goals and be accountable for the functional solution delivered by the team. Participate in recruitment and building a strong BA team. RFP Support: Participate in Request for Information/Proposal handling and support with responses & solutions to questions or information requested. Client/Business Training: Work with Technical Writers to create training material and handle product/platform training sessions with diversified stakeholders. Essential Functions Multi-disciplinary technologist who enjoys designing, executing and selling Healthcare solutions, and being on the front-line of client communications and selling strategies Deep understanding of the US Healthcare value chain and key impact drivers [Payer and/or Provider] Knowledgeable and cognizant of how data management and science is used to solve organizational problems in the healthcare context Hands-on experience in two (or more) areas of the data and analytics technical domains - Enterprise cloud data warehousing, integration, preparation, and visualization along with artificial intelligence, machine learning, data science, data modeling, data management, and data governance Strong problem solving and analytical skills: Ability to break down a vague business problem into structured data analysis approaches & ability to work with incomplete information and take judgment-driven decisions based on experience. Experience ramping up analytics programs with new clients, including integrating with work of other teams to ensure analytics approach is aligned with operations as well as engage in consultative selling Primary Internal Interactions Review with the Product Manager & AVP for improvements in the product development lifecycle Assessment meeting with VP & above for additional product development features. Manage a small team of business analyst to lead the requirements effort for product development Primary External Interactions Communicate with onshore stakeholder & Executive Team Members. Help the Product Management Group set the product roadmap & help in identifying future sellable product features. Client Interactions to better understands expectations & streamline solutions. If required should be a bridge between the client and the technology teams. Skills Technical Skills Required Skills - SME in US Healthcare with deep Knowledge on Claims & Payments Lifecycle with at least 8 years of experience working with various US Healthcare Payer clients Skills Must Have Excellent understanding of Software Development Life Cycle & Methodologies like Agile Scrum, Waterfall etc. Strong experience in requirements elicitation techniques, functional documentation, stakeholder management, business solutions validation and user walkthroughs. Strong documentation skills to create BRD, FSD, Process Flows, User Stories Strong presentation skills. Good knowledge of SQL. Knowledge of tools like Azure Dev Ops, Jira, Visio, Draw.io etc. Experience in AI or Gen AI projects. Skills Nice To Have Development experience of 2 or more years will be good to have Experience on Big Data Tools – not limited to – Python, Spark + Python, HIVE, HBASE, Sqoop, CouchDB, MongoDB, MS SQL, Cassandra, Kafka Knowledge of Data Analysis Tools – (Online analytical processing (OLAP), ETL frameworks) Knowledge of Enterprise modeling tool and data integration platform (Erwin, Embarcadero, Informatica, Talend, SSIS, DataStage, Pentaho) Knowledge of Enterprise business intelligence platform (Tableau, PowerBI, Business Objects, Microstrategy, Cognos) Knowledge of Enterprise data warehousing platform (Oracle, Microsoft, DB2, Snowflake, AWS, Azure, Google Cloud Platform) Process Specific Skills Delivery Domain - Software Development – SDLC & Agile Certifications Business Domain - US Healthcare & Payer Analytics Payment Integrity Fraud, Waste & Abuse Claims Management Soft Skills Understanding of Healthcare business vertical and the business terms within Good analytical skills. Strong communication skills - oral and written Ability to work with various stakeholders across different geographical locations Should be able to function as an Individual Contributor as well if required. Strong aptitude to learn & implement healthcare solutions. Good Leadership Skills. Working Hours General Shift – 12PM to 9 PM Will be required to extend as per project release needs Education Requirements Master’s or bachelor’s degree from top tier colleges with good grades, preferably in a relevant field including Mathematics, Statistics, Computer Science or equivalent experience Show more Show less

Posted 3 weeks ago

Apply

Exploring OLAP Jobs in India

With the increasing demand for data analysis and business intelligence, OLAP (Online Analytical Processing) jobs have become popular in India. OLAP professionals are responsible for designing, building, and maintaining OLAP databases to support data analysis and reporting activities for organizations. If you are looking to pursue a career in OLAP in India, here is a comprehensive guide to help you navigate the job market.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Chennai

These cities are known for having a high concentration of IT companies and organizations that require OLAP professionals.

Average Salary Range

The average salary range for OLAP professionals in India varies based on experience levels. Entry-level professionals can expect to earn around INR 4-6 lakhs per annum, while experienced professionals with 5+ years of experience can earn upwards of INR 12 lakhs per annum.

Career Path

Career progression in OLAP typically follows a trajectory from Junior Developer to Senior Developer, and then to a Tech Lead role. As professionals gain experience and expertise in OLAP technologies, they may also explore roles such as Data Analyst, Business Intelligence Developer, or Database Administrator.

Related Skills

In addition to OLAP expertise, professionals in this field are often expected to have knowledge of SQL, data modeling, ETL (Extract, Transform, Load) processes, data warehousing concepts, and data visualization tools such as Tableau or Power BI.

Interview Questions

  • What is OLAP and how does it differ from OLTP? (basic)
  • Explain the difference between a star schema and a snowflake schema. (medium)
  • How do you optimize OLAP queries for performance? (advanced)
  • What is the role of aggregation functions in OLAP databases? (basic)
  • Can you explain the concept of drill-down in OLAP? (medium)
  • How do you handle slowly changing dimensions in OLAP databases? (advanced)
  • What are the advantages of using a multidimensional database over a relational database for OLAP purposes? (medium)
  • Describe your experience with OLAP tools such as Microsoft Analysis Services or Oracle OLAP. (basic)
  • How do you ensure data consistency in an OLAP environment? (medium)
  • What are some common challenges faced when working with OLAP databases? (advanced)
  • Explain the concept of data cubes in OLAP. (basic)
  • How do you approach designing a data warehouse for OLAP purposes? (medium)
  • Can you discuss the importance of indexing in OLAP databases? (advanced)
  • How do you handle missing or incomplete data in OLAP analysis? (medium)
  • What are the key components of an OLAP system architecture? (basic)
  • How do you troubleshoot performance issues in OLAP queries? (advanced)
  • Have you worked with real-time OLAP systems? If so, can you explain the challenges involved? (medium)
  • What are the limitations of OLAP compared to other data analysis techniques? (advanced)
  • How do you ensure data security in an OLAP environment? (medium)
  • Have you implemented any data mining algorithms in OLAP systems? If so, can you provide an example? (advanced)
  • How do you approach designing dimensions and measures in an OLAP cube? (medium)
  • What are some best practices for OLAP database design? (advanced)
  • How do you handle concurrent user access in an OLAP environment? (medium)
  • Can you explain the concept of data slicing and dicing in OLAP analysis? (basic)
  • What are your thoughts on the future of OLAP technologies in the era of big data and AI? (advanced)

Closing Remark

As you prepare for OLAP job interviews in India, make sure to hone your technical skills, brush up on industry trends, and showcase your problem-solving abilities. With the right preparation and confidence, you can successfully land a rewarding career in OLAP in India. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies