Home
Jobs

458 Olap Jobs - Page 15

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Senior Business Analyst/Lead Business Analyst – B1/B2 Employment Type: Permanent Location: Chennai Responsible Functions Product Vision & Strategy: Help with inputs on product features through market analysis to understand market landscape including competitor solutions, trends and customer needs. Stakeholder Engagement: Interact with diversified stakeholders to conduct JAD sessions and use variety of techniques to elicit, document, analyze and validate client requirements. Interface with Business team to conduct product demonstrations, evaluate, prioritize and build new features and functions. Requirements Management: Analyze and develop business requirement document (BRD) for client/business reference. Translate Business Requirements to user Stories to create, prioritize in backlog, sprint, DOD and releases using Jira for development consumption. Perform requirements review with external and internal stakeholders and resolve issues while suggesting corrective actions. Functional Solution Development: Responsible for end-to-end functional solution. Analyze the business problem and validate the key business requirements to create a complete picture of workflows and technical requirements fulfilled by existing and proposed software. Identify, define and evaluate potential product solutions, including off-the-shelf and open-source components, and system architecture to ensure that they meet business requirements. Communication & Collaboration: Act as a liaison between Business user and technical solutions/support groups to ensure proper communication between diversified teams. Collaborate with development team (including architecture, coding & testing teams) to produce/maintain additional product and project deliverables in technical design, testing & program specifications, additional test scenarios and project plan. Proactively manage expectation regarding roadblocks, in the critical path to help ensure successful delivery of the solution. Business Value: Comprehend the fundamental solution being developed/deployed – its business value & blueprint how it fits with the overall architecture, risks, and more. Drive business metrics that will help optimize business & also deep dive into data for insights as required Team Mentoring: Train & Mentor juniors in the team on a need basis Essential Functions Technologist who enjoys executing and selling Healthcare solutions. Being on the front-line of client communications. Good understanding of the US Healthcare value chain and key impact drivers [Payer and/or Provider] Knowledgeable and cognizant of how data management and science is used to solve organizational problems in the healthcare context Hands-on experience in at least one area of the data and analytics technical domains - Enterprise cloud data warehousing, integration, preparation, and visualization along with artificial intelligence, machine learning, data science, data modeling, data management, and data governance Strong problem solving and analytical skills: ability to break down a vague business problem into structured data analysis approaches & ability to work with incomplete information and take judgment-driven decisions based on experience. Primary Internal Interactions Review with the Product Manager & AVP for improvements in the product development lifecycle Assessment meeting with VP & above for additional product development features. Train & Mentor juniors in the team on a need basis. Primary External Interactions Communicate with onshore stakeholder & Executive Team Members. Help the Product Management Group set the product roadmap & help in identifying future sellable product features. Client Interactions to better understands expectations & streamline solutions. If required should be a bridge between the client and the technology teams. Skills Technical Skills Required Skills – Good Knowledge of US Healthcare with at least 3 years of experience working with various US Healthcare Payer clients Skills Must Have Good understanding of Software Development Life Cycle & Methodologies like Agile Scrum, Waterfall etc. Strong experience in requirements elicitation techniques, functional documentation, stakeholder management, business solutions validation and user walkthroughs. Strong documentation skills to create BRD, FSD, Process Flows, User Stories Strong presentation skills. Basic knowledge of SQL. Knowledge of tools like Jira, Visio, Draw.io etc. Skills Nice To Have Development experience of 1 or 2 years will be good to have Experience on Big Data Tools – not limited to – Python, Spark + Python, HIVE, HBASE, Sqoop, CouchDB, MongoDB, MS SQL, Cassandra, Kafka Knowledge of Data Analysis Tools – (Online analytical processing (OLAP), ETL frameworks) Knowledge of Enterprise modeling tool and data integration platform (Erwin, Embarcadero, Informatica, Talend, SSIS, DataStage, Pentaho) Knowledge of Enterprise business intelligence platform (Tableau, PowerBI, Business Objects, Microstrategy, Cognos) Knowledge of Enterprise data warehousing platform (Oracle, Microsoft, DB2, Snowflake, AWS, Azure, Google Cloud Platform) Process Specific Skills Delivery Domain - Software Development – SDLC & Agile Certifications Business Domain - US Healthcare Insurance & Payer Analytics Fraud, Waste & Abuse Payer Management Code Classification Management Soft Skills Understanding of Healthcare business vertical and the business terms within Good analytical skills. Strong communication skills - oral and verbal Ability to work with various stakeholders across different geographical locations Should be able to function as an Individual Contributor as well if required. Strong aptitude to learn & implement healthcare solutions. Ability to work independently Working Hours General Shift – 12PM to 9 PM Will be required to extend as per project release needs Education Requirements Master’s or Bachelor’s degree from top tier colleges with good grades, preferably in a relevant field including Mathematics, Statistics, Computer Science or equivalent experience Show more Show less

Posted 3 weeks ago

Apply

7.0 years

0 Lacs

Madhya Pradesh, India

Remote

Linkedin logo

As a global leader in cybersecurity, CrowdStrike protects the people, processes and technologies that drive modern organizations. Since 2011, our mission hasn’t changed — we’re here to stop breaches, and we’ve redefined modern security with the world’s most advanced AI-native platform. We work on large scale distributed systems, processing almost 3 trillion events per day. We have 3.44 PB of RAM deployed across our fleet of C* servers - and this traffic is growing daily. Our customers span all industries, and they count on CrowdStrike to keep their businesses running, their communities safe and their lives moving forward. We’re also a mission-driven company. We cultivate a culture that gives every CrowdStriker both the flexibility and autonomy to own their careers. We’re always looking to add talented CrowdStrikers to the team who have limitless passion, a relentless focus on innovation and a fanatical commitment to our customers, our community and each other. Ready to join a mission that matters? The future of cybersecurity starts with you. About The Role The charter of the Data + ML Platform team is to harness all the data that is ingested and cataloged within the Data LakeHouse for exploration, insights, model development, ML Engineering and Insights Activation. This team is situated within the larger Data Platform group, which serves as one of the core pillars of our company. We process data at a truly immense scale. Our processing is composed of various facets including threat events collected via telemetry data, associated metadata, along with IT asset information, contextual information about threat exposure based on additional processing, etc. These facets comprise the overall data platform, which is currently over 200 PB and maintained in a hyper scale Data Lakehouse, built and owned by the Data Platform team. The ingestion mechanisms include both batch and near real-time streams that form the core Threat Analytics Platform used for insights, threat hunting, incident investigations and more. As an engineer in this team, you will play an integral role as we build out our ML Experimentation Platform from the ground up. You will collaborate closely with Data Platform Software Engineers, Data Scientists & Threat Analysts to design, implement, and maintain scalable ML pipelines that will be used for Data Preparation, Cataloging, Feature Engineering, Model Training, and Model Serving that influence critical business decisions. You’ll be a key contributor in a production-focused culture that bridges the gap between model development and operational success. Future plans include generative AI investments for use cases such as modeling attack paths for IT assets. What You’ll Do Help design, build, and facilitate adoption of a modern Data+ML platform Modularize complex ML code into standardized and repeatable components Establish and facilitate adoption of repeatable patterns for model development, deployment, and monitoring Build a platform that scales to thousands of users and offers self-service capability to build ML experimentation pipelines Leverage workflow orchestration tools to deploy efficient and scalable execution of complex data and ML pipelines Review code changes from data scientists and champion software development best practices Leverage cloud services like Kubernetes, blob storage, and queues in our cloud first environment What You’ll Need B.S. in Computer Science, Data Science, Statistics, Applied Mathematics, or a related field and 7 + years related experience; or M.S. with 5+ years of experience; or Ph.D with 6+ years of experience. 3+ years experience developing and deploying machine learning solutions to production. Familiarity with typical machine learning algorithms from an engineering perspective (how they are built and used, not necessarily the theory); familiarity with supervised / unsupervised approaches: how, why, and when and labeled data is created and used 3+ years experience with ML Platform tools like Jupyter Notebooks, NVidia Workbench, MLFlow, Ray, Vertex AI etc. Experience building data platform product(s) or features with (one of) Apache Spark, Flink or comparable tools in GCP. Experience with Iceberg is highly desirable. Proficiency in distributed computing and orchestration technologies (Kubernetes, Airflow, etc.) Production experience with infrastructure-as-code tools such as Terraform, FluxCD Expert level experience with Python; Java/Scala exposure is recommended. Ability to write Python interfaces to provide standardized and simplified interfaces for data scientists to utilize internal Crowdstrike tools Expert level experience with CI/CD frameworks such as GitHub Actions Expert level experience with containerization frameworks Strong analytical and problem solving skills, capable of working in a dynamic environment Exceptional interpersonal and communication skills. Work with stakeholders across multiple teams and synthesize their needs into software interfaces and processes. Experience With The Following Is Desirable Go Iceberg Pinot or other time-series/OLAP-style database Jenkins Parquet Protocol Buffers/GRPC VJ1 Benefits Of Working At CrowdStrike Remote-friendly and flexible work culture Market leader in compensation and equity awards Comprehensive physical and mental wellness programs Competitive vacation and holidays for recharge Paid parental and adoption leaves Professional development opportunities for all employees regardless of level or role Employee Resource Groups, geographic neighbourhood groups and volunteer opportunities to build connections Vibrant office culture with world class amenities Great Place to Work Certified™ across the globe CrowdStrike is proud to be an equal opportunity employer. We are committed to fostering a culture of belonging where everyone is valued for who they are and empowered to succeed. We support veterans and individuals with disabilities through our affirmative action program. CrowdStrike is committed to providing equal employment opportunity for all employees and applicants for employment. The Company does not discriminate in employment opportunities or practices on the basis of race, color, creed, ethnicity, religion, sex (including pregnancy or pregnancy-related medical conditions), sexual orientation, gender identity, marital or family status, veteran status, age, national origin, ancestry, physical disability (including HIV and AIDS), mental disability, medical condition, genetic information, membership or activity in a local human rights commission, status with regard to public assistance, or any other characteristic protected by law. We base all employment decisions--including recruitment, selection, training, compensation, benefits, discipline, promotions, transfers, lay-offs, return from lay-off, terminations and social/recreational programs--on valid job requirements. If you need assistance accessing or reviewing the information on this website or need help submitting an application for employment or requesting an accommodation, please contact us at recruiting@crowdstrike.com for further assistance. Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Description You are a strategic thinker passionate about driving solutions in data visualization. You have found the right team. As a Data Visualization Associate within our Databricks team, you will be responsible for designing, developing, and optimizing data models to support data integration, transformation, and analytics. We value your expertise in handling data from various sources and your commitment to ensuring scalable, efficient, and high-quality data solutions. Job Responsibilities Design and implement data models (conceptual, logical, and physical) to support business requirements. Hands on Erwin tool experience is added advantage. Work with structured and unstructured data from multiple sources and integrate them into Databricks. Develop ETL/ELT pipelines to extract, transform, and load data efficiently. Optimize data storage, processing, and performance in Databricks. Collaborate with data engineers, analysts, and business stakeholders to understand data needs. Ensure data governance, quality, and compliance with industry standards. Create and maintain documentation for data models, pipelines, and architectures. Troubleshoot and optimize queries and workflows for performance improvement. Create/modify queries at consumption level for end users. Required Qualifications, Capabilities, And Skills 5+ years of experience in data modeling/data engineering. Strong expertise in Databricks, Delta Lake, Apache Spark, advanced queries Experience with SQL, Python for data manipulation. Knowledge of ETL/ELT processes and data pipeline development. Hands-on experience with data warehousing, relational databases, and NoSQL. Familiarity with data governance, security, and compliance best practices. Strong problem-solving skills and the ability to work in an agile environment. Preferred Qualifications, Capabilities And Skills Experience working with large-scale data systems and streaming data. Knowledge of business intelligence (BI) tools and reporting frameworks. Experience in finance domain (P&A, Markets etc) will be preferable. Experience with cloud platforms (AWS, Azure, or GCP) is a plus. Experience with OLAP tools (TM1, Essbase, Atoti etc) is a plus. ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team J.P. Morgan’s Commercial & Investment Bank is a global leader across banking, markets, securities services and payments. Corporations, governments and institutions throughout the world entrust us with their business in more than 100 countries. The Commercial & Investment Bank provides strategic advice, raises capital, manages risk and extends liquidity in markets around the world. Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Apply with updated CV to hr@bitstringit.com Main Task : Maintain & develop data platforms based on Microsoft Fabric for Business Intelligence & Databricks for real-time data analytics Design, implement and maintain standardized production-grade data pipelines using modern data transformation processes and workflows for SAP, MS Dynamics, on-premise or cloud. Develop an enterprise-scale cloud-based Data Lake for business intelligence solutions. Translate business and customer needs into data collection, preparation and processing requirements. Optimize the performance of algorithms developed by Data Scientists General administration and monitoring of the data platforms Competencies : working with structured & unstructured data experienced in various database technologies (RDBMS, OLAP, Timeseries, etc.) solid programming skills (Python, SQL, Scala is a plus) experience in Microsoft Fabric (incl. Warehouse, Lakehouse, Data Factory, DataFlow Gen2, Semantic Model) and/or Databricks (Spark) proficient in PowerBI experienced working with APIs proficient in security best practices data centric Azure know-how is a plus (Storage, Networking, Security, Billing) Education / experience / language: • Bachelor or Master degree in business informatics, computer science, or equal • A background in software engineering (e.g., agile programming, project organization) and experience with human centered design would be desirable • Extensive experience in handling large data sets • Experience working at least 5 years as a data engineer, preferably in an industrial company • Analytical problem-solving skills and the ability to assimilate complex information • Programming experience in modern data-oriented languages (SQL, Python) • Experience with Apache Spark and DevOps Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Duration of Contract: 6 Months Location: PAN India Experience Required: 7–8 Years Prerequisite Skills: SSAS, Advanced SQL Job Summary We are looking for a highly skilled SSAS Developer with solid experience in building OLAP and Tabular models using SQL Server Analysis Services (SSAS) . The ideal candidate should also possess in-depth knowledge of ETL processes using tools such as SSIS , Informatica , or Azure Data Factory , and should be capable of developing scalable and efficient data integration solutions. Key Responsibilities Design, develop, and maintain SSAS OLAP cubes and Tabular models. Collaborate with business analysts and data architects to gather and understand business requirements. Develop advanced DAX and MDX queries for analytics and reporting. Optimize SSAS models for performance, scalability, and efficiency. Create and manage ETL pipelines for data extraction, transformation, and loading. Integrate data from various relational and non-relational sources. Follow best practices in data modeling, version control, and deployment automation. Troubleshoot issues related to data performance, integrity, and availability. Work with BI tools like Power BI, Excel, or Tableau for dashboard creation and data visualization. Required Skills Strong hands-on experience with SSAS Tabular and Multidimensional models. Advanced skills in DAX, MDX, and SQL. Proficiency in ETL tools like SSIS, Informatica, or Azure Data Factory. Solid understanding of Dimensional Data Modeling and schema designs (Star/Snowflake). Experience with CI/CD pipelines and source control systems (e.g., Git). Familiarity with data warehousing concepts and data governance practices. Strong problem-solving abilities and attention to detail. Preferred Qualifications Experience with cloud data platforms: Azure Synapse, Snowflake, or AWS Redshift. Knowledge of Power BI or other front-end BI tools. Familiarity with Agile/Scrum methodologies. Bachelor’s degree in Computer Science, Information Systems, or a related field. Mandatory Technical Skills T-SQL MS SQL Server Dimensional Data Modeling Azure SQL Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

load_list_page(event)"> Job listing Job details Job Information Date Opened 05/21/2025 Job Type Full time Industry Technology Work Experience 5+ years City Kolkata State/Province West Bengal Country India Zip/Postal Code 700026 Job Description Job Description: Understand BI/reporting-related requirements. Design and develop Reporting or Dashboards. Work closely with DBAs to understand data flow and apply optimized data structures to BI/Reporting for performance tuning. Develop and implement programs and scripting to support BI/Reporting to help front-end developers. Suggest, share, and implement best practices among team members. Develop, build, and deploy BI solutions or Reporting tools. Maintain, support, evaluate, and improve existing BI/Reporting & analytics platforms. Create tools to store data (e.g., OLAP cubes). Conduct unit testing and troubleshooting. Develop and execute database queries and conduct analyses. Create visualizations and reports for requested projects. Map various BI/Reporting databases and documentation used in the organization. Support customers to resolve any issues. Requirements Proven 6+ years of experience as a BI/Report Developer. Proven abilities to take initiative and be innovative. A clear understanding of data integration tools, OLAP, ETL/ELT processes, warehouse architecture/design (e.g., dimensional modeling), and data mining. Microservices architecture is a must. Familiarity with Cloud-based BI/Reporting environment (e.g., Amazon Web Services, Microsoft Power BI, Snowflake, or similar services). Knowledge of Amazon products (like QuickSight), the Hadoop platform, and Apache technologies. Hands-on knowledge in C#, Scala, R, Python, DAX, or GraphQL will be preferable. Familiarity with reporting technologies on different platforms (e.g., SQL queries, SSRS, SSIS, MongoDB, MySQL, PGDB). An analytical mind with a problem-solving aptitude. BSc/BA in Computer Science, Engineering, or relevant field. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#6875E2;border-color:#6875E2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Welcome to Warner Bros. Discovery… the stuff dreams are made of. Who We Are… When we say, “the stuff dreams are made of,” we’re not just referring to the world of wizards, dragons and superheroes, or even to the wonders of Planet Earth. Behind WBD’s vast portfolio of iconic content and beloved brands, are the storytellers bringing our characters to life, the creators bringing them to your living rooms and the dreamers creating what’s next… From brilliant creatives, to technology trailblazers, across the globe, WBD offers career defining opportunities, thoughtfully curated benefits, and the tools to explore and grow into your best selves. Here you are supported, here you are celebrated, here you can thrive. Your New Role The Business Intelligence senior analyst will support the ongoing design and development of dashboards, reports, and other analytics studies or needs. To be successful in the role you’ll need to be intellectually curious, detail-oriented, open to new ideas, and possess data skills and a strong aptitude for quantitative methods. The role requires strong SQL skills a wide experience using BI visualization tools like Tableau and PowerBI Your Role Accountabilities With the support of other analysis and technical teams, collect and analyze stakeholders’ requirements. Responsible for developing interactive and user-friendly dashboards and reports, partnering with UI/UX designers. Be experienced in BI tools like powerBi, Tableau, Looker, Microstrategy and Business Object and be capable and eager to learn new and other tools Be able to quickly shape data into reporting and analytics solutions Work with the Data and visualization platform team on reporting tools actualizations, understanding how new features can benefit our stakeholders in the future, and adapting existing dashboards and reports Have knowledge of database fundamentals such as multidimensional database design, relational database design, and more Qualifications & Experiences 5+ years of experience working with BI tools or any data-specific role with a sound knowledge of database management, data modeling, business intelligence, SQL querying, data warehousing, and online analytical processing (OLAP) Skills in BI tools and BI systems, such as Power BI, SAP BO, Tableau, Looker, Microstrategy, etc., creating data-rich dashboards, implementing Row-level Security (RLS) in Power BI, writing DAX expressions, developing custom BI products with scripting and programming languages such as R, Python, etc. In-depth understanding and experience with BI stacks The ability to drill down on data and visualize it in the best possible way through charts, reports, or dashboards Self-motivated and eager to learn Ability to communicate with business as well as technical teams Strong client management skills Ability to learn and quickly respond to rapidly changing business environment Have an analytical and problem-solving mindset and approach Not Required But Preferred Experience BA/BS or MA/MS in design related field, or equivalent experience (relevant degree subjects include computer science, digital design, graphic design, web design, web technology) Understanding of software development architecture and technical aspects How We Get Things Done… This last bit is probably the most important! Here at WBD, our guiding principles are the core values by which we operate and are central to how we get things done. You can find them at www.wbd.com/guiding-principles/ along with some insights from the team on what they mean and how they show up in their day to day. We hope they resonate with you and look forward to discussing them during your interview. Championing Inclusion at WBD Warner Bros. Discovery embraces the opportunity to build a workforce that reflects a wide array of perspectives, backgrounds and experiences. Being an equal opportunity employer means that we take seriously our responsibility to consider qualified candidates on the basis of merit, regardless of sex, gender identity, ethnicity, age, sexual orientation, religion or belief, marital status, pregnancy, parenthood, disability or any other category protected by law. If you’re a qualified candidate with a disability and you require adjustments or accommodations during the job application and/or recruitment process, please visit our accessibility page for instructions to submit your request. Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Software Developer - Expertise SQL queries, Power BI, and SQL Server Integration Services (SSIS) - Pune Location We are seeking a highly skilled and experienced Specialist Power BI Developer to join our team, specifically focused on our client inquiry management project. As a Specialist Power BI Developer, you will be responsible for formulating and delivering automated reports and dashboards using Power BI and other reporting tools, with a specific focus on inquiry reporting metrics as MTTR, Avg. Aging, platform adoption etc. You will work closely with business stakeholders to understand their requirements related to inquiry management and translate them into functional specifications for reporting applications. You will have expertise in SQL queries, Power BI, and SQL Server Integration Services (SSIS), and a strong understanding of database concepts and data modeling. Responsibilities Formulate automated reports and dashboards using Power BI and other reporting tools, with a focus on inquiry reporting metrics. Understand the specific business requirements related to inquiry management and set functional specifications for reporting applications. Utilize expertise in SQL queries, Power BI, and SSIS to gather and analyze data related to inquiries for reporting purposes. Develop technical specifications from business needs and establish deadlines for work completion. Design data models that transform raw data related to inquiries into insightful knowledge by understanding business requirements in the context of inquiry reporting metrics. Create dynamic and eye-catching dashboards and reports using Power BI, highlighting key metrics and trends related to inquiries. Implement row-level security on data and comprehend Power BI's application security layer models, ensuring data privacy and confidentiality related to inquiries. Collaborate with cross-functional teams to integrate, alter, and connect data sources related to inquiries for business intelligence purposes. Make necessary tactical and technological adjustments to enhance the current inquiry management reporting systems. Troubleshoot and resolve issues related to data quality and reporting specifically focused on inquiries. Communicate effectively with internal teams and client teams to explain requirements and deliver solutions related to inquiry reporting metrics. Stay up to date with industry trends and advancements in Power BI and business intelligence for effective inquiry reporting. Requirements Bachelor's degree in Computer Science, Information Systems, or a related field. Proven experience as a Power BI Developer or similar role, with a specific focus on reporting related to inquiries or customer service metrics. Expertise in SQL queries, Power BI, and SQL Server Integration Services (SSIS). Excellent communication skills to effectively articulate requirements and collaborate with internal and client teams. Strong analytical thinking skills for converting data related to inquiries into illuminating reports and insights. Knowledge of data warehousing, data gateway, and data preparation projects. Familiarity with the Microsoft SQL Server BI Stack, including SSIS, TSQL, Power Query, MDX, PowerBI, and DAX. Detailed knowledge and understanding of database management systems, OLAP, and the ETL framework. Proficiency in Microsoft Excel and other data analysis tools. Ability to gather and analyze business requirements specific to inquiries and translate them into technical specifications. Strong attention to detail and ability to QA and validate data for accuracy. Ability to manage multiple projects and deadlines simultaneously. Knowledge of Agile development methodologies is a plus. Ability to learn and adapt to new technologies and tools quickly. Thank You For Considering Employment With Fiserv. Please Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our Commitment To Diversity And Inclusion Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note To Agencies Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning About Fake Job Posts Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address. Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Through our dedicated associates, Conduent delivers mission-critical services and solutions on behalf of Fortune 100 companies and over 500 governments - creating exceptional outcomes for our clients and the millions of people who count on them. You have an opportunity to personally thrive, make a difference and be part of a culture where individuality is noticed and valued every day. Job Overview: We are looking for a BI & Visualization Developer who will be part of our Analytics Practice and will be expected to actively work in a multi-disciplinary fast paced environment. This role requires a broad range of skills and the ability to step into different roles depending on the size and scope of the project; its primary responsibility is to support the design, development and maintainance of business intelligence and analytics solutions. Responsibilities: Develop reports, dashboards, and advanced visualizations. Works closely with the product managers, business analysts, clients etc. to understand the needs / requirements and develop visualizations needed. Provide support to new of existing applications while recommending best practices and leading projects to implement new functionality. Learn and develop new visualization techniques as required to keep up with the contemporary visualization design and presentation. Reviews the solution requirements and architecture to ensure selection of appropriate technology, efficient use of resources and integration of multiple systems and technology. Collaborate in design reviews and code reviews to ensure standards are met. Recommend new standards for visualizations. Build and reuse template/components/web services across multiple dashboards Support presentations to Customers and Partners Advising on new technology trends and possible adoption to maintain competitive advantage Mentoring Associates Experience Needed: 8+ years of related experience is required. A Bachelor degree or Masters degree in Computer Science or related technical discipline is required Highly skilled in data visualization tools like PowerBI, Tableau, Qlikview etc. Very Good Understanding of PowerBI Tabular Model/Azure Analysis Services using large datasets. Strong SQL coding experience with performance optimization experience for data queries. Understands different data models like normalized, de-normalied, stars, and snowflake models. Worked in big data environments, cloud data stores, different RDBMS and OLAP solutions. Experience in design, development, and deployment of BI systems. Candidates with ETL experience preferred. Is familiar with the principles and practices involved in development and maintenance of software solutions and architectures and in service delivery. Has strong technical background and remains evergreen with technology and industry developments. Additional Requirements Demonstrated ability to have successfully completed multiple, complex technical projects Prior experience with application delivery using an Onshore/Offshore model Experience with business processes across multiple Master data domains in a services based company Demonstrates a rational and organized approach to the tasks undertaken and an awareness of the need to achieve quality. Demonstrates high standards of professional behavior in dealings with clients, colleagues and staff. Strong written communication skills. Is effective and persuasive in both written and oral communication. Experience with gathering end user requirements and writing technical documentation Time management and multitasking skills to effectively meet deadlines under time-to-market pressure May require occasional travel Conduent is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, creed, religion, ancestry, national origin, age, gender identity, gender expression, sex/gender, marital status, sexual orientation, physical or mental disability, medical condition, use of a guide dog or service animal, military/veteran status, citizenship status, basis of genetic information, or any other group protected by law. People with disabilities who need a reasonable accommodation to apply for or compete for employment with Conduent may request such accommodation(s) by submitting their request through this form that must be downloaded: click here to access or download the form. Complete the form and then email it as an attachment to FTADAAA@conduent.com. You may also click here to access Conduent's ADAAA Accommodation Policy. At Conduent we value the health and safety of our associates, their families and our community. For US applicants while we DO NOT require vaccination for most of our jobs, we DO require that you provide us with your vaccination status, where legally permissible. Providing this information is a requirement of your employment at Conduent. Show more Show less

Posted 3 weeks ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Hyderabad, Bengaluru

Work from Office

Naukri logo

sRide Carpool is looking for Azure SQL Developer DBA to join our dynamic team and embark on a rewarding career journey The Azure SQL Developer DBA is responsible for designing, implementing, and managing databases in the Azure environment This role involves optimizing database performance, ensuring data integrity, managing security, and collaborating with development teams to support application deployments and enhancements Key Responsibilities:Database Design and Architecture:Design and implement Azure SQL databases based on business requirements Optimize database schema for performance, scalability, and maintainability Define data retention and archiving strategies Database Deployment and Management:Deploy and manage Azure SQL databases using best practices Configure and maintain database instances, including provisioning, scaling, and patching Monitor database performance and troubleshoot issues Performance Optimization:Identify and resolve performance bottlenecks by tuning queries, indexes, and database configurations Monitor and analyze query execution plans to optimize performance Implement caching strategies to improve application performance Data Security and Compliance:Implement security measures to safeguard sensitive data Configure and manage user roles, permissions, and access controls Ensure compliance with data protection regulations (EG, GDPR, HIPAA) Backup and Recovery:Implement and maintain database backup and recovery processes Develop disaster recovery plans and perform regular backups and testing High Availability and Scalability:Configure and manage high availability solutions such as failover groups, availability sets, or geo-replication Monitor and scale databases based on usage patterns and growth Collaboration and Support:Work closely with development teams to understand application requirements and optimize database interactions Provide technical guidance and support to developers for database-related tasks Assist in troubleshooting production incidents related to databases Automation and Scripting:Develop scripts and automation tools for database provisioning, configuration, and maintenance Implement Infrastructure as Code (IaC) principles for database deployment and management Documentation:Maintain documentation for database design, configuration, and processes Create operational runbooks and guidelines for database-related tasks

Posted 3 weeks ago

Apply

8.0 - 10.0 years

25 - 30 Lacs

Chennai

Work from Office

Naukri logo

Job_Description":" Pando \u202fis a global leader in supply chain technology, building the worlds quickest time-to-value Fulfillment Cloud platform. Pando\u2019s Fulfillment Cloud provides manufacturers, retailers, and 3PLs with a single pane of glass to streamline end-to-end purchase order fulfillment and customer order fulfillment to improve service levels, reduce carbon footprint, and bring down costs. As a partner of choice for Fortune 500 enterprises globally, with a presence across APAC, the Middle East, and the US, Pando is recognized as a \u202fTechnology Pioneer by the World Economic Forum (WEF) , and as\u202f one of the fastest growing technology companies by Deloitte .\u202f Role As the Senior Lead for AI and Data Warehouse at Pando, you will be responsible for building and scaling the data and AI services team. You will drive the design and implementation of highly scalable, modular, and reusable data pipelines, leveraging big data technologies and low-code implementations. This is a senior leadership position where you will work closely with cross-functional teams to deliver solutions that power advanced analytics, dashboards, and AI-based insights. Key Responsibilities Lead the development of scalable, high-performance data pipelines using PySpark or Big Data ETL pipeline technologies. Drive data modeling efforts for analytics, dashboards, and knowledge graphs. Oversee the implementation of parquet-based data lakes. Work on OLAP databases, ensuring optimal data structure for reporting and querying. Architect and optimize large-scale enterprise big data implementations with a focus on modular and reusable low-code libraries. Collaborate with stakeholders to design and deliver AI and DWH solutions that align with business needs. Mentor and lead a team of engineers, building out the data and AI services organization. Requirements -8-10 years of experience in big data and AI technologies, with expertise in PySpark or similar Big Data ETL pipeline technologies. -Strong proficiency in SQL and OLAP database technologies. -Firsthand experience with data modeling for analytics, dashboards, and knowledge graphs. -Proven experience with parquet-based data lake implementations. -Expertise in building highly scalable, high-volume data pipelines. -Experience with modular, reusable, low-code-based implementations. -Involvement in large-scale enterprise big data implementations. -Initiative-taker with strong motivation and the ability to lead a growing team. Preferred -Experience leading a team or building out a new department. -Experience with cloud-based data platforms and AI services. -Familiarity with supply chain technology or fulfilment platforms is a plus. -Join us at Pando and lead the transformation of our AI and data services, delivering innovative solutions for global enterprises! ","Job_Type":"Full time" , "Job_Opening_Name":"Technical Lead - AI & Data Warehouse","State":"Tamil Nadu" , "Country":"India" , "Zip_Code":"600017" , "id":"727224000012573268" , "Publish":true , "Date_Opened":"2025-05-08" , "

Posted 3 weeks ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description Total 6+ years of experience 3 years of experience with Selenium automation tool 2 years of Software development in Java 2EE. Experience in other automation tools Mercury tools or self-created test-harness tool and White-box testing (Java APIs) .i.e. JUnit 4 Year College Degree in Computer Science or related field i.e. BE or MCA Good understanding of XML, XSL/XSLT, RDBMS and Linux platform. Experience in Multi-dimensional (OLAP technology),Data Warehouse and Financial software would be desirable. Motivated individual in learning leading-edge technology and testing complex software. Career Level - IC3 Responsibilities Total 6+ years of experience 3 years of experience with Selenium automation tool 2 years of Software development in Java 2EE. Experience in other automation tools Mercury tools or self-created test-harness tool and White-box testing (Java APIs) .i.e. JUnit 4 Year College Degree in Computer Science or related field i.e. BE or MCA Good understanding of XML, XSL/XSLT, RDBMS and Linux platform. Experience in Multi-dimensional (OLAP technology),Data Warehouse and Financial software would be desirable. Motivated individual in learning leading-edge technology and testing complex software. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Through our dedicated associates, Conduent delivers mission-critical services and solutions on behalf of Fortune 100 companies and over 500 governments - creating exceptional outcomes for our clients and the millions of people who count on them. You have an opportunity to personally thrive, make a difference and be part of a culture where individuality is noticed and valued every day. Job Overview We are looking for a Data Engineer who will be part of our Analytics Practice and will be expected to actively work in a multi-disciplinary fast paced environment. This role requires a broad range of skills and the ability to step into different roles depending on the size and scope of the project; its primary responsibility is the acquisition, transformation, loading and processing of data from a multitude of disparate data sources, including structured and unstructured data for advanced analytics and machine learning in a big data environment. Responsibilities: Engineer a modern data pipeline to collect, organize, and process data from disparate sources. Performs data management tasks, such as conduct data profiling, assess data quality, and write SQL queries to extract and integrate data Develop efficient data collection systems and sound strategies for getting quality data from different sources Consume and analyze data from the data pool to support inference, prediction and recommendation of actionable insights to support business growth. Design and develop ETL processes using tools and scripting. Troubleshoot and debug ETL processes. Performance tuning and opitimization of the ETL processes. Provide support to new of existing applications while recommending best practices and leading projects to implement new functionality. Collaborate in design reviews and code reviews to ensure standards are met. Recommend new standards for visualizations. Learn and develop new ETL techniques as required to keep up with the contemporary technologies. Reviews the solution requirements and architecture to ensure selection of appropriate technology, efficient use of resources and integration of multiple systems and technology. Support presentations to Customers and Partners Advising on new technology trends and possible adoption to maintain competitive advantage Experience Needed: 8+ years of related experience is required. A BS or Masters degree in Computer Science or related technical discipline is required ETL experience with data integration to support data marts, extracts and reporting Experience connecting to varied data sources Excellent SQL coding experience with performance optimization for data queries. Understands different data models like normalized, de-normalied, stars, and snowflake models. Worked with transactional, temporarl, time series, and structured and unstructured data. Experience on Azure Data Factory and Azure Synapse Analytics Worked in big data environments, cloud data stores, different RDBMS and OLAP solutions. Experience in cloud-based ETL development processes. Experience in deployment and maintenance of ETL Jobs. Is familiar with the principles and practices involved in development and maintenance of software solutions and architectures and in service delivery. Has strong technical background and remains evergreen with technology and industry developments. At least 3 years of demonstrated success in software engineering, release engineering, and/or configuration management. Highly skilled in scripting languages like PowerShell. Substantial experience in the implementation and exectuion fo CI/CD processes. Additional Requirements Demonstrated ability to have successfully completed multiple, complex technical projects Prior experience with application delivery using an Onshore/Offshore model Experience with business processes across multiple Master data domains in a services based company Demonstrates a rational and organized approach to the tasks undertaken and an awareness of the need to achieve quality. Demonstrates high standards of professional behavior in dealings with clients, colleagues and staff. Is able to make sound and far reaching decisions alone on major issues and to take full responsibility for them on a technical basis. Strong written communication skills. Is effective and persuasive in both written and oral communication. Experience with gathering end user requirements and writing technical documentation Time management and multitasking skills to effectively meet deadlines under time-to-market pressure Conduent is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, creed, religion, ancestry, national origin, age, gender identity, gender expression, sex/gender, marital status, sexual orientation, physical or mental disability, medical condition, use of a guide dog or service animal, military/veteran status, citizenship status, basis of genetic information, or any other group protected by law. People with disabilities who need a reasonable accommodation to apply for or compete for employment with Conduent may request such accommodation(s) by submitting their request through this form that must be downloaded: click here to access or download the form. Complete the form and then email it as an attachment to FTADAAA@conduent.com. You may also click here to access Conduent's ADAAA Accommodation Policy. At Conduent we value the health and safety of our associates, their families and our community. For US applicants while we DO NOT require vaccination for most of our jobs, we DO require that you provide us with your vaccination status, where legally permissible. Providing this information is a requirement of your employment at Conduent. Show more Show less

Posted 3 weeks ago

Apply

7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Job Description Design, develop, troubleshoot and debug software programs for databases, applications, tools, networks etc.As a member of the software engineering division, you will take an active role in the definition and evolution of standard practices and procedures. You will be responsible for defining and developing software for tasks associated with the developing, designing and debugging of software applications or operating systems.Work is non-routine and very complex, involving the application of advanced technical/business skills in area of specialization. Leading contributor individually and as a team member, providing direction and mentoring to others. BS or MS degree or equivalent experience relevant to functional area. 7 years of software engineering or related experience.ResponsibilitiesOverview of Product – Oracle AnalyticsBe part of an energetic and challenging team building an enterprise Analytic platform that will allow users to quickly gain insights on their most valuable asset; data. Oracle Analytics is an industry-leading product that empowers entire organizations with a full range of business analytics tools, enterprise ready reporting and engaging, and easy-to-use self-service data visualizations. Our customers are business users that demand a software product that allows easy, fast navigation through the full spectrum of data scale from simple spreadsheets to analyzing enormous volumes of information in enterprise class data warehouses.Oracle Analytics is a comprehensive solution to meet the breadth of all analytics needs. Get the right data, to the right people, at the right time with analytics for everyone in your organization. With built-in security and governance, you can easily share insights and collaborate with your colleagues. By leveraging the cloud, you can scale up or down to suit your needs. The Oracle Analytics Cloud offering is a leading cloud service at Oracle built on Oracle Cloud Infrastructure. It runs with a Generation 2 offering and provides consistent high performance and unmatched governance and security controls.Self-service analytics drive business agility with faster time to insights. You no longer need help from IT to access, prepare, analyze, and collaborate on all your data. Easily create data visualizations with automated chart recommendations and optimize insights by collaborating with colleagues on analyses.Augmented analytics with embedded machine learning throughout the platform drive smarter and better insights. Always on—and always working in the background, machine learning is continuously learning from the data it takes in, making it smarter and more accurate as time goes by. Uncover deeper patterns and predict trends for impactful, unbiased recommendations.On the team we develop, deploy, and support the Oracle Analytics platform helping our customers succeed in their journey to drive business value. You will be working with experts in their field, exploring the latest technologies, you will be challenged while creating features that will be delivered to our customers, asked to be creative, and hopefully have some fun along the way. Members of our team are tasked to take on challenges along all aspect of our product.https://www.oracle.com/solutions/business-analytics Career Level - IC4 Responsibilities As a member of the development team, you will design, code, debug, and deliver innovative analytic features that involve in C++ development with extensive exposure on highly scalable, distributed, multithreaded applications. You will work closely with your peer developers located across the world, including Mexico, India, and the USA. Key responsibilities include: Design, develop, test and deliver new features on a world-class analytics platform suitable for deployment to both the Oracle Cloud and on-premise environments Lead the creation of formal design specifications and coding of complex systems Work closely with the Product Management on product requirements and functionality Build software applications following established coding standards Communicate continually with the project teams, explain progress on the development effort Contribute to continuous improvement by suggesting improvements to user interface, software architecture or recommending new technologies Ensure quality of work through development standards and QA procedures Perform maintenance and enhancements on existing software Key Qualifications: BS/MS in Computer Science or related major Exceptional analytic and problem-solving skills Extensive experience in using, building, debugging multithreaded applications Ability to design large, scalable systems for enterprise customers Solid understanding concurrency, multithreading and memory management Experienced in C++ programming including templates, STL, and object-oriented patterns Interest or experience in database kernel development Understanding of SQL and relational data processing concepts like joins and indexing strategies Experience With Java, Python Or Other Scripting Languages. Experienced in distributed and scalable server-side software development Knowledge in developing, implementing, and optimizing software algorithms Solid knowledge of data structures and operating systems Basic understanding of Agile/Scrum development methodologies Hands-on experience using source control tools such as GIT Strong written and verbal English communication skills Self-motivated and passionate in developing high quality software Strong Team Player Other Qualifications: Knowledge of Business Intelligence or Analytics Familiarity with SQL query optimization and execution Experienced in Big Data technologies (such as Hadoop, Spark) Interest or experience of OLAP, data warehousing or multidimensional databases Familiarity with Cloud services such as OCI, AWS or Azure Knowledge of Terraform/Python About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 3 weeks ago

Apply

10.0 - 15.0 years

2 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title :Senior SQL Developer Experience 10 -15Years Location :Bangalore ExperienceMinimum of 10+ years in database development and management roles. SQL MasteryAdvanced expertise in crafting and optimizing complex SQL queries and scripts. AWS RedshiftProven experience in managing, tuning, and optimizing large-scale Redshift clusters. PostgreSQLDeep understanding of PostgreSQL, including query planning, indexing strategies, and advanced tuning techniques. Data PipelinesExtensive experience in ETL development and integrating data from multiple sources into cloud environments. Cloud ProficiencyStrong experience with AWS services like ECS, S3, KMS, Lambda, Glue, and IAM. Data ModelingComprehensive knowledge of data modeling techniques for both OLAP and OLTP systems. ScriptingProficiency in Python, C#, or other scripting languages for automation and data manipulation. Preferred Qualifications LeadershipPrior experience in leading database or data engineering teams. Data VisualizationFamiliarity with reporting and visualization tools like Tableau, Power BI, or Looker. DevOpsKnowledge of CI/CD pipelines, infrastructure as code (e.g., Terraform), and version control (Git). CertificationsAny relevant certifications (e.g., AWS Certified Solutions Architect, AWS Certified Database - Specialty, PostgreSQL Certified Professional) will be a plus. Azure DatabricksFamiliarity with Azure Databricks for data engineering and analytics workflows will be a significant advantage. Soft Skills Strong problem-solving and analytical capabilities. Exceptional communication skills for collaboration with technical and non-technical stakeholders. A results-driven mindset with the ability to work independently or lead within a team. Qualification: Bachelor's or masters degree in Computer Science, Information Systems, Engineering or equivalent. 10+ years of experience

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Introduction A career in IBM Consulting embraces long-term relationships and close collaboration with clients across the globe. In this role, you will work for IBM BPO, part of Consulting that, accelerates digital transformation using agile methodologies, process mining, and AI-powered workflows. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio, including IBM Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be supported by mentors and coaches who will encourage you to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in groundbreaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and learning opportunities in an environment that embraces your unique skills and experience. Your Role And Responsibilities Develop, test and support future-ready data solutions for customers across industry verticals Develop, test, and support end-to-end batch and near real-time data flows/pipelines Demonstrate understanding in data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies Communicates risks and ensures understanding of these risks. Preferred Education Master's Degree Required Technical And Professional Expertise Minimum of 5+ years of related experience required Experience in modeling and business system designs Good hands-on experience on DataStage, Cloud based ETL Services Have great expertise in writing TSQL code Well versed with data warehouse schemas and OLAP techniques Preferred Technical And Professional Experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate Must be a strong team player/leader Ability to lead Data transformation project with multiple junior data engineers Strong oral written and interpersonal skills for interacting and throughout all levels of the organization. Ability to clearly communicate complex business problems and technical solutions. Show more Show less

Posted 3 weeks ago

Apply

7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Location: Hyderabad / Bangalore / Pune Experience: 7-16 Years Work Mode: Hybrid Mandatory Skills: ERstudio, Data Warehouse, Data Modelling, Databricks, ETL, PostgreSQL, MySQL, Oracle, NoSQL, Hadoop, Spark, Dimensional Modelling ,OLAP, OLTP, Erwin, Data Architect and Supplychain (preferred) Job Description We are looking for a talented and experienced Senior Data Modeler to join our growing team. As a Senior Data Modeler, you will be responsible for designing, implementing, and maintaining data models to enhance data quality, performance, and scalability. You will collaborate with cross-functional teams including data analysts, architects, and business stakeholders to ensure that the data models align with business requirements and drive efficient data management. Key Responsibilities Design, implement, and maintain data models that support business requirements, ensuring high data quality, performance, and scalability. Collaborate with data analysts, data architects, and business stakeholders to align data models with business needs. Leverage expertise in Azure, Databricks, and data warehousing to create and manage data solutions. Manage and optimize relational and NoSQL databases such as Teradata, SQL Server, Oracle, MySQL, MongoDB, and Cassandra. Contribute to and enhance the ETL processes and data integration pipelines to ensure smooth data flows. Apply data modeling principles and techniques, such as ERD and UML, to design and implement effective data models. Stay up-to-date with industry trends and emerging technologies, such as big data technologies like Hadoop and Spark. Develop and maintain data models using data modeling tools such as ER/Studio and Hackolade. Drive the adoption of best practices and standards for data modeling within the organization. Skills And Qualifications Minimum of 6+ years of experience in data modeling, with a proven track record of implementing scalable and efficient data models. Expertise in Azure and Databricks for building data solutions. Proficiency in ER/Studio, Hackolade, and other data modeling tools. Strong understanding of data modeling principles and techniques (e.g., ERD, UML). Experience with relational databases (e.g., Teradata, SQL Server, Oracle, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra). Solid understanding of data warehousing, ETL processes, and data integration. Familiarity with big data technologies such as Hadoop and Spark is an advantage. Industry Knowledge: A background in supply chain is preferred but not mandatory. Excellent analytical and problem-solving skills. Strong communication skills, with the ability to interact with both technical and non-technical stakeholders. Ability to work well in a collaborative, fast-paced environment. Education B.Tech in any branch or specialization Skills: data visualization,oltp,models,databricks,spark,data modeller,supplychain,oracle,skills,databases,hadoop,dimensional modelling,erstudio,nosql,data warehouse,data models,data modeling,modeling,data architect,er studio,data modelling,data,etl,erwin,mysql,olap,postgresql Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

We are looking for an experienced SSAS Data Engineer with strong expertise in SSAS (Tabular and/or Multidimensional Models) , SQL , MDX/DAX , and data modeling . The ideal candidate will have a solid background in designing and developing BI solutions, working with large datasets, and building scalable SSAS cubes for reporting and analytics. Experience with ETL processes and reporting tools like Power BI is a strong plus. Key Responsibilities Design, develop, and maintain SSAS models (Tabular and/or Multidimensional). Build and optimize MDX or DAX queries for advanced reporting needs. Create and manage data models (Star/Snowflake schemas) supporting business KPIs. Develop and maintain ETL pipelines for efficient data ingestion (preferably using SSIS or similar tools). Implement KPIs, aggregations, partitioning, and performance tuning in SSAS cubes. Collaborate with data analysts, business stakeholders, and Power BI teams to deliver accurate and insightful reporting solutions. Maintain data quality and consistency across data sources and reporting layers. Implement RLS/OLS and manage report security and governance in SSAS and Power BI. Primary Required Skills: SSAS – Tabular & Multidimensional SQL Server (Advanced SQL, Views, Joins, Indexes) DAX & MDX Data Modeling & OLAP concepts Secondary ETL Tools (SSIS or equivalent) Power BI or similar BI/reporting tools Performance tuning & troubleshooting in SSAS and SQL Version control (TFS/Git), deployment best practices Skills: business intelligence,data visualization,sql proficiency,data modeling & olap concepts,mdx,dax & mdx,data analysis,performance tuning,ssas,data modeling,etl tools (ssis or equivalent),version control (tfs/git), deployment best practices,ssas - tabular & multidimensional,etl,sql server (advanced sql, views, joins, indexes),multidimensional expressions (mdx),dax,performance tuning & troubleshooting in ssas and sql,power bi or similar bi/reporting tools Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

New Delhi, Delhi, India

On-site

Linkedin logo

We are looking for an experienced SSAS Data Engineer with strong expertise in SSAS (Tabular and/or Multidimensional Models) , SQL , MDX/DAX , and data modeling . The ideal candidate will have a solid background in designing and developing BI solutions, working with large datasets, and building scalable SSAS cubes for reporting and analytics. Experience with ETL processes and reporting tools like Power BI is a strong plus. Key Responsibilities Design, develop, and maintain SSAS models (Tabular and/or Multidimensional). Build and optimize MDX or DAX queries for advanced reporting needs. Create and manage data models (Star/Snowflake schemas) supporting business KPIs. Develop and maintain ETL pipelines for efficient data ingestion (preferably using SSIS or similar tools). Implement KPIs, aggregations, partitioning, and performance tuning in SSAS cubes. Collaborate with data analysts, business stakeholders, and Power BI teams to deliver accurate and insightful reporting solutions. Maintain data quality and consistency across data sources and reporting layers. Implement RLS/OLS and manage report security and governance in SSAS and Power BI. Primary Required Skills: SSAS – Tabular & Multidimensional SQL Server (Advanced SQL, Views, Joins, Indexes) DAX & MDX Data Modeling & OLAP concepts Secondary ETL Tools (SSIS or equivalent) Power BI or similar BI/reporting tools Performance tuning & troubleshooting in SSAS and SQL Version control (TFS/Git), deployment best practices Skills: business intelligence,data visualization,sql proficiency,data modeling & olap concepts,mdx,dax & mdx,data analysis,performance tuning,ssas,data modeling,etl tools (ssis or equivalent),version control (tfs/git), deployment best practices,ssas - tabular & multidimensional,etl,sql server (advanced sql, views, joins, indexes),multidimensional expressions (mdx),dax,performance tuning & troubleshooting in ssas and sql,power bi or similar bi/reporting tools Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Faridabad, Haryana, India

On-site

Linkedin logo

We are looking for an experienced SSAS Data Engineer with strong expertise in SSAS (Tabular and/or Multidimensional Models) , SQL , MDX/DAX , and data modeling . The ideal candidate will have a solid background in designing and developing BI solutions, working with large datasets, and building scalable SSAS cubes for reporting and analytics. Experience with ETL processes and reporting tools like Power BI is a strong plus. Key Responsibilities Design, develop, and maintain SSAS models (Tabular and/or Multidimensional). Build and optimize MDX or DAX queries for advanced reporting needs. Create and manage data models (Star/Snowflake schemas) supporting business KPIs. Develop and maintain ETL pipelines for efficient data ingestion (preferably using SSIS or similar tools). Implement KPIs, aggregations, partitioning, and performance tuning in SSAS cubes. Collaborate with data analysts, business stakeholders, and Power BI teams to deliver accurate and insightful reporting solutions. Maintain data quality and consistency across data sources and reporting layers. Implement RLS/OLS and manage report security and governance in SSAS and Power BI. Primary Required Skills: SSAS – Tabular & Multidimensional SQL Server (Advanced SQL, Views, Joins, Indexes) DAX & MDX Data Modeling & OLAP concepts Secondary ETL Tools (SSIS or equivalent) Power BI or similar BI/reporting tools Performance tuning & troubleshooting in SSAS and SQL Version control (TFS/Git), deployment best practices Skills: business intelligence,data visualization,sql proficiency,data modeling & olap concepts,mdx,dax & mdx,data analysis,performance tuning,ssas,data modeling,etl tools (ssis or equivalent),version control (tfs/git), deployment best practices,ssas - tabular & multidimensional,etl,sql server (advanced sql, views, joins, indexes),multidimensional expressions (mdx),dax,performance tuning & troubleshooting in ssas and sql,power bi or similar bi/reporting tools Show more Show less

Posted 3 weeks ago

Apply

7.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Location: Hyderabad / Bangalore / Pune Experience: 7-16 Years Work Mode: Hybrid Mandatory Skills: ERstudio, Data Warehouse, Data Modelling, Databricks, ETL, PostgreSQL, MySQL, Oracle, NoSQL, Hadoop, Spark, Dimensional Modelling ,OLAP, OLTP, Erwin, Data Architect and Supplychain (preferred) Job Description We are looking for a talented and experienced Senior Data Modeler to join our growing team. As a Senior Data Modeler, you will be responsible for designing, implementing, and maintaining data models to enhance data quality, performance, and scalability. You will collaborate with cross-functional teams including data analysts, architects, and business stakeholders to ensure that the data models align with business requirements and drive efficient data management. Key Responsibilities Design, implement, and maintain data models that support business requirements, ensuring high data quality, performance, and scalability. Collaborate with data analysts, data architects, and business stakeholders to align data models with business needs. Leverage expertise in Azure, Databricks, and data warehousing to create and manage data solutions. Manage and optimize relational and NoSQL databases such as Teradata, SQL Server, Oracle, MySQL, MongoDB, and Cassandra. Contribute to and enhance the ETL processes and data integration pipelines to ensure smooth data flows. Apply data modeling principles and techniques, such as ERD and UML, to design and implement effective data models. Stay up-to-date with industry trends and emerging technologies, such as big data technologies like Hadoop and Spark. Develop and maintain data models using data modeling tools such as ER/Studio and Hackolade. Drive the adoption of best practices and standards for data modeling within the organization. Skills And Qualifications Minimum of 6+ years of experience in data modeling, with a proven track record of implementing scalable and efficient data models. Expertise in Azure and Databricks for building data solutions. Proficiency in ER/Studio, Hackolade, and other data modeling tools. Strong understanding of data modeling principles and techniques (e.g., ERD, UML). Experience with relational databases (e.g., Teradata, SQL Server, Oracle, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra). Solid understanding of data warehousing, ETL processes, and data integration. Familiarity with big data technologies such as Hadoop and Spark is an advantage. Industry Knowledge: A background in supply chain is preferred but not mandatory. Excellent analytical and problem-solving skills. Strong communication skills, with the ability to interact with both technical and non-technical stakeholders. Ability to work well in a collaborative, fast-paced environment. Education B.Tech in any branch or specialization Skills: data visualization,oltp,models,databricks,spark,data modeller,supplychain,oracle,skills,databases,hadoop,dimensional modelling,erstudio,nosql,data warehouse,data models,data modeling,modeling,data architect,er studio,data modelling,data,etl,erwin,mysql,olap,postgresql Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

We are looking for an experienced SSAS Data Engineer with strong expertise in SSAS (Tabular and/or Multidimensional Models) , SQL , MDX/DAX , and data modeling . The ideal candidate will have a solid background in designing and developing BI solutions, working with large datasets, and building scalable SSAS cubes for reporting and analytics. Experience with ETL processes and reporting tools like Power BI is a strong plus. Key Responsibilities Design, develop, and maintain SSAS models (Tabular and/or Multidimensional). Build and optimize MDX or DAX queries for advanced reporting needs. Create and manage data models (Star/Snowflake schemas) supporting business KPIs. Develop and maintain ETL pipelines for efficient data ingestion (preferably using SSIS or similar tools). Implement KPIs, aggregations, partitioning, and performance tuning in SSAS cubes. Collaborate with data analysts, business stakeholders, and Power BI teams to deliver accurate and insightful reporting solutions. Maintain data quality and consistency across data sources and reporting layers. Implement RLS/OLS and manage report security and governance in SSAS and Power BI. Primary Required Skills: SSAS – Tabular & Multidimensional SQL Server (Advanced SQL, Views, Joins, Indexes) DAX & MDX Data Modeling & OLAP concepts Secondary ETL Tools (SSIS or equivalent) Power BI or similar BI/reporting tools Performance tuning & troubleshooting in SSAS and SQL Version control (TFS/Git), deployment best practices Skills: business intelligence,data visualization,sql proficiency,data modeling & olap concepts,mdx,dax & mdx,data analysis,performance tuning,ssas,data modeling,etl tools (ssis or equivalent),version control (tfs/git), deployment best practices,ssas - tabular & multidimensional,etl,sql server (advanced sql, views, joins, indexes),multidimensional expressions (mdx),dax,performance tuning & troubleshooting in ssas and sql,power bi or similar bi/reporting tools Show more Show less

Posted 3 weeks ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Location: Hyderabad / Bangalore / Pune Experience: 7-16 Years Work Mode: Hybrid Mandatory Skills: ERstudio, Data Warehouse, Data Modelling, Databricks, ETL, PostgreSQL, MySQL, Oracle, NoSQL, Hadoop, Spark, Dimensional Modelling ,OLAP, OLTP, Erwin, Data Architect and Supplychain (preferred) Job Description We are looking for a talented and experienced Senior Data Modeler to join our growing team. As a Senior Data Modeler, you will be responsible for designing, implementing, and maintaining data models to enhance data quality, performance, and scalability. You will collaborate with cross-functional teams including data analysts, architects, and business stakeholders to ensure that the data models align with business requirements and drive efficient data management. Key Responsibilities Design, implement, and maintain data models that support business requirements, ensuring high data quality, performance, and scalability. Collaborate with data analysts, data architects, and business stakeholders to align data models with business needs. Leverage expertise in Azure, Databricks, and data warehousing to create and manage data solutions. Manage and optimize relational and NoSQL databases such as Teradata, SQL Server, Oracle, MySQL, MongoDB, and Cassandra. Contribute to and enhance the ETL processes and data integration pipelines to ensure smooth data flows. Apply data modeling principles and techniques, such as ERD and UML, to design and implement effective data models. Stay up-to-date with industry trends and emerging technologies, such as big data technologies like Hadoop and Spark. Develop and maintain data models using data modeling tools such as ER/Studio and Hackolade. Drive the adoption of best practices and standards for data modeling within the organization. Skills And Qualifications Minimum of 6+ years of experience in data modeling, with a proven track record of implementing scalable and efficient data models. Expertise in Azure and Databricks for building data solutions. Proficiency in ER/Studio, Hackolade, and other data modeling tools. Strong understanding of data modeling principles and techniques (e.g., ERD, UML). Experience with relational databases (e.g., Teradata, SQL Server, Oracle, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra). Solid understanding of data warehousing, ETL processes, and data integration. Familiarity with big data technologies such as Hadoop and Spark is an advantage. Industry Knowledge: A background in supply chain is preferred but not mandatory. Excellent analytical and problem-solving skills. Strong communication skills, with the ability to interact with both technical and non-technical stakeholders. Ability to work well in a collaborative, fast-paced environment. Education B.Tech in any branch or specialization Skills: data visualization,oltp,models,databricks,spark,data modeller,supplychain,oracle,skills,databases,hadoop,dimensional modelling,erstudio,nosql,data warehouse,data models,data modeling,modeling,data architect,er studio,data modelling,data,etl,erwin,mysql,olap,postgresql Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Company Description Intellekt AI collaborates with companies to gain a deep understanding of their specific industrial landscapes and designs innovative AI solutions that revolutionize their business operations. Our versatile skill set and comprehensive industry knowledge enable us to consistently deliver successful outcomes tailored to unique requirements. Notable achievements include developing an automated plane parking system for Airports in the US and Canada, developing an AI solution for brain tumor detection and chest X-ray abnormality detection, and creating a profitable ML-based trading strategy for a leading hedge fund. Role Description This is a full-time remote role for a Senior Data Engineer. The Senior Data Engineer will be responsible for designing, developing, and maintaining robust data pipelines, modeling data to support business intelligence and analytics needs, executing Extract Transform Load (ETL) processes, and managing data warehousing solutions. Qualifications Data Engineering, Data Modeling, and Data Warehousing skills Strong experience in data warehouse designing from transactional database Experience in Extract Transform Load (ETL) processes for OLTP to OLAP using tools like Airflow Proficiency in SQL and knowledge of database management systems Strong programming skills in languages such as Python Experience with cloud platforms such as AWS, Google Cloud, or Azure Experience with streaming data, NoSQL databases, and unstructured data is a big plus. Excellent problem-solving and communication skills Bachelor's degree in Computer Science, Data Science, Engineering, or related field Tip If you have experience with designing a production-grade analytical database from a transactional database, highlight that in your resume and/or application, and you will be given preference. Show more Show less

Posted 3 weeks ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Location: Hyderabad / Bangalore / Pune Experience: 7-16 Years Work Mode: Hybrid Mandatory Skills: ERstudio, Data Warehouse, Data Modelling, Databricks, ETL, PostgreSQL, MySQL, Oracle, NoSQL, Hadoop, Spark, Dimensional Modelling ,OLAP, OLTP, Erwin, Data Architect and Supplychain (preferred) Job Description We are looking for a talented and experienced Senior Data Modeler to join our growing team. As a Senior Data Modeler, you will be responsible for designing, implementing, and maintaining data models to enhance data quality, performance, and scalability. You will collaborate with cross-functional teams including data analysts, architects, and business stakeholders to ensure that the data models align with business requirements and drive efficient data management. Key Responsibilities Design, implement, and maintain data models that support business requirements, ensuring high data quality, performance, and scalability. Collaborate with data analysts, data architects, and business stakeholders to align data models with business needs. Leverage expertise in Azure, Databricks, and data warehousing to create and manage data solutions. Manage and optimize relational and NoSQL databases such as Teradata, SQL Server, Oracle, MySQL, MongoDB, and Cassandra. Contribute to and enhance the ETL processes and data integration pipelines to ensure smooth data flows. Apply data modeling principles and techniques, such as ERD and UML, to design and implement effective data models. Stay up-to-date with industry trends and emerging technologies, such as big data technologies like Hadoop and Spark. Develop and maintain data models using data modeling tools such as ER/Studio and Hackolade. Drive the adoption of best practices and standards for data modeling within the organization. Skills And Qualifications Minimum of 6+ years of experience in data modeling, with a proven track record of implementing scalable and efficient data models. Expertise in Azure and Databricks for building data solutions. Proficiency in ER/Studio, Hackolade, and other data modeling tools. Strong understanding of data modeling principles and techniques (e.g., ERD, UML). Experience with relational databases (e.g., Teradata, SQL Server, Oracle, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra). Solid understanding of data warehousing, ETL processes, and data integration. Familiarity with big data technologies such as Hadoop and Spark is an advantage. Industry Knowledge: A background in supply chain is preferred but not mandatory. Excellent analytical and problem-solving skills. Strong communication skills, with the ability to interact with both technical and non-technical stakeholders. Ability to work well in a collaborative, fast-paced environment. Education B.Tech in any branch or specialization Skills: data visualization,oltp,models,databricks,spark,data modeller,supplychain,oracle,skills,databases,hadoop,dimensional modelling,erstudio,nosql,data warehouse,data models,data modeling,modeling,data architect,er studio,data modelling,data,etl,erwin,mysql,olap,postgresql Show more Show less

Posted 3 weeks ago

Apply

Exploring OLAP Jobs in India

With the increasing demand for data analysis and business intelligence, OLAP (Online Analytical Processing) jobs have become popular in India. OLAP professionals are responsible for designing, building, and maintaining OLAP databases to support data analysis and reporting activities for organizations. If you are looking to pursue a career in OLAP in India, here is a comprehensive guide to help you navigate the job market.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Chennai

These cities are known for having a high concentration of IT companies and organizations that require OLAP professionals.

Average Salary Range

The average salary range for OLAP professionals in India varies based on experience levels. Entry-level professionals can expect to earn around INR 4-6 lakhs per annum, while experienced professionals with 5+ years of experience can earn upwards of INR 12 lakhs per annum.

Career Path

Career progression in OLAP typically follows a trajectory from Junior Developer to Senior Developer, and then to a Tech Lead role. As professionals gain experience and expertise in OLAP technologies, they may also explore roles such as Data Analyst, Business Intelligence Developer, or Database Administrator.

Related Skills

In addition to OLAP expertise, professionals in this field are often expected to have knowledge of SQL, data modeling, ETL (Extract, Transform, Load) processes, data warehousing concepts, and data visualization tools such as Tableau or Power BI.

Interview Questions

  • What is OLAP and how does it differ from OLTP? (basic)
  • Explain the difference between a star schema and a snowflake schema. (medium)
  • How do you optimize OLAP queries for performance? (advanced)
  • What is the role of aggregation functions in OLAP databases? (basic)
  • Can you explain the concept of drill-down in OLAP? (medium)
  • How do you handle slowly changing dimensions in OLAP databases? (advanced)
  • What are the advantages of using a multidimensional database over a relational database for OLAP purposes? (medium)
  • Describe your experience with OLAP tools such as Microsoft Analysis Services or Oracle OLAP. (basic)
  • How do you ensure data consistency in an OLAP environment? (medium)
  • What are some common challenges faced when working with OLAP databases? (advanced)
  • Explain the concept of data cubes in OLAP. (basic)
  • How do you approach designing a data warehouse for OLAP purposes? (medium)
  • Can you discuss the importance of indexing in OLAP databases? (advanced)
  • How do you handle missing or incomplete data in OLAP analysis? (medium)
  • What are the key components of an OLAP system architecture? (basic)
  • How do you troubleshoot performance issues in OLAP queries? (advanced)
  • Have you worked with real-time OLAP systems? If so, can you explain the challenges involved? (medium)
  • What are the limitations of OLAP compared to other data analysis techniques? (advanced)
  • How do you ensure data security in an OLAP environment? (medium)
  • Have you implemented any data mining algorithms in OLAP systems? If so, can you provide an example? (advanced)
  • How do you approach designing dimensions and measures in an OLAP cube? (medium)
  • What are some best practices for OLAP database design? (advanced)
  • How do you handle concurrent user access in an OLAP environment? (medium)
  • Can you explain the concept of data slicing and dicing in OLAP analysis? (basic)
  • What are your thoughts on the future of OLAP technologies in the era of big data and AI? (advanced)

Closing Remark

As you prepare for OLAP job interviews in India, make sure to hone your technical skills, brush up on industry trends, and showcase your problem-solving abilities. With the right preparation and confidence, you can successfully land a rewarding career in OLAP in India. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies