Home
Jobs

458 Olap Jobs - Page 18

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2 - 5 years

3 - 5 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. ABOUT THE ROLE Role Description: We are looking for an Associate Data Engineer with deep expertise in writing data pipelines to build scalable, high-performance data solutions. The ideal candidate will be responsible for developing, optimizing and maintaining complex data pipelines, integration frameworks, and metadata-driven architectures that enable seamless access and analytics. This role prefers deep understanding of the big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Data Engineer who owns development of complex ETL/ELT data pipelines to process large-scale datasets Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Ensuring data integrity, accuracy, and consistency through rigorous quality checks and monitoring Exploring and implementing new tools and technologies to enhance ETL platform and performance of the pipelines Proactively identify and implement opportunities to automate tasks and develop reusable frameworks Eager to understand the biotech/pharma domains & build highly efficient data pipelines to migrate and deploy complex data across systems Work in an Agile and Scaled Agile (SAFe) environment, collaborating with cross-functional teams, product owners, and Scrum Masters to deliver incremental value Use JIRA, Confluence, and Agile DevOps tools to manage sprints, backlogs, and user stories. Support continuous improvement, test automation, and DevOps practices in the data engineering lifecycle Collaborate and communicate effectively with the product teams, with cross-functional teams to understand business requirements and translate them into technical solutions Must-Have Skills: Experience in Data Engineering with a focus on Databricks, AWS, Python, SQL, and Scaled Agile methodologies Proficiency & Strong understanding of data processing and transformation of big data frameworks (Databricks, Apache Spark, Delta Lake, and distributed computing concepts) Strong understanding of AWS services and can demonstrate the same Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery, and DevOps practices Good-to-Have Skills: Data Engineering experience in Biotechnology or pharma industry Exposure to APIs, full stack development Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Education and Professional Certifications Any degree and 2-5 years of experience AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.

Posted 1 month ago

Apply

3 - 5 years

4 - 8 Lacs

Gurugram

Work from Office

Naukri logo

AHEAD builds platforms for digital business. By weaving together advances in cloud infrastructure, automation and analytics, and software delivery, we help enterprises deliver on the promise of digital transformation. AtAHEAD, we prioritize creating a culture of belonging,where all perspectives and voices are represented, valued, respected, and heard. We create spaces to empower everyone to speak up, make change, and drive the culture at AHEAD. We are an equal opportunity employer,anddo not discriminatebased onan individual's race, national origin, color, gender, gender identity, gender expression, sexual orientation, religion, age, disability, maritalstatus,or any other protected characteristic under applicable law, whether actual or perceived. We embraceall candidatesthatwillcontribute to the diversification and enrichment of ideas andperspectives atAHEAD. Data Engineer (Internally known as a Sr. Associate Technical Consultant) AHEAD is looking for a Technical Consultant Data Engineer to work closely with our dynamic project teams (both on-site and remotely). This Data Engineer will be responsible for hands-on engineering of Data platforms that support our clients advanced analytics, data science, and other data engineering initiatives. This consultant will build, and support modern data environments that reside in the public cloud or multi-cloud enterprise architectures. The Data Engineer will have responsibility for working on a variety of data projects. This includes orchestrating pipelines using modern Data Engineering tools/architectures as well as design and integration of existing transactional processing systems. As a Data Engineer, you will implement data pipelines to enable analytics and machine learning on rich datasets. Responsibilities: A Data Engineer should be able to build, operationalize and monitor data processing systems Create robust and automated pipelines to ingest and process structured and unstructured data from various source systems into analytical platforms using batch and streaming mechanisms leveraging cloud native toolset Implement custom applications using tools such as Kinesis, Lambda and other cloud native tools as required to address streaming use cases Engineers and supports data structures including but not limited to SQL and NoSQL databases Engineers and maintain ELT processes for loading data lake (Snowflake, Cloud Storage, Hadoop) Leverages the right tools for the right job to deliver testable, maintainable, and modern data solutions Respond to customer/team inquiries and assist in troubleshooting and resolving challenges Works with other scrum team members to estimate and deliver work inside of a sprint Research data questions, identifies root causes, and interacts closely with business users and technical resources Qualifications: 3+ years of professional technical experience 3+ years of hands-on Data Warehousing. 3+ years of experience building highly scalable data solutions using Hadoop, Spark, Databricks, Snowflake 2+ years of programming languages such as Python 3+ years of experience working in cloud environments (Azure) 2 years of experience in Redshift Strong client-facing communication and facilitation skills Key Skills: Python, Azure Cloud, Redshift, NoSQL, Git, ETL/ELT, Spark, Hadoop, Data Warehouse, Data Lake, Data Engineering, Snowflake, SQL/RDBMS, OLAP Why AHEAD: Through our daily work and internal groups like Moving Women AHEAD and RISE AHEAD, we value and benefit from diversity of people, ideas, experience, and everything in between. We fuel growth by stacking our office with top-notch technologies in a multi-million-dollar lab, by encouraging cross department training and development, sponsoring certifications and credentials for continued learning. USA Employment Benefits include - Medical, Dental, and Vision Insurance - 401(k) - Paid company holidays - Paid time off - Paid parental and caregiver leave - Plus more! See benefits https://www.aheadbenefits.com/ for additional details. The compensation range indicated in this posting reflects the On-Target Earnings (OTE) for this role, which includes a base salary and any applicable target bonus amount. This OTE range may vary based on the candidates relevant experience, qualifications, and geographic location.

Posted 1 month ago

Apply

3 - 5 years

4 - 9 Lacs

Gurugram

Work from Office

Naukri logo

AHEAD builds platforms for digital business. By weaving together advances in cloud infrastructure, automation and analytics, and software delivery, we help enterprises deliver on the promise of digital transformation. AtAHEAD, we prioritize creating a culture of belonging,where all perspectives and voices are represented, valued, respected, and heard. We create spaces to empower everyone to speak up, make change, and drive the culture at AHEAD. We are an equal opportunity employer,anddo not discriminatebased onan individual's race, national origin, color, gender, gender identity, gender expression, sexual orientation, religion, age, disability, maritalstatus,or any other protected characteristic under applicable law, whether actual or perceived. We embraceall candidatesthatwillcontribute to the diversification and enrichment of ideas andperspectives atAHEAD. Data Engineer (Internally known as a Technical Consultant) AHEAD is looking for a Technical Consultant Data Engineer to work closely with our dynamic project teams (both on-site and remotely). This Data Engineer will be responsible for hands-on engineering of Data platforms that support our clients advanced analytics, data science, and other data engineering initiatives. This consultant will build, and support modern data environments that reside in the public cloud or multi-cloud enterprise architectures. The Data Engineer will have responsibility for working on a variety of data projects. This includes orchestrating pipelines using modern Data Engineering tools/architectures as well as design and integration of existing transactional processing systems. As a Data Engineer, you will implement data pipelines to enable analytics and machine learning on rich datasets. Responsibilities: A Data Engineer should be able to build, operationalize and monitor data processing systems Create robust and automated pipelines to ingest and process structured and unstructured data from various source systems into analytical platforms using batch and streaming mechanisms leveraging cloud native toolset Implement custom applications using tools such as Kinesis, Lambda and other cloud native tools as required to address streaming use cases Engineers and supports data structures including but not limited to SQL and NoSQL databases Engineers and maintain ELT processes for loading data lake (Snowflake, Cloud Storage, Hadoop) Leverages the right tools for the right job to deliver testable, maintainable, and modern data solutions Respond to customer/team inquiries and assist in troubleshooting and resolving challenges Works with other scrum team members to estimate and deliver work inside of a sprint Research data questions, identifies root causes, and interacts closely with business users and technical resources Qualifications: 3+ years of professional technical experience 3+ years of hands-on Data Warehousing. 3+ years of experience building highly scalable data solutions using Hadoop, Spark, Databricks, Snowflake 2+ years of programming languages such as Python 3+ years of experience working in cloud environments (Azure) 2 years of experience in Redshift Strong client-facing communication and facilitation skills Key Skills: Python, Azure Cloud, Redshift, NoSQL, Git, ETL/ELT, Spark, Hadoop, Data Warehouse, Data Lake, Data Engineering, Snowflake, SQL/RDBMS, OLAP Why AHEAD: Through our daily work and internal groups like Moving Women AHEAD and RISE AHEAD, we value and benefit from diversity of people, ideas, experience, and everything in between. We fuel growth by stacking our office with top-notch technologies in a multi-million-dollar lab, by encouraging cross department training and development, sponsoring certifications and credentials for continued learning. USA Employment Benefits include - Medical, Dental, and Vision Insurance - 401(k) - Paid company holidays - Paid time off - Paid parental and caregiver leave - Plus more! See benefits https://www.aheadbenefits.com/ for additional details. The compensation range indicated in this posting reflects the On-Target Earnings (OTE) for this role, which includes a base salary and any applicable target bonus amount. This OTE range may vary based on the candidates relevant experience, qualifications, and geographic location.

Posted 1 month ago

Apply

6 - 10 years

12 - 22 Lacs

Coimbatore

Work from Office

Naukri logo

Looking for Database Developer

Posted 1 month ago

Apply

5 - 8 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description Service Line Digital Products and Services Rank 65/66 Type of employment Permanent Overall Years Of Experience 5-7 years Relevant Years Of Experience 2-4 years Job Summary We are looking for a data visualiser in Power BI who is motivated to combine the arts of analytics and design Responsibilities will include translation of the design wireframes to dashboards in Power BI combining data from SharePoint Lists or Excel sheets Uses standard applications, like Excel and Power BI for providing reports required in team sites, workflow trackers etc. created in SharePoint Understands and anticipates customer’s needs to meet or exceed expectations Works effectively in a team environment Essential Technical Skills/Tools Requirement Power BI, SharePoint, HTML5, CSS3 Desirable MS Excel, Access, SharePoint Modern PagesDesign skills - Adobe Photoshop, Illustrator, XD Essential Roles and responsibilities Design database architecture required for dashboardsPossess in-depth knowledge about Power BI and its functionalitiesTranslate business needs to technical specificationsDesign, build, and deploy BI solutions (e.g., reporting tools)Maintain and support data analytics platforms (e.g., MicroStrategy)Create tools to store data (e.g., OLAP cubes)Conduct unit testing and troubleshootingEvaluate and improve existing BI systemsCollaborate with teams to integrate systemsDevelop and execute database queries and conduct analysesCreate visualizations and reports for requested projectsDevelop and update technical documentation Desirable Excellent communication skills, both oral and writtenAbility to work with all levels in the organizationAbility to communicate effectively with team and end usersGood understanding of data analytics principles and ensuring that application will adhere to themAbility to manage competing priorities while working collaboratively with customers and stakeholdersSelf-motivated with the ability to thrive in a dynamic team environment, work across organizational departments and instill confidence with the client through work quality, time management, organizational skills, and responsivenessExperience with user interface design and prototyping Skill Set Requirement Matrix Technical Competency Requirement Expert Power BI Essential SharePoint Designer Essential Adobe – Photoshop, illustrator, XD Desirable SP Workflow creation Desirable Operational competency Requirement N/A Fundamental awareness Novice Intermediate Advanced Expert Communication skills Essential Result Oriented Essential Listening Skills Essential Customer Focus Essential Time Management Desirable Planning Desirable Competency Proficiency Scale The Proficiency Scale is an instrument used to measure one’s ability to demonstrate a competency on the job. The scale captures a wide range of ability levels and organises them into five steps; from ‘Fundamental Awareness’ to ‘Expert’.In combination with the Proficiency Map for a specific occupation, an individual can compare their current level of proficiency to top performers in the same occupation. This scale serves as the guide to understanding the expected proficiency level of top performers at each grade level.Score Proficiency Level DescriptionN/ANot ApplicableYou are not required to apply or demonstrate this competency. This competency is not applicable to your position Fundamental Awareness (basic knowledge)You have a common knowledge or an understanding of basic techniques and concepts.Focus is on learning. Novice (limited Experience) You have the level of experience gained in a classroom and/or experimental scenarios or as a trainee on-the-job. You are expected to need help when performing this skill. Focus is on developing through on-the-job experienceYou understand and discuss terminology, concepts, principles, and issues related to this competencyYou utilise the full range of reference and resource materials in this competency Intermediate (practical application)You are able to successfully complete tasks in this competency as requested. Help from an expert may be required from time to time, but you can usually perform the skill independently. Focus is on applying and enhancing knowledge or skillYou have applied this competency to situations occasionally while needing minimal guidance to perform successfullyYou understand and can discuss the application and implications of changes to processes, policies, and procedures in this area Advanced (applied theory)You can perform the actions associated with this skill without assistance. You are certainly recognised within your immediate organisation as ‘a person to ask’ when difficult questions arise regarding this skill.Focus is on broad organizational or professional issuesYou have consistently provided practical and relevant ideas and perspectives on process or practice improvements which may easily be implementedYou are capable of coaching others in the application of this competency by translating complex nuances relating to this competency into easy to understand termsYou participate in senior level discussions regarding this competencyYou assists in the development of reference and resource materials in this competency Expert You are known as an expert in this area. You can provide guidance, troubleshoot and answer questions related to this area of expertise and the field where the skill is used.Focus is strategicYou have demonstrated consistent excellence in applying this competency across multiple projects and organizationsYou are considered the ‘go to’ person in this area of expertiseYou create new applications for and/or lead the development of reference and resource materials for this competency;You are able to diagram or explain the relevant process elements and issues in relation to organisational issues and trends in sufficient detail during discussions and presentations, to foster a greater understanding among internal and external colleagues and constituents. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 month ago

Apply

5 - 8 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description Service Line Digital Products and Services Rank 65/66 Type of employment Permanent Overall Years Of Experience 5-7 years Relevant Years Of Experience 2-4 years Job Summary We are looking for a data visualiser in Power BI who is motivated to combine the arts of analytics and design Responsibilities will include translation of the design wireframes to dashboards in Power BI combining data from SharePoint Lists or Excel sheets Uses standard applications, like Excel and Power BI for providing reports required in team sites, workflow trackers etc. created in SharePoint Understands and anticipates customer’s needs to meet or exceed expectations Works effectively in a team environment Essential Technical Skills/Tools Requirement Power BI, SharePoint, HTML5, CSS3 Desirable MS Excel, Access, SharePoint Modern PagesDesign skills - Adobe Photoshop, Illustrator, XD Essential Roles and responsibilities Design database architecture required for dashboardsPossess in-depth knowledge about Power BI and its functionalitiesTranslate business needs to technical specificationsDesign, build, and deploy BI solutions (e.g., reporting tools)Maintain and support data analytics platforms (e.g., MicroStrategy)Create tools to store data (e.g., OLAP cubes)Conduct unit testing and troubleshootingEvaluate and improve existing BI systemsCollaborate with teams to integrate systemsDevelop and execute database queries and conduct analysesCreate visualizations and reports for requested projectsDevelop and update technical documentation Desirable Excellent communication skills, both oral and writtenAbility to work with all levels in the organizationAbility to communicate effectively with team and end usersGood understanding of data analytics principles and ensuring that application will adhere to themAbility to manage competing priorities while working collaboratively with customers and stakeholdersSelf-motivated with the ability to thrive in a dynamic team environment, work across organizational departments and instill confidence with the client through work quality, time management, organizational skills, and responsivenessExperience with user interface design and prototyping Skill Set Requirement Matrix Technical Competency Requirement Expert Power BI Essential SharePoint Designer Essential Adobe – Photoshop, illustrator, XD Desirable SP Workflow creation Desirable Operational competency Requirement N/A Fundamental awareness Novice Intermediate Advanced Expert Communication skills Essential Result Oriented Essential Listening Skills Essential Customer Focus Essential Time Management Desirable Planning Desirable Competency Proficiency Scale The Proficiency Scale is an instrument used to measure one’s ability to demonstrate a competency on the job. The scale captures a wide range of ability levels and organises them into five steps; from ‘Fundamental Awareness’ to ‘Expert’.In combination with the Proficiency Map for a specific occupation, an individual can compare their current level of proficiency to top performers in the same occupation. This scale serves as the guide to understanding the expected proficiency level of top performers at each grade level.Score Proficiency Level DescriptionN/ANot ApplicableYou are not required to apply or demonstrate this competency. This competency is not applicable to your position Fundamental Awareness (basic knowledge)You have a common knowledge or an understanding of basic techniques and concepts.Focus is on learning. Novice (limited Experience) You have the level of experience gained in a classroom and/or experimental scenarios or as a trainee on-the-job. You are expected to need help when performing this skill. Focus is on developing through on-the-job experienceYou understand and discuss terminology, concepts, principles, and issues related to this competencyYou utilise the full range of reference and resource materials in this competency Intermediate (practical application)You are able to successfully complete tasks in this competency as requested. Help from an expert may be required from time to time, but you can usually perform the skill independently. Focus is on applying and enhancing knowledge or skillYou have applied this competency to situations occasionally while needing minimal guidance to perform successfullyYou understand and can discuss the application and implications of changes to processes, policies, and procedures in this area Advanced (applied theory)You can perform the actions associated with this skill without assistance. You are certainly recognised within your immediate organisation as ‘a person to ask’ when difficult questions arise regarding this skill.Focus is on broad organizational or professional issuesYou have consistently provided practical and relevant ideas and perspectives on process or practice improvements which may easily be implementedYou are capable of coaching others in the application of this competency by translating complex nuances relating to this competency into easy to understand termsYou participate in senior level discussions regarding this competencyYou assists in the development of reference and resource materials in this competency Expert You are known as an expert in this area. You can provide guidance, troubleshoot and answer questions related to this area of expertise and the field where the skill is used.Focus is strategicYou have demonstrated consistent excellence in applying this competency across multiple projects and organizationsYou are considered the ‘go to’ person in this area of expertiseYou create new applications for and/or lead the development of reference and resource materials for this competency;You are able to diagram or explain the relevant process elements and issues in relation to organisational issues and trends in sufficient detail during discussions and presentations, to foster a greater understanding among internal and external colleagues and constituents. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 month ago

Apply

10 - 14 years

15 - 22 Lacs

Gurugram

Work from Office

Naukri logo

ZS Master Data Management Team has an extensive track record of completing over 1000 global projects and partnering with 15 of the top 20 Global Pharma organizations. They specialize in various MDM domains, offering end-to-end project implementation, change management, and data stewardship support. Their services encompass MDM strategy consulting, implementation for key entities (e.g., HCP, HCO, Employee, Payer, Product, Patient, Affiliations), and operational support including KTLO and Data Stewardship. With 50+ MDM implementations and Change Management programs annually for Life Sciences clients, the team has developed valuable assets like MDM libraries and pre-built accelerators. Strategic partnerships with leading platform vendors (Reltio, Informatica, Veeva, Semarchy etc) and collaborations with 18+ data vendors and technology providers further enhance their capabilities. You as Business Technology Solutions Manager will take ownership of one or more client delivery at a cross office level encompassing the area of digital experience transformation. The successful candidate will work closely with ZS Technology leadership and be responsible for building and managing client relationships, generating new business engagements, and providing thought leadership in the Digital Area. What Youll Do Lead the delivery process right from discovery/ POC to managing operations, across 3-4 client engagements helping to deliver world-class MDM solutions Ownership to ensure the proposed design/ architecture, deliverables meets the client expectation and solves the business problem with high degree of quality; Partner with Senior Leadership team and assist in project management responsibility i.e. Project planning, staffing management, people growth, etc.; Develop and implement master data management strategies and processes to maintain high-quality master data across the organization. Design and manage data governance frameworks, including data quality standards, policies, and procedures. Outlook for continuous improvement, innovation and provide necessary mentorship and guidance to the team; Liaison with Staffing partner, HR business partners for team building/ planning; Lead efforts for building POV on new technology or problem solving, Innovation to build firm intellectual capital: Actively lead unstructured problem solving to design and build complex solutions, tune to meet expected performance and functional requirements; Stay current with industry trends and emerging technologies in master data management and data governance. What Youll Bring: Bachelor's/Master's degree with specialization in Computer Science, MIS, IT or other computer related disciplines; 10-14 years of relevant consulting-industry experience (Preferably Healthcare bad Life Science) working on medium-large scale MDM solution delivery engagements: 5+ years of hands-on experience on designing, implementation MDM services capabilities using tools such as Informatica MDM, Reltio etc Strong understanding of data management principles, including data modeling, data quality, and metadata management. Strong understanding of various cloud based data management (ETL Tools) platforms such as AWS, Azure, Snowflake etc.,; Experience in designing and driving delivery of mid-large-scale solutions on Cloud platforms; Experience with ETL design and development, and (OLAP) tools to support business applications Additional Skills Ability to manage a virtual global team environment that contributes to the overall timely delivery of multiple projects; Knowledge of current data modeling, and data warehouse concepts, issues, practices, methodologies, and trends in the Business Intelligence domain; Experience with analyzing and troubleshooting the interaction between databases, operating systems, and applications; Significant supervisory, coaching and hands-on project management skills; Willingness to travel to other global offices as needed to work with client or other internal project teams.

Posted 1 month ago

Apply

5 years

0 Lacs

Pune, Maharashtra, India

Hybrid

Linkedin logo

Role: SQL Developer with (SSAS + OLAP + AWS)Location: Bangalore/Pune/Hyderabad/Noida/GurugramType: FulltimeExperience: 5 years & aboveNotice: ImmediateJDDesign, develop, and maintain SQL Server Analysis Services (SSAS) models.Create and manage OLAP cubes to support business intelligence reporting.Develop and implement multidimensional and tabular data models.Optimize the performance of SSAS solutions for efficient query processing.Integrate data from various sources into SQL Server databases and SSAS modelsPreferably knowledge on AWS S3 and SQL server Polybase

Posted 1 month ago

Apply

10 - 14 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Skill required: Delivery - Actionable Insights Designation: I&F Decision Sci Practitioner Assoc Mgr Qualifications: Any Graduation Years of Experience: 10 to 14 years What would you do? Data & AIWe are seeking a highly skilled Insights Consultant in Marketing Analytics to join our team. The ideal candidate will have a strong marketing domain knowledge, expertise in OLAP, SQL, and a knack for deriving insights and storytelling. This role involves performing pre and post-marketing campaign analysis, analyzing campaign effectiveness, and providing recurring insights and analysis across various marketing channels. What are we looking for? Insights Consultant Marketing Analytics End-to-End Analysis and Insights Benchmarking and Success Metrics Analytics Measurement Plans Proven experience in marketing analytics, with a strong understanding of OLAP and SQL Expertise in deriving insights and storytelling from data Strong analytical and problem-solving skills Excellent communication and collaboration skills Ability to provide actionable insights and prescriptive recommendations Roles and Responsibilities: In this role you are required to do analysis and solving of moderately complex problems Typically creates new solutions, leveraging and, where needed, adapting existing methods and procedures The person requires understanding of the strategic direction set by senior management as it relates to team goals Primary upward interaction is with direct supervisor or team leads Generally, interacts with peers and/or management levels at a client and/or within Accenture The person should require minimal guidance when determining methods and procedures on new assignments Decisions often impact the team in which they reside and occasionally impact other teams Individual would manage medium-small sized teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Qualifications Any Graduation

Posted 1 month ago

Apply

5 - 10 years

7 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : Snowflake Schema Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your day will involve creating innovative solutions to address business needs and ensuring applications are tailored to meet specific requirements. Roles & Responsibilities:Implement snowflake cloud data warehouse and cloud related architecture. Migrating from various sources to Snowflake.Work on Snowflake capabilities such as Snow pipe, Stages, Snow SQL, Streams, and tasks.Implement snowflake advanced concepts like setting up resource monitor, RBAC controls, Virtual Warehouse sizing, zero copy clone.In-depth knowledge and experience in data migration from RDBMS to Snowflake cloud data warehouseDeploy the snowflake features such as data sharing, event, and lake house patterns.Implement Incremental extraction loads - batched and streaming.Must Have- Snowflake certification Professional & Technical Skills: Must To Have Skills:Proficiency in Snowflake Data Warehouse Good To Have Skills:Experience with Snowflake Schema Strong understanding of data warehousing concepts Experience in ETL processes and data modeling Knowledge of SQL and database management Ability to troubleshoot and debug applications Additional Information: The candidate should have a minimum of 5 years of experience in Snowflake Data Warehouse This position is based at our Hyderabad office A 15 years full-time education is required Qualifications 15 years full time education

Posted 1 month ago

Apply

8 - 13 years

6 - 11 Lacs

Gurugram

Work from Office

Naukri logo

AHEAD is looking for a Senior Data Engineer to work closely with our dynamic project teams (both on-site and remotely). This Senior Data Engineer will be responsible for strategic planning and hands-on engineering of Big Data and cloud environments that support our clients advanced analytics, data science, and other data platform initiatives. This consultant will design, build, and support modern data environments that reside in the public cloud or multi-cloud enterprise architectures. They will be expected to be hands-on technically, but also present to leadership, and lead projects. The Senior Data Engineer will have responsibility for working on a variety of data projects. This includes orchestrating pipelines using modern Data Engineering tools/architectures as well as design and integration of existing transactional processing systems. As a Senior Data Engineer, you will design and implement data pipelines to enable analytics and machine learning on rich datasets. Roles and Responsibilities A Data Engineer should be able to design, build, operationalize, secure, and monitor data processing systems Create robust and automated pipelines to ingest and process structured and unstructured data from various source systems into analytical platforms using batch and streaming mechanisms leveraging cloud native toolset Implement custom applications using tools such as Kinesis, Lambda and other cloud native tools as required to address streaming use cases Engineers and supports data structures including but not limited to SQL and NoSQL databases Engineers and maintain ELT processes for loading data lake (Snowflake, Cloud Storage, Hadoop) Engineers APIs for returning data from these structures to the Enterprise Applications Leverages the right tools for the right job to deliver testable, maintainable, and modern data solutions Respond to customer/team inquiries and assist in troubleshooting and resolving challenges Works with other scrum team members to estimate and deliver work inside of a sprint Research data questions, identifies root causes, and interacts closely with business users and technical resources Qualifications 8+ years of professional technical experience 4+ years of hands-on Data Architecture and Data Modelling. 4+ years of experience building highly scalable data solutions using Hadoop, Spark, Databricks, Snowflake 4+ years of programming languages such as Python 2+ years of experience working in cloud environments (AWS and/or Azure) Strong client-facing communication and facilitation skills Key Skills Python, Cloud, Linux, Windows, NoSQL, Git, ETL/ELT, Spark, Hadoop, Data Warehouse, Data Lake, Snowflake, SQL/RDBMS, OLAP, Data Engineering

Posted 1 month ago

Apply

1 - 4 years

4 - 8 Lacs

Lucknow

Work from Office

Naukri logo

Work alongside our multidisciplinary team of developers and designers to create the next generation of enterprise software Support the entire application lifecycle (concept, design, develop, test, release, and support) Special interest in SQL/Query and/or MDX/XMLA/OLAP data sources Responsible for end-to-end product development of Java-based services Work with other developers to implement best practices, introduce new tools, and improve processes Stay up to date with new technology trends Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 6+ years of software development in a professional environment Proficiency in Java Knowledge and experience in relational database, OLAP and/or query planning is a plus Knowledge and experience creating applications on cloud platforms (Kubernetes, RedHat OCP) Exposure to agile development, continuous integration, continuous development environment (CI/CD) with tools such asGitHub, JIRA, Jenkins, etc. Other Toolsssh clients, docker Excellent interpersonal and communication skills with ability to effectively articulate technical challenges and devise solutions Ability to work independently in a large matrix environment Preferred technical and professional experience The position requires a back-end developer with strong Java skills Experience integrating applications with relational databases and/or OLAP data sources would be an asset Strong working knowledge of SQL and/or MDX/XMLA would also be an asset Ability to adapt to and learn new technologies Exposure to the Business Intelligence domain would be an advantage

Posted 1 month ago

Apply

5 - 10 years

4 - 7 Lacs

Lucknow

Work from Office

Naukri logo

Work alongside our multidisciplinary team of developers and designers to create the next generation of enterprise software Support the entire application lifecycle (concept, design, develop, test, release and support) Special interest in SQL/Query and/or MDX/XMLA/OLAP data sources Responsible for end-to-end product development of Java-based services Work with other developers to implement best practices, introduce new tools, and improve processes Stay up to date with new technology trends Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 15+ years of software development in a professional environment Proficiency in Java Experience integrating services with relational databases and/or OLAP data sources Knowledge and experience in relational database, OLAP and/or query planning Strong working knowledge of SQL and/or MDX/XMLA Knowledge and experience creating applications on cloud platforms (Kubernetes, RedHat OCP) Exposure to agile development, continuous integration, continuous development environment (CI/CD) with tools such asGitHub, JIRA, Jenkins, etc. Other Toolsssh clients, docker Excellent interpersonal and communication skills with ability to effectively articulate technical challenges and devise solutions Ability to work independently in a large matrix environment Preferred technical and professional experience The position requires a back-end developer with strong Java skills Experienced integrating Business Intelligence tools with relational data sources Experienced integrating Business Intelligence tools with OLAP technologies such as SAP/BW, SAP/BW4HANA Experienced defining relational or OLAP test assets -test suites, automated tests - to ensure high code coverage and tight integration with Business Intelligence tools Full lifecycle of SAP/BW and BW4HANA assets - Cube upgrade, server, and server supportadministering, maintaining, and upgrading using current SAP tooling

Posted 1 month ago

Apply

4 - 5 years

6 - 7 Lacs

Chennai

Work from Office

Naukri logo

Working at WPP means being part of a global network of more than 100,000 talented people dedicated to doing extraordinary work for our clients. We operate in over 100 countries, with corporate headquarters in New York, London and Singapore. WPP is a world leader in marketing services, with deep AI, data and technology capabilities, global presence and unrivalled creative talent. Our clients include many of the biggest companies and advertisers in the world, including approximately 300 of the Fortune Global 500. Our people are the key to our success. Were committed to fostering a culture of creativity, belonging and continuous learning, attracting and developing the brightest talent, and providing exciting career opportunities that help our people grow. Why were hiring: WPP is at the forefront of the marketing and advertising industrys largest transformation. Our Global CIO is leading a significant evolution of our Enterprise Technology capabilities, bringing together over 2,500 technology professionals into an integrated global team. This team will play a crucial role in enabling the ongoing transformation of our agencies and functions. GroupM is the world s leading media investment company responsible for more than $63B in annual media investment through agencies Mindshare, MediaCom, Wavemaker, Essence and m/SIX, as well as the results-driven programmatic audience company, Xaxis and data and technology company Choreograph. GroupM s portfolio includes Data & Technology, Investment and Services, all united in a vision to shape the next era of media where advertising works better for people. By leveraging all the benefits of scale, the company innovates, differentiates and generates sustained value for our clients wherever they do business. The GroupM IT team in WPP IT are the technology solutions partner for the GroupM group of agencies and are accountable for co-ordinating and assuring end-to-end change delivery, managing the GroupM IT technology life cycle and innovation pipeline. This role will work as part of the Business Platform Team for EMEA. You will be part of a new team Data team in Chennai, that will support our existing and future BI setup for EMEA markets. You will be responsible for delivering the solutions formulated by product owners and key stakeholders for different EMEA markets. In collaboration with the Data development team, you update the architecture and data models to new data needs and changes in source systems What youll be doing: Responsible for planning, deployment and managing the quality assurance and testing for the shared services team Develop and establish quality assurance deliverables and testing framework, including creating and maintaining test plans and test cases in accordance with changing requirements, throughout the development of the Application Software and data services. Create and manage the automated test framework (where applicable) such as preparing the test environment and maintaining functional testing scripts in accordance with the changing Checking data source locations and formats, performing a data count and verifying that the columns and data types meet requirements, reviewing keys and removing duplicate data. Identifying key ETL mapping scenarios and creating SQL queries that simulate them. Each ETL test case should be able to replicate one or more ETL mappings and compare them against real output. Confirming that all data was loaded, and that invalid data was rejected and loaded to the appropriate location (rejects table, etc). Perform thorough testing of datasets, reports, and dashboards developed in Power BI to ensure data accuracy and completeness. Validate data transformation processes and ETL pipelines that feed into Power BI. Work with the deployment team on setting up continuous integration, build and release automation Work with the development team (i.e. the business analysts and testers) for user stories elaboration, test automation, bug fixes, etc. Ensure defects are tracked in the product backlog, and conduct impact analysis of defect fixes discovered during manual or automated tests, validate completed stories pass the acceptance criteria, raising defects as needed and work with the development team to improve the expected behavior and test cases. Ensure that only the tested stories are demonstrated during the Sprint review(s). What youll need: 4-5 years in the capacity of a QA Lead Solid Backend experience by writing PL/SQL Queries Experience in Data Modelling Understanding of Data warehouse, Data Migration, ETL concepts Aware of the concepts related to Datawarehousing, OLTP and OLAP Experience in Power Bi Dashboards and reports testing Excellence analytical skills. Experience automated testing tools is a plus. In depth understanding of development methodologies (Agile, Waterfall, etc.). Working with 1 or more agile development teams. Familiar with Agile tools (e.g. JIRA/Azure Devops). Continuous QA process improvement. Who you are: Youre open : We are inclusive and collaborative; we encourage the free exchange of ideas; we respect and celebrate diverse views. We are accepting of new ideas, new partnerships, new ways of working. Youre optimistic : We believe in the power of creativity, technology, and talent to create brighter futures or our people, our clients, and our communities. We approach all that we do with conviction: to try the new and to seek the unexpected. Youre extraordinary: we are stronger together: through collaboration we achieve the amazing. We are creative leaders and pioneers of our industry; we provide extraordinary every day. What well give you: Passionate, inspired people - We promote a culture of people that do extraordinary work. Scale and opportunity - We offer the opportunity to create, influence and complete projects at a scale that is unparalleled in the industry. Challenging and stimulating work - Unique work and the opportunity to join a group of creative problem solvers. Are you up for the challenge #LI-Onsite We believe the best work happens when were together, fostering creativity, collaboration, and connection. Thats why we ve adopted a hybrid approach, with teams in the office around four days a week. If you require accommodations or flexibility, please discuss this with the hiring team during the interview process. WPP is an equal opportunity employer and considers applicants for all positions without discrimination or regard to particular characteristics. We are committed to fostering a culture of respect in which everyone feels they belong and has the same opportunities to progress in their careers.

Posted 1 month ago

Apply

2 - 6 years

18 - 25 Lacs

Pune

Work from Office

Naukri logo

Senior Associate, Full Stack Engineer At BNY, our culture empowers you to grow and succeed. As a leading global financial services company at the center of the world s financial system we touch nearly 20% of the world s investible assets. Every day around the globe, our 50,000+ employees bring the power of their perspective to the table to create solutions with our clients that benefit businesses, communities and people everywhere. We continue to be a leader in the industry, awarded as a top home for innovators and for creating an inclusive workplace. Through our unique ideas and talents, together we help make money work for the world. This is what #LifeAtBNY is all about. We re seeking a future team member for the role Senior Associate, Full Stack Engineer to join our Compliance engineering. This role is in Pune, MH-HYBRID. In this role, you ll make an impact in the following ways: Overall 2-6 years of experience with ETL, Databases, Data warehouses etc. Need to have in-depth technical knowledge as a Pentaho ETL developer and should feel comfortable working within large internal and external data sets. Experience in OLAP and OLTP and data warehousing and data model concepts Having good experience in Vertica, Oracle, Denodo and similar kind of databases Experienced in Design, Development, and Implementation of large - scale projects in financial industries using Data Warehousing ETL tools (Pentaho) Experience in creating ETL transformations and jobs using Pentaho Kettle Spoon designer and Pentaho Data Integration Designer and scheduling. Proficient in writing - SQL Statements, Complex Stored Procedures, Dynamic SQL queries, Batches, Scripts, Functions, Triggers, Views, Cursors and Query Optimization Excellent data analysis skills Working knowledge of source control tools such as GitLab Good Analytical skills. Good in PDI architecture Having good experience in Splunk is plus. To be successful in this role, we re seeking the following: Graduates of bachelor s degree programs in business, related discipline, or equivalent work experience Relevant domain expertise in alternative investment Services domain or capital markets and financial services domain is required. At BNY, our culture speaks for itself. Here s a few of our awards: America s Most Innovative Companies, Fortune, 2024 World s Most Admired Companies, Fortune 2024 Human Rights Campaign Foundation, Corporate Equality Index, 100% score, 2023-2024 Best Places to Work for Disability Inclusion , Disability: IN - 100% score, 2023-2024 Most Just Companies , Just Capital and CNBC, 2024 Dow Jones Sustainability Indices, Top performing company for Sustainability, 2024 Bloomberg s Gender Equality Index (GEI), 2023 Our Benefits and Rewards: BNY offers highly competitive compensation, benefits, and wellbeing programs rooted in a strong culture of excellence and our pay-for-performance philosophy. We provide access to flexible global resources and tools for your life s journey. Focus on your health, foster your personal resilience, and reach your financial goals as a valued member of our team, along with generous paid leaves, including paid volunteer time, that can support you and your family through moments that matter. BNY is an Equal Employment Opportunity/Affirmative Action Employer - Underrepresented racial and ethnic groups/Females/Individuals with Disabilities/Protected Veterans.

Posted 1 month ago

Apply

3 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description This position works with an existing team of Business Intelligence Developers within the Verisk Claims Solutions group that services both internal and external customers. Following existing patterns, the successful candidate will develop and deliver timely business intelligence solutions that meet specific technical requirements. Responsibilities Duties and Responsibilities Work with Product Owners and customers to understand requirements, validate solutions and troubleshoot issues Work closely with QA ,Data Engineering team to develop BI solutions and have the desire and ability to make connections across different disciplines to support the entire project team Analyze and understand user behavior, and log and resolve software bugs in a timely manner Develop high quality reporting worksheets/data source field lists and data visualizations for dashboards on various Business Intelligence tools such as ThoughtSpot and Power BI Write SQL scripts for reports and row level security creation Maintain, support and enhance our data analytics as a service platform capabilities Work with BI Platform Developers for upgrades and onboarding new features Work with vendor support teams to resolve technical issues Proactively research and understand the latest BI trends Requirements QUALIFICATIONS BS degree in Information Technology or related discipline with 5+ years of relevant experience, or MS degree in Information Technology or related discipline with 3+ years of relevant experience Strong programming, visualization, and analytical skills Proven ability to solve problems analytically and creatively Ability to design unit test solutions to ensure high quality and accuracy and set new standards Strong communication (written and oral) skills Proficient in creating reports and dashboards, in BI Tools (e.g. Power BI, ThoughtSpot, Tableau) for internal and external use Proficient in generating reports and analysis using SQL and databases (e.g. Snowflake, MS SQL Server, PostGres, BigQuery, Hive) Understanding of database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework Proven ability using scripting languages (e.g. Python, JavaScript, Scala) to connect and work with APIs Familiar with cloud services (e.g. AWS/GCP/Azure) not required but nice to have Familiar with version control (GIT) not required but nice to have About Us For over 50 years, Verisk has been the leading data analytics and technology partner to the global insurance industry by delivering value to our clients through expertise and scale. We empower communities and businesses to make better decisions on risk, faster. At Verisk, you'll have the chance to use your voice and build a rewarding career that's as unique as you are, with work flexibility and the support, coaching, and training you need to succeed. For the eighth consecutive year, Verisk is proudly recognized as a Great Place to Work® for outstanding workplace culture in the US, fourth consecutive year in the UK, Spain, and India, and second consecutive year in Poland. We value learning, caring and results and make inclusivity and diversity a top priority. In addition to our Great Place to Work® Certification, we’ve been recognized by The Wall Street Journal as one of the Best-Managed Companies and by Forbes as a World’s Best Employer and Best Employer for Women, testaments to the value we place on workplace culture. We’re 7,000 people strong. We relentlessly and ethically pursue innovation. And we are looking for people like you to help us translate big data into big ideas. Join us and create an exceptional experience for yourself and a better tomorrow for future generations. Verisk Businesses Underwriting Solutions — provides underwriting and rating solutions for auto and property, general liability, and excess and surplus to assess and price risk with speed and precision Claims Solutions — supports end-to-end claims handling with analytic and automation tools that streamline workflow, improve claims management, and support better customer experiences Property Estimating Solutions — offers property estimation software and tools for professionals in estimating all phases of building and repair to make day-to-day workflows the most efficient Extreme Event Solutions — provides risk modeling solutions to help individuals, businesses, and society become more resilient to extreme events. Specialty Business Solutions — provides an integrated suite of software for full end-to-end management of insurance and reinsurance business, helping companies manage their businesses through efficiency, flexibility, and data governance Marketing Solutions — delivers data and insights to improve the reach, timing, relevance, and compliance of every consumer engagement Life Insurance Solutions – offers end-to-end, data insight-driven core capabilities for carriers, distribution, and direct customers across the entire policy lifecycle of life and annuities for both individual and group. Verisk Maplecroft — provides intelligence on sustainability, resilience, and ESG, helping people, business, and societies become stronger Verisk Analytics is an equal opportunity employer. All members of the Verisk Analytics family of companies are equal opportunity employers. We consider all qualified applicants for employment without regard to race, religion, color, national origin, citizenship, sex, gender identity and/or expression, sexual orientation, veteran's status, age or disability. Verisk’s minimum hiring age is 18 except in countries with a higher age limit subject to applicable law. https://www.verisk.com/company/careers/ Unsolicited resumes sent to Verisk, including unsolicited resumes sent to a Verisk business mailing address, fax machine or email address, or directly to Verisk employees, will be considered Verisk property. Verisk will NOT pay a fee for any placement resulting from the receipt of an unsolicited resume. Verisk Employee Privacy Notice

Posted 1 month ago

Apply

8 - 10 years

0 Lacs

Jamshedpur, Jharkhand, India

Remote

Linkedin logo

Experience : 8.00 + years Salary : USD 60000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT-05:00) America/Atikokan (EST) Opportunity Type : Remote Placement Type : Full Time Contract for 12 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - Andela) What do you need for this opportunity? Must have skills required: aws (amazon web services), MariaDB, Azure (Microsoft Azure), Azure Certified, Computer & Network Security, Computer Hardware, CyberSecurity, Oracle Database, MongoDB, MySQL, PostgreSQL Andela is Looking for: Contract Duration: 12 months REQUIRED skills: MUST BE AZURE CERTIFIED 8-10 years Sr. Postgres DBA experience Must have successfully migrated on prem into Azure Strong Postgres knowledge along with infrastructure knowledge within Azure Past experience with database migrations and toolsets into Azure Strong clustering and HA experience. Troubleshoot complex database issues in accurate and timely manner. Maintain database disaster recovery procedures to ensure continuous availability and speedy recovery. Ensure databases are deployed according to GBS standards and business requirements. Identify and resolve database issues related to performance and capacity. Ensure database management and maintenance tasks are performed effectively. Ensure ticket SLA expectations are met. Stay updated with new database technologies and analyse such technologies to bring into scope of existing infrastructure. Able to switch between OLTP and OLAP environments Bachelor's degree in engineering and/or related experience of 8-10+ years as DBA experience, expert level experience in at least one database technology platform, multi-platform preferred: MySQL, SQL Server, Oracle, Postgres, Cosmos DB, MongoDB, MariaDB. Multiple platform experience a plus. The applicant will need to have a deep understanding of integrated security and be able to participate in troubleshooting activities. The ability to discuss database related topics with both technical & business audiences. Troubleshoot, investigate, offer and execute resolution to database issues. Monitor and report on database storage utilization. Experience writing and interpreting code in Postgres systems. With the ability to understand what others have developed. Monitor, tune and manage scheduled tasks, backup jobs, recover processes, alerts, and database storage needs in line with firm change control procedures. Perform fault diagnosis, troubleshoot and correct problems at the database and application performance level. Work well in a team environment within the database administration team as well as with other Technical Service Group teams and other departments within the Wolters Kluwer. Provide regular reports on performance and stability of database environment, identifying coming needs proactively to ensure continued reliable service. Document, develop, test and implement updates to database systems. Enjoy constantly learning new technologies and contributing to the knowledge of others. Work outside of regular business hours as required for project or operational work. Experience: Experienced Database professional with at least 8-10+ years of experience in Database Administration. Experience with all aspects of setup, maintenance, troubleshooting, monitoring, and security. Self-motivated, with the proven ability to work independently. Take ownership of and proactively seek to improve on existing systems. Technical Skillsets: Solid Database Administration. Building cloud model servers end to end a plus. On-premises to Cloud DB migrations. Data Security, Backup & Recovery tools. Experience working with Windows server, including Active Directory. Excellent written and verbal communication. Flexible, team player, “get-it-done” personality. Ability to organize and plan work independently. Ability to work in a rapidly changing environment. Management of database environments in cloud solutions. Soft Skills: Past experience supervision staff preferred but not required. Ability to work independently. Team oriented and places the success of the team over their own. Mentors and guides other DBA’s when there are improvement opportunities. Drives their own development. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 1 month ago

Apply

8 - 10 years

0 Lacs

Ranchi, Jharkhand, India

Remote

Linkedin logo

Experience : 8.00 + years Salary : USD 60000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT-05:00) America/Atikokan (EST) Opportunity Type : Remote Placement Type : Full Time Contract for 12 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - Andela) What do you need for this opportunity? Must have skills required: aws (amazon web services), MariaDB, Azure (Microsoft Azure), Azure Certified, Computer & Network Security, Computer Hardware, CyberSecurity, Oracle Database, MongoDB, MySQL, PostgreSQL Andela is Looking for: Contract Duration: 12 months REQUIRED skills: MUST BE AZURE CERTIFIED 8-10 years Sr. Postgres DBA experience Must have successfully migrated on prem into Azure Strong Postgres knowledge along with infrastructure knowledge within Azure Past experience with database migrations and toolsets into Azure Strong clustering and HA experience. Troubleshoot complex database issues in accurate and timely manner. Maintain database disaster recovery procedures to ensure continuous availability and speedy recovery. Ensure databases are deployed according to GBS standards and business requirements. Identify and resolve database issues related to performance and capacity. Ensure database management and maintenance tasks are performed effectively. Ensure ticket SLA expectations are met. Stay updated with new database technologies and analyse such technologies to bring into scope of existing infrastructure. Able to switch between OLTP and OLAP environments Bachelor's degree in engineering and/or related experience of 8-10+ years as DBA experience, expert level experience in at least one database technology platform, multi-platform preferred: MySQL, SQL Server, Oracle, Postgres, Cosmos DB, MongoDB, MariaDB. Multiple platform experience a plus. The applicant will need to have a deep understanding of integrated security and be able to participate in troubleshooting activities. The ability to discuss database related topics with both technical & business audiences. Troubleshoot, investigate, offer and execute resolution to database issues. Monitor and report on database storage utilization. Experience writing and interpreting code in Postgres systems. With the ability to understand what others have developed. Monitor, tune and manage scheduled tasks, backup jobs, recover processes, alerts, and database storage needs in line with firm change control procedures. Perform fault diagnosis, troubleshoot and correct problems at the database and application performance level. Work well in a team environment within the database administration team as well as with other Technical Service Group teams and other departments within the Wolters Kluwer. Provide regular reports on performance and stability of database environment, identifying coming needs proactively to ensure continued reliable service. Document, develop, test and implement updates to database systems. Enjoy constantly learning new technologies and contributing to the knowledge of others. Work outside of regular business hours as required for project or operational work. Experience: Experienced Database professional with at least 8-10+ years of experience in Database Administration. Experience with all aspects of setup, maintenance, troubleshooting, monitoring, and security. Self-motivated, with the proven ability to work independently. Take ownership of and proactively seek to improve on existing systems. Technical Skillsets: Solid Database Administration. Building cloud model servers end to end a plus. On-premises to Cloud DB migrations. Data Security, Backup & Recovery tools. Experience working with Windows server, including Active Directory. Excellent written and verbal communication. Flexible, team player, “get-it-done” personality. Ability to organize and plan work independently. Ability to work in a rapidly changing environment. Management of database environments in cloud solutions. Soft Skills: Past experience supervision staff preferred but not required. Ability to work independently. Team oriented and places the success of the team over their own. Mentors and guides other DBA’s when there are improvement opportunities. Drives their own development. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 1 month ago

Apply

5 - 8 years

0 Lacs

Pune, Maharashtra, India

Hybrid

Linkedin logo

Infosys Consulting, the management consulting business of Infosys, is looking to hire Data Engineer. You should have between 3 to 10 years of Data Engineering experience. Successful candidates will be part of a specialized customer-facing Advanced Analytics team in our rapidly expanding Automation & Artificial Intelligence Consulting Practice. They will work closely with our AI Consultants and SMEs to define and deliver best-in-class solutions. You will be entrepreneurial, inspired by execution excellence, possess strong design skills creating client deliverables and bid documents, highly analytical, emotionally intelligent, a team player, and only satisfied when your work has a meaningful impact. You will help our clients identify and qualify opportunities to leverage automation to unlock business value and maintain a competitive edge. Infosys Consulting offers exceptional opportunity for personal growth, and, alongside client engagements, you will be recognized for your personal contribution to leading and developing new opportunities, developing thought leadership and service offerings in the domains of artificial intelligence and automation. Responsibilities:You will need to be well spoken and have an easy time establishing productive long lasting working relationships with a large variety of stakeholdersTake the lead on data pipeline design with strong analytical skills and a keen eye for detail to really understand and tackle the challenges businesses are facingYou will be confronted with a large variety of Data Engineer tools and other new technologies as well with a wide variety of IT, compliance, security related issues.Design and develop world-class technology solutions to solve business problems across multiple client engagements.Collaborate with other teams to understand business requirements, client infrastructure, platforms and overall strategy to ensure seamless transitions.Work closely with AI and A team to build world-class solutions and to define AI strategy.You will possess strong logical structuring and problem-solving skills with expert level understanding of database and have an inherent desire to turn data into actions.Strong verbal, written and presentation skills Requirements:3 – 10 years of strong python or Java data engineering experienceExperience in Developing Data Pipelines that process large volumes of data using Python, PySpark, Pandas etc, on AWS.Experience in developing ETL, OLAP based and Analytical Applications.Experience in ingesting batch and streaming data from various data sources.Strong Experience in writing complex SQL using any RDBMS (Oracle, PostgreSQL, SQL Server etc.)Ability to quickly learn and develop expertise in existing highly complex applications and architectures.Exposure to AWS platform's data services (AWS Lambda, Glue, Athena, Redshift, Kinesis etc.)Experience in Airflow DAGS, AWS EMR, S3, IAM and other servicesExperience working on Test cases using pytest/ unit test or any other framework.Snowflake or Redshift data warehousesExperience of DevOps and CD/CD tools.Familiarity with Rest APIsClear and precise communication skillsExperience with CI/CD pipelines, branching strategies, & GIT for code managementComfortable working in Agile projectsBachelor's degree in computer science, information technology, or a similar field Locations Offered:Bengaluru, Hyderabad, Chennai, Noida, Gurgaon, Chandigarh, Pune, Navi Mumbai, Indore, Kolkata, Bhubaneswar, Mysuru, Trivandrum

Posted 1 month ago

Apply

4 - 6 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Job Description: Database Administrator (DBA)We are a fast-growing healthtech company building an end-to-end hospital management platform serving public and private hospitals across India. Our cloud-native platform is built on Spring Boot (Java), Angular, and GCP (Google Cloud Platform), with a strong focus on performance, scalability, and compliance (HIPAA, NDHM). We are looking for an experienced and hands-on Database Administrator (DBA) who can work closely with our engineering team to design, optimize, and scale database infrastructure on GCP/AWS and related services.Location:Ahmedabad ( On-site)Experience:4-6 YearsKey Responsibilities:Database Optimization & Performance Tuning: Analyze query performance, optimize indexes, partitions, and schema structure.Database Design: Design scalable, normalized, and efficient relational schemasMultitenancy & Scalability: Architect and implement robust multi-tenant database strategies.Google Cloud Platform (GCP): Work with Google Cloud SQL and Cloud Spanner, and understand IAM, backups, and high availability.Data Security & Compliance: Ensure encryption, RBAC, and support for HIPAA compliance.Collaboration & Development Support: Support Java developers in optimized DB interactions using JPA/Hibernate.Required Skills & Qualifications:5+ years of experience as a production DBA (MySQL/PostgreSQL)Strong expertise in query optimization, indexing, and partitioningDeep understanding of multi-tenant DB design patternsHands-on with GCP/AWS managed database servicesExperience in healthcare domain or transaction-heavy enterprise systemsUnderstanding of schema migration tools (e.g.Liquibase)Basic knowledge of Java backend systems and how ORM worksExcellent problem-solving and communication skills Experience with data archiving strategies and OLAP vs OLTPExposure to Power BI or BI Reporting ToolsNice to Have:Familiar with Spring Boot/JPA/Hibernate interactions with RDBMSFamiliarity with ElasticSearch or NoSQL for read-heavy use casesFamiliarity with FHIR/HL7 healthcare database knowledge Why Join Us?Build the infrastructure of a fast-scaling healthcare platform impacting millions of lives.Work directly with the VP of Engineering and Core Architecture team.Opportunity to contribute to public health systems, government hospitals, and health-tech innovation.Autonomy, ownership, and a learning-first culture. If Interested Please share your resume on aarohi.patel@artemhealthtech.com with below mentioned details: Total Exp:Rel. Exp:Current Company:Current Designation:Current Location:Current Salary:Expected Salary:Official Notice Period:How early you can join:Any Offer in Hand (Mention Package and Location):Reason for Change:

Posted 1 month ago

Apply

7 - 10 years

0 Lacs

Chennai

Work from Office

Naukri logo

Job Title: Data Architect Oracle Fusion Data Warehouse (DWH) Location: Chennai Job Type: Full-Time Office Role Department: Analytics Reports To: BI Lead We are seeking a highly experienced Data Architect with deep expertise in Oracle Fusion and Enterprise Data Warehousing to lead the design, development, and implementation of our enterprise data architecture. The ideal candidate will have a strong understanding of Oracle Fusion Applications (ERP, HCM, SCM), Oracle Autonomous Data Warehouse (ADW), and best practices for building scalable and secure enterprise data platforms. Key Responsibilities: Lead architecture design and implementation of the enterprise data warehouse using Oracle Fusion data sources. Design and maintain conceptual, logical, and physical data models. Define data integration strategies for extracting data from Oracle Fusion SaaS applications into the enterprise DWH. Collaborate with data engineers, business analysts, and application owners to understand business requirements and translate them into scalable data architecture solutions. Guide the development of ETL/ELT pipelines using tools like Oracle Data Integrator (ODI), Oracle Integration Cloud (OIC), or custom solutions. Ensure data quality, integrity, security, and governance across all data layers. Establish and enforce standards and best practices for data modeling, metadata management, and master data management (MDM). Optimize data warehouse performance, cost, and scalability (especially in ADW/OCI environments). Provide architectural direction during Oracle Fusion implementation/migration projects. Required Qualifications: Bachelor's or Masters degree in Computer Science, Information Systems, or related field. 8+ years of experience in enterprise data architecture with 3+ years focused on Oracle Fusion SaaS and DWH integrations. Proven expertise in Oracle Fusion Applications (ERP, HCM, SCM) and Oracle Autonomous Data Warehouse (ADW). Experience with tools such as ODI, OIC, GoldenGate, or Talend. Strong knowledge of data modeling (star/snowflake), data warehousing, and performance tuning. Familiarity with REST/SOAP APIs and FBDI/AOR data extraction mechanisms in Fusion. Solid understanding of data governance, security, and privacy practices. Excellent problem-solving and communication skills. Preferred Skills: Experience with reporting tools such as Oracle Analytics Cloud (OAC), Power BI, or Tableau. Knowledge of cloud architectures (OCI, AWS, Azure) and hybrid environments. Certification in Oracle Fusion or related cloud technologies is a plus Experience with DevOps/CI-CD pipelines for data workloads.

Posted 1 month ago

Apply

6 - 9 years

20 - 25 Lacs

Bengaluru

Hybrid

Naukri logo

Company Description Epsilon is the leader in outcome-based marketing. We enable marketing that's built on proof, not promises. Through Epsilon PeopleCloud, the marketing platform for personalizing consumer journeys with performance transparency, Epsilon helps marketers anticipate, activate and prove measurable business outcomes. Powered by CORE ID, the most accurate and stable identity management platform representing 200+ million people, Epsilon's award-winning data and technology is rooted in privacy by design and underpinned by powerful AI. With more than 50 years of experience in personalization and performance working with the world's top brands, agencies and publishers, Epsilon is a trusted partner leading CRM, digital media, loyalty and email programs. Positioned at the core of Publicis Groupe, Epsilon is a global company with over 8,000 employees in over 40 offices around the world. For more information, visit https://www.epsilon.com/apac (APAC). Follow us on Twitter at @EpsilonMktg. Click here to view how Epsilon transforms marketing with 1 View, 1 Vision and 1 Voice. https://www.epsilon.com/apac/youniverse Wondering what it's like to work with Epsilon? Check out this video that captures the spirit of our resilient minds, our values and our great culture. Job Description The Product team forms the crux of our powerful platforms and connects millions of customers to the product magic. This team of innovative thinkers develop and build products that help Epsilon be a market differentiator. They map the future and set new standards for our products, empowered with industry best practices, ML and AI capabilities. The team passionately delivers intelligent end-to-end solutions and plays a key role in Epsilon's success story. Candidate will be the Senior Software Engineer for Business Intelligence team in the Product Engineering group. The Business Intelligence team partners with internal and external clients and technology providers, to develop, implement, and manage state-of-the-art data analytics, business intelligence and data visualization solutions for our marketing products. The Sr Software Engineer will be an individual with strong technical expertise on business intelligence and analytics solutions/tools and work on the BI strategy in terms of toolset selection, report and visualization best practices, team training, and environment efficiency. Why we are looking for you You are an individual with combination of technical leadership and architectural design skills. You have a solid foundation in business intelligence and analytics solutions/tools. You have experience in Product Engineering & Software Development using Tableau and SAP Business Objects, Kibana Dashboard development. You have experience in data integration tools like Databricks. You excel at collaborating with different stakeholders (ERP, CRM, Data Hub and Business stakeholders) to success. You have a strong experience of building reusable database components using SQL queries You enjoy new challenges and are solution oriented. You like mentoring people and enable collaboration of the highest order What you will enjoy in this role As part of the Epsilon Product Engineering team, the pace of the work matches the fast-evolving demands of Fortune 500 clients across the globe. As part of an innovative team that's not afraid to take risks, your ideas will come to life in digital marketing products that support more than 50% automotive dealers in the US. The open and transparent environment that values innovation and efficiency. Exposure to all the different Epsilon Products where reporting plays a key role for the efficient decision-making abilities to the end users. What you will do Work on our BI strategy in terms of toolset selection, report and visualization best practices, team training, and environment efficiency. Analyze requirements and design data analytics and enterprise reporting solutions in various frameworks (such as Tableau, SAP Business Objects, and others) as part of the enterprise, multi-tier, customer-facing applications. Strong technical hands-on to develop data analytics solutions and enterprise reporting solutions in frameworks (such as Tableau, SAP Business Objects, and Kibana). Good to have scripting skills on Python. Build data integration & aggregate pipelines using Databricks. Provide estimates for BI solutions to be developed and deployed. Develop and support cloud infrastructure for BI solutions including automation, process definition and support documentation as required. Work in an agile environment and align with agile / scrum methodology for development work. Follow Data Management processes and procedures and provide input to the creation of data definitions, business rules and data access methods. Collaborate with database administrators and data warehouse architects on data access patterns to optimize data visualization and processing. Assess and come up with infrastructure design for BI solutions catering to system availability and fault tolerance needs. Establish best practices of workloads on multi-tenant deployments. Document solutions and train implementation and operational support teams. Assess gaps in solutions and make recommendations on how to solve the problem. Understand the priorities of various projects and help steer organizational tradeoffs to help focus on the most important initiatives. Show initiative and take responsibility for decisions that impact project and team goals Qualifications BE/ B. Tech/ MCA only, No correspondence course 7+ years of overall technical hands-on experience with good to have supervisory experience Experience in developing BI solutions in enterprise reporting frameworks Experience in designing semantic layer in reporting frameworks and developing reporting model on an OLTP or OLAP environment. Experience working with large data sets, both structured & unstructured, Datawarehouse and Data lakes. Strong knowledge in multitenancy concepts, object, folder and user group templates and user access models in BI reporting tool frameworks, including single sign-on integrations with identity and access management systems such as Okta. Experience in performing periodic sizing, establishing monitoring, backup and restore procedures catering to MTTR and MTBF expectations. Working knowledge of OLTP and relational database concepts and data warehouse concepts/best practices and data modeling Experience in documenting technical design and procedures, reusable artifacts and provide technical guidance as needed. Familiarity with cloud stack (AWS, Azure) & cloud deployments and tools Ability to work on multiple assignments concurrently.

Posted 1 month ago

Apply

8 - 10 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Number of Openings* 1 ECMS Request no in sourcing stage * 525266 Duration of contract* 12 Months Total Yrs. of Experience* 8-10 Yrs. Detailed JD *(Roles and Responsibilities) Manage and maintain NoSQL database systems to ensure optimal performance, Monitor database health and troubleshoot performance issues, Implement and maintain database security measures to protect sensitive data, Collaborate with development teams to design efficient data models, Perform database backups and develop disaster recovery plans. Design, manage, and optimize relational databases, Configure, deploy, and support SQL Server databases, Ensure data security and integrity while managing SQL databases. Analyze and translate business needs into data models, Develop conceptual, logical, and physical data models, Create and enforce database development standards, Validate and reconcile data models to ensure accuracy, Maintain and update existing data models, Mandatory skills* knowledge of OLTP, OLAP Data modeling, NoSQL db.; mongo DB preferred, Desired skills* Should be good at SQL,PL/SQL; experience in MySQL is bonus. Must have interpersonal skills to work with client and understand data model of Insurance systems Domain* Insurance Approx. vendor billing rate excluding service tax* 7588 INR/Day Precise Work Location* (E.g. Bangalore Infosys SEZ or STP) No constraint; Mumbai Bengaluru Pune preferred BG Check ( Before OR After onboarding) Pre-Onboarding Any client prerequisite BGV Agency* NA Is there any working in shifts from standard Daylight (to avoid confusions post onboarding)* IST only

Posted 1 month ago

Apply

2 - 6 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

AWS Data Engineer: ***************** As an AWS Data Engineer, you will contribute to our client and will have the below responsibilities: Work with technical development team and team lead to understand desired application capabilities. Candidate would need to do development using application development by lifecycles, & continuous integration/deployment practices. Working to integrate open-source components into data-analytic solutions Willingness to continuously learn & share learnings with others Required: 5+ years of direct applicable experience with key focus: Glue and Python; AWS; Data Pipeline creation Develop code using Python, such as o Developing data pipelines from various external data sources to internal data. Use of Glue for extracting data from the design data base. Developing Python APIs as needed Minimum 3 years of hands-on experience in Amazon Web Services including EC2, VPC, S3, EBS, ELB, Cloud-Front, IAM, RDS, Cloud Watch. Able to interpret business requirements, analyzing, designing and developing application on AWS Cloud and ETL technologies Able to design and architect server less application using AWS Lambda, EMR, and DynamoDB Ability to leverage AWS data migration tools and technologies including Storage Gateway, Database Migration and Import Export services. Understands relational database design, stored procedures, triggers, user-defined functions, SQL jobs. Familiar with CI/CD tools e.g., Jenkins, UCD for Automated application deployments Understanding of OLAP, OLTP, Star Schema, Snow Flake Schema, Logical/Physical/Dimensional Data Modeling. Ability to extract data from multiple operational sources and load into staging, Data warehouse, Data Marts etc. using SCDs (Type 1/Type 2/ Type 3/Hybrid) loads. Familiar with Software Development Life Cycle (SDLC) stages in a Waterfall and Agile environment. Nice to have: Familiar with the use of source control management tools for Branching, Merging, Labeling/Tagging and Integration, such as GIT and SVN. Experience working with UNIX/LINUX environments Hand-on experience with IDEs such as Jupiter Notebook Education & Certification University degree or diploma and applicable years of experience Job Segment Developer, Open Source, Data Warehouse, Cloud, Database, Technology

Posted 1 month ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Amgen Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. About The Role Role Description: Let’s do this. Let’s change the world. We are looking for highly motivated expert Data Engineer who can own the design & development of complex data pipelines, solutions and frameworks. The ideal candidate will be responsible to design, develop, and maintain data pipelines, data integration frameworks, and metadata-driven architectures that enable seamless data access and analytics. This role prefers deep expertise in big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Design, develop, and maintain complex ETL/ELT data pipelines in Databricks using PySpark, Scala, and SQL to process large-scale datasetsUnderstand the biotech/pharma or related domains & build highly efficient data pipelines to migrate and deploy complex data across systemsDesign and Implement solutions to enable unified data access, governance, and interoperability across hybrid cloud environmentsIngest and transform structured and unstructured data from databases (PostgreSQL, MySQL, SQL Server, MongoDB etc.), APIs, logs, event streams, images, pdf, and third-party platformsEnsuring data integrity, accuracy, and consistency through rigorous quality checks and monitoringExpert in data quality, data validation and verification frameworksInnovate, explore and implement new tools and technologies to enhance efficient data processingProactively identify and implement opportunities to automate tasks and develop reusable frameworksWork in an Agile and Scaled Agile (SAFe) environment, collaborating with cross-functional teams, product owners, and Scrum Masters to deliver incremental valueUse JIRA, Confluence, and Agile DevOps tools to manage sprints, backlogs, and user stories.Support continuous improvement, test automation, and DevOps practices in the data engineering lifecycleCollaborate and communicate effectively with the product teams, with cross-functional teams to understand business requirements and translate them into technical solutions Must-Have Skills: Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies.Proficiency in workflow orchestration, performance tuning on big data processing.Strong understanding of AWS servicesAbility to quickly learn, adapt and apply new technologiesStrong problem-solving and analytical skillsExcellent communication and teamwork skillsExperience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Good-to-Have Skills: Data Engineering experience in Biotechnology or pharma industryExperience in writing APIs to make the data available to the consumersExperienced with SQL/NOSQL database, vector database for large language modelsExperienced with data modeling and performance tuning for both OLAP and OLTP databasesExperienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Education and Professional Certifications Minimum 5 to 8 years of Computer Science, IT or related field experienceAWS Certified Data Engineer preferredDatabricks Certificate preferredScaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills.Strong verbal and written communication skillsAbility to work effectively with global, virtual teamsHigh degree of initiative and self-motivation.Ability to manage multiple priorities successfully.Team-oriented, with a focus on achieving team goals.Ability to learn quickly, be organized and detail oriented.Strong presentation and public speaking skills. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.

Posted 1 month ago

Apply

Exploring OLAP Jobs in India

With the increasing demand for data analysis and business intelligence, OLAP (Online Analytical Processing) jobs have become popular in India. OLAP professionals are responsible for designing, building, and maintaining OLAP databases to support data analysis and reporting activities for organizations. If you are looking to pursue a career in OLAP in India, here is a comprehensive guide to help you navigate the job market.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Chennai

These cities are known for having a high concentration of IT companies and organizations that require OLAP professionals.

Average Salary Range

The average salary range for OLAP professionals in India varies based on experience levels. Entry-level professionals can expect to earn around INR 4-6 lakhs per annum, while experienced professionals with 5+ years of experience can earn upwards of INR 12 lakhs per annum.

Career Path

Career progression in OLAP typically follows a trajectory from Junior Developer to Senior Developer, and then to a Tech Lead role. As professionals gain experience and expertise in OLAP technologies, they may also explore roles such as Data Analyst, Business Intelligence Developer, or Database Administrator.

Related Skills

In addition to OLAP expertise, professionals in this field are often expected to have knowledge of SQL, data modeling, ETL (Extract, Transform, Load) processes, data warehousing concepts, and data visualization tools such as Tableau or Power BI.

Interview Questions

  • What is OLAP and how does it differ from OLTP? (basic)
  • Explain the difference between a star schema and a snowflake schema. (medium)
  • How do you optimize OLAP queries for performance? (advanced)
  • What is the role of aggregation functions in OLAP databases? (basic)
  • Can you explain the concept of drill-down in OLAP? (medium)
  • How do you handle slowly changing dimensions in OLAP databases? (advanced)
  • What are the advantages of using a multidimensional database over a relational database for OLAP purposes? (medium)
  • Describe your experience with OLAP tools such as Microsoft Analysis Services or Oracle OLAP. (basic)
  • How do you ensure data consistency in an OLAP environment? (medium)
  • What are some common challenges faced when working with OLAP databases? (advanced)
  • Explain the concept of data cubes in OLAP. (basic)
  • How do you approach designing a data warehouse for OLAP purposes? (medium)
  • Can you discuss the importance of indexing in OLAP databases? (advanced)
  • How do you handle missing or incomplete data in OLAP analysis? (medium)
  • What are the key components of an OLAP system architecture? (basic)
  • How do you troubleshoot performance issues in OLAP queries? (advanced)
  • Have you worked with real-time OLAP systems? If so, can you explain the challenges involved? (medium)
  • What are the limitations of OLAP compared to other data analysis techniques? (advanced)
  • How do you ensure data security in an OLAP environment? (medium)
  • Have you implemented any data mining algorithms in OLAP systems? If so, can you provide an example? (advanced)
  • How do you approach designing dimensions and measures in an OLAP cube? (medium)
  • What are some best practices for OLAP database design? (advanced)
  • How do you handle concurrent user access in an OLAP environment? (medium)
  • Can you explain the concept of data slicing and dicing in OLAP analysis? (basic)
  • What are your thoughts on the future of OLAP technologies in the era of big data and AI? (advanced)

Closing Remark

As you prepare for OLAP job interviews in India, make sure to hone your technical skills, brush up on industry trends, and showcase your problem-solving abilities. With the right preparation and confidence, you can successfully land a rewarding career in OLAP in India. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies