Jobs
Interviews

954 Olap Jobs - Page 35

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 years

0 Lacs

Greater Kolkata Area

On-site

Duration of Contract: 6 Months Location: PAN India Experience Required: 7–8 Years Prerequisite Skills: SSAS, Advanced SQL Job Summary We are looking for a highly skilled SSAS Developer with solid experience in building OLAP and Tabular models using SQL Server Analysis Services (SSAS) . The ideal candidate should also possess in-depth knowledge of ETL processes using tools such as SSIS , Informatica , or Azure Data Factory , and should be capable of developing scalable and efficient data integration solutions. Key Responsibilities Design, develop, and maintain SSAS OLAP cubes and Tabular models. Collaborate with business analysts and data architects to gather and understand business requirements. Develop advanced DAX and MDX queries for analytics and reporting. Optimize SSAS models for performance, scalability, and efficiency. Create and manage ETL pipelines for data extraction, transformation, and loading. Integrate data from various relational and non-relational sources. Follow best practices in data modeling, version control, and deployment automation. Troubleshoot issues related to data performance, integrity, and availability. Work with BI tools like Power BI, Excel, or Tableau for dashboard creation and data visualization. Required Skills Strong hands-on experience with SSAS Tabular and Multidimensional models. Advanced skills in DAX, MDX, and SQL. Proficiency in ETL tools like SSIS, Informatica, or Azure Data Factory. Solid understanding of Dimensional Data Modeling and schema designs (Star/Snowflake). Experience with CI/CD pipelines and source control systems (e.g., Git). Familiarity with data warehousing concepts and data governance practices. Strong problem-solving abilities and attention to detail. Preferred Qualifications Experience with cloud data platforms: Azure Synapse, Snowflake, or AWS Redshift. Knowledge of Power BI or other front-end BI tools. Familiarity with Agile/Scrum methodologies. Bachelor’s degree in Computer Science, Information Systems, or a related field. Mandatory Technical Skills T-SQL MS SQL Server Dimensional Data Modeling Azure SQL Show more Show less

Posted 2 months ago

Apply

5.0 years

0 Lacs

Greater Kolkata Area

On-site

load_list_page(event)"> Job listing Job details Job Information Date Opened 05/21/2025 Job Type Full time Industry Technology Work Experience 5+ years City Kolkata State/Province West Bengal Country India Zip/Postal Code 700026 Job Description Job Description: Understand BI/reporting-related requirements. Design and develop Reporting or Dashboards. Work closely with DBAs to understand data flow and apply optimized data structures to BI/Reporting for performance tuning. Develop and implement programs and scripting to support BI/Reporting to help front-end developers. Suggest, share, and implement best practices among team members. Develop, build, and deploy BI solutions or Reporting tools. Maintain, support, evaluate, and improve existing BI/Reporting & analytics platforms. Create tools to store data (e.g., OLAP cubes). Conduct unit testing and troubleshooting. Develop and execute database queries and conduct analyses. Create visualizations and reports for requested projects. Map various BI/Reporting databases and documentation used in the organization. Support customers to resolve any issues. Requirements Proven 6+ years of experience as a BI/Report Developer. Proven abilities to take initiative and be innovative. A clear understanding of data integration tools, OLAP, ETL/ELT processes, warehouse architecture/design (e.g., dimensional modeling), and data mining. Microservices architecture is a must. Familiarity with Cloud-based BI/Reporting environment (e.g., Amazon Web Services, Microsoft Power BI, Snowflake, or similar services). Knowledge of Amazon products (like QuickSight), the Hadoop platform, and Apache technologies. Hands-on knowledge in C#, Scala, R, Python, DAX, or GraphQL will be preferable. Familiarity with reporting technologies on different platforms (e.g., SQL queries, SSRS, SSIS, MongoDB, MySQL, PGDB). An analytical mind with a problem-solving aptitude. BSc/BA in Computer Science, Engineering, or relevant field. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#6875E2;border-color:#6875E2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> Show more Show less

Posted 2 months ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Welcome to Warner Bros. Discovery… the stuff dreams are made of. Who We Are… When we say, “the stuff dreams are made of,” we’re not just referring to the world of wizards, dragons and superheroes, or even to the wonders of Planet Earth. Behind WBD’s vast portfolio of iconic content and beloved brands, are the storytellers bringing our characters to life, the creators bringing them to your living rooms and the dreamers creating what’s next… From brilliant creatives, to technology trailblazers, across the globe, WBD offers career defining opportunities, thoughtfully curated benefits, and the tools to explore and grow into your best selves. Here you are supported, here you are celebrated, here you can thrive. Your New Role The Business Intelligence senior analyst will support the ongoing design and development of dashboards, reports, and other analytics studies or needs. To be successful in the role you’ll need to be intellectually curious, detail-oriented, open to new ideas, and possess data skills and a strong aptitude for quantitative methods. The role requires strong SQL skills a wide experience using BI visualization tools like Tableau and PowerBI Your Role Accountabilities With the support of other analysis and technical teams, collect and analyze stakeholders’ requirements. Responsible for developing interactive and user-friendly dashboards and reports, partnering with UI/UX designers. Be experienced in BI tools like powerBi, Tableau, Looker, Microstrategy and Business Object and be capable and eager to learn new and other tools Be able to quickly shape data into reporting and analytics solutions Work with the Data and visualization platform team on reporting tools actualizations, understanding how new features can benefit our stakeholders in the future, and adapting existing dashboards and reports Have knowledge of database fundamentals such as multidimensional database design, relational database design, and more Qualifications & Experiences 5+ years of experience working with BI tools or any data-specific role with a sound knowledge of database management, data modeling, business intelligence, SQL querying, data warehousing, and online analytical processing (OLAP) Skills in BI tools and BI systems, such as Power BI, SAP BO, Tableau, Looker, Microstrategy, etc., creating data-rich dashboards, implementing Row-level Security (RLS) in Power BI, writing DAX expressions, developing custom BI products with scripting and programming languages such as R, Python, etc. In-depth understanding and experience with BI stacks The ability to drill down on data and visualize it in the best possible way through charts, reports, or dashboards Self-motivated and eager to learn Ability to communicate with business as well as technical teams Strong client management skills Ability to learn and quickly respond to rapidly changing business environment Have an analytical and problem-solving mindset and approach Not Required But Preferred Experience BA/BS or MA/MS in design related field, or equivalent experience (relevant degree subjects include computer science, digital design, graphic design, web design, web technology) Understanding of software development architecture and technical aspects How We Get Things Done… This last bit is probably the most important! Here at WBD, our guiding principles are the core values by which we operate and are central to how we get things done. You can find them at www.wbd.com/guiding-principles/ along with some insights from the team on what they mean and how they show up in their day to day. We hope they resonate with you and look forward to discussing them during your interview. Championing Inclusion at WBD Warner Bros. Discovery embraces the opportunity to build a workforce that reflects a wide array of perspectives, backgrounds and experiences. Being an equal opportunity employer means that we take seriously our responsibility to consider qualified candidates on the basis of merit, regardless of sex, gender identity, ethnicity, age, sexual orientation, religion or belief, marital status, pregnancy, parenthood, disability or any other category protected by law. If you’re a qualified candidate with a disability and you require adjustments or accommodations during the job application and/or recruitment process, please visit our accessibility page for instructions to submit your request. Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Software Developer - Expertise SQL queries, Power BI, and SQL Server Integration Services (SSIS) - Pune Location We are seeking a highly skilled and experienced Specialist Power BI Developer to join our team, specifically focused on our client inquiry management project. As a Specialist Power BI Developer, you will be responsible for formulating and delivering automated reports and dashboards using Power BI and other reporting tools, with a specific focus on inquiry reporting metrics as MTTR, Avg. Aging, platform adoption etc. You will work closely with business stakeholders to understand their requirements related to inquiry management and translate them into functional specifications for reporting applications. You will have expertise in SQL queries, Power BI, and SQL Server Integration Services (SSIS), and a strong understanding of database concepts and data modeling. Responsibilities Formulate automated reports and dashboards using Power BI and other reporting tools, with a focus on inquiry reporting metrics. Understand the specific business requirements related to inquiry management and set functional specifications for reporting applications. Utilize expertise in SQL queries, Power BI, and SSIS to gather and analyze data related to inquiries for reporting purposes. Develop technical specifications from business needs and establish deadlines for work completion. Design data models that transform raw data related to inquiries into insightful knowledge by understanding business requirements in the context of inquiry reporting metrics. Create dynamic and eye-catching dashboards and reports using Power BI, highlighting key metrics and trends related to inquiries. Implement row-level security on data and comprehend Power BI's application security layer models, ensuring data privacy and confidentiality related to inquiries. Collaborate with cross-functional teams to integrate, alter, and connect data sources related to inquiries for business intelligence purposes. Make necessary tactical and technological adjustments to enhance the current inquiry management reporting systems. Troubleshoot and resolve issues related to data quality and reporting specifically focused on inquiries. Communicate effectively with internal teams and client teams to explain requirements and deliver solutions related to inquiry reporting metrics. Stay up to date with industry trends and advancements in Power BI and business intelligence for effective inquiry reporting. Requirements Bachelor's degree in Computer Science, Information Systems, or a related field. Proven experience as a Power BI Developer or similar role, with a specific focus on reporting related to inquiries or customer service metrics. Expertise in SQL queries, Power BI, and SQL Server Integration Services (SSIS). Excellent communication skills to effectively articulate requirements and collaborate with internal and client teams. Strong analytical thinking skills for converting data related to inquiries into illuminating reports and insights. Knowledge of data warehousing, data gateway, and data preparation projects. Familiarity with the Microsoft SQL Server BI Stack, including SSIS, TSQL, Power Query, MDX, PowerBI, and DAX. Detailed knowledge and understanding of database management systems, OLAP, and the ETL framework. Proficiency in Microsoft Excel and other data analysis tools. Ability to gather and analyze business requirements specific to inquiries and translate them into technical specifications. Strong attention to detail and ability to QA and validate data for accuracy. Ability to manage multiple projects and deadlines simultaneously. Knowledge of Agile development methodologies is a plus. Ability to learn and adapt to new technologies and tools quickly. Thank You For Considering Employment With Fiserv. Please Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our Commitment To Diversity And Inclusion Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note To Agencies Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning About Fake Job Posts Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address. Show more Show less

Posted 2 months ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Through our dedicated associates, Conduent delivers mission-critical services and solutions on behalf of Fortune 100 companies and over 500 governments - creating exceptional outcomes for our clients and the millions of people who count on them. You have an opportunity to personally thrive, make a difference and be part of a culture where individuality is noticed and valued every day. Job Overview: We are looking for a BI & Visualization Developer who will be part of our Analytics Practice and will be expected to actively work in a multi-disciplinary fast paced environment. This role requires a broad range of skills and the ability to step into different roles depending on the size and scope of the project; its primary responsibility is to support the design, development and maintainance of business intelligence and analytics solutions. Responsibilities: Develop reports, dashboards, and advanced visualizations. Works closely with the product managers, business analysts, clients etc. to understand the needs / requirements and develop visualizations needed. Provide support to new of existing applications while recommending best practices and leading projects to implement new functionality. Learn and develop new visualization techniques as required to keep up with the contemporary visualization design and presentation. Reviews the solution requirements and architecture to ensure selection of appropriate technology, efficient use of resources and integration of multiple systems and technology. Collaborate in design reviews and code reviews to ensure standards are met. Recommend new standards for visualizations. Build and reuse template/components/web services across multiple dashboards Support presentations to Customers and Partners Advising on new technology trends and possible adoption to maintain competitive advantage Mentoring Associates Experience Needed: 8+ years of related experience is required. A Bachelor degree or Masters degree in Computer Science or related technical discipline is required Highly skilled in data visualization tools like PowerBI, Tableau, Qlikview etc. Very Good Understanding of PowerBI Tabular Model/Azure Analysis Services using large datasets. Strong SQL coding experience with performance optimization experience for data queries. Understands different data models like normalized, de-normalied, stars, and snowflake models. Worked in big data environments, cloud data stores, different RDBMS and OLAP solutions. Experience in design, development, and deployment of BI systems. Candidates with ETL experience preferred. Is familiar with the principles and practices involved in development and maintenance of software solutions and architectures and in service delivery. Has strong technical background and remains evergreen with technology and industry developments. Additional Requirements Demonstrated ability to have successfully completed multiple, complex technical projects Prior experience with application delivery using an Onshore/Offshore model Experience with business processes across multiple Master data domains in a services based company Demonstrates a rational and organized approach to the tasks undertaken and an awareness of the need to achieve quality. Demonstrates high standards of professional behavior in dealings with clients, colleagues and staff. Strong written communication skills. Is effective and persuasive in both written and oral communication. Experience with gathering end user requirements and writing technical documentation Time management and multitasking skills to effectively meet deadlines under time-to-market pressure May require occasional travel Conduent is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, creed, religion, ancestry, national origin, age, gender identity, gender expression, sex/gender, marital status, sexual orientation, physical or mental disability, medical condition, use of a guide dog or service animal, military/veteran status, citizenship status, basis of genetic information, or any other group protected by law. People with disabilities who need a reasonable accommodation to apply for or compete for employment with Conduent may request such accommodation(s) by submitting their request through this form that must be downloaded: click here to access or download the form. Complete the form and then email it as an attachment to FTADAAA@conduent.com. You may also click here to access Conduent's ADAAA Accommodation Policy. At Conduent we value the health and safety of our associates, their families and our community. For US applicants while we DO NOT require vaccination for most of our jobs, we DO require that you provide us with your vaccination status, where legally permissible. Providing this information is a requirement of your employment at Conduent. Show more Show less

Posted 2 months ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Hyderabad, Bengaluru

Work from Office

sRide Carpool is looking for Azure SQL Developer DBA to join our dynamic team and embark on a rewarding career journey The Azure SQL Developer DBA is responsible for designing, implementing, and managing databases in the Azure environment This role involves optimizing database performance, ensuring data integrity, managing security, and collaborating with development teams to support application deployments and enhancements Key Responsibilities:Database Design and Architecture:Design and implement Azure SQL databases based on business requirements Optimize database schema for performance, scalability, and maintainability Define data retention and archiving strategies Database Deployment and Management:Deploy and manage Azure SQL databases using best practices Configure and maintain database instances, including provisioning, scaling, and patching Monitor database performance and troubleshoot issues Performance Optimization:Identify and resolve performance bottlenecks by tuning queries, indexes, and database configurations Monitor and analyze query execution plans to optimize performance Implement caching strategies to improve application performance Data Security and Compliance:Implement security measures to safeguard sensitive data Configure and manage user roles, permissions, and access controls Ensure compliance with data protection regulations (EG, GDPR, HIPAA) Backup and Recovery:Implement and maintain database backup and recovery processes Develop disaster recovery plans and perform regular backups and testing High Availability and Scalability:Configure and manage high availability solutions such as failover groups, availability sets, or geo-replication Monitor and scale databases based on usage patterns and growth Collaboration and Support:Work closely with development teams to understand application requirements and optimize database interactions Provide technical guidance and support to developers for database-related tasks Assist in troubleshooting production incidents related to databases Automation and Scripting:Develop scripts and automation tools for database provisioning, configuration, and maintenance Implement Infrastructure as Code (IaC) principles for database deployment and management Documentation:Maintain documentation for database design, configuration, and processes Create operational runbooks and guidelines for database-related tasks

Posted 2 months ago

Apply

8.0 - 10.0 years

25 - 30 Lacs

Chennai

Work from Office

Job_Description":" Pando \u202fis a global leader in supply chain technology, building the worlds quickest time-to-value Fulfillment Cloud platform. Pando\u2019s Fulfillment Cloud provides manufacturers, retailers, and 3PLs with a single pane of glass to streamline end-to-end purchase order fulfillment and customer order fulfillment to improve service levels, reduce carbon footprint, and bring down costs. As a partner of choice for Fortune 500 enterprises globally, with a presence across APAC, the Middle East, and the US, Pando is recognized as a \u202fTechnology Pioneer by the World Economic Forum (WEF) , and as\u202f one of the fastest growing technology companies by Deloitte .\u202f Role As the Senior Lead for AI and Data Warehouse at Pando, you will be responsible for building and scaling the data and AI services team. You will drive the design and implementation of highly scalable, modular, and reusable data pipelines, leveraging big data technologies and low-code implementations. This is a senior leadership position where you will work closely with cross-functional teams to deliver solutions that power advanced analytics, dashboards, and AI-based insights. Key Responsibilities Lead the development of scalable, high-performance data pipelines using PySpark or Big Data ETL pipeline technologies. Drive data modeling efforts for analytics, dashboards, and knowledge graphs. Oversee the implementation of parquet-based data lakes. Work on OLAP databases, ensuring optimal data structure for reporting and querying. Architect and optimize large-scale enterprise big data implementations with a focus on modular and reusable low-code libraries. Collaborate with stakeholders to design and deliver AI and DWH solutions that align with business needs. Mentor and lead a team of engineers, building out the data and AI services organization. Requirements -8-10 years of experience in big data and AI technologies, with expertise in PySpark or similar Big Data ETL pipeline technologies. -Strong proficiency in SQL and OLAP database technologies. -Firsthand experience with data modeling for analytics, dashboards, and knowledge graphs. -Proven experience with parquet-based data lake implementations. -Expertise in building highly scalable, high-volume data pipelines. -Experience with modular, reusable, low-code-based implementations. -Involvement in large-scale enterprise big data implementations. -Initiative-taker with strong motivation and the ability to lead a growing team. Preferred -Experience leading a team or building out a new department. -Experience with cloud-based data platforms and AI services. -Familiarity with supply chain technology or fulfilment platforms is a plus. -Join us at Pando and lead the transformation of our AI and data services, delivering innovative solutions for global enterprises! ","Job_Type":"Full time" , "Job_Opening_Name":"Technical Lead - AI & Data Warehouse","State":"Tamil Nadu" , "Country":"India" , "Zip_Code":"600017" , "id":"727224000012573268" , "Publish":true , "Date_Opened":"2025-05-08" , "

Posted 2 months ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description Total 6+ years of experience 3 years of experience with Selenium automation tool 2 years of Software development in Java 2EE. Experience in other automation tools Mercury tools or self-created test-harness tool and White-box testing (Java APIs) .i.e. JUnit 4 Year College Degree in Computer Science or related field i.e. BE or MCA Good understanding of XML, XSL/XSLT, RDBMS and Linux platform. Experience in Multi-dimensional (OLAP technology),Data Warehouse and Financial software would be desirable. Motivated individual in learning leading-edge technology and testing complex software. Career Level - IC3 Responsibilities Total 6+ years of experience 3 years of experience with Selenium automation tool 2 years of Software development in Java 2EE. Experience in other automation tools Mercury tools or self-created test-harness tool and White-box testing (Java APIs) .i.e. JUnit 4 Year College Degree in Computer Science or related field i.e. BE or MCA Good understanding of XML, XSL/XSLT, RDBMS and Linux platform. Experience in Multi-dimensional (OLAP technology),Data Warehouse and Financial software would be desirable. Motivated individual in learning leading-edge technology and testing complex software. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 2 months ago

Apply

8.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Through our dedicated associates, Conduent delivers mission-critical services and solutions on behalf of Fortune 100 companies and over 500 governments - creating exceptional outcomes for our clients and the millions of people who count on them. You have an opportunity to personally thrive, make a difference and be part of a culture where individuality is noticed and valued every day. Job Overview We are looking for a Data Engineer who will be part of our Analytics Practice and will be expected to actively work in a multi-disciplinary fast paced environment. This role requires a broad range of skills and the ability to step into different roles depending on the size and scope of the project; its primary responsibility is the acquisition, transformation, loading and processing of data from a multitude of disparate data sources, including structured and unstructured data for advanced analytics and machine learning in a big data environment. Responsibilities: Engineer a modern data pipeline to collect, organize, and process data from disparate sources. Performs data management tasks, such as conduct data profiling, assess data quality, and write SQL queries to extract and integrate data Develop efficient data collection systems and sound strategies for getting quality data from different sources Consume and analyze data from the data pool to support inference, prediction and recommendation of actionable insights to support business growth. Design and develop ETL processes using tools and scripting. Troubleshoot and debug ETL processes. Performance tuning and opitimization of the ETL processes. Provide support to new of existing applications while recommending best practices and leading projects to implement new functionality. Collaborate in design reviews and code reviews to ensure standards are met. Recommend new standards for visualizations. Learn and develop new ETL techniques as required to keep up with the contemporary technologies. Reviews the solution requirements and architecture to ensure selection of appropriate technology, efficient use of resources and integration of multiple systems and technology. Support presentations to Customers and Partners Advising on new technology trends and possible adoption to maintain competitive advantage Experience Needed: 8+ years of related experience is required. A BS or Masters degree in Computer Science or related technical discipline is required ETL experience with data integration to support data marts, extracts and reporting Experience connecting to varied data sources Excellent SQL coding experience with performance optimization for data queries. Understands different data models like normalized, de-normalied, stars, and snowflake models. Worked with transactional, temporarl, time series, and structured and unstructured data. Experience on Azure Data Factory and Azure Synapse Analytics Worked in big data environments, cloud data stores, different RDBMS and OLAP solutions. Experience in cloud-based ETL development processes. Experience in deployment and maintenance of ETL Jobs. Is familiar with the principles and practices involved in development and maintenance of software solutions and architectures and in service delivery. Has strong technical background and remains evergreen with technology and industry developments. At least 3 years of demonstrated success in software engineering, release engineering, and/or configuration management. Highly skilled in scripting languages like PowerShell. Substantial experience in the implementation and exectuion fo CI/CD processes. Additional Requirements Demonstrated ability to have successfully completed multiple, complex technical projects Prior experience with application delivery using an Onshore/Offshore model Experience with business processes across multiple Master data domains in a services based company Demonstrates a rational and organized approach to the tasks undertaken and an awareness of the need to achieve quality. Demonstrates high standards of professional behavior in dealings with clients, colleagues and staff. Is able to make sound and far reaching decisions alone on major issues and to take full responsibility for them on a technical basis. Strong written communication skills. Is effective and persuasive in both written and oral communication. Experience with gathering end user requirements and writing technical documentation Time management and multitasking skills to effectively meet deadlines under time-to-market pressure Conduent is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, creed, religion, ancestry, national origin, age, gender identity, gender expression, sex/gender, marital status, sexual orientation, physical or mental disability, medical condition, use of a guide dog or service animal, military/veteran status, citizenship status, basis of genetic information, or any other group protected by law. People with disabilities who need a reasonable accommodation to apply for or compete for employment with Conduent may request such accommodation(s) by submitting their request through this form that must be downloaded: click here to access or download the form. Complete the form and then email it as an attachment to FTADAAA@conduent.com. You may also click here to access Conduent's ADAAA Accommodation Policy. At Conduent we value the health and safety of our associates, their families and our community. For US applicants while we DO NOT require vaccination for most of our jobs, we DO require that you provide us with your vaccination status, where legally permissible. Providing this information is a requirement of your employment at Conduent. Show more Show less

Posted 2 months ago

Apply

7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Description Design, develop, troubleshoot and debug software programs for databases, applications, tools, networks etc.As a member of the software engineering division, you will take an active role in the definition and evolution of standard practices and procedures. You will be responsible for defining and developing software for tasks associated with the developing, designing and debugging of software applications or operating systems.Work is non-routine and very complex, involving the application of advanced technical/business skills in area of specialization. Leading contributor individually and as a team member, providing direction and mentoring to others. BS or MS degree or equivalent experience relevant to functional area. 7 years of software engineering or related experience.ResponsibilitiesOverview of Product – Oracle AnalyticsBe part of an energetic and challenging team building an enterprise Analytic platform that will allow users to quickly gain insights on their most valuable asset; data. Oracle Analytics is an industry-leading product that empowers entire organizations with a full range of business analytics tools, enterprise ready reporting and engaging, and easy-to-use self-service data visualizations. Our customers are business users that demand a software product that allows easy, fast navigation through the full spectrum of data scale from simple spreadsheets to analyzing enormous volumes of information in enterprise class data warehouses.Oracle Analytics is a comprehensive solution to meet the breadth of all analytics needs. Get the right data, to the right people, at the right time with analytics for everyone in your organization. With built-in security and governance, you can easily share insights and collaborate with your colleagues. By leveraging the cloud, you can scale up or down to suit your needs. The Oracle Analytics Cloud offering is a leading cloud service at Oracle built on Oracle Cloud Infrastructure. It runs with a Generation 2 offering and provides consistent high performance and unmatched governance and security controls.Self-service analytics drive business agility with faster time to insights. You no longer need help from IT to access, prepare, analyze, and collaborate on all your data. Easily create data visualizations with automated chart recommendations and optimize insights by collaborating with colleagues on analyses.Augmented analytics with embedded machine learning throughout the platform drive smarter and better insights. Always on—and always working in the background, machine learning is continuously learning from the data it takes in, making it smarter and more accurate as time goes by. Uncover deeper patterns and predict trends for impactful, unbiased recommendations.On the team we develop, deploy, and support the Oracle Analytics platform helping our customers succeed in their journey to drive business value. You will be working with experts in their field, exploring the latest technologies, you will be challenged while creating features that will be delivered to our customers, asked to be creative, and hopefully have some fun along the way. Members of our team are tasked to take on challenges along all aspect of our product.https://www.oracle.com/solutions/business-analytics Career Level - IC4 Responsibilities As a member of the development team, you will design, code, debug, and deliver innovative analytic features that involve in C++ development with extensive exposure on highly scalable, distributed, multithreaded applications. You will work closely with your peer developers located across the world, including Mexico, India, and the USA. Key responsibilities include: Design, develop, test and deliver new features on a world-class analytics platform suitable for deployment to both the Oracle Cloud and on-premise environments Lead the creation of formal design specifications and coding of complex systems Work closely with the Product Management on product requirements and functionality Build software applications following established coding standards Communicate continually with the project teams, explain progress on the development effort Contribute to continuous improvement by suggesting improvements to user interface, software architecture or recommending new technologies Ensure quality of work through development standards and QA procedures Perform maintenance and enhancements on existing software Key Qualifications: BS/MS in Computer Science or related major Exceptional analytic and problem-solving skills Extensive experience in using, building, debugging multithreaded applications Ability to design large, scalable systems for enterprise customers Solid understanding concurrency, multithreading and memory management Experienced in C++ programming including templates, STL, and object-oriented patterns Interest or experience in database kernel development Understanding of SQL and relational data processing concepts like joins and indexing strategies Experience With Java, Python Or Other Scripting Languages. Experienced in distributed and scalable server-side software development Knowledge in developing, implementing, and optimizing software algorithms Solid knowledge of data structures and operating systems Basic understanding of Agile/Scrum development methodologies Hands-on experience using source control tools such as GIT Strong written and verbal English communication skills Self-motivated and passionate in developing high quality software Strong Team Player Other Qualifications: Knowledge of Business Intelligence or Analytics Familiarity with SQL query optimization and execution Experienced in Big Data technologies (such as Hadoop, Spark) Interest or experience of OLAP, data warehousing or multidimensional databases Familiarity with Cloud services such as OCI, AWS or Azure Knowledge of Terraform/Python About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 2 months ago

Apply

10.0 - 15.0 years

2 - 6 Lacs

Bengaluru

Work from Office

Job Title :Senior SQL Developer Experience 10 -15Years Location :Bangalore ExperienceMinimum of 10+ years in database development and management roles. SQL MasteryAdvanced expertise in crafting and optimizing complex SQL queries and scripts. AWS RedshiftProven experience in managing, tuning, and optimizing large-scale Redshift clusters. PostgreSQLDeep understanding of PostgreSQL, including query planning, indexing strategies, and advanced tuning techniques. Data PipelinesExtensive experience in ETL development and integrating data from multiple sources into cloud environments. Cloud ProficiencyStrong experience with AWS services like ECS, S3, KMS, Lambda, Glue, and IAM. Data ModelingComprehensive knowledge of data modeling techniques for both OLAP and OLTP systems. ScriptingProficiency in Python, C#, or other scripting languages for automation and data manipulation. Preferred Qualifications LeadershipPrior experience in leading database or data engineering teams. Data VisualizationFamiliarity with reporting and visualization tools like Tableau, Power BI, or Looker. DevOpsKnowledge of CI/CD pipelines, infrastructure as code (e.g., Terraform), and version control (Git). CertificationsAny relevant certifications (e.g., AWS Certified Solutions Architect, AWS Certified Database - Specialty, PostgreSQL Certified Professional) will be a plus. Azure DatabricksFamiliarity with Azure Databricks for data engineering and analytics workflows will be a significant advantage. Soft Skills Strong problem-solving and analytical capabilities. Exceptional communication skills for collaboration with technical and non-technical stakeholders. A results-driven mindset with the ability to work independently or lead within a team. Qualification: Bachelor's or masters degree in Computer Science, Information Systems, Engineering or equivalent. 10+ years of experience

Posted 2 months ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Introduction A career in IBM Consulting embraces long-term relationships and close collaboration with clients across the globe. In this role, you will work for IBM BPO, part of Consulting that, accelerates digital transformation using agile methodologies, process mining, and AI-powered workflows. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio, including IBM Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be supported by mentors and coaches who will encourage you to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in groundbreaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and learning opportunities in an environment that embraces your unique skills and experience. Your Role And Responsibilities Develop, test and support future-ready data solutions for customers across industry verticals Develop, test, and support end-to-end batch and near real-time data flows/pipelines Demonstrate understanding in data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies Communicates risks and ensures understanding of these risks. Preferred Education Master's Degree Required Technical And Professional Expertise Minimum of 5+ years of related experience required Experience in modeling and business system designs Good hands-on experience on DataStage, Cloud based ETL Services Have great expertise in writing TSQL code Well versed with data warehouse schemas and OLAP techniques Preferred Technical And Professional Experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate Must be a strong team player/leader Ability to lead Data transformation project with multiple junior data engineers Strong oral written and interpersonal skills for interacting and throughout all levels of the organization. Ability to clearly communicate complex business problems and technical solutions. Show more Show less

Posted 2 months ago

Apply

7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Location: Hyderabad / Bangalore / Pune Experience: 7-16 Years Work Mode: Hybrid Mandatory Skills: ERstudio, Data Warehouse, Data Modelling, Databricks, ETL, PostgreSQL, MySQL, Oracle, NoSQL, Hadoop, Spark, Dimensional Modelling ,OLAP, OLTP, Erwin, Data Architect and Supplychain (preferred) Job Description We are looking for a talented and experienced Senior Data Modeler to join our growing team. As a Senior Data Modeler, you will be responsible for designing, implementing, and maintaining data models to enhance data quality, performance, and scalability. You will collaborate with cross-functional teams including data analysts, architects, and business stakeholders to ensure that the data models align with business requirements and drive efficient data management. Key Responsibilities Design, implement, and maintain data models that support business requirements, ensuring high data quality, performance, and scalability. Collaborate with data analysts, data architects, and business stakeholders to align data models with business needs. Leverage expertise in Azure, Databricks, and data warehousing to create and manage data solutions. Manage and optimize relational and NoSQL databases such as Teradata, SQL Server, Oracle, MySQL, MongoDB, and Cassandra. Contribute to and enhance the ETL processes and data integration pipelines to ensure smooth data flows. Apply data modeling principles and techniques, such as ERD and UML, to design and implement effective data models. Stay up-to-date with industry trends and emerging technologies, such as big data technologies like Hadoop and Spark. Develop and maintain data models using data modeling tools such as ER/Studio and Hackolade. Drive the adoption of best practices and standards for data modeling within the organization. Skills And Qualifications Minimum of 6+ years of experience in data modeling, with a proven track record of implementing scalable and efficient data models. Expertise in Azure and Databricks for building data solutions. Proficiency in ER/Studio, Hackolade, and other data modeling tools. Strong understanding of data modeling principles and techniques (e.g., ERD, UML). Experience with relational databases (e.g., Teradata, SQL Server, Oracle, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra). Solid understanding of data warehousing, ETL processes, and data integration. Familiarity with big data technologies such as Hadoop and Spark is an advantage. Industry Knowledge: A background in supply chain is preferred but not mandatory. Excellent analytical and problem-solving skills. Strong communication skills, with the ability to interact with both technical and non-technical stakeholders. Ability to work well in a collaborative, fast-paced environment. Education B.Tech in any branch or specialization Skills: data visualization,oltp,models,databricks,spark,data modeller,supplychain,oracle,skills,databases,hadoop,dimensional modelling,erstudio,nosql,data warehouse,data models,data modeling,modeling,data architect,er studio,data modelling,data,etl,erwin,mysql,olap,postgresql Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Delhi, India

On-site

We are looking for an experienced SSAS Data Engineer with strong expertise in SSAS (Tabular and/or Multidimensional Models) , SQL , MDX/DAX , and data modeling . The ideal candidate will have a solid background in designing and developing BI solutions, working with large datasets, and building scalable SSAS cubes for reporting and analytics. Experience with ETL processes and reporting tools like Power BI is a strong plus. Key Responsibilities Design, develop, and maintain SSAS models (Tabular and/or Multidimensional). Build and optimize MDX or DAX queries for advanced reporting needs. Create and manage data models (Star/Snowflake schemas) supporting business KPIs. Develop and maintain ETL pipelines for efficient data ingestion (preferably using SSIS or similar tools). Implement KPIs, aggregations, partitioning, and performance tuning in SSAS cubes. Collaborate with data analysts, business stakeholders, and Power BI teams to deliver accurate and insightful reporting solutions. Maintain data quality and consistency across data sources and reporting layers. Implement RLS/OLS and manage report security and governance in SSAS and Power BI. Primary Required Skills: SSAS – Tabular & Multidimensional SQL Server (Advanced SQL, Views, Joins, Indexes) DAX & MDX Data Modeling & OLAP concepts Secondary ETL Tools (SSIS or equivalent) Power BI or similar BI/reporting tools Performance tuning & troubleshooting in SSAS and SQL Version control (TFS/Git), deployment best practices Skills: business intelligence,data visualization,sql proficiency,data modeling & olap concepts,mdx,dax & mdx,data analysis,performance tuning,ssas,data modeling,etl tools (ssis or equivalent),version control (tfs/git), deployment best practices,ssas - tabular & multidimensional,etl,sql server (advanced sql, views, joins, indexes),multidimensional expressions (mdx),dax,performance tuning & troubleshooting in ssas and sql,power bi or similar bi/reporting tools Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

New Delhi, Delhi, India

On-site

We are looking for an experienced SSAS Data Engineer with strong expertise in SSAS (Tabular and/or Multidimensional Models) , SQL , MDX/DAX , and data modeling . The ideal candidate will have a solid background in designing and developing BI solutions, working with large datasets, and building scalable SSAS cubes for reporting and analytics. Experience with ETL processes and reporting tools like Power BI is a strong plus. Key Responsibilities Design, develop, and maintain SSAS models (Tabular and/or Multidimensional). Build and optimize MDX or DAX queries for advanced reporting needs. Create and manage data models (Star/Snowflake schemas) supporting business KPIs. Develop and maintain ETL pipelines for efficient data ingestion (preferably using SSIS or similar tools). Implement KPIs, aggregations, partitioning, and performance tuning in SSAS cubes. Collaborate with data analysts, business stakeholders, and Power BI teams to deliver accurate and insightful reporting solutions. Maintain data quality and consistency across data sources and reporting layers. Implement RLS/OLS and manage report security and governance in SSAS and Power BI. Primary Required Skills: SSAS – Tabular & Multidimensional SQL Server (Advanced SQL, Views, Joins, Indexes) DAX & MDX Data Modeling & OLAP concepts Secondary ETL Tools (SSIS or equivalent) Power BI or similar BI/reporting tools Performance tuning & troubleshooting in SSAS and SQL Version control (TFS/Git), deployment best practices Skills: business intelligence,data visualization,sql proficiency,data modeling & olap concepts,mdx,dax & mdx,data analysis,performance tuning,ssas,data modeling,etl tools (ssis or equivalent),version control (tfs/git), deployment best practices,ssas - tabular & multidimensional,etl,sql server (advanced sql, views, joins, indexes),multidimensional expressions (mdx),dax,performance tuning & troubleshooting in ssas and sql,power bi or similar bi/reporting tools Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Faridabad, Haryana, India

On-site

We are looking for an experienced SSAS Data Engineer with strong expertise in SSAS (Tabular and/or Multidimensional Models) , SQL , MDX/DAX , and data modeling . The ideal candidate will have a solid background in designing and developing BI solutions, working with large datasets, and building scalable SSAS cubes for reporting and analytics. Experience with ETL processes and reporting tools like Power BI is a strong plus. Key Responsibilities Design, develop, and maintain SSAS models (Tabular and/or Multidimensional). Build and optimize MDX or DAX queries for advanced reporting needs. Create and manage data models (Star/Snowflake schemas) supporting business KPIs. Develop and maintain ETL pipelines for efficient data ingestion (preferably using SSIS or similar tools). Implement KPIs, aggregations, partitioning, and performance tuning in SSAS cubes. Collaborate with data analysts, business stakeholders, and Power BI teams to deliver accurate and insightful reporting solutions. Maintain data quality and consistency across data sources and reporting layers. Implement RLS/OLS and manage report security and governance in SSAS and Power BI. Primary Required Skills: SSAS – Tabular & Multidimensional SQL Server (Advanced SQL, Views, Joins, Indexes) DAX & MDX Data Modeling & OLAP concepts Secondary ETL Tools (SSIS or equivalent) Power BI or similar BI/reporting tools Performance tuning & troubleshooting in SSAS and SQL Version control (TFS/Git), deployment best practices Skills: business intelligence,data visualization,sql proficiency,data modeling & olap concepts,mdx,dax & mdx,data analysis,performance tuning,ssas,data modeling,etl tools (ssis or equivalent),version control (tfs/git), deployment best practices,ssas - tabular & multidimensional,etl,sql server (advanced sql, views, joins, indexes),multidimensional expressions (mdx),dax,performance tuning & troubleshooting in ssas and sql,power bi or similar bi/reporting tools Show more Show less

Posted 2 months ago

Apply

7.0 years

0 Lacs

Greater Kolkata Area

On-site

Location: Hyderabad / Bangalore / Pune Experience: 7-16 Years Work Mode: Hybrid Mandatory Skills: ERstudio, Data Warehouse, Data Modelling, Databricks, ETL, PostgreSQL, MySQL, Oracle, NoSQL, Hadoop, Spark, Dimensional Modelling ,OLAP, OLTP, Erwin, Data Architect and Supplychain (preferred) Job Description We are looking for a talented and experienced Senior Data Modeler to join our growing team. As a Senior Data Modeler, you will be responsible for designing, implementing, and maintaining data models to enhance data quality, performance, and scalability. You will collaborate with cross-functional teams including data analysts, architects, and business stakeholders to ensure that the data models align with business requirements and drive efficient data management. Key Responsibilities Design, implement, and maintain data models that support business requirements, ensuring high data quality, performance, and scalability. Collaborate with data analysts, data architects, and business stakeholders to align data models with business needs. Leverage expertise in Azure, Databricks, and data warehousing to create and manage data solutions. Manage and optimize relational and NoSQL databases such as Teradata, SQL Server, Oracle, MySQL, MongoDB, and Cassandra. Contribute to and enhance the ETL processes and data integration pipelines to ensure smooth data flows. Apply data modeling principles and techniques, such as ERD and UML, to design and implement effective data models. Stay up-to-date with industry trends and emerging technologies, such as big data technologies like Hadoop and Spark. Develop and maintain data models using data modeling tools such as ER/Studio and Hackolade. Drive the adoption of best practices and standards for data modeling within the organization. Skills And Qualifications Minimum of 6+ years of experience in data modeling, with a proven track record of implementing scalable and efficient data models. Expertise in Azure and Databricks for building data solutions. Proficiency in ER/Studio, Hackolade, and other data modeling tools. Strong understanding of data modeling principles and techniques (e.g., ERD, UML). Experience with relational databases (e.g., Teradata, SQL Server, Oracle, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra). Solid understanding of data warehousing, ETL processes, and data integration. Familiarity with big data technologies such as Hadoop and Spark is an advantage. Industry Knowledge: A background in supply chain is preferred but not mandatory. Excellent analytical and problem-solving skills. Strong communication skills, with the ability to interact with both technical and non-technical stakeholders. Ability to work well in a collaborative, fast-paced environment. Education B.Tech in any branch or specialization Skills: data visualization,oltp,models,databricks,spark,data modeller,supplychain,oracle,skills,databases,hadoop,dimensional modelling,erstudio,nosql,data warehouse,data models,data modeling,modeling,data architect,er studio,data modelling,data,etl,erwin,mysql,olap,postgresql Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

We are looking for an experienced SSAS Data Engineer with strong expertise in SSAS (Tabular and/or Multidimensional Models) , SQL , MDX/DAX , and data modeling . The ideal candidate will have a solid background in designing and developing BI solutions, working with large datasets, and building scalable SSAS cubes for reporting and analytics. Experience with ETL processes and reporting tools like Power BI is a strong plus. Key Responsibilities Design, develop, and maintain SSAS models (Tabular and/or Multidimensional). Build and optimize MDX or DAX queries for advanced reporting needs. Create and manage data models (Star/Snowflake schemas) supporting business KPIs. Develop and maintain ETL pipelines for efficient data ingestion (preferably using SSIS or similar tools). Implement KPIs, aggregations, partitioning, and performance tuning in SSAS cubes. Collaborate with data analysts, business stakeholders, and Power BI teams to deliver accurate and insightful reporting solutions. Maintain data quality and consistency across data sources and reporting layers. Implement RLS/OLS and manage report security and governance in SSAS and Power BI. Primary Required Skills: SSAS – Tabular & Multidimensional SQL Server (Advanced SQL, Views, Joins, Indexes) DAX & MDX Data Modeling & OLAP concepts Secondary ETL Tools (SSIS or equivalent) Power BI or similar BI/reporting tools Performance tuning & troubleshooting in SSAS and SQL Version control (TFS/Git), deployment best practices Skills: business intelligence,data visualization,sql proficiency,data modeling & olap concepts,mdx,dax & mdx,data analysis,performance tuning,ssas,data modeling,etl tools (ssis or equivalent),version control (tfs/git), deployment best practices,ssas - tabular & multidimensional,etl,sql server (advanced sql, views, joins, indexes),multidimensional expressions (mdx),dax,performance tuning & troubleshooting in ssas and sql,power bi or similar bi/reporting tools Show more Show less

Posted 2 months ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Location: Hyderabad / Bangalore / Pune Experience: 7-16 Years Work Mode: Hybrid Mandatory Skills: ERstudio, Data Warehouse, Data Modelling, Databricks, ETL, PostgreSQL, MySQL, Oracle, NoSQL, Hadoop, Spark, Dimensional Modelling ,OLAP, OLTP, Erwin, Data Architect and Supplychain (preferred) Job Description We are looking for a talented and experienced Senior Data Modeler to join our growing team. As a Senior Data Modeler, you will be responsible for designing, implementing, and maintaining data models to enhance data quality, performance, and scalability. You will collaborate with cross-functional teams including data analysts, architects, and business stakeholders to ensure that the data models align with business requirements and drive efficient data management. Key Responsibilities Design, implement, and maintain data models that support business requirements, ensuring high data quality, performance, and scalability. Collaborate with data analysts, data architects, and business stakeholders to align data models with business needs. Leverage expertise in Azure, Databricks, and data warehousing to create and manage data solutions. Manage and optimize relational and NoSQL databases such as Teradata, SQL Server, Oracle, MySQL, MongoDB, and Cassandra. Contribute to and enhance the ETL processes and data integration pipelines to ensure smooth data flows. Apply data modeling principles and techniques, such as ERD and UML, to design and implement effective data models. Stay up-to-date with industry trends and emerging technologies, such as big data technologies like Hadoop and Spark. Develop and maintain data models using data modeling tools such as ER/Studio and Hackolade. Drive the adoption of best practices and standards for data modeling within the organization. Skills And Qualifications Minimum of 6+ years of experience in data modeling, with a proven track record of implementing scalable and efficient data models. Expertise in Azure and Databricks for building data solutions. Proficiency in ER/Studio, Hackolade, and other data modeling tools. Strong understanding of data modeling principles and techniques (e.g., ERD, UML). Experience with relational databases (e.g., Teradata, SQL Server, Oracle, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra). Solid understanding of data warehousing, ETL processes, and data integration. Familiarity with big data technologies such as Hadoop and Spark is an advantage. Industry Knowledge: A background in supply chain is preferred but not mandatory. Excellent analytical and problem-solving skills. Strong communication skills, with the ability to interact with both technical and non-technical stakeholders. Ability to work well in a collaborative, fast-paced environment. Education B.Tech in any branch or specialization Skills: data visualization,oltp,models,databricks,spark,data modeller,supplychain,oracle,skills,databases,hadoop,dimensional modelling,erstudio,nosql,data warehouse,data models,data modeling,modeling,data architect,er studio,data modelling,data,etl,erwin,mysql,olap,postgresql Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

India

Remote

Company Description Intellekt AI collaborates with companies to gain a deep understanding of their specific industrial landscapes and designs innovative AI solutions that revolutionize their business operations. Our versatile skill set and comprehensive industry knowledge enable us to consistently deliver successful outcomes tailored to unique requirements. Notable achievements include developing an automated plane parking system for Airports in the US and Canada, developing an AI solution for brain tumor detection and chest X-ray abnormality detection, and creating a profitable ML-based trading strategy for a leading hedge fund. Role Description This is a full-time remote role for a Senior Data Engineer. The Senior Data Engineer will be responsible for designing, developing, and maintaining robust data pipelines, modeling data to support business intelligence and analytics needs, executing Extract Transform Load (ETL) processes, and managing data warehousing solutions. Qualifications Data Engineering, Data Modeling, and Data Warehousing skills Strong experience in data warehouse designing from transactional database Experience in Extract Transform Load (ETL) processes for OLTP to OLAP using tools like Airflow Proficiency in SQL and knowledge of database management systems Strong programming skills in languages such as Python Experience with cloud platforms such as AWS, Google Cloud, or Azure Experience with streaming data, NoSQL databases, and unstructured data is a big plus. Excellent problem-solving and communication skills Bachelor's degree in Computer Science, Data Science, Engineering, or related field Tip If you have experience with designing a production-grade analytical database from a transactional database, highlight that in your resume and/or application, and you will be given preference. Show more Show less

Posted 2 months ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Location: Hyderabad / Bangalore / Pune Experience: 7-16 Years Work Mode: Hybrid Mandatory Skills: ERstudio, Data Warehouse, Data Modelling, Databricks, ETL, PostgreSQL, MySQL, Oracle, NoSQL, Hadoop, Spark, Dimensional Modelling ,OLAP, OLTP, Erwin, Data Architect and Supplychain (preferred) Job Description We are looking for a talented and experienced Senior Data Modeler to join our growing team. As a Senior Data Modeler, you will be responsible for designing, implementing, and maintaining data models to enhance data quality, performance, and scalability. You will collaborate with cross-functional teams including data analysts, architects, and business stakeholders to ensure that the data models align with business requirements and drive efficient data management. Key Responsibilities Design, implement, and maintain data models that support business requirements, ensuring high data quality, performance, and scalability. Collaborate with data analysts, data architects, and business stakeholders to align data models with business needs. Leverage expertise in Azure, Databricks, and data warehousing to create and manage data solutions. Manage and optimize relational and NoSQL databases such as Teradata, SQL Server, Oracle, MySQL, MongoDB, and Cassandra. Contribute to and enhance the ETL processes and data integration pipelines to ensure smooth data flows. Apply data modeling principles and techniques, such as ERD and UML, to design and implement effective data models. Stay up-to-date with industry trends and emerging technologies, such as big data technologies like Hadoop and Spark. Develop and maintain data models using data modeling tools such as ER/Studio and Hackolade. Drive the adoption of best practices and standards for data modeling within the organization. Skills And Qualifications Minimum of 6+ years of experience in data modeling, with a proven track record of implementing scalable and efficient data models. Expertise in Azure and Databricks for building data solutions. Proficiency in ER/Studio, Hackolade, and other data modeling tools. Strong understanding of data modeling principles and techniques (e.g., ERD, UML). Experience with relational databases (e.g., Teradata, SQL Server, Oracle, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra). Solid understanding of data warehousing, ETL processes, and data integration. Familiarity with big data technologies such as Hadoop and Spark is an advantage. Industry Knowledge: A background in supply chain is preferred but not mandatory. Excellent analytical and problem-solving skills. Strong communication skills, with the ability to interact with both technical and non-technical stakeholders. Ability to work well in a collaborative, fast-paced environment. Education B.Tech in any branch or specialization Skills: data visualization,oltp,models,databricks,spark,data modeller,supplychain,oracle,skills,databases,hadoop,dimensional modelling,erstudio,nosql,data warehouse,data models,data modeling,modeling,data architect,er studio,data modelling,data,etl,erwin,mysql,olap,postgresql Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

We are looking for an experienced SSAS Data Engineer with strong expertise in SSAS (Tabular and/or Multidimensional Models) , SQL , MDX/DAX , and data modeling . The ideal candidate will have a solid background in designing and developing BI solutions, working with large datasets, and building scalable SSAS cubes for reporting and analytics. Experience with ETL processes and reporting tools like Power BI is a strong plus. Key Responsibilities Design, develop, and maintain SSAS models (Tabular and/or Multidimensional). Build and optimize MDX or DAX queries for advanced reporting needs. Create and manage data models (Star/Snowflake schemas) supporting business KPIs. Develop and maintain ETL pipelines for efficient data ingestion (preferably using SSIS or similar tools). Implement KPIs, aggregations, partitioning, and performance tuning in SSAS cubes. Collaborate with data analysts, business stakeholders, and Power BI teams to deliver accurate and insightful reporting solutions. Maintain data quality and consistency across data sources and reporting layers. Implement RLS/OLS and manage report security and governance in SSAS and Power BI. Primary Required Skills: SSAS – Tabular & Multidimensional SQL Server (Advanced SQL, Views, Joins, Indexes) DAX & MDX Data Modeling & OLAP concepts Secondary ETL Tools (SSIS or equivalent) Power BI or similar BI/reporting tools Performance tuning & troubleshooting in SSAS and SQL Version control (TFS/Git), deployment best practices Skills: business intelligence,data visualization,sql proficiency,data modeling & olap concepts,mdx,dax & mdx,data analysis,performance tuning,ssas,data modeling,etl tools (ssis or equivalent),version control (tfs/git), deployment best practices,ssas - tabular & multidimensional,etl,sql server (advanced sql, views, joins, indexes),multidimensional expressions (mdx),dax,performance tuning & troubleshooting in ssas and sql,power bi or similar bi/reporting tools Show more Show less

Posted 2 months ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

J At Personify Health, we value and celebrate diversity and are committed to creating an inclusive environment for all employees. We believe in creating teams made up of individuals with various backgrounds, experiences, and perspectives. Why? Because diversity inspires innovation and collaboration and challenges us to produce better solutions. But more than this, diversity is our strength, and a catalyst in our ability to #changelivesforgood. Job Summary As a Business Intelligence developer, you'll understand the Schema layer to build complex BI reports and Dashboards with a keen focus on the healthcare and wellbeing industry. Your SQL skills will play a significant role in data manipulation and delivery, and your experience with MicroStrategy will be vital for creating BI tools and reports. This role will help migrate and build new analytics products based on MicroStrategy to support teams with their internal and external reporting for Health Comp data. Essential Functions/Responsibilities/Duties Work closely with Senior Business Intelligence engineer and BI architect to understand the schema objects and build BI reports and Dashboards Participation in sprint refinement, planning, and kick-off to understand the Agile process and Sprint priorities Develop necessary transformations and aggregate tables required for the reporting\Dashboard needs Understand the Schema layer in MicroStrategy and business requirements Develop complex reports and Dashboards in MicroStrategy Investigate and troubleshoot issues with Dashboard and reports Proactively researching new technologies and proposing improvements to processes and tech stack Create test cases and scenarios to validate the dashboards and maintain data accuracy Education And Experience 3 years of experience in Business Intelligence and Data warehousing 3+ years of experience in MicroStrategy Reports and Dashboard development 2 years of experience in SQL Bachelors or master’s degree in IT or Computer Science or ECE. Nice to have – Any MicroStrategy certifications Required Knowledge, Skills, And Abilities Good in writing complex SQL, including aggregate functions, subqueries and complex date calculations and able to teach these concepts to others. Detail oriented and able to examine data and code for quality and accuracy. Self-Starter – taking initiative when inefficiencies or opportunities are seen. Good understanding of modern relational and non-relational models and differences between them Good understanding of Datawarehouse concepts, snowflake & star schema architecture and SCD concepts Good understanding of MicroStrategy Schema objects Develop Public objects such as metrics, filters, prompts, derived objects, custom groups and consolidations in MicroStrategy Develop complex reports and dashboards using OLAP and MTDI cubes Create complex dashboards with data blending Understand VLDB settings and report optimization Understand security filters and connection mappings in MSTR Work Environment At Personify Health, we value and celebrate diversity and are committed to creating an inclusive environment for all employees. We believe in creating teams made up of individuals with various backgrounds, experiences, and perspectives. Diversity inspires innovation and collaboration and challenges us to produce better solutions. But more than this, diversity is our strength and a catalyst in our ability to change lives for the good. Physical Requirements Constantly operates a computer and other office productivity machinery, such as copy machine, computer printer, calculator, etc. Show more Show less

Posted 2 months ago

Apply

7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Location: Hyderabad / Bangalore / Pune Experience: 7-16 Years Work Mode: Hybrid Mandatory Skills: ERstudio, Data Warehouse, Data Modelling, Databricks, ETL, PostgreSQL, MySQL, Oracle, NoSQL, Hadoop, Spark, Dimensional Modelling ,OLAP, OLTP, Erwin, Data Architect and Supplychain (preferred) Job Description We are looking for a talented and experienced Senior Data Modeler to join our growing team. As a Senior Data Modeler, you will be responsible for designing, implementing, and maintaining data models to enhance data quality, performance, and scalability. You will collaborate with cross-functional teams including data analysts, architects, and business stakeholders to ensure that the data models align with business requirements and drive efficient data management. Key Responsibilities Design, implement, and maintain data models that support business requirements, ensuring high data quality, performance, and scalability. Collaborate with data analysts, data architects, and business stakeholders to align data models with business needs. Leverage expertise in Azure, Databricks, and data warehousing to create and manage data solutions. Manage and optimize relational and NoSQL databases such as Teradata, SQL Server, Oracle, MySQL, MongoDB, and Cassandra. Contribute to and enhance the ETL processes and data integration pipelines to ensure smooth data flows. Apply data modeling principles and techniques, such as ERD and UML, to design and implement effective data models. Stay up-to-date with industry trends and emerging technologies, such as big data technologies like Hadoop and Spark. Develop and maintain data models using data modeling tools such as ER/Studio and Hackolade. Drive the adoption of best practices and standards for data modeling within the organization. Skills And Qualifications Minimum of 6+ years of experience in data modeling, with a proven track record of implementing scalable and efficient data models. Expertise in Azure and Databricks for building data solutions. Proficiency in ER/Studio, Hackolade, and other data modeling tools. Strong understanding of data modeling principles and techniques (e.g., ERD, UML). Experience with relational databases (e.g., Teradata, SQL Server, Oracle, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra). Solid understanding of data warehousing, ETL processes, and data integration. Familiarity with big data technologies such as Hadoop and Spark is an advantage. Industry Knowledge: A background in supply chain is preferred but not mandatory. Excellent analytical and problem-solving skills. Strong communication skills, with the ability to interact with both technical and non-technical stakeholders. Ability to work well in a collaborative, fast-paced environment. Education B.Tech in any branch or specialization Skills: data visualization,oltp,models,databricks,spark,data modeller,supplychain,oracle,skills,databases,hadoop,dimensional modelling,erstudio,nosql,data warehouse,data models,data modeling,modeling,data architect,er studio,data modelling,data,etl,erwin,mysql,olap,postgresql Show more Show less

Posted 2 months ago

Apply

9.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

This role is for one of Weekday's clients Salary range: Rs 3500000 - Rs 6000000 (ie INR 35-60 LPA) Min Experience: 9 years Location: Bengaluru JobType: full-time Requirements About the Role: We are seeking a highly experienced and innovative Senior Data Engineer to join our growing data team. As a Data Engineer, you will be responsible for designing, developing, and maintaining robust data pipelines and scalable architectures that support our business intelligence, analytics, and data-driven decision-making initiatives. You will work closely with data scientists, analysts, product managers, and other stakeholders to ensure that our data infrastructure is efficient, reliable, and aligned with business goals. This is a strategic role ideal for someone who thrives in a fast-paced environment, has deep experience in data engineering best practices, and is passionate about leveraging data to drive impact at scale. Key Responsibilities: Data Architecture & Pipeline Development: Design and build highly scalable, efficient, and secure data pipelines for batch and real-time data processing. Develop and maintain ETL/ELT processes to extract, transform, and load data from multiple sources into data warehouses and data lakes. Data Modeling & Warehousing: Create and maintain optimized data models that support advanced analytics and reporting. Design and implement data warehousing solutions using modern data storage technologies. Data Quality & Governance: Ensure high levels of data availability, quality, and integrity through the implementation of robust data validation, monitoring, and governance practices. Partner with compliance and data governance teams to enforce data security and privacy policies. Collaboration & Cross-functional Partnership: Work closely with data scientists, analysts, and business teams to understand data requirements and provide reliable data solutions. Collaborate with DevOps and infrastructure teams to automate deployment and manage cloud-based data environments. Tooling & Performance Optimization: Implement monitoring tools and optimize performance of data pipelines and database systems. Stay updated with the latest trends in data engineering, and evaluate new tools and technologies for adoption. Required Qualifications: Bachelor's or Master's degree in Computer Science, Information Systems, Engineering, or a related field. 9+ years of hands-on experience in Data Engineering, with a deep understanding of scalable data pipeline architecture. Proficient in at least one programming language such as Python, Java, or Scala. Strong experience with ETL/ELT frameworks, data orchestration tools (e.g., Apache Airflow, DBT), and workflow management. Solid experience working with cloud data platforms (e.g., AWS, Azure, GCP) and data storage solutions (e.g., Snowflake, Redshift, BigQuery). Expertise in SQL and data modeling for both OLAP and OLTP systems. Familiarity with distributed systems, streaming technologies (Kafka, Spark), and containerization (Docker, Kubernetes) is a plus. Preferred Skills: Experience in a fast-paced startup or enterprise data team. Exposure to big data technologies and real-time data processing. Strong analytical thinking and problem-solving skills. Excellent communication and project management abilities. Show more Show less

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies