This GigaOm Research Reprint Expires September 24, 2026
The image shows a radar chart titled "Data, Analytics & AI" next to a photo of a smiling man. The radar chart has several unlabeled axes radiating out from the center, with data points plotted on each axis forming an asymmetrical polygon shape.

Below the radar chart and photo, there is a heading that reads "SEMANTIC LAYERS AND METRICS STORES". The photo shows a middle-aged man with short dark hair wearing a patterned dress shirt. He has a friendly expression.

The overall image appears to be a slide or graphic from a presentation about data analytics and AI topics, featuring the presenter's photo and some visual aids to convey the technical concepts being discussed.
The image shows a radar chart titled "Data, Analytics & AI" next to a photo of a smiling man. The radar chart has several unlabeled axes radiating out from the center, with data points plotted on each axis forming an asymmetrical polygon shape.

Below the radar chart and photo, there is a heading that reads "SEMANTIC LAYERS AND METRICS STORES". The photo shows a middle-aged man with short dark hair wearing a patterned dress shirt. He has a friendly expression.

The overall image appears to be a slide or graphic from a presentation about data analytics and AI topics, featuring the presenter's photo and some visual aids to convey the technical concepts being discussed.
September 25, 2025

GigaOm Radar for Semantic Layers and Metrics Stores v1

Andrew J. Brust

Analyst at GigaOm

1.
Executive Summary

1. Executive Summary

Semantic layers present a complete and consistent representation of an organization’s data. They capture data’s business meaning and make it understandable in everyday terms. Metrics stores are subcomponents of semantic layers. They contain the definitions of business metrics and help maintain consistency in analysis and reporting.

Semantic layers help organizations reduce confusion, complexity, and duplication of work. They provide standardization, which increases efficiency, promotes knowledge sharing and interdepartmental collaboration, and improves trust in data analysis. They remove technical barriers to working with data and encourage business users to become involved in data analysis.

Semantic layers impact any user involved in performing data analysis or interpreting the results of insights generated through analytics and reporting. Without the help of a semantic layer solution, different teams may define key terms like “sales revenue” in their own way, leading to inconsistent reports for management. Semantic layers store the official, agreed-upon definitions of business metrics used across all BI reports, spreadsheets, and departmental dashboards. With semantic layers, everyone from business users to executives can rest assured that key metrics, complex financial ratios, or simple time windows such as a “day” or “week” remain consistent regardless of how they are used.

Business Imperative

The management of an organization must make confident, solid decisions about the direction and future of the business, and those decisions must be founded on data-backed evidence. Semantic layers help organizations achieve these fundamental business objectives by making sure analytics results are thorough, meaningful, deterministic, and consistent. Semantic layers abstract the complexities of individual systems of record and simplify access to data in underlying data sources. They present a high-level, comprehensive picture of an organization’s data that is needed for analysis, ensuring that management can trust the calculations seen in analytics reports and dashboards and make decisions with confidence. 

Today, semantic layers go beyond analytics use cases and are becoming critical in helping organizations adopt and integrate generative AI (GenAI) across their business. GenAI applications and tools rely on enrichment with context to function appropriately and usefully. One of the semantic layer’s core functions is to capture context: the business meaning of data. It maps technical data to business concepts by holding the logic of the business metrics calculations, outlining how tables are related, and storing the metadata for columns. Enriching technical data with meaningful context is key to giving GenAI models the semantic meaning of data that’s needed for them to perform effectively. Semantic layers help businesses future-proof their data estates and position them to gain and maintain the competitive edge that can come only from properly harnessing their data.

This report has evolved from a Sonar (an evaluation of a cutting-edge technology) into a Radar (an evaluation of an established technology category). It showcases the further maturation of previously up-and-coming vendors along with the ways incumbents have invested in enhancing their platforms to address customer needs.

This is our first year evaluating the semantic layer and metrics store space in the context of our Key Criteria and Radar reports. This report builds on our previous GigaOm Sonar report and considers how the market has evolved over the last year. 

This GigaOm Radar report examines seven of the top semantic layer and metrics store solutions and compares offerings against the capabilities (table stakes, key features, and emerging features) and nonfunctional requirements (business criteria) outlined in the companion Key Criteria report. Together, these reports provide an overview of the market, identify leading semantic layer and metrics store offerings, and help decision-makers evaluate these solutions so they can make more informed investment decisions. Note that the features and capabilities described here reflect the offerings as observed during the research phase of this report (August 2025).

GIGAOM KEY CRITERIA AND RADAR REPORTS

The GigaOm Key Criteria report provides a detailed decision framework for IT and executive leadership assessing enterprise technologies. Each report defines relevant functional and nonfunctional aspects of solutions in a sector. The Key Criteria report informs the GigaOm Radar report, which provides a forward-looking assessment of vendor solutions in the sector.

2.
Market Categories and User Segments

2. Market Categories and User Segments

To help prospective customers find the best fit for their use case and business requirements, we assess how well semantic layer and metrics store solutions are designed to serve specific market categories and user segments (Table 1).

For this report, we recognize the following market categories:

  • Small-to-medium business (SMB): In this category, we assess solutions on their ability to meet the needs of organizations ranging from small businesses to medium-sized companies. Also assessed are departmental use cases in large enterprises where ease of use and deployment are more important than extensive management functionality, data mobility, and feature set.

  • Large enterprise: Here, offerings are assessed on their ability to support large and business-critical projects. Optimal solutions in this category have a strong focus on flexibility, performance, data services, and features to improve security and data protection. Scalability is another big differentiator, as is the ability to deploy the same service in different environments.

  • Specialized: Here, solutions are assessed on their ability to support more niche, specialized scenarios. Optimal solutions will be designed for specific workloads and use cases such as big data analytics and high-performance computing (HPC).

In addition, we recognize the following user segments:

  • Business user: Business users are typically beginners in the realm of data and analytics. While these employees may occasionally need to use analytical tools to perform self-service exploration and analysis, they rely on others to handle the technical aspects of configuring and provisioning them.

  • Business analyst: These users have some knowledge of data analysis tasks and are familiar with using self-service tools to perform analytics. They evaluate data from the perspective of deriving business insights and making recommendations for improvements, such as better performance or reduced costs.

  • Data analyst: These users review data to look for trends and patterns that can benefit organizations at the corporate level. While not as technical as data engineers, data analysts possess knowledge of data preparation, visualization, and analysis that can be applied to inform organizational strategy.

  • Data engineer: Data engineers are very well versed technically and apply their specialized knowledge to helping prepare, organize, and model data, transforming it into actionable information for the organizations they support. 

Table 1. Vendor Positioning: Market Categories and User Segments

Vendor Positioning: Market Categories and User Segments
TARGET MARKETUSER SEGMENT
Small-to-medium Business (SMB)
Large Enterprise
Specialized
Business User
Business Analyst
Data Analyst
Data Engineer
AtScale
Cube
dbt Labs
Google Cloud
Microsoft
Oracle
SAP
Source: GigaOm 2026

Table 1 components are evaluated in a binary yes/no manner and do not factor into a vendor’s designation as a Leader, Challenger, or Entrant on the Radar chart (Figure 1). 

“Target market” reflects which use cases each solution is recommended for, not simply whether that group can use it. For example, if an SMB could use a solution but doing so would be cost-prohibitive, that solution would be rated “no” for SMBs.

3.
Decision Criteria Comparison

3. Decision Criteria Comparison

All solutions included in this Radar report meet the following table stakes—capabilities widely adopted and well implemented in the sector:

  • Data source connectivity

  • Semantic model openness

  • Security and access controls

  • Client application integration

  • Analytics optimizations

Tables 2, 3, and 4 summarize how each vendor in this research performs in the areas we consider differentiating and critical in this sector. The objective is to give the reader a snapshot of the technical capabilities of available solutions, define the perimeter of the relevant market space, and gauge the potential impact on the business.

  • Key features differentiate solutions, highlighting the primary criteria to be considered when evaluating a semantic layer and metrics store solution.

  • Emerging features show how well each vendor implements capabilities that are not yet mainstream but are expected to become more widespread and compelling within the next 12 to 18 months. 

  • Business criteria provide insight into the nonfunctional requirements that factor into a purchase decision and determine a solution’s impact on an organization.

These decision criteria are summarized below. More detailed descriptions can be found in the corresponding report, “GigaOm Key Criteria for Evaluating Semantic Layer and Metrics Store Solutions.”

Key Features

  • Modeling language support: Many semantic layer platforms provide support for creating, interacting with, and modifying semantic models using code. For those that do, some vendors support a modeling language or have developed their own modeling language for this capability.

  • Federation and virtualization: This key feature describes the use of federation and/or virtualization capabilities, if any, by semantic layer platforms. Such capabilities are used to present a consolidated view of an organization’s data across all of its data sources, without physical movement of the data.

  • Native API support: Some semantic layer offerings support native APIs (implemented using REST, GraphQL, SQL, or other technologies) for querying data. This support allows additional applications, tools, and programming languages to connect with and use the semantic layer platform.

  • Query language compatibility: Query language compatibility describes the semantic layer’s native support for BI-specific query languages, such as multidimensional expressions (MDX) and data analysis expressions (DAX). This compatibility allows semantic layers to work as seamlessly as possible with client applications. 

  • Data product support: Data product support describes the ability, if any, to expose and share entire semantic models or components of semantic models as “products” across departments and teams in an organization. This support relates to, enables, and enriches a domain-oriented architecture (originally popularized by the “data mesh” framework) within an organization, facilitating the autonomy of individual business teams and a commitment to “product support” by those teams for other teams while ensuring sharing and availability of data across domains.

  • Advanced data modeling capabilities: The semantic model is the core of the semantic layer and contains the definitions of business logic mapped to the underlying technical data, including the upfront definitions of measures, dimensions, and hierarchies. Advanced data modeling capabilities include support for multiple fact tables, complex hierarchies, and the ability to define intricate many-to-many relationships.

Table 2. Key Features Comparison 

Key Features Comparison
Exceptional
Superior
Capable
Limited
Poor
Not Applicable
KEY FEATURES
Average Score
Modeling Language Support
Federation and Virtualization
Native API Support
Query Language Compatibility
Data Product Support
Advanced Data Modeling Capabilities
AtScale
5.0
★★★★★
★★★★★
★★★★★
★★★★★
★★★★★
★★★★★
Cube
4.2
★★★★★
★★★★
★★★★
★★★
★★★★
★★★★★
dbt Labs
3.5
★★★★★
★★
★★★★
★★★
★★★★
★★★
Google Cloud
3.7
★★★★★
★★★
★★★★
★★★
★★★
★★★★
Microsoft
5.0
★★★★★
★★★★★
★★★★★
★★★★★
★★★★★
★★★★★
Oracle
3.5
★★★
★★★★
★★★
★★★★
★★★
★★★★
SAP
3.7
★★★
★★★★
★★★
★★★
★★★★★
★★★★
Source: GigaOm 2026

Emerging Features

  • GenAI enablement: Semantic layer offerings perform unique and crucial functions in helping customers leverage GenAI across their organization. Semantic layers capture the business meaning of data, enriching a large language model (LLM)’s semantic understanding of text input, improving GenAI-created output, and offering the potential to benefit many use cases and workloads.

  • Support for DataOps approach: Some semantic layer offerings support DataOps principles in the management of the code for their data modeling languages. This includes support for version control, separate environments for test, staging, and production, and support for continuous integration and continuous deployment (CI/CD).

  • GenAI and agentic features: This criterion encompasses the implementation of features in the semantic layer that make use of or are powered by GenAI, and features that are designed to make use of agentic AI. These could include in-platform GenAI-powered copilot assistants, interfaces for asking questions of data in natural language, and/or natural-language-to-SQL (or other programming language) interfaces.

Table 3. Emerging Features Comparison 

Emerging Features Comparison 
Exceptional
Superior
Capable
Limited
Poor
Not Applicable
EMERGING FEATURES
Average Score
GenAI Enablement
Support for DataOps Approach
GenAI and Agentic Features
AtScale
3.7
★★★★
★★★★
★★★
Cube
3.3
★★★
★★★★
★★★
dbt Labs
3.7
★★★
★★★★
★★★★
Google Cloud
2.3
★★★
★★★★
Microsoft
4.0
★★★★
★★★★
★★★★
Oracle
2.3
★★★★
★★★
SAP
1.0
★★★
Source: GigaOm 2026

Business Criteria

  • Ecosystem and integrations: This criterion refers to the nature of the platform’s openness to integrate with third-party platforms from an ecosystem encompassing a wide variety of other analytics and data management categories. Because semantic layer platforms interface with all of an organization’s data sources and analytics applications, they can provide key information to other tools such as metadata management, lineage, monitoring, and observability solutions.

  • Analytics workload diversity: With their standardization of definitions and business logic, semantic layers benefit potentially all of an organization’s diverse analytics workloads. While the core of such workload is business intelligence and reporting for management decision-making, support for others, such as machine learning and data science, or GenAI, is also relevant.

  • Hybrid enablement: Hybrid enablement refers to the semantic layer’s ability to unify data across both on-premises and cloud environments. This interoperability across environments helps organizations comply with data sovereignty and residency requirements by ensuring data remains physically stored within specific regional or national boundaries, on-premises and in the cloud.

  • Code and developer orientation: This criterion refers to the degree to which the platform supports a more technical, code-first approach to building and interacting with the semantic model. This developer support is achieved through the presence of multiple platform elements, such as modeling languages, native APIs, a code-first interface, and DataOps support.

  • Ease of use: Some platforms enable a visual, low- or no-code approach to data modeling, an approach that encourages less-technical users to be involved in creating the semantic model. Platforms that take such an approach often possess elements such as a graphical UI for representing and building the semantic model, allowing users to drag and drop components onto, and iteratively refine their model within, a visual design canvas.

Table 4. Business Criteria Comparison 

Business Criteria Comparison 
Exceptional
Superior
Capable
Limited
Poor
Not Applicable
BUSINESS CRITERIA
Average Score
Ecosystem and Integrations
Analytics Workload Diversity
Hybrid Enablement
Code and Developer Orientation
Ease of Use
AtScale
5.0
★★★★★
★★★★★
★★★★★
★★★★★
★★★★★
Cube
4.6
★★★★★
★★★★★
★★★★
★★★★★
★★★★
dbt Labs
4.2
★★★★★
★★★★★
★★★
★★★★★
★★★
Google Cloud
3.8
★★★
★★★★
★★★★
★★★★★
★★★
Microsoft
4.6
★★★★
★★★★★
★★★★
★★★★★
★★★★★
Oracle
3.8
★★★
★★★★
★★★★
★★★★
★★★★
SAP
4.0
★★★★
★★★★
★★★★
★★★★
★★★★
Source: GigaOm 2026

4.
GigaOm Radar

4. GigaOm Radar

The GigaOm Radar plots vendor solutions across a series of concentric rings with those set closer to the center judged to be of higher overall value. The chart characterizes each vendor on two axes—balancing Maturity versus Innovation and Feature Play versus Platform Play—while providing an arrowhead that projects each solution’s evolution over the coming 12 to 18 months.

This image depicts the GigaOm Radar chart for Semantic Layers and Metric Stores, providing an analysis of various companies in this technology space.

The radar chart is divided into four quadrants based on two axes: Maturity (ranging from Entrant to Leader) and Innovation (ranging from Outperformer to Forward Mover). The concentric circles represent different levels of Feature Play and Platform Play.

Several companies are positioned on the chart according to their maturity and innovation levels. At the center is AtScale, suggesting it holds a central position in this market. Other notable companies include Cube, Microsoft, dbt Labs, Oracle, SAP, and Google Cloud.

Below the radar chart, a legend explains the meaning of each axis:
- Maturity: Emphasis on stability and continuity; may be slower to innovate.
- Innovation: Flexible and responsive to market; may invite disruption.
- Feature Play: Offers specific functionality and use case support; may lack broad capability.
- Platform Play: Offers broad functionality and use case support; may heighten complexity.

The image provides a high-level view of the competitive landscape for semantic layers and metric stores as of September 2025, according to GigaOm's analysis.

Figure 1. GigaOm Radar for Semantic Layers and Metrics Stores

As you can see in Figure 1, the majority of vendors are positioned on the Feature Play side of the Radar because they offer a semantic layer as part of their larger approach to analytics. Major cloud providers are also shown here, as well as other significant industry names. By including semantic layers as part of their data and analytics strategy, these major players show they recognize the value semantic layers bring to enterprise analytics. 

The two pure-play semantic layer offerings fall on the Platform Play half, both in the Leaders circle, and in the Innovation half of the radar. This placement demonstrates that, while most vendors don’t prefer to build pure semantic layer offerings, those that have semantic layers as their sole or flagship offerings also primarily, though not exclusively, drive the development and innovation in the market landscape.

The vendors are otherwise nearly evenly divided between the Maturity and Innovation halves of the Radar. This balance reflects that, instead of embracing a completely new technology, the industry has, in many ways, rediscovered the benefits of a technology that has been around all along.

All semantic layer vendors have forward-looking roadmaps, recognizing the intersection of this technology with the larger industry trends of GenAI and agentic AI. But the two Outperformers are so designated because of their particularly ambitious roadmaps and their continual dedication to innovation and the development of new features.

In reviewing solutions, it’s important to keep in mind that there are no universal “best” or “worst” offerings; every solution has aspects that might make it a better or worse fit for specific customer requirements. Prospective customers should consider their current and future needs when comparing solutions and vendor roadmaps.

INSIDE THE GIGAOM RADAR

To create the GigaOm Radar graphic, key features, emerging features, and business criteria are scored and weighted. Key features and business criteria receive the highest weighting and have the most impact on vendor positioning on the Radar graphic. Emerging features receive a lower weighting and have a lower impact on vendor positioning on the Radar graphic. The resulting chart is a forward-looking perspective on all the vendors in this report, based on their products’ technical capabilities and roadmaps.

Note that the Radar is technology-focused, and business considerations such as vendor market share, customer share, spend, recency or longevity in the market, and so on are not considered in our evaluations. As such, these factors do not impact scoring and positioning on the Radar graphic.

For more information, please visit our Methodology.

5.
Solution Insights

5. Solution Insights

AtScale: AtScale Universal Semantic Layer

Solution Overview
AtScale’s Universal Semantic Layer blends semantic modeling, data virtualization and federation, open protocols, query acceleration, and centralized data access governance into a single platform. Consistent with its expertise as a pioneer of modern semantic layer technology, AtScale delivers a pure-play semantic layer offering that’s suited for the volumes of data, complex analytics, and AI enablement needs of large enterprises. 

AtScale also makes its capabilities accessible for smaller businesses with several features and options designed for semantic model composability, openness, and flexibility. Some examples of these features are its open source, object-oriented semantic modeling language (SML), an open source library of prebuilt semantic models for popular applications and industry-specific use cases, and a free developer edition for development and proofs of concept. 

Although AtScale is not exclusive in the semantic layer space, it was a pioneer in the development of modern semantic layer platforms. It continues to influence the ongoing development of the technology and how it is applied.

AtScale is positioned as a Leader and Fast Mover in the Innovation/Platform Play quadrant of the semantic layers and metrics stores Radar chart.

Strengths
AtScale scored well on a number of decision criteria, including:

  • Modeling language support: AtScale sponsors and continues to develop SML, an object-oriented, YAML-based modeling language for defining semantic model objects and models with code. SML supports modularization and composability of semantic definitions and is described by AtScale as Git native, enabling version history, pull requests, and CI/CD.

  • Query language compatibility: AtScale’s new release (in August 2025) enhanced its support for the DAX (multidimensional expressions) language, adding compatibility with DAX Level 1600. According to AtScale, this enhancement allows its solution to “achieve full parity with the modeling features of Power BI, including model inheritance, live connections, and client-side measures.”

  • GenAI enablement: AtScale provides capabilities that allow it to serve as a semantic foundation for GenAI workloads. It supplies the business context for LLMs to produce refined and tailored output, and it provides the governance needed for models to be deployed effectively across organizations. Some specific capabilities include support for the Model Context Protocol (MCP), support for GenAI application development through AtScale’s AI-Link Python SDK, MCP and the REST protocol, and compatibility with LLMs from OpenAI and Anthropic as well as retrieval augmented generation (RAG) frameworks like LangChain.

Opportunities
AtScale has room for improvement in a couple of decision criteria, including: 

  • GenAI and agentic features: AtScale doesn’t yet possess a copilot modeling assistant, like some of the other offerings covered in this report do. As of the research phase of this report, this feature remained in development. AtScale says, when fully developed, this modeling assistant would let users modify and refine models with natural language prompts. Significant features in this area that AtScale does possess include MCP support and one-click modeling for automatic creation of semantic models.

  • Ease of use: According to AtScale, capabilities for automated metric suggestions are currently under development. The introduction of such capabilities into the platform would allow AtScale to proactively suggest KPIs and metrics to be created, based on user behavior and query history.

Purchase Considerations
AtScale is designed to be open and interoperable, deeply integrating with many other solutions: data catalogs, enterprise BI applications, ML tools and platforms, governance and access control offerings, and GenAI frameworks. It supports a strong diversity of analytics workloads, including business intelligence and reporting, predictive analytics and ML, compliance and governance, and GenAI and agentic workloads. It offers deployment options that allow hybrid data access across both cloud and on-premises environments, enabling customers to meet data sovereignty and residency requirements. 

AtScale is designed for significant ease of use, with both visual and code-first modeling interfaces, predefined model templates, and GenAI assistance such as its one-click modeling capability.

The AtScale pricing model is based on usage, specifically on what it terms monthly deployed objects (MDOs)—semantic objects that are actively deployed and used in a production environment. Such objects can include semantic models, measures and calculated metrics, dimensions, hierarchies, and cubes in legacy configurations. The MDO total includes only the objects that are published and active in a given month, such as those that actively power dashboards and reports. Test and development environments and inactive or unpublished models are not included in the billing. AtScale states that this pricing model is designed to improve billing transparency, help customers more easily predict costs, and scale as the logical “semantic footprint” expands rather than as infrastructure does.

Use Cases
AtScale supports a wide variety of enterprise analytics use cases and AI workloads. These include standardizing metrics definitions and accelerating and optimizing business intelligence workloads. AtScale supports aspects of predictive analytics and ML, specifically feature engineering. The semantic layer platform allows users to create centralized, reusable semantic features for training ML models. Through support for the MCP, AtScale makes a governed semantic model available to AI agents and LLMs. This exposure has the potential to enrich custom GenAI chatbots and automations with business context and improve the results of natural language querying.

Cube: Cube Cloud

Solution Overview
Cube was created as a developer-friendly semantic layer platform that can function as a foundation to support the development of embedded analytics applications. Cube’s native API support and its code-first, developer-oriented platform made it especially suited for this use case (although it now boasts a visual modeling interface too). This approach, along with its support for embedded analytics, were two main factors that initially propelled its popularity, garnering community support and interest. Though the solution has now grown beyond these origins, they remain core areas where it excels and where customers rely on it to provide value.

Cube Cloud is a fully managed, cloud-based offering that builds on and enhances the Cube Core open source semantic layer project. The core components of the platform remain data modeling, caching, access controls, and APIs. Beyond the functionality of the open source project, the managed offering provides a management console, development environment, platform monitoring and observability, and support for additional data sources.  

Cube has built out its offering with a visual modeling interface to complement its code-first modeling capabilities. It has added additional integrations and advanced data modeling capabilities and is in the process of laying the foundation for an overhaul and transformation of its offering with a number of agentic AI features. While not yet generally available as of the research phase of this report (August 2025), these features are described as providing a foundation for users to interact with their data in natural language.  

These features are augmented with GenAI assistance that has been enriched by context stored in the semantic model and that is tailored to different user personas (for example, data analysts, data scientists, and data engineers). This assistance is described as taking the form of different AI agents that help with tasks usually undertaken by specific personas. For example, the AI Data Analyst agent potentially generates visualizations that users can update or interact with via natural language. The AI Data Engineering agent is intended to provide assistance creating and optimizing semantic models. For example, it could generate code for semantic model objects from database tables, and it could allow the user to update the semantic model through natural language.

Cube is positioned as a Leader and Outperformer in the Innovation/Platform Play quadrant of the semantic layers and metrics stores Radar chart.

Strengths
Cube scored well on a number of decision criteria, including:

  • Modeling language support: In Cube, semantic models and their components (such as measures, dimensions, and joins) are defined declaratively using YAML syntax and stored in files in a Git repository. For more complex data modeling, such as creating dynamic data models, Cube supports modeling in JavaScript. Dynamic models can also be created through a combination of Jinja and Python. 

  • Advanced data modeling capabilities: Cube has the ability to capture complex business logic through advanced data modeling capabilities. Some of these capabilities include support for multiple fact tables, modeling many-to-many relationships through bridge tables, and capturing parent-child relationships through polymorphic cubes. 

  • Support for DataOps approach: Cube is well integrated with the major Git providers, maintaining all semantic layer models (stored as YAML and Javascript files) in Git repositories. Version control, pull requests, branching, CI/CD, and other best practices are enabled. Cube provides separate environments for development, staging, and production.

Cube is classified as an Outperformer because of its ambitious roadmap and rapid pace of development. This designation also reflects the expectation that this pace will be maintained. 

Opportunities
Cube has room for improvement in a couple decision criteria, including:

  • GenAI enablement: While Cube has significant GenAI developments in the works as of the research phase of this report (August, 2025), they are not yet generally available. Through currently existing capabilities, such as Cube’s current native API support and advanced data modeling capabilities, users can leverage the benefits of the semantic layer for enriching LLMs with appropriate business context and complex logic captured in the model. 

  • GenAI and agentic features: While Cube has a number of relevant GenAI and agentic features in development, these were not yet generally available. While Cube has introduced a Cube Copilot capability that it says is designed to provide GenAI assistance for data modeling, in our research, Cube indicates that users need to contact sales to enable this feature. 

Purchase Considerations
Cube offers a usage-based pricing model, with pricing based on consumption units of compute power. The price of the consumption unit varies according to the tier of the Cube Cloud plan. These are designed to suit many variations of team sizes and workload needs, as follows:

  • Free: Provides a developer instance only with a daily limit of 1,000 queries.

  • Starter: Designed for small production deployments.

  • Premium: Designed for small-scale production, does not limit queries, and has pay-as-you-go or contract-based options.

  • Enterprise and Enterprise Premier: Designed for potentially large-scale production deployments that might require dedicated resources, additional security, potential additional integrations with enterprise data platforms (like Microsoft Fabric), external monitoring and audit logging services, and/or custom integrations.

Use Cases
Cube supports use cases such as standardizing and streamlining metrics used in business intelligence and reporting, modernizing legacy online analytical processing (OLAP) systems, and assisting with analytics of real-time data through connectors to sources such as Apache Flink, RisingWave, and ksqlDB. Cube continues to distinguish itself through its developer-friendly semantic layer platform, which can serve as a foundation for building embedded analytics applications.

dbt Labs: dbt Semantic Layer

Solution Overview
The dbt Semantic Layer is powered by the capabilities of MetricFlow, a SQL query-generation tool that was acquired by dbt Labs with its purchase of Transform Data in 2023. MetricFlow enables users to define measures and dimensions in YAML files using the information in these configurations to generate SQL queries based on user requests and executing them in the specified destination platform. The dbt Semantic Layer helps centralize metrics definitions and ensure consistent access by downstream tools. This approach to defining measures and dimensions and creating a semantic model aligns with dbt Labs’ overall philosophy of collaborative development of code, version control, and CI/CD.

Since the Sonar version of this report, dbt Labs has built out its semantic layer offering as part of a larger enhancement of its managed solution. Other components include the dbt Catalog for metadata management, lineage, and monitoring and observability; dbt Canvas, a graphical interface for creating and visualizing dbt models; and dbt Copilot, which provides GenAI assistance in the dbt Studio IDE. Combined with the dbt Semantic Layer, these enhancements highlight the vendor’s intention to build out its offering into a broader data platform.

dbt Labs is positioned as a Challenger and Fast Mover in the Innovation/Feature Play quadrant of the semantic layers and metrics stores Radar chart.

Strengths
dbt Labs scored well on a number of decision criteria, including:

  • Native API support: Metrics defined in the dbt Semantic Layer are consumable through multiple interfaces, including a GraphQL API, a JDBC driver (JDBC API), and a Python SDK.

  • Support for DataOps approach: dbt Labs applies software engineering best practices to the development and maintenance of data engineering code, and this extends to the dbt Semantic Layer as well, where users can define measures and dimensions in code using YAML specifications. These definitions are stored in a Git repository, enabling version control, collaborative development of the modeling code, and CI/CD.

  • GenAI and agentic features: The dbt Copilot (introduced in March 2025) provides multiple GenAI assistance capabilities to the dbt Studio IDE, including auto-generating an initial draft of the semantic model in the dbt Semantic Layer, asking natural language questions about business metrics, and generating dbt models (as SQL code) based on natural language prompts.

Opportunities
dbt Labs has room for improvement in a couple decision criteria, including:

  • Query language compatibility: As of this writing, the dbt Semantic Layer didn’t appear to provide native support for DAX or MDX. Instead, it translates metric logic defined in YAML files into SQL at query time, with the vendor describing its SQL-generation capabilities as bypassing the need for MDX.

  • Federation and virtualization: This solution’s approach focuses on helping users to create standardized definitions of metrics and business logic within a single analytics environment rather than federating across disparate backend database systems. This is consistent with the original function of the dbt open source solution (dbt Core), which orchestrates and runs code in the specified destination associated with a project (that is, a data warehouse or lakehouse). While the vendor says models can be built across dbt projects if they work with the same platform (for example, BigQuery, Databricks, or Redshift), primarily only one backend database is associated with a dbt project.

Purchase Considerations
The dbt Semantic Layer is a feature of the dbt managed offering, which is available in several options to suit teams of different sizes and requirements. The starter plan level includes access to the base metrics definition capabilities of the dbt Semantic Layer. It also has a 5,000 unit limit on queried metrics (how the solution measures usage for billing purposes). At the Enterprise and Enterprise Premier tiers, customers can access the full availability of the dbt Semantic Layer’s features, including caching and fine-grained access controls. These tiers have a limit of 20,000 for metrics queries.

Use Cases
dbt Labs describes a number of key use cases for the dbt Semantic Layer: standardizing metrics across reporting and BI tools, enabling embedded analytics with APIs and SDKs, and ensuring consistent definitions of metrics used in reporting and BI visualizations. In the financial services industry, for example, the dbt Semantic Layer’s capabilities to map complex financial logic help streamline and automate accounting procedures, reducing the complexity and manual labor required of internal finance teams. In B2B and e-commerce, the dbt Semantic Layer helps ensure consistent metrics can be used in internal dashboards, and it can also help streamline the creation of customer-facing reports.

Google Cloud: Looker

Solution Overview
Google’s Looker is a comprehensive BI and analytics platform with a data modeling layer that allows semantic models to be developed in code, exposed, and independently queried. Looker is notable for LookML, a declarative modeling language that helps users standardize definitions of business logic via a reusable data model and abstracts some of the challenges of data modeling.

Measures and dimensions are defined in LookML, standardizing definitions of these across an organization and constructing the semantic data models. Metrics defined within Looker’s modeling layer can be accessed by many integrated tools, including Google Sheets, Power BI, Tableau, ThoughtSpot, and Mode. Additionally, any third-party application that supports Java Database Connectivity (JDBC) can access LookML models through Looker’s Open SQL Interface. Going in the other direction, Looker’s SQL generator translates the LookML code into SQL and runs it against the underlying database(s); results can then be visualized for further presentation and analysis. Looker also integrates with major Git-based platforms so the code for data models can be collaboratively developed and version controlled.

Looker was brought into the Google Cloud technology portfolio in 2020. In addition to integrating Looker with other elements of this suite of products, Google Cloud has significantly invested in integrating Gemini GenAI-powered features into Looker. These developments have been in the works to power what is described as “conversational analytics.” For Looker’s data modeling capabilities specifically, this includes a natural-language-to-LookML code assistant. While the specific features are not yet generally available (according to the product’s documentation), they’re worth mentioning for visibility into the overarching strategy and roadmap for this solution.

Google Cloud is positioned as a Challenger and Fast Mover in the Maturity/Feature Play quadrant of the semantic layers and metrics stores Radar chart.

Strengths
Google Cloud scored well on a number of decision criteria, including:

  • Modeling language support: Through the LookML modeling language, users can write LookML code to create semantic data models. They can define such components as dimensions, aggregates, calculations, and data relationships. Looker’s SQL generator translates the LookML code into SQL to be run against the underlying database(s) encapsulated in the model. The different LookML parameters provide information about how the SQL query should be structured. LookML code for models is stored in collections of files called projects, typically in Git-based repositories. This structure also enables collaborative development and version control.

  • Native API support: The Looker REST API is described as a JSON-oriented REST API. It enables support for a number of use cases, including developing customer-facing analytics, powering internal dashboards and operational software, and facilitating mobile app integrations. Users can make calls to the API manually via HTTPS requests or make use of one of Looker’s API SDKs. These SDKs handle tasks such as authentication and parameter and response serialization. They are available in Ruby, Python, Typescript, Javascript, and, optionally, in the customer’s language of choice through a Looker SDK codegen project. Additionally, Looker provides an API Explorer, which is a web application extension that possesses:

    • Comprehensive documentation of API methods and types.

    • A function that provides code for API calls, API responses, SDK functions, and the ability to execute API calls directly.

    • Features that help troubleshoot access issues.

  • Support for DataOps approach: The code for the semantic data models defined in the LookML language is stored and maintained in projects. Projects can be configured to connect to customer-managed Git repositories or those hosted on major Git-based platforms such as GitHub and GitLab. Benefits of this approach include version control and collaborative creation and development of LookML code. As of August 2025, a feature called Looker Continuous Integration appears to be in preview. Once generally available, Google Cloud says this feature will allow users to run tests on LookML projects and use a number of validators (including a SQL validator, assert validator, content validator, and LookML validator) to detect issues before code is deployed to production.

Opportunities
Google Cloud has room for improvement in a few decision criteria, including:

  • Query language compatibility: According to Google Cloud’s literature, Looker is fundamentally “a tool that generates SQL queries and submits them against a database connection.” Users define relationships between tables and columns in their data model through LookML code, and Looker translates this information into SQL queries. The parameters in LookML control different aspects of the way Looker generates SQL, including the structure, content, or behavior of the query. Looker doesn’t natively support languages like MDX; all data modeling, manipulation, and calculations are performed through LookML, and the resultant SQL queries are generated by Looker.

  • GenAI enablement: Looker does not appear to possess support for the MCP. Adding this support would strengthen its ability to allow AI models to connect to, interact with, and be enriched by context from the semantic data model, thus helping customers better leverage GenAI across their organization. Regarding current capabilities, Looker’s ability to capture complex business logic in the semantic model has the potential to help produce higher quality GenAI-created output. Looker’s strong native API support, described above, has the potential to help customers leverage this benefit of a semantic layer in the development of GenAI applications.

  • GenAI and agentic features: While Google Cloud has many Gemini GenAI features in the works, none appear to be generally available; Gemini in Looker was listed as being in preview and subject to “pre-GA terms.” One specific capability in this set of GenAI features is the LookML code assistant, which suggests LookML code based on the user’s natural language prompts. Because these features are still in preview, their usefulness for customers remains limited until they are generally available.

Purchase Considerations
The pricing for the Google Cloud hosted, managed offering, Looker (Google Cloud core), has two components: platform pricing and user pricing. Platform pricing refers to the cost of running an instance of the product. User pricing refers to the charges for licensing individual users to access the platform, and vary based on user permissions and assigned roles.

Looker (Google Cloud core) is offered in three platform editions (standard, enterprise, and embed), each designed for organizations and teams with varying requirements. 

  • The standard product tier is designed for businesses or teams with fewer than 50 users. Per month, it allows up to 1,000 query-related API calls; for example, API calls that request data from a client database or API calls that run SQL queries. It also allows up to 1,000 administrative-related API calls, such as calls that manage permissions and users or manage the administration of the instance. 

  • The enterprise tier, designed for internal BI and analytics use cases, includes additional features beyond those of the standard edition, such as enhanced security features, unlimited users, a “private label” option that allows the blending of corporate branding into Looker, and a higher limit of API calls (100,000 query-related API calls and 10,000 admin-related calls per month). 

  • The embed tier is designed to support external analytics and custom applications. It provides all the features of the enterprise edition, and it also includes custom themes and the highest number of API calls (500,000 query-related API calls and 100,000 admin-related calls per month).

Use Cases
From a broader, high-level perspective, customers can use Looker’s data modeling capabilities to create a central semantic model for their organization, capturing complex business logic and ensuring standardization. Looker also has the potential to help customers improve the outputs of their GenAI implementations across their organization, enriching their GenAI models and applications with custom business logic by way of the semantic model. From an industry-specific perspective, online retailers and e-commerce platforms leverage LookML’s capabilities for creating a version-controlled semantic model to consolidate definitions of business logic. Benefits include faster creation of new dashboards and improved trustworthiness of analytics results. In the area of supply-chain management and logistics, standardization of metrics and business logic can improve reliability of shipment monitoring and contribute to scheduling optimization.

Microsoft: Microsoft Fabric*

Solution Overview
Microsoft’s efforts in semantic modeling stretch back to the early days of BI software. While the company did not promote its own semantic modeling capabilities in a high-profile manner, it assiduously developed and refined them over decades. The development of Microsoft’s BI and semantic layer capabilities encompasses a progression from the original SQL Server Analysis Services (SSAS) Multidimensional Mode to the SSAS Tabular Mode and on to Power BI. This progression culminates in Microsoft Fabric, the end-to-end platform that unifies, enhances, and abstracts multiple previously separate cloud services.

Currently, in Microsoft Fabric, when a user creates a lakehouse or warehouse endpoint, the platform automatically creates a Power BI semantic model as a virtual layer on top. Models can include partitions, security roles, calculations, KPIs, perspectives, multi-language translations, and display folders that let client tools visually organize objects in the model. Users can edit or customize the generated semantic model. They can also create new ones from scratch, an option that lets users have greater autonomy over their semantic models.

This latter approach appears to have generated significant customer appeal and support. As of the August 2025 research phase of this report, Microsoft is poised to make what it describes as a major “strategic shift” in this area: disabling the automatic generation of default semantic models, and decoupling existing ones from parent items to become independent semantic models. According to Microsoft’s literature surrounding this change, the underlying intent is to better support enterprise data modeling needs, specifically through clear ownership over model creation, stronger governance, and improved customizability.

Microsoft is positioned as a Leader and Outperformer in the Innovation/Feature Play quadrant of the semantic layers and metrics stores Radar chart.

Strengths
Microsoft scored well on a number of decision criteria, including:

  • Modeling language support: Microsoft’s Tabular Model Definition Language (TMDL) provides users with a modular scripting language for defining objects in a semantic model. With TMDL, modelers create YAML-like files for each object in the model, enabling a more composable approach to building the data model. TMDL files are organized hierarchically in a folder structure, making them suitable for Git-based version control and CI/CD and fostering collaborative editing of the model. A feature called “TMDL view” that was in preview will provide a code editor for scripting and modifying semantic model objects.  

  • Data product support: Support for composite modeling differentiates Microsoft from many of the other vendors in this report. Composite modeling enables capturing domain-specific business logic in addition to enterprise-wide logic. An enterprise-wide semantic model, defined by a central IT team or center of excellence, functions as an organization-wide, foundational standard. What Microsoft terms a BI semantic model (models that are specific to a business domain or subject area) can extend the enterprise model, enrich it with data from departmental or external sources, and capture the logic for domain-specific business concepts.

  • GenAI and agentic features: Copilot in Power BI in Fabric provides a number of features specifically related to semantic modeling. These include assistance with writing and explaining DAX queries, allowing users to ask questions or add measure definitions to the semantic model through natural language, and providing support for generating descriptions of model measures. These capabilities can significantly assist users, especially less-technical users, in the modeling process. Fabric data agents, an upcoming feature that’s in preview, can customize and configure agents to perform tasks such as answering questions or querying data in natural language. 

Microsoft is classified as an Outperformer because it continues to develop and release new features at a quick pace. This designation also reflects its future-looking roadmap and the anticipation of continued rapid innovation. 

Opportunities
Microsoft has room for improvement in a couple of decision criteria, including:

  • Ecosystem and integrations: While customers not previously invested in Microsoft infrastructure could certainly use this solution as a discrete semantic layer offering, connecting to third-party applications and data sources, this wouldn’t fully utilize the benefits provided by the broader capabilities of Power BI and the unified Fabric platform. This is because Microsoft Fabric’s end-to-end integration of multiple services amplifies the benefits of this solution to potential customers with existing investments in Microsoft infrastructure or to those who have an interest in adopting this full data and analytics platform.

  • Hybrid enablement: Microsoft Fabric is a purely cloud-based platform, which limits its ability to enable hybrid cloud and on-premises scenarios. However, it can include data that is stored in external storage systems in queries via the shortcuts feature and via mirroring. Also, through the data integration module of Fabric, Data Factory, an on-premises data gateway allows access to on-premises data sources.

Purchase Considerations
In Microsoft Fabric, users access all the services through a unified interface and pay for them as a single SaaS offering. This single-platform approach simplifies both billing and the analytics experience.

Microsoft offers a unified platform for multiple analytic workloads, which reduces tool sprawl, simplifies integration, and lowers costs. The Fabric offering also supports both technical and business users. Code-first options such as ASSL, TMSL, and Semantic Link give developers flexibility, while visual editors and Copilot’s GenAI assistance make modeling easier for nontechnical users. This dual approach helps maximize adoption across the enterprise.

The Metrics Hub in Power BI strengthens consistency and efficiency by allowing teams to create, curate, and reuse metric sets. These can be shared across reports, dashboards, and notebooks, reducing duplication of work and ensuring KPIs stay aligned across departments.

Another purchase driver is Microsoft’s investment in AI-ready capabilities. Features like synonyms and linguistic relationships improve the experience of Q&A in Power BI; AI instructions and verified answers increase control and trust in Copilot outputs; and schema simplification, plus AI-prep markers, reduce ambiguity. Together, these tools give buyers confidence that their models will work reliably with AI.

Use Cases
Microsoft Fabric is distinguished by its function as a comprehensive data and analytics platform that encompasses multiple services in one place. It supports the standardization and streamlining of mapping technical data to complex business logic, a core use case of the semantic layer solutions covered in this report. Other diverse analytics workloads supported by the broader Microsoft Fabric platform include exploratory and ad hoc analytics, data preparation and integration, data science and ML, and real-time data processing and analytics.

Oracle: Oracle Analytic Views*

Solution Overview
Oracle Analytic Views is a feature within the Oracle Database and is included with any edition of this database solution. Users can model data in Oracle Analytical Views using SQL Data Definition Language (DDL) commands to create relevant objects, including measures, dimensions, and hierarchies. The Analytic Views feature abstracts away the complexities of the data in underlying source system(s), which can include Oracle Database, external object stores, external tables, and data stored in non-Oracle data sources by means of Oracle Heterogeneous Data Services.

Oracle Analytic Views can also store the logic for navigation, joins, aggregations, and calculations. This is key to what Oracle describes as one of the signature benefits of this solution: by defining those rules in the model, users don’t need to include them in SQL queries. According to Oracle, enabling simpler SQL queries to access more complex data structures streamlines application development. It also ensures deterministic calculation results, since calculation rules are defined in the database rather than in SQL queries.

Oracle is positioned as a Challenger and Fast Mover in the Maturity/Feature Play quadrant of the semantic layers and metrics stores Radar chart.

Strengths
Oracle scored well on a number of decision criteria, including:

  • Query language compatibility: Oracle Analytic Views are defined using SQL DDL and can be queried using hierarchical SQL as well as standard “GROUP BY” SQL queries. Hierarchical SQL contains Oracle-specific extensions to SQL that provide the ability to more easily “navigate” the Analytic Views hierarchies without having to write complex SQL statements. Oracle says Analytic Views can also be queried using OLE DB for OLAP and the MDX query language.

  • Federation and virtualization: Oracle Analytic Views create a virtual, logical layer over data in Oracle Database, as well as potentially over data in external object stores and external tables. They abstract away the complexities of the underlying data sources. When queried through a client analytics application, such as Oracle Analytics Cloud or Tableau, Analytic Views passes the query to the SQL generator. The SQL generator translates the logic of the query to generate optimized SQL to return the results from wherever they reside.

  • Advanced data modeling capabilities: In Oracle Analytic Views, complex business logic is embedded in the semantic model, including complex hierarchies such as parent-child relationships, multiple hierarchies for the same dimension, and time-based dimensions that enable time-series analysis. 

Opportunities
Oracle has room for improvement in a couple of decision criteria, including:

  • Modeling language support: Oracle Analytic Views doesn’t include support for a YAML-based modeling language that would enable collaborative, version-controlled development of modeling code. Instead, users define Oracle Analytic Views in SQL DDL statements and also query them through SQL. The objects in the Analytic View model can be organized and stored in a hierarchy of folders and versioned like other objects in the database. Oracle focuses on using SQL to capture complex business logic in the model, simplifying the SQL required for queries.

  • GenAI and agentic features: Oracle does not appear to provide full support for querying Analytic View metadata. However, it plans to expand the natural language-to-SQL capabilities of Select AI (a suite of GenAI features in Oracle Database) to enable this functionality.

Purchase Considerations
Oracle Analytic Views are a built-in feature of the Oracle Database, automatically available with every instance of the Oracle Database, and not priced separately. This offering integrates tightly with other Oracle solutions, especially Oracle Analytics Cloud for consumption and visualization and Oracle APEX for application development. Using Oracle Analytic Views with other Oracle solutions lets users take advantage of the fullest enhancements this solution has to offer.

Use Cases
Oracle Analytic Views supports the core semantic layer use case of capturing complex business logic and standardizing the definitions of metrics for use in business intelligence and reporting across an organization. The models created in Analytic Views can be used to support analytics in internal or customer-facing applications. The optimizations inherent in Analytic Views can contribute to the performance of these applications. The recently added support for the MCP Oracle Database opens up the potential to enrich and refine the output produced by GenAI with the business context stored in the semantic layer.

SAP: SAP Datasphere

Solution Overview
The SAP Datasphere platform forms the data foundation of SAP Business Data Cloud (SAP’s fully-managed SaaS solution designed to help customers derive meaning from all their SAP and non-SAP data, and to support the development of AI and analytics applications). SAP Datasphere unifies capabilities for data integration, data federation and virtualization, semantic modeling, data cataloging, and data warehousing into a single comprehensive platform. SAP Datasphere’s modeling approach includes tailored experiences for both nontechnical and technical personas, including low-code and no-code modeling interfaces as well as a SQL editor for code-first modeling. SAP Datasphere is based on the company’s powerful HANA database, a columnar, in-memory relational database system that is described as supporting both online transaction processing (OLTP) and online OLAP workloads from a single platform.

SAP is positioned as a Challenger and Fast Mover in the Maturity/Feature Play quadrant of the semantic layers and metrics stores Radar chart.

Strengths
SAP scored well on a number of decision criteria, including:

  • Federation and virtualization: Data federation and virtualization are key elements of the unified SAP Datasphere platform, which allows users to access and query data from both SAP and non-SAP systems without having to physically move the data. Capabilities, including virtual tables, enable users to combine data from different sources without having to extract and move it to a central repository.

  • Data product support: In addition to its data cataloging and data integration capabilities, SAP Datasphere’s “spaces” feature is notable here. Since this feature provides governed virtual workspaces for individual teams or specific use cases, it enables a domain-oriented environment for modeling.

  • Advanced data modeling capabilities: Data modeling capabilities in SAP support advanced constructs such as multiple fact tables and many-to-many relationships. SAP Datasphere also provides the ability to support nested hierarchies through a “hierarchy with directory” capability (similar to the parent-child relationships and dimensions in other solutions like Microsoft Fabric and Power BI and Oracle Analytic Views). These nested hierarchies are described as combining different dimensions into a single hierarchy.

Opportunities
SAP has room for improvement in a couple of decision criteria, including:

  • Modeling language support: A modeling language (such as the YAML-based ones that are used by some of the other offerings in this report) isn’t part of SAP’s overall approach. Instead, SAP Datasphere focuses on providing SQL-based semantic modeling and a graphical interface.

  • Support for DataOps approach: SAP’s overall approach to data modeling doesn’t currently include the code-first, version-controlled, CI/CD-enabled workflows for semantic modeling encompassed by this emerging feature. Instead, SAP Datasphere focuses on providing an integrated, end-to-end data analytics platform that still retains strong features allowing both technical and business users to be involved in data modeling.

Purchase Considerations
SAP Datasphere is a core component of the SAP Business Data Cloud (BDC) solution. Through SAP Analytics Cloud, the core analytics and planning component of BDC, business users can natively consume data enriched by the semantic models in SAP Datasphere. SAP describes these products as together ensuring consistent business semantics across data integration, modeling, analytics, and planning. SAP Datasphere is also tightly integrated with other SAP applications and platforms, including SAP S/4HANA, the vendor’s high-profile enterprise resource planning (ERP) platform. These integrations, together with SAP Datasphere’s unified support for multiple analytic capabilities and workloads, make SAP Datasphere ideal for use with other elements of SAP’s technology portfolio. 

Use Cases
SAP Datasphere supports the core semantic layer use case of modeling complex business logic for reuse and standardization across an organization. Beyond this, the broader data platform can support a variety of other use cases and workloads, including data preparation and integration, data cataloging, data warehousing, and business application development.

6.
Analyst’s Outlook

6. Analyst’s Outlook

Semantic layers have existed as components of the enterprise technology stack for decades. In many senses, the industry has rediscovered the benefits of a technology that has been around for quite some time rather than being forced to develop a new one. Today’s semantic layers benefit from technologies and features that bring them squarely into the modern era. 

Code-first interfaces, domain-specific languages, and support for DataOps principles bring software engineering best practices (such as version control) to data modeling. Natural language querying interfaces, code generation assistants, and other generative AI-powered features expand the data modeling process to include business users and other less technical personas. 

Perhaps most significantly, modern semantic layers benefit from powerful improvements in query engine technology, relative to 1990s vintage OLAP engines. The modern semantic layer solution now represents the best of both worlds, so to speak. It possesses the advantages of the classic dimensional modeling approach without the baggage of the original query engine technology that accompanied it. This context is an important starting place for those seeking to initiate or enhance their semantic layer and metrics store investments.

Decision-makers should begin their buying journey by evaluating their own organizational requirements in addition to assessing the offerings available in the market. They should consider which data sources and client applications the semantic layer must connect with, determine the proportion of technical users in the organization and whether a code-first or more visual modeling approach would best serve them, identify the specific use cases the solution is expected to support, and account for data sovereignty and residency requirements. Addressing these factors is essential for determining the semantic layer solution that will best benefit a particular organization.

Some semantic layer vendors have free developer options, and some have made portions of their code publicly available. These options provide a good potential starting point for customers to “try out” the capabilities of a semantic layer offering without needing to make a formal purchase commitment. When adopted with a proper understanding of the intended use case and goals, semantic layer and metrics store solutions can be relatively seamless and noninvasive to implement. They have the potential to transform organizations’ analytics and reporting experience and reduce their time to insight. 

Forward View

In recent years, the shortcomings and pain points of the self-service BI approach served to highlight the importance of semantic layers and the crucial functions they perform in relation to foundational aspects of data analytics in business. Today, businesses are realizing the importance of semantic layers in helping them adopt and integrate generative AI across their organizations. The semantic layer’s ability to enrich technical data with meaningful context is key to achieving high-quality GenAI model output and effective performance of GenAI applications. As businesses continue to innovate and seek ways to remain competitive, their demand for solutions that can help them leverage generative AI will continue to grow.  

To learn about related topics in this space, check out the following GigaOm Radar report:

7.
Methodology

7. Methodology

*Vendors marked with an asterisk did not participate in our research process for the Radar report, and their capsules and scoring were compiled via desk research.

For more information about our research process for Radar reports, please visit our Methodology.

8.
About Andrew J. Brust

8. About Andrew J. Brust

Andrew Brust has held developer, CTO, analyst, research director, and market strategist positions at organizations ranging from the City of New York and Cap Gemini to GigaOm and Datameer. He has worked with small, medium, and Fortune 1000 clients in numerous industries and with software companies ranging from small ISVs to large clients like Microsoft. The understanding of technology and the way customers use it that resulted from this experience makes his market and product analyses relevant, credible, and empathetic.

Andrew has tracked the Big Data and Analytics industry since its inception, as GigaOm’s Research Director and as ZDNet’s original blogger for Big Data and Analytics. Andrew co-chairs Visual Studio Live!, one of the nation’s longest-running developer conferences, and currently covers data and analytics for The New Stack and VentureBeat. As a seasoned technical author and speaker in the database field, Andrew understands today’s market in the context of its extensive enterprise underpinnings.

9.
About GigaOm

9. About GigaOm

GigaOm provides technical, operational, and business advice for IT’s strategic digital enterprise and business initiatives. Enterprise business leaders, CIOs, and technology organizations partner with GigaOm for practical, actionable, strategic, and visionary advice for modernizing and transforming their business. GigaOm’s advice empowers enterprises to successfully compete in an increasingly complicated business atmosphere that requires a solid understanding of constantly changing customer demands.

GigaOm works directly with enterprises both inside and outside of the IT organization to apply proven research and methodologies designed to avoid pitfalls and roadblocks while balancing risk and innovation. Research methodologies include but are not limited to adoption and benchmarking surveys, use cases, interviews, ROI/TCO, market landscapes, strategic trends, and technical benchmarks. Our analysts possess 20+ years of experience advising a spectrum of clients from early adopters to mainstream enterprises.

GigaOm’s perspective is that of the unbiased enterprise practitioner. Through this perspective, GigaOm connects with engaged and loyal subscribers on a deep and meaningful level.