NEW: Scale AI Case Study — ~1,900 data requests per week across 4 business units Read now →
Contents
AtScale
AtScale started as a virtual OLAP cube engine for Hadoop in 2013 and evolved into one of the leading enterprise semantic layers, especially in financial services and Microsoft-heavy shops.
AtScale is the rare semantic layer vendor whose origin story is genuinely instructive about how the category evolved. It started in 2013 as a virtual OLAP cube engine for Hadoop and quietly transformed itself, over the next decade, into one of the leading enterprise semantic layers — particularly in financial services, Fortune 500s, and Microsoft-stack shops where Power BI and Excel are still the default analytics tools.
Where Cube is the developer-led, open-source, modern-data-stack semantic layer, AtScale is the enterprise OLAP semantic layer. Different buyers, different selling motion, different vibe. Both are bets on the same long-term thesis: the semantic layer should outlive any one BI tool.
AtScale was founded in 2013 by Dave Mariani, who had previously been the VP of Engineering responsible for building Yahoo's analytics platform — one of the largest OLAP cubes ever deployed (at the time, billions of rows queried via MDX and Excel pivot tables). When Yahoo migrated that workload to Hadoop, Mariani watched the analysts revolt. They had been used to sub-second cube queries from Excel. Hadoop was orders of magnitude slower for the same questions. The data team kept telling them to learn Hive. The analysts kept refusing.
Mariani's insight: the problem was not Hadoop, it was the missing semantic layer. Hadoop had the data and the compute, but it had no equivalent of Microsoft Analysis Services — no place where dimensions, measures, and joins were defined once for everyone, no way to query through the familiar tools (Excel, Tableau, Power BI). He started AtScale to put a virtual OLAP layer on top of Hadoop that spoke MDX, the legacy cube query language, so existing Excel users could keep using their pivot tables against new big data backends.
This is the origin of the AtScale design that still defines the product: a semantic layer that pretends to be an OLAP cube to whatever tool is asking. AtScale connects to a modern warehouse on one side, exposes itself as a Microsoft Analysis Services cube on the other, and translates queries between them in real time. Excel users do not have to relearn anything. Power BI users do not have to import data. Tableau users get a "live connection" that feels like a database. Underneath, AtScale is hitting Snowflake or Databricks or BigQuery and caching the results.
AtScale's core capability is virtualization: it makes a modern cloud warehouse look like a multi-dimensional cube to any tool that speaks MDX, DAX, SQL, JDBC, or REST. Concretely, you do four things in AtScale:
1. Model your business logic. Define dimensions (date, customer, product, region), measures (revenue, units sold, churn rate), hierarchies (year > quarter > month), and relationships between dimensional and fact tables. This modeling is done in a UI or imported from existing models.
2. Connect to a backend warehouse. Snowflake, Databricks, BigQuery, Redshift, Synapse, Iceberg, Hadoop, Teradata. AtScale supports an unusually broad list of backends, which matters for big enterprises that have data spread across several.
3. Expose the model as an OLAP cube. Excel sees an Analysis Services cube. Power BI sees a Tabular model. Tableau sees a SQL database. Each tool gets its native interface, but they all hit the same governed definitions.
4. Run an aggregation engine. AtScale's secret sauce is autonomous performance: it watches which queries get asked and automatically materializes pre-aggregations in the warehouse to make them faster on next query. This is the modern descendant of OLAP cube pre-computation, except the cube lives inside Snowflake instead of on a separate appliance.
AtScale is the enterprise pick, and that is both its strength and its ceiling.
The strengths are real. AtScale was selling semantic layer software to Morgan Stanley, JPMorgan, and Wells Fargo when most of the modern data stack vendors did not exist yet. The product is mature, the support organization is enterprise-grade, the security features are CIO-approved, and the MDX/DAX compatibility is years ahead of what newer competitors can offer. For a Fortune 500 with thousands of Excel users, AtScale is often the only viable option to centralize metrics without breaking the existing analyst workflow.
The ceiling is also real. AtScale is a closed-source, GUI-first, sales-led product in a category where the momentum is around open-source, code-first, developer-led alternatives. Every modern data team that adopts dbt and Cube finds AtScale's modeling experience clunky by comparison. The pricing is enterprise (six and seven figures), the sales cycle is long, and the brand is invisible to anyone under 35 who learned data engineering in the dbt era.
The AtScale strategy in the AI era has been to lean hard into being the semantic layer for AI agents on Microsoft and enterprise stacks. They have shipped MCP servers, integrations with Microsoft Fabric, and an "AI Link" product that exposes their semantic models to LLMs. This is a defensible niche — Microsoft enterprise customers want Microsoft-friendly semantic layers, and the modern data stack vendors do not really speak that language.
The honest prediction: AtScale will continue to do well in regulated enterprises (financial services, insurance, healthcare) where Excel and Power BI dominate, and will probably never break out into the wider modern data stack market. Whether it ends up acquired (by Microsoft, IBM, SAP) or stays independent depends on how much patience its investors have.
A typical AtScale buyer is a CDO or VP of Analytics at a large regulated enterprise who needs governed, consistent metrics across thousands of business users, the majority of whom live in Excel and Power BI and are not switching.
TextQL Ana can consume AtScale's semantic models as a metric source. When customers have already invested years into modeling their business in AtScale, Ana queries through that model rather than asking the warehouse directly, which preserves the customer's governance and definitions. The two products are complementary: AtScale defines the canonical metrics, Ana lets users ask for them in natural language.
See TextQL in action