Skip to main content

GAQ REST API

What is the GAQ REST API?

The GAQ (GetAggregatesQuery) REST API is an experimental API that allows applications to execute queries against Atoti cubes and retrieve results in Apache Arrow format. Unlike the Continuous GAQ API, this is a request-response API that returns a single snapshot of query results.

Experimental API

This API is experimental and may change in future versions without notice.

Why use the GAQ REST API?

The GAQ REST API provides:

  • Efficient data format: Results are delivered in Apache Arrow format for optimal performance
  • Direct query execution: Execute GetAggregatesQuery operations programmatically
  • Flexible filtering: Support for coordinates and filter conditions
  • Context control: Configure query execution parameters like timeouts and result limits

Prerequisites

Before using this API:

  • Authentication: All endpoints require authenticated users
  • GAQ familiarity: Understanding of GetAggregatesQuery concepts (measures, levels, coordinates)
  • Apache Arrow: Client-side Arrow deserialization capability

Overview

The GAQ REST API provides a single endpoint for executing queries and receiving results as Apache Arrow streams. Each request executes independently and returns a complete result set.

Key features

  • Apache Arrow format: Results are streamed in application/vnd.apache.arrow.stream format
  • Synchronous execution: Request-response pattern with complete results
  • Flexible querying: Support for measures, coordinates, filters, and context values
  • Configurable timeouts: Control query execution time limits

API endpoint

For detailed endpoint specifications, request/response schemas, and examples, see the GAQ Query REST API in the OpenAPI documentation.

Execute GAQ query

Executes a GAQ query on the specified cube and returns the result as an Apache Arrow stream.

Endpoint: POST /activeviam/pivot/rest/v10/cube/{cubeName}/queries/gaq

Response: Streaming HTTP response body in Apache Arrow format (application/vnd.apache.arrow.stream)

Result format

Results are returned in Apache Arrow IPC (Inter-Process Communication) streaming format. The response contains a single Arrow record batch with:

Schema structure

The Arrow schema contains:

  • Level columns: Dimension members (UTF-8 or Integer types) corresponding to the queried coordinates
  • Measure columns: Aggregated values (Floating point types) for the requested measures

Example result structure

Arrow Record Batch:

Currency (UTF-8)Value (Float64)
"USD"44.0
"EUR"52.0

Reading results

To read the Arrow stream:

  1. Deserialize the HTTP response body as an Arrow IPC stream
  2. Read the schema to understand column types and names
  3. Extract data from the record batch vectors
  4. Level columns contain the dimension member values
  5. Measure columns contain the aggregated results

For more information about Apache Arrow, see the official documentation.

Coordinates vs filters

Understanding the difference between coordinates and filters:

Coordinates

  • Define the structure of the result set
  • Determine which dimension members appear as rows in the output
  • Use null value for a level to get all members (wildcard expansion)
  • Example: "Currency": null returns one row per currency

Filters

  • Define the scope of data to aggregate
  • Restrict which data is included in calculations
  • Do not change the structure of the output
  • Example: Filter on trader ID restricts aggregation to specific traders

Authentication

All endpoints require authentication. The API uses the current authenticated user's credentials to execute queries.

Differences from Continuous GAQ REST API

FeatureGAQ REST APIContinuous GAQ REST API
PatternRequest-responseStreaming with updates
ConnectionSingle requestPersistent connection
ResultsComplete snapshotInitial + incremental updates
Use caseOne-time queriesReal-time monitoring

Use the GAQ REST API for:

  • One-time data retrieval
  • Batch processing
  • Simple query scenarios
  • Testing and development

Use the Continuous GAQ REST API for:

  • Real-time data monitoring
  • Dashboard applications
  • Continuous data streaming
  • Production applications requiring live updates