Adding cube hierarchies
This page provides a description of how we can expose a datastore field as a cube level. It features an example in which we expose a calculated column ‘calculated’ from a new store as a level. For a full walk-through on how this store was added to the cube, see Adding New Data Loading or Unloading Topics.
tip
MR 3.1.0 introduced adding customizations through Spring Beans using new extension points. The example below assumes a new field has been added to the store, fed through the addition of a new column calculator. For documentation on the new mechanism, including how to implement such a column calculator, see Configuring sources.
The techniques employed are generic examples that can be extended, adapted and repeated for any use case that we need.
Step 1 - (Optional) Modify the schema selection description
If the default Atoti Market Risk behavior is not appropriate for the fields added in the datastore, schema selection customization might be required.
For details about the available customization mechanism, please refer to Configuring schema selections using Spring Beans.
Step 2 - Create the level
The field in the schema selection can now be turned into a level in the cube.
Dimensions, hierarchies, and levels can be added to the cube
configuration by creating Spring Beans with type HierarchyBuilderConsumer and one of the following qualifiers:
| Cube | Qualifier |
|---|---|
| Var-ES Cube | varDimensions |
| Sensitivity Cube | sensiDimensions, sensiCommonDimensions |
| PLCube | aPnlDimension |
| Var-ES Summary Cube | varSummaryDimensions |
| Sensitivity Summary Cube | sensiSummaryDimensions |
| PL Summary Cube | aPnlSummaryDimension |
| Market Data Cube | mdDimensions |
For example:
@Qualifier(SP_QUALIFIER__VAR_DIMENSIONS)
@Order(50)
@Bean
public HierarchyBuilderConsumer myHierarchy() {
return hierarchyBuilder -> hierarchyBuilder.toDimension("Sign", builder -> builder
.withHierarchy("Sign")
.withLevel("calculated");
}
Note: The order has been set to 50 so that this dimension is added after the existing dimensions (order=20) and before the epoch dimension (order=99).
When using DirectQuery
When using DirectQuery, perform the steps as listed preceding. There are no additional requirements.
warning
The DirectQuery and in-memory Atoti servers must be configured identically. There must not be any hierarchies in the DirectQuery data node that do not exist in the in-memory one.
Copper hierarchies case
The Copper hierarchies are analysis hierarchies that are built from Copper join from a side store. You can also amend Copper hierarchies. They mainly work the same way as the Copper metrics.
The Copper join
The copper join is exposed on a bean of type CopperStore, with a qualifier. You can override it by creating a new bean with the same qualifier and the primary annotation.
For instance if you want to override the Var-ES cube to Scenario store Copper join:
@Primary
@Qualifier(SP_QUALIFIER__SCENARIO_STORE_NAME_JOIN_VAR)
@Bean
public CopperStore myScenarioStoreForVaR() {
return Copper.store(SCENARIO_STORE_NAME)
.joinToCube(UnlinkedCopperStore.JoinType.LEFT)
[...];
}
The Copper hierarchy
The Copper hierarchies are exposed on beans of type Publishable<CopperHierarchy>, with a qualifier. You can override it by creating a new bean with the same qualifier and the primary annotation.
For instance if you want to override the Var-ES cube liquidity Horizon Copper hierarchy:
@Primary
@VarCopperContextBean
@Qualifier(SP_QUALIFIER__LIQUIDITY_HORIZONS_HIERARCHY_VAR)
public Publishable<CopperHierarchy> liquidityHorizonHierarchyVaR(@Qualifier(SP_QUALIFIER__SCENARIO_STORE_NAME_JOIN_VAR) CopperStore scenarioStoreForVaR) {
return Copper
.newHierarchy(RISK_DIMENSION, LIQUIDITY_HORIZONS_HIERARCHY)
.fromStore(scenarioStore)
.withLevel(LIQUIDITY_HORIZON_LEVEL, FieldPath.of(StoreFieldConstants.LIQUIDITY_HORIZON));
}
You can also add a new Copper hierarchy by creating a new bean of type Publishable<CopperHierarchy> with a new qualifier.
The Copper hierarchy qualifiers
The following table lists the Copper hierarchy qualifiers that are used in the different cubes. You can override any of them by creating a new bean with the same qualifier and the primary annotation.
| Store name | Cube name | Hierarchy | Qualifier | CopperStore qualifier |
|---|---|---|---|---|
| Scenarios | Sensitivity Cube | Scenarios@Risk | SP_QUALIFIER__SCENARIO_ANALYSIS_HIERARCHY_SENSI | SP_QUALIFIER__SCENARIO_STORE_NAME_JOIN_SENSI |
| Scenarios | Sensitivity Summary Cube | Scenarios@Risk | SP_QUALIFIER__SCENARIO_ANALYSIS_HIERARCHY_SENSI_SUMMARY | SP_QUALIFIER__SCENARIO_STORE_NAME_JOIN_SENSI_SUMMARY |
| Scenarios | Var-ES Cube | Scenarios@Risk | SP_QUALIFIER__SCENARIO_ANALYSIS_HIERARCHY_VAR | SP_QUALIFIER__SCENARIO_STORE_NAME_JOIN_VAR |
| Scenarios | Var-ES Summary Cube | Scenarios@Risk | SP_QUALIFIER__SCENARIO_ANALYSIS_HIERARCHY_VAR_SUMMARY | SP_QUALIFIER__SCENARIO_STORE_NAME_JOIN_VAR_SUMMARY |
| Scenarios | Sensitivity Cube | Liquidity Horizons@Risk | SP_QUALIFIER__LIQUIDITY_HORIZONS_SENSI | SP_QUALIFIER__SCENARIO_STORE_NAME_JOIN_SENSI |
| Scenarios | Sensitivity Summary Cube | Liquidity Horizons@Risk | SP_QUALIFIER__LIQUIDITY_HORIZONS_SENSI_SUMMARY | SP_QUALIFIER__SCENARIO_STORE_NAME_JOIN_SENSI_SUMMARY |
| Scenarios | Var-ES Cube | Liquidity Horizons@Risk | SP_QUALIFIER__LIQUIDITY_HORIZONS_HIERARCHY_VAR | SP_QUALIFIER__SCENARIO_STORE_NAME_JOIN_VAR |
| Scenarios | Var-ES Summary Cube | Liquidity Horizons@Risk | SP_QUALIFIER__LIQUIDITY_HORIZONS_HIERARCHY_VAR_SUMMARY | SP_QUALIFIER__SCENARIO_STORE_NAME_JOIN_VAR_SUMMARY |
| Scenarios | Sensitivity Cube | Scenario Sets@Risk | SP_QUALIFIER__SCENARIO_SET_HIERARCHY_SENSI | SP_QUALIFIER__SCENARIO_STORE_NAME_JOIN_SENSI |
| Scenarios | Sensitivity Summary Cube | Scenario Sets@Risk | SP_QUALIFIER__SCENARIO_SET_HIERARCHY_SENSI_SUMMARY | SP_QUALIFIER__SCENARIO_STORE_NAME_JOIN_SENSI_SUMMARY |
The left-join Copper metrics
The left-join Copper metrics are handled on the same exact way as the other Copper metrics are. The only difference is that
you need to inject the qualified CopperStore bean that you want to use for the left-join.
Suggested further reading
- Configuring measures using Spring Beans
- Configuring schema selections using Spring Beans
- Configuring sources using Spring Beans
- Adding a new KPI
- Adding data loading or unloading topics