Testing custom adjustment types
The Market Risk Application supports cube-level, fact-level, and roll-over adjustments for the Sensitivities, PnL, and VaR-ES cubes.
As an example of how to configure a new custom adjustment, this page explains how PNL adjustments have been defined within the mr-application module in the
mr-application/src/main/java/com/activeviam/mr/application/signoff/adjustments/ directory.
note
PNL adjustments are scalar adjustments.
Configure Adjustment Execution
All adjustment processes are defined within the AdjustmentExecutionConfig.java class. Other library classes are mentioned where relevant.
Fact-level adjustments
Fact-level adjustments are modifications of the underlying data held in the datastore. They operate directly on datastore rows, using Execution objects containing functional
components.
These functional components are defined in the ExecutionFunctionalComponents.java class.
The Execution object
| Field | Description |
|---|---|
| appliesOnAdjustments | Determines whether or not the current adjustment applies for rows created by a previous adjustment. |
| inputParser | A BiFunction that takes the request and definition DTOs and matches the required definition inputs to the input provided in the request. The values of the request inputs are parsed using the IParser objects matching the type found in the definition. |
| inputRetriever | A TriFunction that takes a datastore row, the store format and a list of store fields and returns an object representing the value to be adjusted. |
| sourceTagger | A List of TriFunction objects that take an execution ID and the initial values of the Source and Input type fields and return a map with the new values of those fields. |
| valueField | The datastore field that holds the values that will be adjusted by the execution. |
| expectedInputs | The List of expected inputs required by the execution. |
| inputConverter | A BiFunction that takes the result of the inputParser and the expectedInputs and generates the request inputs to be used in the adjustment steps. |
| steps | A list of BiFunction objects that use the results of the inputRetriever and inputConverter to create the adjusted values that will be written to the datastore. |
As PnL adjustments are scalar (i.e. a fact is a single value, as opposed to a vector), the scalar functions are used:
inputParserisparseInput()inputRetrieverisscalarInputRetriever()- given the base tuple, the store format and a list of fields, returns a double value held by the first field in the listinputConverterisscalarInputConverter()- given a map of values and the expected input, returns a singledouble-typed value
For the PnL datastore:
valueFieldparameter isStoreFieldNames.DAILYexpectedInputsparameter only containsStoreFieldNames.DAILY
Add On Execution
appliesOnAdjustmentsisfalsesourceTaggercontains:userInputSourceTagging()- tags the row as direct user input
stepscontains:doubleInputReplacer()- replaces the initial value with the request input value
private final Execution<Double, Double, Double> pnlAddOnExecution = new Execution<>(
false,
executionFunctionalComponents.parseInput(),
executionFunctionalComponents.scalarInputRetriever(),
List.of(executionFunctionalComponents.userInputSourceTagging()),
StoreFieldNames.DAILY,
List.of(StoreFieldNames.DAILY),
executionFunctionalComponents.scalarInputConverter(),
List.of(executionFunctionalComponents.doubleInputReplacer())
);
Scaling Execution
appliesOnAdjustmentsistruesourceTaggercontains:inverseTagging()- tags a row as the inversion of the initial rowscaleTagging()- tags the row as a scaling of the initial row
stepscontains:doubleInverter()- the execution replaces the initial value with its inverse (initialValue * -1.0)doubleScaler()- the execution replaces the initial value with the scaled value (initialValue * scalingFactor)
private final Execution<Double, Double, Double> pnlScalingExecution = new Execution<>(
true,
executionFunctionalComponents.parseInput(),
executionFunctionalComponents.scalarInputRetriever(),
List.of(executionFunctionalComponents.inverseTagging(), executionFunctionalComponents.scaleTagging()),
StoreFieldNames.DAILY,
List.of(StoreFieldNames.DAILY),
executionFunctionalComponents.scalarInputConverter(),
List.of(executionFunctionalComponents.doubleInverter(), executionFunctionalComponents.doubleScaler())
);
Override Execution
appliesOnAdjustmentsistruesourceTaggercontains:inverseTagging()- tags a row as the inversion of the initial rowuserInputSourceTagging()- tags the row as direct user input
stepscontains:doubleInverter()- the execution replaces the initial value with its inverse (initialValue * -1.0)doubleInputReplacer()- replaces the initial value with the request input value
private final Execution<Double, Double, Double> pnlOverrideExecution = new Execution<>(
true,
executionFunctionalComponents.parseInput(),
executionFunctionalComponents.scalarInputRetriever(),
List.of(executionFunctionalComponents.inverseTagging(), executionFunctionalComponents.userInputSourceTagging()),
StoreFieldNames.DAILY,
List.of(StoreFieldNames.DAILY),
executionFunctionalComponents.scalarInputConverter(),
List.of(executionFunctionalComponents.doubleInverter(), executionFunctionalComponents.doubleInputReplacer())
);
Cube-level adjustments
Cube-level adjustments rely on the concept of ’location digest’, a string representation of a location in the cube that is more robust to changes in the cube configuration than the native toString() value of a location object. A digest is a string representation of the form: “dimensionName@hierarchyName=…|dimensionName@hierarchyName=…” in which hierarchies for which the path is “AllMember” are excluded . The location digest is independent of the order in which hierarchies are defined in the cube configuration: the hierarchies digests are sorted alphabetically in the location digest.
Cube-level adjustments work by submitting entries into the SignOffDigestStore store to specify:
- the location digest
- the measure to override
- the value used for the override
- the currency in which that value is expressed
The executor used for cube-level adjustments does not take any argument into account.
Roll-over adjustments
Roll-over adjustments operate on the datastore, replacing the rows corresponding to the current as-of date with the approved rows from the input as-of date.
The executors use the rollOver() method with the following arguments:
| Argument | Description |
|---|---|
| inverters | A Map of inverter Function objects for the datastore fields that should be inverted by the roll-over. |
| inverseTagging | Tagging TriFunction to generate the source and input tags for an inverted datastore row. |
| rollOverTagging | Tagging TriFunction to generate the source and input tags for a rolled-over row. |
For the PnL roll-over adjustment, the inverters are as follows:
private final Map<String, Function<Object, Object>> pnlInverters = Map.of(
StoreFieldNames.DAILY, executionFunctionalComponents.inPlaceDoubleOrArrayInverter(),
MONTHLY, executionFunctionalComponents.inPlaceDoubleOrArrayInverter(),
YEARLY, executionFunctionalComponents.inPlaceDoubleOrArrayInverter(),
LIFETIME, executionFunctionalComponents.inPlaceDoubleOrArrayInverter()
);
The inPlaceDoubleOrArrayInverter() function returns input * 1.0 for double inputs and ((IVector) input).scale(-1.0) for IVector inputs.
Include Defined Executions in Executors Map
Now that we have defined our adjustment executions we need to add them to our executors Map which will be accessed by the services defined in the Sign-off API library.
We want to include our executors in the bean with the qualifier SP_QUALIFIER__EXECUTORS
Note that our executors are included in both profiles.
@Bean
@Qualifier(SP_QUALIFIER__EXECUTORS)
public Map<String, BiConsumer<AdjustmentRequestDTO, String>> executors() {
executionFunctionalComponents.setStatusService(statusService);
Map<String, BiConsumer<AdjustmentRequestDTO, String>> executors = new HashMap<>();
Boolean onBranch = Boolean.parseBoolean(performOnBranch);
...
executors.put(PNL_ADD_ON, execution(onBranch, pnlAddOnExecution));
executors.put(PNL_OVERRIDE, execution(onBranch, pnlOverrideExecution));
executors.put(PNL_SCALING, execution(onBranch, pnlScalingExecution));
executors.put(PNL_ROLL_OVER, rollOver(
pnlInverters,
executionFunctionalComponents.inverseTagging(),
executionFunctionalComponents.rollOverTagging()));
executors.put(PNL_CUBE_LEVEL, cubeLevelAdjustment());
...
return executors;
}
Define required dimensions for sign-off adjustments
Navigate to AdjustmentPivotConfig.java
We add SignOff Source Dimension to the appropriate schema in this case PnlSchema.
@Bean
@Qualifier("aPnlDimension")
@Order(75)
public ICanStartBuildingDimensions.DimensionsAdder signOffPnlDimensionAdder() {
return AdjustmentPivotConfig::getSignOffSourceDimension;
}
Add source tagging fields
Navigate to SignOffDatastoreCustomisations
public static void loadCustomisations(IDatastoreConfigurator configurator) {
...
addTagging(configurator, PnLDatastoreDescriptionConfig.SCHEMA, StoreNames.PNL_STORE_NAME);
addTagging(configurator, PnLDatastoreDescriptionConfig.SCALAR_SCHEMA, StoreNames.PNL_STORE_NAME);
addTaggingAfterField(configurator, PnLFlatDatastoreDescriptionConfig.SCHEMA, StoreNames.PNL_BASE_STORE, StoreFieldNames.INSTRUMENT_SUB_TYPE);
addTaggingAfterField(configurator, PnLFlatDatastoreDescriptionConfig.SCALAR_SCHEMA, StoreNames.PNL_BASE_STORE, StoreFieldNames.INSTRUMENT_SUB_TYPE);
addTaggingAfterField(configurator, PnLAggregatedDatastoreDescriptionConfig.SCHEMA, StoreNames.PNL_BASE_STORE, StoreFieldNames.INSTRUMENT_SUB_TYPE);
addTaggingAfterField(configurator, PnLAggregatedDatastoreDescriptionConfig.SCALAR_SCHEMA, StoreNames.PNL_BASE_STORE, StoreFieldNames.INSTRUMENT_SUB_TYPE);
...
}
Define supported Adjustment
Navigate to SupportedAdjustmentsConfig
Define a SupportedAdjustmentDTO bean.
Add-on
@Bean
public SupportedAdjustmentDTO pnlAddOn() {
return new SupportedAdjustmentDTO(
ADD_ON_NAME,
PNL_ADD_ON,
true,
PnLCubeConfig.CUBE_NAME,
Set.of(StoreNames.PNL_STORE_NAME),
Set.of(
new TypedFieldDTO(AS_OF_LEVEL, StoreFieldNames.AS_OF_DATE, LOCAL_DATE),
new TypedFieldDTO(TRADE_LEVEL, StoreFieldNames.TRADE_ID, STRING),
new TypedFieldDTO(TYPE_LEVEL, StoreFieldNames.TYPE, STRING),
new TypedFieldDTO(RISK_FACTOR_LEVEL, StoreFieldNames.RISK_FACTOR, STRING),
new TypedFieldDTO(CURRENCY_LEVEL, StoreFieldNames.VALUE_CCY, STRING)
),
Set.of(DTD_PNL_NATIVE),
Set.of(new TypedFieldDTO(StoreFieldNames.DAILY, DOUBLE))
);
}
Scaling
@Bean
public SupportedAdjustmentDTO pnlScaling() {
return new SupportedAdjustmentDTO(
SCALING_NAME,
PNL_SCALING,
true,
PnlCubeConfig.CUBE_NAME,
Set.of(StoreNames.PNL_STORE_NAME),
Set.of(
new TypedFieldDTO(AS_OF_LEVEL, StoreFieldNames.AS_OF_DATE, LOCAL_DATE),
new TypedFieldDTO(TRADE_LEVEL, StoreFieldNames.TRADE_ID, STRING),
new TypedFieldDTO(TYPE_LEVEL, StoreFieldNames.TYPE, STRING),
new TypedFieldDTO(RISK_FACTOR_LEVEL, StoreFieldNames.RISK_FACTOR, STRING),
new TypedFieldDTO(CURRENCY_LEVEL, StoreFieldNames.VALUE_CCY, STRING)
),
Set.of(DTD_PNL_NATIVE),
Set.of(new TypedFieldDTO(StoreFieldNames.DAILY, DOUBLE))
);
}
Override
@Bean
public SupportedAdjustmentDTO pnlOverride() {
return new SupportedAdjustmentDTO(
OVERRIDE_NAME,
PNL_OVERRIDE,
true,
PnlCubeConfig.CUBE_NAME,
Set.of(StoreNames.PNL_STORE_NAME),
Set.of(
new TypedFieldDTO(AS_OF_LEVEL, StoreFieldNames.AS_OF_DATE, LOCAL_DATE),
new TypedFieldDTO(TRADE_LEVEL, StoreFieldNames.TRADE_ID, STRING),
new TypedFieldDTO(TYPE_LEVEL, StoreFieldNames.TYPE, STRING),
new TypedFieldDTO(RISK_FACTOR_LEVEL, StoreFieldNames.RISK_FACTOR, STRING),
new TypedFieldDTO(CURRENCY_LEVEL, StoreFieldNames.VALUE_CCY, STRING)
),
Set.of(DTD_PNL_NATIVE),
Set.of(new TypedFieldDTO(StoreFieldNames.DAILY, DOUBLE))
);
}
Roll-over
@Bean
public SupportedAdjustmentDTO pnlRollOver() {
return new SupportedAdjustmentDTO(
ROLL_OVER_NAME,
PNL_ROLL_OVER,
false,
PnlCubeConfig.CUBE_NAME,
Set.of(StoreNames.PNL_STORE_NAME),
Set.of(
new TypedFieldDTO(AS_OF_LEVEL, StoreFieldNames.AS_OF_DATE, LOCAL_DATE),
new TypedFieldDTO(DESK_LEVEL, StoreFieldNames.DESK, STRING),
new TypedFieldDTO(BOOK_LEVEL, StoreFieldNames.BOOK, STRING, true),
new TypedFieldDTO(TRADE_LEVEL, StoreFieldNames.TRADE_ID, STRING, true),
new TypedFieldDTO(SIGNOFF_STATUS_LEVEL, ADJUSTMENT_LEVEL_STATUS, LEVEL_PATH, true),
new TypedFieldDTO(SIGNOFF_TASK_LEVEL, ADJUSTMENT_LEVEL_TASK, LEVEL_PATH, true),
new TypedFieldDTO(SIGNOFF_SOURCE_LEVEL, SOURCE_FIELD, LEVEL_PATH, true),
new TypedFieldDTO(SIGNOFF_INPUT_TYPE_LEVEL, INPUT_FIELD, LEVEL_PATH, true)
),
null,
Set.of(new TypedFieldDTO(StoreFieldNames.AS_OF_DATE, LOCAL_DATE))
);
}
Cube-level
@Bean
public SupportedAdjustmentDTO pnlCubeLevelAdjustment() {
return new SupportedAdjustmentDTO(
CUBE_LEVEL_NAME,
PNL_CUBE_LEVEL,
false,
PnlCubeConfig.CUBE_NAME,
null,
Set.of(
new TypedFieldDTO(AS_OF_LEVEL, StoreFieldNames.AS_OF_DATE, LEVEL_PATH),
new TypedFieldDTO(DESK_LEVEL, StoreFieldNames.DESK, LEVEL_PATH),
new TypedFieldDTO(BOOK_LEVEL, StoreFieldNames.BOOK, LEVEL_PATH, true),
new TypedFieldDTO(TRADE_LEVEL, StoreFieldNames.TRADE_ID, LEVEL_PATH, true),
new TypedFieldDTO(SIGNOFF_STATUS_LEVEL, ADJUSTMENT_LEVEL_STATUS, LEVEL_PATH, true),
new TypedFieldDTO(SIGNOFF_TASK_LEVEL, ADJUSTMENT_LEVEL_TASK, LEVEL_PATH, true),
new TypedFieldDTO(SIGNOFF_SOURCE_LEVEL, SOURCE_FIELD, LEVEL_PATH, true),
new TypedFieldDTO(SIGNOFF_INPUT_TYPE_LEVEL, INPUT_FIELD, LEVEL_PATH, true)
),
null,
Set.of( new TypedFieldDTO(CURRENCY, STRING),
new TypedFieldDTO(StoreFieldNames.SENSITIVITY_VALUES, DOUBLE))
);
}