Testing custom adjustment types

The reference implementation of Atoti Common Library supports fact-level and roll-over adjustments.

As an example of how to configure a new custom adjustment, this page will walk through how PNL adjustments have been defined within the starter module in the frtb-starter/src/main/java/com/activeviam/frtb/starter/signoff/adjustments/ directory.

Configure Adjustment Execution

All adjustment processes are defined within the AdjustmentExecutionConfig.java class. Other library classes are mentioned where relevant.

Fact-level adjustments

Fact-level adjustments are modifications of the underlying data held in the datastore. They operate directly on datastore rows, using Execution objects containing functional components.

These functional components are defined in the ExecutionFunctionalComponents.java class.

The Execution object

Field Description
appliesOnAdjustments Determines whether or not the current adjustment applies for rows created by a previous adjustment.
inputParser A AdjustmentInputParser that takes the request and definition DTOs and matches the required definition inputs to the input provided in the request. The values of the request inputs are parsed using the IParser objects matching the type found in the definition.
inputRetriever A AdjustmentInputRetriever that takes a datastore row, the store format and a list of store fields and returns an object representing the value to be adjusted.
sourceTagger A List of AdjustmentSourceTagger objects that take an execution ID and the initial values of the Source and Input type fields and return a map with the new values of those fields.
valueField The datastore field that holds the values that will be adjusted by the execution.
expectedInputs The List of expected inputs required by the execution.
inputConverter A AdjustmentInputConverter that takes the result of the inputParser and the expectedInputs and generates the request inputs to be used in the adjustment steps.
steps A list of AdjustmentStep objects that use the results of the inputRetriever and inputConverter to create the adjusted values that will be written to the datastore.

Add On Execution

  • appliesOnAdjustments is false
  • sourceTagger contains:
    • userInputSourceTagging() - tags the row as direct user input
  • steps contains:
    • doubleInputReplacer() - replaces the initial value with the request input value
    private final Execution<Double, Double, Double> pnlAddOnExecution = new Execution<>(
            false,
            executionFunctionalComponents.parseInput(),
            executionFunctionalComponents.scalarInputRetriever(),
            List.of(executionFunctionalComponents.userInputSourceTagging()),
            StoreFieldNames.DAILY,
            List.of(StoreFieldNames.DAILY),
            executionFunctionalComponents.scalarInputConverter(),
            List.of(executionFunctionalComponents.doubleInputReplacer())
    );

Scaling Execution

  • appliesOnAdjustments is true
  • sourceTagger contains:
    • inverseTagging() - tags a row as the inversion of the initial row
    • scaleTagging() - tags the row as a scaling of the initial row
  • steps contains:
    • doubleInverter() - the execution replaces the initial value with its inverse (initialValue * -1.0)
    • doubleScaler() - the execution replaces the initial value with the scaled value (initialValue * scalingFactor)
    private final Execution<Double, Double, Double> pnlScalingExecution = new Execution<>(
            true,
            executionFunctionalComponents.parseInput(),
            executionFunctionalComponents.scalarInputRetriever(),
            List.of(executionFunctionalComponents.inverseTagging(), executionFunctionalComponents.scaleTagging()),
            StoreFieldNames.DAILY,
            List.of(StoreFieldNames.DAILY),
            executionFunctionalComponents.scalarInputConverter(),
            List.of(executionFunctionalComponents.doubleInverter(), executionFunctionalComponents.doubleScaler())
    );

Override Execution

  • appliesOnAdjustments is true
  • sourceTagger contains:
    • inverseTagging() - tags a row as the inversion of the initial row
    • userInputSourceTagging() - tags the row as direct user input
  • steps contains:
    • doubleInverter() - the execution replaces the initial value with its inverse (initialValue * -1.0)
    • doubleInputReplacer() - replaces the initial value with the request input value
    private final Execution<Double, Double, Double> pnlOverrideExecution = new Execution<>(
            true,
            executionFunctionalComponents.parseInput(),
            executionFunctionalComponents.scalarInputRetriever(),
            List.of(executionFunctionalComponents.inverseTagging(), executionFunctionalComponents.userInputSourceTagging()),
            StoreFieldNames.DAILY,
            List.of(StoreFieldNames.DAILY),
            executionFunctionalComponents.scalarInputConverter(),
            List.of(executionFunctionalComponents.doubleInverter(), executionFunctionalComponents.doubleInputReplacer())
    );

Roll-over adjustments

Roll-over adjustments operate on the datastore, replacing the rows corresponding to the current as-of date with the approved rows from the input as-of date.

The executors use the rollOver() method with the following arguments:

Argument Description
inverters A Map of inverter Function objects for the datastore fields that should be inverted by the roll-over.
inverseTagging Tagging AdjustmentSourceTagger to generate the source and input tags for an inverted datastore row.
rollOverTagging Tagging AdjustmentSourceTagger to generate the source and input tags for a rolled-over row.

For the PnL roll-over adjustment, the inverters are as follows:

    private final Map<String, Function<Object, Object>> pnlInverters = Map.of(
            StoreFieldNames.DAILY, executionFunctionalComponents.inPlaceDoubleOrArrayInverter(),
            MONTHLY, executionFunctionalComponents.inPlaceDoubleOrArrayInverter(),
            YEARLY, executionFunctionalComponents.inPlaceDoubleOrArrayInverter(),
            LIFETIME, executionFunctionalComponents.inPlaceDoubleOrArrayInverter()
    );

The inPlaceDoubleOrArrayInverter() function returns input * 1.0 for double inputs and ((IVector) input).scale(-1.0) for IVector inputs.

Include Defined Executions in Executors Map

Now that we have defined our adjustment executions we need to add them to our executors Map which will be accessed by the services defined in the Sign-off API library.

We want to include our executors in the bean with the qualifier SP_QUALIFIER__EXECUTORS Note that our executors are included in both profiles.

    @Bean
    @Qualifier(SP_QUALIFIER__EXECUTORS)
    public Map<String, BiConsumer<AdjustmentRequestDTO, String>> executors() {
        executionFunctionalComponents.setStatusService(statusService);
        Map<String, BiConsumer<AdjustmentRequestDTO, String>> executors = new HashMap<>();
        ...
        executors.put(IMA_SCALING, execution(imaScalingExecution));
        executors.put(DRC_SCALING, execution(drcScalingExecution));
        executors.put(DRC_PV_OVERRIDE, execution(drcPVOverrideExecution));
        executors.put(DRC_PV_ADDON, execution(drcPVAddonExecution));
        executors.put(PNL_ROLL_OVER, rollOver(
            pnlInverters,
            executionFunctionalComponents.noOp(),
            executionFunctionalComponents.inverseTagging(),
            executionFunctionalComponents.rollOverTagging()));
        ...
        return executors;
    }

Define required dimensions for sign-off adjustments

Navigate to PLSignOffAnalysisConfig.java

We add SignOff Source Dimension to the appropriate cube in this case PLCube.

    @Qualifier("DimensionsPLCube")
    @Bean("plSignOffDimension")
    @Order(90)
    @Override
    public @NonNull DimensionsAdder signOffDimension() {
	    return super.signOffDimension();
    }

or as an alternate

    @Qualifier("DimensionsPLCube")
    @Bean("plSignOffDimension")
    @Order(90)
    @Override
    public @NonNull HierarchyBuilderConsumer signOffHierarchies() {
        return super.signOffHierarchies();
    }

Add source tagging fields

This will add extra fields to describe the adjustments and handle the sign-off. Navigate to PLSignOffAnalysisConfig.java

    @Qualifier(SP_QUALIFIER__CUSTOMISATIONS)
    @Bean("plSignOffCustomisations")
    public Consumer<IDatastoreConfigurator> signOffCustomisations() {
        return signOffCustomisations(null);
    }

Define the sign-off task store

This store will hold the sign-off task for the sign-off task hierarchy and task filtering. This store is defined from the base cube store and the required levels to filter on. The specific Datastore Schema Description Post-Processor will introspect the cube description and create the sign-off store compatible with the required level.

Navigate to PLSignOffAnalysisConfig.java

    @Bean("plDatastoreSchemaDescriptionPostProcessor")
    @Override
    public IDatastoreSchemaDescriptionPostProcessor datastoreSchemaDescriptionPostProcessor(IActivePivotManagerDescription managerDescription) {
        return super.datastoreSchemaDescriptionPostProcessor(managerDescription);
    }

Define supported Adjustment

Navigate to SupportedAdjustmentsConfig

Define a SupportedAdjustmentDTO bean.

Add-on

    @Bean
    public SupportedAdjustmentDTO drcAddonPresentValue() {
        return new SupportedAdjustmentDTO(
                ADD_ON_NAME,
                DRC_PV_ADDON,
                true,
                SACubeConfigurer.CUBE_NAME,
                Set.of(SA_SENSITIVITIES_STORE_NAME),
                SA_SENSITIVITIES_STORE_FILTER_LEVELS,
                SA_SENSITIVITIES_STORE_DRC_MEASURES,
                Set.of(new InputTypedFieldDTO(PRESENT_VALUE, DOUBLE))
        );
    }

Scaling

    @Bean
    public SupportedAdjustmentDTO drcScaling() {
        return new SupportedAdjustmentDTO(
                SCALING_NAME,
                DRC_SCALING,
                true,
                SACubeConfigurer.CUBE_NAME,
                Set.of(SA_SENSITIVITIES_STORE_NAME),
                SA_SENSITIVITIES_STORE_FILTER_LEVELS,
                SA_SENSITIVITIES_STORE_DRC_MEASURES,
                Set.of(new InputTypedFieldDTO(PRESENT_VALUE, DOUBLE))
        );
    }

Override

    @Bean
    public SupportedAdjustmentDTO drcOverridePresentValue() {
        return new SupportedAdjustmentDTO(
                OVERRIDE_NAME,
                DRC_PV_OVERRIDE,
                true,
                SACubeConfigurer.CUBE_NAME,
                Set.of(SA_SENSITIVITIES_STORE_NAME),
                SA_SENSITIVITIES_STORE_FILTER_LEVELS,
                SA_SENSITIVITIES_STORE_DRC_MEASURES,
                Set.of(new InputTypedFieldDTO(PRESENT_VALUE, DOUBLE))
        );
    }

Roll-over

    @Bean
    public SupportedAdjustmentDTO pnlRollOver() {
        return new SupportedAdjustmentDTO(
                ROLL_OVER_NAME,
                PNL_ROLL_OVER,
                false,
                PnlCubeConfig.CUBE_NAME,
                Set.of(StoreNames.PNL_STORE_NAME),
                Set.of(
                        new LevelTypedFieldDTO(AS_OF_LEVEL, StoreFieldNames.AS_OF_DATE, LOCAL_DATE),
                        new LevelTypedFieldDTO(DESK_LEVEL, StoreFieldNames.DESK, STRING),
                        new LevelTypedFieldDTO(BOOK_LEVEL, StoreFieldNames.BOOK, STRING, true),
                        new LevelTypedFieldDTO(TRADE_LEVEL, StoreFieldNames.TRADE_ID, STRING, true),
                        new LevelTypedFieldDTO(SIGNOFF_STATUS_LEVEL, ADJUSTMENT_LEVEL_STATUS, LEVEL_PATH, true),
                        new LevelTypedFieldDTO(SIGNOFF_TASK_LEVEL, ADJUSTMENT_LEVEL_TASK, LEVEL_PATH, true),
                        new LevelTypedFieldDTO(SIGNOFF_SOURCE_LEVEL, SOURCE_FIELD, LEVEL_PATH, true),
                        new LevelTypedFieldDTO(SIGNOFF_INPUT_TYPE_LEVEL, INPUT_FIELD, LEVEL_PATH, true)
                ),
                null,
                Set.of(new TypedFieldDTO(StoreFieldNames.AS_OF_DATE, LOCAL_DATE))
        );
    }