Adding and Populating a New Store
This page provides a description of how to load a new file into a new store within Atoti FRTB.
The techniques employed are generic examples that can be extended, adapted and repeated for any use case that are needed. In all cases, we make minimal changes to the Reference Implementation by externalizing our customizations into configuration classes.
Step 1 - Data Preparation
In this step, we want to bring any custom data into the appropriate folder.
For this example we will be adding our files to /sample-data/data/
which is picked up by the csv-source.dataset
property.
We will also place our file CustomProjection.csv into the dated subdirectory 2018-09-26
and set up our DLC configuration so that we
can load these files by dated subdirectory.
The file below will be placed into /sample-data/data/2018-09-26
:
AsOfDate | RiskClass | CustomProjected |
---|---|---|
2018-09-26 | Commodity | -6640.633693 |
2018-09-26 | GIRR | 14020.37649 |
2018-09-26 | CSR Sec CTP | 8386.767854 |
2018-09-26 | CSR Sec non-CTP | 19218.3336 |
2018-09-26 | Equity | 8274.302903 |
2018-09-26 | FX | 2150.845785 |
2018-09-26 | CSR non-Sec | -25353.39868 |
2018-09-26 | DRC Sec non-CTP | 11319.08548 |
2018-09-26 | DRC non-Sec | 17134.37517 |
Step 2 - Datastore Modifications
For this step, we will be using the Datastore Helper.
To add a new store, we need to define a new IStoreDescription
and add it via DatastoreConfiguratorConsumer
.
Optionally, we can also update our references by defining a IReferenceDescription
and adding it via the DatastoreConfiguratorConsumer
.
In the example below, we define the IStoreDescription
, IReferenceDescription
and update the datastore schema via a DatastoreConfiguratorConsumer
in a
new Spring configuration class CustomProjectionStoreConfiguration
:
@Configuration
public static class CustomProjectionStoreConfiguration {
/**
* Define our custom store
*/
@Bean
public IStoreDescription customProjectionStoreDescription() {
return StartBuilding.store()
.withStoreName("CustomProjectionStore")
.withField("CustomProjection_AsOfDate", DatastoreConfig.STORE_DATE_FIELD_FORMAT).asKeyField()
.withField("CustomProjection_RiskClass", ILiteralType.STRING).asKeyField()
.withField("CustomProjected", ILiteralType.DOUBLE)
.build();
}
/**
* Define a reference from the SA Cube's base store to our CustomProjectionStore store
*/
@Bean
public IReferenceDescription customProjectionReferenceDescription() {
var joinName = DatastoreConfig.ConcreteDatastoreConfig
.joinName(
SADatastoreConfig.SA_SENSITIVITIES_STORE_NAME,
"CustomProjectionStore"
);
return StartBuilding.reference()
.fromStore(SADatastoreConfig.SA_SENSITIVITIES_STORE_NAME)
.toStore("CustomProjectionStore")
.withName(joinName)
.withMapping(SADatastoreConfig.AS_OF_DATE, "CustomProjection_AsOfDate")
.withMapping(SADatastoreConfig.SA_SENSITIVITIES_STORE_RISK_CLASS, "CustomProjection_RiskClass")
.build();
}
/**
* Add our custom Store and Reference into the overall Datastore Schema
*/
@Qualifier(SP_QUALIFIER__CUSTOMISATIONS)
@Bean
public DatastoreConfiguratorConsumer datastoreConfiguratorConsumer(
IStoreDescription customStoreDescription,
IReferenceDescription customStoreReference) {
// Add our custom Store & Reference into the datastore schema:
return datastoreConfigurator -> datastoreConfigurator
.addStore(customStoreDescription)
.addReference(customStoreReference);
}
}
This configuration class must be added into FRTBConfig
’s list of Spring configuration dependencies inside the @Import
annotation.
For more information on potential datastore modifications, see the Datastore Helper documentation
Step 3 - DLC Configuration
For this step, we will be using the Data Load Controller.
Define a Custom Source
To load custom data, we can create a custom Source Description, or use an existing one.
In the example below, we create a custom Source Description which is configured to accept an AsOfDate
scope.
The scope value will change which dated subdirectory we load from.
We explicitly include the topic which we will add in the next section so that our source does not implicitly pick up all existing CSV topics.
@Configuration
public static class CustomSourceConfig {
/**
* Define a Source to load our custom topics through.
*/
@Bean
LocalCsvSourceDescription customLocalCsvSourceDescription(CsvSourceDatasetConfigurationProperties csvSourceDataset) {
return LocalCsvSourceDescription.builder("CustomSource", csvSourceDataset.getDataset() + AS_OF_DATE_SUBDIR)
// Provide all custom topic names:
.topicsToInclude(Set.of("CustomProjectionStore"))
.build();
}
}
This configuration class must be added into FRTBConfig
’s list of Spring configuration dependencies inside the @Import
annotation.
Define a Custom Topic Description
Next, we need to define the Topic Description to load CustomProjection.csv
into CustomProjectionStore
.
The following topic configuration example configures a topic called “CustomProjectionStore”,
which loads files that match the file pattern {**/,}CustomProjection*.csv
.
This configuration does not specify a target for the data, nor the data format.
note
If you do not name your topic the same as your store, you will have to explicitly specify the target in the channel of the topic.
- The target is implicitly the Atoti table named “CustomProjectionStore”.
- The format (column order and how input fields are parsed) comes implicitly from the columns and types of the targeted Atoti table, in this case “Trades”.
@Configuration
public static class CustomTopicsConfig {
/**
* Define the topic to load our Custom Projection file.
*/
@Bean
@Qualifier
public CsvTopicDescription customProjectionTopicDescription() {
// Create the Topic Description:
return CsvTopicDescription.builder("CustomProjectionStore", "glob:{**/,}CustomProjection*.csv")
.build();
}
}
This configuration class must be added into FRTBConfig
’s list of Spring configuration dependencies inside the @Import
annotation.
Step 4 - Executing Our Load
We can now load our Topic via the DLC. We can add a configuration to load our topic at application startup by defining a Spring ApplicationRunner
bean to
execute either before or after the application’s default initial data loading phase.
When the application starts, there are two ApplicationRunner
beans that perform the initial load of Configuration data and Dated data. Their order is defined
via the StartupSpringBeanOrder
integer constants:
StartupSpringBeanOrder.INITIAL_CONFIGURATION_DATA_LOAD
: Default value is10
.StartupSpringBeanOrder.INITIAL_DATA_LOAD
: Default value is20
.
We can load before or after the INITIAL_CONFIGURATION_DATA_LOAD
or INITIAL_DATA_LOAD
by adding or subtracting 1
to either ordering phase.
Here we will define a configuration to load after the default initial data loading phase by adding 1
to the INITIAL_DATA_LOAD
bean order:
@Configuration
public static class LoadCustomTopicOnInitialLoadConfig {
@Autowired IDataLoadControllerService dataLoadControllerService;
@Bean
// Run before (-) / after (+) the default initial dated data load
@Order(StartupSpringBeanOrder.INITIAL_DATA_LOAD + 1)
@ConditionalOnInMemoryDatabase
public ApplicationRunner customTopicInitialLoad() {
return args -> {
// Execute loading of our topic:
var requestResult = dataLoadControllerService.execute(
DlcLoadRequest.builder()
.topics("CustomProjectionStore")
.scope(DlcScope.of(Map.of(SCOPE__AS_OF_DATE, "2018-09-26")))
.build()
);
};
}
}
This configuration class must be added into FrtbApplicationConfig
’s list of Spring configuration dependencies inside the @Import
annotation.
Step 5 - Ensure FrtbApplicationConfig picks up our modifications
For our customizations to be picked up, we must include all added Spring @Configuration
classes in FrtbApplicationConfig
located in /frtb-application/src/main/java/com/activeviam/frtb/application/config/
.
When using DirectQuery
When using DirectQuery, we must perform Step 2 (that is, create the datastore definition we want to populate) and configure our datastore correctly. Then we have to modify our remote database to include the new Table / Fields. When doing so we must ensure we follow the naming convention of our Name Mapper.
Our database only needs to contain the new fields and/or stores. When the application starts, the datastore description will be converted into a database description. The remote database needs to contain the modifications before we start the application.
Adding a Field with DirectQuery
When adding a new field, we must add it to the datastore description as outlined in Step 2 (that is, create the datastore definition we want to populate) and also ensure the field exists in our database in the correct table.
Then when the application starts, the datastore description (containing the custom field) will be converted into a DirectQuery database description and the field can be used for Hierarchies or treated the same as other DirectQuery fields.
note
When using custom fields with DirectQuery it is important to remember that datastore queries and getByKey queries are not recommended due to the performance overhead.
Adding a new Table with DirectQuery
When adding a new table, all we need to do is complete Step 2 (that is, create the datastore definition we want to populate) and ensure we have defined our
table on our remote database. Then the Migrator
will convert our
in-memory store description into a remote DirectQuery table description automatically. This is done when we create our remote DirectQuery schema inside the
DirectQueryApplicationConfig.schema()
method.
warning
The DirectQuery and in-memory Atoti servers must be configured identically. There must not be any hierarchies in the DirectQuery data node that do not exist in the in-memory one.
Suggested further reading
Enriching File Fields by Adding Column Calculators
Add and Load a New Column to Existing File