Adding data loading or unloading topics
This page provides a description of how to load a new file into a new store within Atoti Market Risk. The techniques employed are generic examples that can be extended, adapted and repeated for any use case you encounter. In all cases, we make minimal changes to the MR application.
Step 1 - Move data to the relevant directory
For this example, we want to bring any custom data to an appropriate
folder inside of : /mr-application/src/main/resources/data
within the
/2023-09-26
folder and name our file Custom.csv.
AsOfDate | RiskClass | CustomProjected |
---|---|---|
2018-09-26 | Commodity | -6640.633693 |
2018-09-26 | GIRR | 14020.37649 |
2018-09-26 | CSR Sec CTP | 8386.767854 |
2018-09-26 | CSR Sec non-CTP | 19218.3336 |
2018-09-26 | Commodity | -2460.048584 |
2018-09-26 | Equity | 8274.302903 |
2018-09-26 | FX | 2150.845785 |
2018-09-26 | GIRR | 17537.8908 |
2018-09-26 | CSR non-Sec | -25353.39868 |
2018-09-26 | DRC Sec non-CTP | 11319.08548 |
2018-09-26 | FX | 25977.18728 |
2018-09-26 | Commodity | 11714.89133 |
2018-09-26 | Equity | -19844.11309 |
2018-09-26 | FX | 8906.302165 |
2018-09-26 | GIRR | 19617.16455 |
2018-09-26 | DRC non-Sec | 17134.37517 |
Step 2 - Define the datastore and references
To add a new datastore, we will need to create a bean with the signature
@Qualifier(SP_QUALIFIER__CUSTOMISATIONS) Consumer<IDatastoreConfigurator> customisation()
that will be used by
the addModifications()
method inside of the
DatastoreConfiguratorSetup
class.
mr-application/src/main/java/com/activeviam/mr/application/datastore/DatastoreConfiguratorSetup.java
// 1. This will be all the fields you wish to store in your cube for that particular store.
@Bean
@Qualifier(SP_QUALIFIER__CUSTOMISATIONS)
public Consumer<IDatastoreConfigurator> myCustomisations() {
return Customisations::loadCustomisations;
}
public static void loadCustomisations(IDatastoreConfigurator configurator) {
configurator
.....
}
Here we can also add a reference, if needed.
IReferenceDescription referenceDescription = configurator.referenceBuilder(FRTBConstants.FRTB_SCHEMA)
.fromStore(SADatastoreConfig.TRADE_BASE_STORE_NAME)
.toStore(CUSTOM_STORE)
.withName(SADatastoreConfig.TRADE_BASE_STORE_NAME + "To" + CUSTOM_STORE)
.withMapping(SADatastoreConfig.TRADE_BASE_STORE_RISK_CLASS, SADatastoreConfig.TRADE_BASE_STORE_RISK_CLASS)
.withMapping(SADatastoreConfig.AS_OF_DATE, SADatastoreConfig.AS_OF_DATE)
.build();
configurator.addReference(FRTBConstants.FRTB_SCHEMA, referenceDescription);
See DatastoreHelper for more information.
Step 3 - Create channel parameters for our data loading topic
The channel parameters for our store include store name, topic name, and file pattern.
Ensure that a file pattern property is defined within
mr-application/src/main/resources/mr.properties
custom.file-pattern=**/*Custom*.csv
To add channel parameters, we will have to extend the
SourcePatternsConfig
class.
public class ExtendedSourceConfig extends SourcePatternsConfig {
public static final String FILE_PATTERN_PROP = "custom.file-pattern";
public static final String TOPIC_CUSTOM = "Custom_TOPIC";
@Bean(name="sensiPatterns")
@ConditionalOnVectorizedSensitivity
public TopicToStoreAndFilePatternHolder sensiFilePatterns()
List<ChannelParameters> channelParametersList = super.sensiFilePatterns();
channelParametersList.add(TOPIC_CUSTOM, CUSTOM_STORE, FILE_PATTERN_PROP);
return channelParametersList;
}
}
Step 4 - Ensure that MarketRiskConfig takes our extended class
Navigate to the application config class
MarketRiskConfig``mr-application/src/main/java/com/activeviam/mr/application/main/MarketRiskConfig.java
Replace SourcePatternsConfig
with ExtendedSourceConfig
@Configuration
@Import(value = {
...
// SourceConfig.class,
ExtendedSourceConfig.class,
...
})
Step 5 - Ensure that our topic is included within InitialDataLoadConfig
InitialDataLoadConfig orchestrates our Topic’s execution.
public Void initialDataLoad() throws IOException {
//...
//In this case it makes sense to include a scope for our files.
for(final LocalDate date: initialLoadAsOfDates()){
final Map<String, Object> fetchScope = new HashMap<>();
fetchScope.put(DataLoadControllerConfig.SCOPE__AS_OF_DATE, date.toString());
controller.execute(new DataLoadControllerRequest(FRTBLoadDataTxControllerTask.PLUGIN_KEY,
Arrays.asList(ExtendedSourceConfig.TOPIC_CUSTOM), fetchScope));
//...
}
When using DirectQuery
When using DirectQuery, we must perform Step 2 and configure our datastore correctly. Then we have to modify our remote database to include the new Table/Fields. When doing so, we must ensure we follow the naming convention of our Name Mapper.
Our database only needs to contain the new fields and/or stores. When the application starts, the Datastore description is converted into a database description. The remote database needs to contain the modifications before we start the application.
Adding a Field with DirectQuery
When adding a new field, we must add it to the datastore description as outlined in step 2 and also ensure the field exists in our database in the correct table.
Then when the application starts, the datastore description (containing the custom field) will be converted into a DirectQuery database description and the field can be used for Hierarchies or treated the same as other DirectQuery fields.
note
When using custom fields with DirectQuery it is important to remember that datastore queries and getByKey queries are not recommended due to the performance overhead.
Adding a new Table with DirectQuery
Stores that are part of the datastore star schema are automatically added as tables used with DirectQuery. If the remote table needs to be cached in memory, see paragraph [Registering the Table to Cache](/dev/dev-direct-query/caching-remote-table.html#Registering the Table to Cache)
warning
The DirectQuery and in-memory Atoti servers must be configured identically. There must not be any hierarchies in the DirectQuery data node that do not exist in the in-memory one.
Suggested Further Reading
- Adding cube hierarchies
- Configuring measures using Spring Beans
- Configuring schema selections using Spring Beans
- Configuring sources using Spring Beans
- Adding a new KPI