Connecting to a Database

This section explains how to connect to a natively supported Database.

Overall Sequence

To connect to a remote Database using DirectQuery you need to complete these steps:

  1. Configure your remote Database in Atoti Market Risk's expected Database format
  2. Set the Atoti Market Risk configuration properties
  3. Deploy Atoti Market Risk in Horizontal Distribution

Database Schema

Your remote Database must be configured in the same format as Atoti Market Risk's in-memory datastores. This means the same Tables and Field Types will need to be replicated. You can do this by defining the Tables as outlined in the Database documentation or by exposing Views on your Database in the same expected Database format.

note

All of the Database Tables must be present either as actual Tables or Views on your connected Database.

Required DirectQuery Properties

The following properties must be configured to get started with DirectQuery.

Disable Other Cubes

For this preview implementation, only VaR is supported when running with DirectQuery. To disable the other cubes (Market data, PnL, Sensitivities and Summary), configure the following properties either in the application.yaml file or as a command-line argument with the prefix -D as shown:
application.yaml:
mr.enable.cubes.<cube>=false
Command-line argument:
-Dmr.enable.cubes.<cube>=false

Database Configuration

Properties for enabling and connecting to a remote database can be defined using either the YAML or standard property formats. These properties can be used in .yaml and .properties files, or on the command line.

starter:
  deployment:
    type: direct-query
# MSSQL Database Connection Parameters
directquery:
  withAdjustments: false
  database:
    type: mssql
    mssql:
      username: YOUR_USERNAME
      password: YOUR_PASSWORD
      port: YOUR_PORT
      hostName: YOUR_HOST
      database: YOUR_DATABASE
      schema: YOUR_SCHEMA
      encrypt: true/false
      trustServerCertificate: true/false
      connectRetryCount: 5

The above properties populate the MSSQLSpringProperties java class.

username

Full property name: directquery.database.mssql.username

Sets the user name that will be used when connecting to SQL Server.

password

Full property name: directquery.database.mssql.password

Sets the password that will be used when connecting to SQL Server.

port

Full property name: directquery.database.mssql.port

Sets the TCP-IP port number used when opening a socket connection to SQL Server.

hostName

Full property name: directquery.database.mssql.hostName

Hostname of where the MSSQL database is running. For example, when running locally we can specify “localhost”.

database

Full property name: directquery.database.mssql.database

The Microsoft SQL Database you will be connecting to.

schema

Full property name: directquery.database.mssql.schema

The Schema within the specified Database to use. This Schema should match Atoti Market Risk’s expected Database Structure either by having the same Table structure or through the use of views.

encrypt

Full property name: directquery.database.mssql.encrypt

Set to true if the Secure Sockets Layer (SSL) encryption is enabled between the client and the SQL Server, false otherwise.

trustServerCertificate

Full property name: directquery.database.mssql.trustServerCertificate

Set to true if the SSL certificate should be automatically trusted when the communication layer is encrypted using SSL, false otherwise.

connectRetryCount

Full property name: directquery.database.mssql.connectRetryCount

The maximum number of attempts to re-establish a broken connection.

starter:
  deployment:
    type: direct-query
# Snowflake Database Connection Parameters
directquery:
  database:
    type: snowflake
    snowflake:
      connectionString: YOUR_JDBC_SNOWFLAKE_CONNECTION_STRING
      username: YOUR_USERNAME
      password: YOUR_PASSWORD
      warehouse: YOUR_WAREHOUSE
      feedingWarehouse: YOUR_FEEDING_WAREHOUSE
      database: YOUR_DATABASE
      schema: YOUR_SCHEMA
      role: YOUR_ROLE
      arrayAggWrapperFunctionName: YOUR_SPECIFIC_ARRAY_AGGREGATE_FUNCTION
      additionalOptions:
        anSFSessionProperty: YOUR_EXTRA_PROPERTY 

The above properties populate the SnowflakeSpringProperties java class.

connectionString

Full property name: directquery.database.snowflake.connectionString

This is the JDBC Driver connection string which points to the Snowflake Database. for details on how to find your JDBC Connection String, see the JDBC Driver Snowflake documentation.

warehouse

Full property name: directquery.database.snowflake.warehouse

The warehouse is where our queries and aggregation will take place. You can view available warehouses through the SHOW WAREHOUSES command.

feedingWarehouse

Full property name: directquery.database.snowflake.feedingWarehouse

Special wareHouse used during the feeding.

database

Full property name: directquery.database.snowflake.database

The Snowflake Database you will be connecting to.

schema

Full property name: directquery.database.snowflake.schema

The Schema within the specified Database to use. This Schema should match Atoti Market Risk’s expected Database Structure either by having the same Table structure or through the use of views.

role

Full property name: directquery.database.snowflake.role

The Role with which the specified Database is used. This Role allows request and use of the specified Warehouse on the specified Schema.

arrayAggWrapperFunctionName

Full property name: directquery.database.snowflake.arrayAggWrapperFunctionName

Optional name of the user-defined function that can be used to wrap the ARRAY_AGG operations and improve performance. This function must be defined in Snowflake and be accessible to the role running the queries.

additionalOptions

Any connection option defined in the Snowflake documentation represented as a key/value pair.

Deployment Options

We provide two main options to run with DirectQuery:

Option How to run FRTB
Operate with some data loaded in-memory and the rest available through DirectQuery In Horizontal Distribution with in-memory Database
Run purely on DirectQuery remote data In a Single JVM on DirectQuery only

Horizontal Distribution

In Horizontal Distribution you have access to the in-memory tools such as What-If, data updates and Sign-Off for the data loaded in-memory as well as access to a large number of historical dates.

When running in Horizontal Distribution you need to run three Nodes:

  • Query node
  • In-memory data node
  • DirectQuery data node

mr.data-load.initial-business-dates

For both Combined in-memory query node and DirectQuery data node, set the mr.data-load.initial-business-dates property.

Given that the data nodes are distributed by AsOfDate, no two data nodes can contain the same Partition - meaning that one AsOfDate cannot be present in another node. So to prevent any issues the directQueryDatesToIncludeFilter bean is used to set which dates to include in the DirectQuery data node.

note

This property must be provided with the same values to both data nodes.

JVM With Query Node And In-Memory Data Node

This JVM consists of two distributed nodes; a query node and an in-memory data node.

Start the JVM by specifying the following parameters to the Atoti Market Risk application, either in a .properties file or through command-line arguments (add -D before each property).

See Activating and de-activating cubes for details about the enabling/disabling of cubes.

# Enable / Disable cubes
mr:
  enable:
    cubes:
      <cube>: false
      common: true

# Set MR to run this JVM with a query node and in-memory data node
starter:
  deployment:
    type: in-memory
    transport: netty

JVM With DirectQuery Data Node

Once the JVM with the query and in-memory data node is running, you can start a second JVM with the DirectQuery data node with the following configuration properties either in a .properties file or through command-line arguments (add -D before each property).

# Set MR to run as a data node and to connect to the second JVM
starter:
  deployment:
    type: direct-query
    transport: netty

# Disable cubes 
mr:
  enable:
    cubes:
      common: false
      <cube>: false 
# Use a different Content Server for this JVM
  application:
    content-service:
      db:
        url: jdbc:h2:mem:content_service;DB_CLOSE_DELAY=-1
        hibernate:
          hbm2ddl:
            auto: create
  
# Use a different port from the second JVM
server:
  port: 10011

# Enable and configure the MSSQL DirectQuery Database
directquery:
  database:
    type: mssql
    mssql:
      username: YOUR_USERNAME
      password: YOUR_PASSWORD
      port: PORT_MSSQL_DATABASE_IS_ON
      hostName: HOST_OF_THE_MSSQL_DATABASE
      database: YOUR_DATABASE_NAME
      schema: YOUR_DATABASE_SCHEMA
      encrypt: true/false
      trustServerCertificate: true/false
      connectRetryCount: integer
# Set MR to run as a data node and to connect to the second JVM
starter:
  deployment:
    type: direct-query
    transport: netty

# Disable cubes
mr:
  enable:
    cubes:
      common: false
      <cube>: false
  # Use a different Content Server for this JVM
  application:
    content-service:
      db:
        url: jdbc:h2:mem:content_service;DB_CLOSE_DELAY=-1
        hibernate:
          hbm2ddl:
            auto: create
            
# Use a different port from the second JVM
server:
  port: 10011

# Enable and configure the Snowflake DirectQuery Database
directquery:
  database:
    type: snowflake
    snowflake:
      connectionString: YOUR_JDBC_SNOWFLAKE_CONNECTION_STRING
      username: YOUR_USERNAME
      password: YOUR_PASSWORD
      role: ROLE_TO_USE
      warehouse: YOUR_WAREHOUSE
      database: YOUR_DATABASE
      schema: YOUR_SCHEMA

Single JVM

You can run a single JVM consisting of only DirectQuery.

By running in a single JVM with DirectQuery only, you can now see the DirectQuery data in the VaR/ES cube.

Properties

To run the Single JVM node you only need to configure the Database Properties and disable cubes and run the application as normal. There are no distributed nodes to configure when running in a single JVM.

See Activating and de-activating cubes for details about the enabling/disabling of cubes.

Here is an example of the properties to use:

starter:
  deployment:
    type: direct-query
    transport: local

# Disable cubes
mr:
  enable:
    cubes:
      <cube>: false
      
# Enable and configure the MSSQL DirectQuery Database
directquery:
  database:
    type: mssql
    mssql:
      username: YOUR_USERNAME
      password: YOUR_PASSWORD
      port: PORT_MSSQL_DATABASE_IS_ON
      hostName: HOST_OF_THE_MSSQL_DATABASE
      database: YOUR_DATABASE_NAME
      schema: YOUR_DATABASE_SCHEMA
      encrypt: true/false
      trustServerCertificate: true/false
      connectRetryCount: integer
starter:
  deployment:
    type: direct-query
    transport: local
    
# Disable cubes
mr:
  enable:
    cubes:
      <cube>: false

# Enable and configure the Snowflake DirectQuery Database
directquery:
  database:
    type: snowflake
    snowflake:
      username: YOUR_USERNAME
      password: YOUR_PASSWORD
      role: YOUR_ROLE
      connectionString: YOUR_CONNECTION_STRING
      warehouse: YOUR_WAREHOUSE
      database: YOUR_DATABASE_NAME
      schema: YOUR_SCHEMA_NAME
      arrayAggWrapperFunctionName: YOUR_ARRAY_AGGREGATOR_FUNCTION

Single JVM Limitations

By running DirectQuery with a single JVM you will be running purely on DirectQuery. This will come at the compromise of query performance of Trade level queries while also not having access to in-memory only tools such as WhatIf and SignOff.

note

Running both an in-memory data node and DirectQuery data node under a single JVM is not currently supported.

Reference Database

Atoti Market Risk ships with an example MSSQL database that can be used for testing. This database runs inside a docker container. It is located within the directory: mr-directquery/src/test/resources/databases/mssql/.

Atoti Market Risk ships with the SQL and data needed to initialize a Snowflake database that can be used for testing. The SQL scrip and reference data are located within the directory: mr-directquery/src/test/resources/databases/snowflake/.