Snowflake Database

This section explains how to connect to a Snowflake Database.

Overall Sequence

To connect to a remote Database using DirectQuery you need to complete these steps:

  1. Configure your remote Database in Atoti FRTB's expected Database format
  2. Set the Atoti FRTB configuration properties
  3. Deploy Atoti FRTB in Horizontal Distribution

Database Schema

Your remote Database must be configured in the same format as Atoti FRTB's in-memory datastores. This means the same Tables and Field Types will need to be replicated. You can do this by defining the Tables as outlined in the Database documentation or by exposing Views on your Database in the same expected Database format.

note

All of the Database Tables must be present either as actual Tables or Views on your connected Database.

Required DirectQuery Properties

The following properties must be configured to get started with DirectQuery.

Maven Profile

The frtb-directquery module is not included in the classpath by default. You will need to enable the direct-query Maven Profile to add the frtb-directquery module into the classpath. This will need to be completed when building the JAR file.

Disable IMA

For this preview implementation, only SA is supported when running with DirectQuery. To disable IMA, add the following property either to the frtb.properties file or as a command-line argument with the prefix -D as shown:
frtb.properties:
ima.disable=true
Command-line argument:
-Dima.disable=true

Snowflake Configuration

The application.yaml file contains specific properties for enabling and connecting to a remote Database.
# Snowflake Database Connection Parameters
directquery:
  enabled: true
  database:
    type: snowflake
    snowflake:
      connectionString: YOUR_JDBC_SNOWFLAKE_CONNECTION_STRING
      username: YOUR_USERNAME
      password: YOUR_PASSWORD
      role: ROLE_TO_USE
      warehouse: YOUR_WAREHOUSE
      database: YOUR_DATABASE
      schema: YOUR_SCHEMA

The above properties populate the SnowflakeSpringProperties java class.

connectionString

Full property name: directquery.database.snowflake.connectionString

This is the JDBC Driver connection string which points to the Snowflake Database. for details on how to find your JDBC Connection String, see the JDBC Driver Snowflake documentation.

role

Full property name: directquery.database.snowflake.role

The role you want to grant Atoti FRTB when it runs queries on the remote database.

warehouse

Full property name: directquery.database.snowflake.warehouse

The warehouse is where our queries and aggregation will take place. You can view available warehouses through the SHOW WAREHOUSES command.

database

Full property name: directquery.database.snowflake.database

The Snowflake Database we will be connecting to.

schema

Full property name: directquery.database.snowflake.schema

The Schema within the specified Database to use. This Schema should match FRTB’s expected Database Structure either by having the same Table structure or through the use of views.

Deployment Options

We provide two main options to run with DirectQuery:

Option How to run FRTB
Operate with some data loaded in-memory and the rest available through DirectQuery In Horizontal Distribution with in-memory Database
Run purely on DirectQuery remote data In a Single JVM on DirectQuery only

Dates to Include Filter Configuration

Given that the data nodes are distributed by AsOfDate, no two data nodes can contain the same Partition - meaning that one AsOfDate cannot be present in another node. So to prevent any issues the directQueryDatesToIncludeFilter bean is used to set which dates to include in the DirectQuery data node.

note

This property must be provided with the same values to both data nodes.

Horizontal Distribution

In Horizontal Distribution you have access to the in-memory tools such as What-If, data updates and Sign-Off for the data loaded in-memory as well as access to a large number of historical dates.

When running in Horizontal Distribution you need to run three Nodes:

  • Query node
  • In-memory data node
  • DirectQuery data node

JVM With Query Node

This JVM consists of the StandardisedApproachCube query node and the CombinedCube query node.

Start the JVM by specifying the following parameters to the Atoti FRTB application, either in a .properties file or through command-line arguments (add -D before each property).

# Disable IMA
ima.disable=true

# Set Atoti FRTB to run as a query node
spring.profiles.active=dist-query-node
activeviam.distribution.gossip.router.enable=true
activeviam.distribution.gossip.router.port=16484

# DirectyQuery should not be enabled on the query node
directquery.enabled=false

JVM With In-Memory Data Node

This JVM consists of the in-memory data node only.

Start the JVM by specifying the following parameters to the Atoti FRTB application, either in a .properties file or through command-line arguments (add -D before each property).

# Disable IMA
ima.disable=true

# Set Atoti FRTB to run as a query node and data node
spring.profiles.active=dist-data-node

# Use a different port than the other JVMs
server.port=8081

# Use an in-memory Content Server for this JVM
content-service.db.url=jdbc:h2:mem:content_service;DB_CLOSE_DELAY=-1
content-service.db.hibernate.hbm2ddl.auto=create

# DirectyQuery should not be enabled on the in-memory data node
directquery.enabled=false

JVM With DirectQuery Data Node

Once the JVM with the query and in-memory data node is running, you can start a second JVM with the DirectQuery data node with the following configuration properties either in a .properties file or through command-line arguments (add -D before each property).

# Disable IMA
ima.disable=true

# Set Atoti FRTB to run as a data node and to connect to the second JVM
spring.profiles.active=dist-data-node

# Use a different port than the other JVMs
server.port=8082

# Use an in-memory Content Server for this JVM
content-service.db.url=jdbc:h2:mem:content_service;DB_CLOSE_DELAY=-1
content-service.db.hibernate.hbm2ddl.auto=create

# Enable and configure the Snowflake DirectQuery Database
directquery.enabled=true
directquery.database.type=database_type
directquery.database.snowflake.username=database_username
directquery.database.snowflake.password=database_password
directquery.database.snowflake.role=role_app_will_use
directquery.database.snowflake.connectionString=connection_string
directquery.database.snowflake.warehouse=snowflake_warehouse
directquery.database.snowflake.database=snowflake_database
directquery.database.snowflake.schema=snowflake_database_schema

Single JVM

You can run a single JVM consisting of only DirectQuery.

By running in a single JVM with DirectQuery only, you can now see the DirectQuery data in the StandardisedApproachCube.

Properties

To run the Single JVM node we will only need to configure the Snowflake Properties and Disable IMA and run the application as normal. There are no distributed nodes to configure when running in a single JVM.

Here is an example of the properties to use:

# Disable IMA in order to use DirectQuery
ima.disable=true

# Enable and configure the Snowflake DirectQuery Database
directquery.enabled=true
directquery.database.type=database_type
directquery.database.snowflake.username=database_username
directquery.database.snowflake.password=database_password
directquery.database.snowflake.role=role_app_will_use
directquery.database.snowflake.connectionString=connection_string
directquery.database.snowflake.warehouse=snowflake_warehouse
directquery.database.snowflake.database=snowflake_database
directquery.database.snowflake.schema=snowflake_database_schema

Single JVM Limitations

By running with a single JVM we will be running purely on DirectQuery. This will come at the compromise of query performance of Trade level queries while also not having access to in-memory only tools such as WhatIf and SignOff.

note

Running both an in-memory data node and DirectQuery data node under a single JVM is not currently supported.

Reference Snowflake Database Schema

Atoti FRTB ships with the SQL and data needed to initialize a Snowflake database that can be used for testing. The SQL scrip and reference data are located within the directory: frtb-directquery/src/test/resources/databases/snowflake/.