Snowflake Database
This section explains how to connect to a Snowflake Database.
Overall Sequence
To connect to a remote Database using DirectQuery you need to complete these steps:
- Configure your remote Database in Atoti Market Risk's expected Database format
- Set the Atoti Market Risk configuration properties
- Deploy Atoti Market Risk in Horizontal Distribution
Database Schema
Your remote Database must be configured in the same format as Atoti Market Risk's in-memory datastores. This means the same Tables and Field Types will need to be replicated. You can do this by defining the Tables as outlined in the Database documentation or by exposing Views on your Database in the same expected Database format.note
All of the Database Tables must be present either as actual Tables or Views on your connected Database.
Required DirectQuery Properties
The following properties must be configured to get started with DirectQuery.DirectQuery only supports scalar mode for the Sensitivities Cube. In addition, the Summary Cubes can’t be activated with DirectQuery, so they have to be disabled:
mr:
enable:
data-model:
# Set the sensitivity cube to scalar mode
scalar-sensitivities: true
cubes:
# Disable all summary cubes
var-summary: false
pnl-summary: false
sensi-summary: false
Snowflake Configuration
Properties for enabling and connecting to a remote database can be defined using either the YAML or standard property formats. These properties can be used in .yaml and .properties files, or on the command line.
# Snowflake Database Connection Parameters
directquery:
enabled: true
database:
type: snowflake
snowflake:
connectionString: YOUR_JDBC_SNOWFLAKE_CONNECTION_STRING
username: YOUR_USERNAME
password: YOUR_PASSWORD
warehouse: YOUR_WAREHOUSE
feedingWarehouse: YOUR_FEEDING_WAREHOUSE
database: YOUR_DATABASE
schema: YOUR_SCHEMA
role: YOUR_ROLE
arrayAggWrapperFunctionName: YOUR_SPECIFIC_ARRAY_AGGREGATE_FUNCTION
additionalOptions:
anSFSessionProperty: YOUR_EXTRA_PROPERTY
The above properties populate the SnowflakeSpringProperties java class.
connectionString
Full property name: directquery.database.snowflake.connectionString
This is the JDBC Driver connection string which points to the Snowflake Database. for details on how to find your JDBC Connection String, see the JDBC Driver Snowflake documentation.
warehouse
Full property name: directquery.database.snowflake.warehouse
The warehouse is where our queries and aggregation will take place. You can view available warehouses through the SHOW WAREHOUSES command.
feedingWarehouse
Full property name: directquery.database.snowflake.feedingWarehouse
Special wareHouse used during the feeding.
database
Full property name: directquery.database.snowflake.database
The Snowflake Database you will be connecting to.
schema
Full property name: directquery.database.snowflake.schema
The Schema within the specified Database to use. This Schema should match Atoti Market Risk’s expected Database Structure either by having the same Table structure or through the use of views.
role
Full property name: directquery.database.snowflake.role
The Role with which the specified Database is used. This Role allows request and use of the specified Warehouse on the specified Schema.
arrayAggWrapperFunctionName
Full property name: directquery.database.snowflake.arrayAggWrapperFunctionName
Optional name of the user-defined function that can be used to wrap the ARRAY_AGG operations and improve performance. This function must be defined in Snowflake and be accessible to the role running the queries.
additionalOptions
Any connection option defined in the Snowflake documentation represented as a key/value pair.
Deployment Options
We provide two main options to run with DirectQuery:
Option | How to run Atoti Market Risk |
---|---|
Operate with some data loaded in-memory and the rest available through DirectQuery | In Horizontal Distribution with in-memory Database |
Run purely on DirectQuery remote data | In a Single JVM on DirectQuery only |
Horizontal Distribution
In Horizontal Distribution you have access to the in-memory tools such as What-If, data updates and Sign-Off for the data loaded in-memory as well as access to a large number of historical dates.
When running in Horizontal Distribution you need to run three Nodes:
- Query node
- In-memory data node
- DirectQuery data node
mr.data-load.initial-business-dates
For both Combined in-memory query node and DirectQuery
data node, set the mr.data-load.initial-business-dates
property.
Given that the data nodes are distributed by AsOfDate, no two data nodes can contain the same Partition - meaning that one AsOfDate cannot be present in another node. So to prevent any issues the directQueryDatesToIncludeFilter bean is used to set which dates to include in the DirectQuery data node.
note
This property must be provided with the same values to both data nodes.
JVM With Query Node And In-Memory Data Node
This JVM consists of two distributed nodes; a query node and an in-memory data node.
Start the JVM by specifying the following parameters to the Atoti Market Risk application, either in a .properties file or through command-line
arguments (add -D
before each property).
See Activating and de-activating cubes for details about the enabling/disabling of cubes.
# Disable cubes
mr.enable.cubes.<cube>: false
# Set MR to run this JVM with a query node and in-memory data node
spring.profiles.active=dist-data-node,dist-query-node
JVM With DirectQuery Data Node
Once the JVM with the query and in-memory data node is running, you can start a second JVM with the DirectQuery data node with the following configuration
properties either in a .properties file or through command-line arguments (add -D
before each property).
# Set MR to run as a data node and to connect to the second JVM
spring.profiles.active=dist-data-node
# Use a different port from the second JVM
server.port=10011
# Use a different Content Server for this JVM
mr.application.content-service.db.url=jdbc:h2:mem:content_service;DB_CLOSE_DELAY=-1
mr.application.content-service.db.hibernate.hbm2ddl.auto=create
# Disable cubes
mr.enable.cubes.<cube>: false
# Enable and configure the Snowflake DirectQuery Database
directquery.enabled=true
directquery.database.type=snowflake
directquery.database.snowflake.username=database_username
directquery.database.snowflake.password=database_password
directquery.database.snowflake.connectionString=connection_string
directquery.database.snowflake.warehouse=snowflake_warehouse
directquery.database.snowflake.database=snowflake_database
directquery.database.snowflake.schema=snowflake_database_schema
Single JVM
You can run a single JVM consisting of only DirectQuery.
By running in a single JVM with DirectQuery only, you can now see the DirectQuery data in the VaR/ES cube.
Properties
To run the Single JVM node you only need to configure the Snowflake Properties and disable cubes and run the application as normal. There are no distributed nodes to configure when running in a single JVM.
Here is an example of the properties to use:
See Activating and de-activating cubes for details about the enabling/disabling of cubes.
# Disable cubes
mr.enable.cubes.<cube>: false
# Enable and configure the Snowflake DirectQuery Database
directquery.enabled=true
directquery.database.type=snowflake
directquery.database.snowflake.username=database_username
directquery.database.snowflake.password=database_password
directquery.database.snowflake.connectionString=connection_string
directquery.database.snowflake.warehouse=snowflake_warehouse
directquery.database.snowflake.database=snowflake_database
directquery.database.snowflake.schema=snowflake_database_schema
Single JVM Limitations
By running DirectQuery with a single JVM you will be running purely on DirectQuery. This will come at the compromise of query performance of Trade level queries while also not having access to in-memory only tools such as WhatIf and SignOff.
note
Running both an in-memory data node and DirectQuery data node under a single JVM is not currently supported.
Reference Snowflake Database Schema
Atoti Market Risk ships with the SQL and data needed to initialize a Snowflake database that can be used for testing. The SQL scrip and reference data are
located within the directory: mr-directquery/src/test/resources/databases/snowflake/
.