Configuring the DLC

Any of the source dependencies must be included in your projects pom. You can check Data Connectors Requirements to ensure your project meets the minimum ActivePivot version requirements. You must also include one dependency for each source you wish to add. All supported sources can be found at Data Connectors Module Structure.

Here is an example of importing Data Connectors for the CSV and JDBC sources:

    <!-- Data Connectors CSV Source -->
    <dependency>
        <groupId>com.activeviam.io</groupId>
        <artifactId>data-connectors-csv</artifactId>
        <version>${dataconnectors.version}</version>
    </dependency>
    <!-- Data Connectors JDBC Source -->
    <dependency>
        <groupId>com.activeviam.io</groupId>
        <artifactId>data-connectors-jdbc</artifactId>
        <version>${dataconnectors.version}</version>
    </dependency>

Please contact your project team if you do not have the Data Connectors jar file. It needs to be imported in your maven repository.

Next, once the required spring configurations for data source type (CSV, JDBC or Kafka/RabbitMQ) have been configured (examples further below), they need to be registered with the DataLoadController. The following snippets show you how to do that:

Create DataLoadController Bean

    @Bean(destroyMethod = "close")
    public IDataLoadController dataLoadController() {

        final IDataLoadController controller = new DataLoadController(datastoreConfig.datastore(), null);                
        registerJdbcSourceAndChannels(controller);    		            

        return controller;
    }            	

Register data source and channels

    private void registerJdbcSourceAndChannels(
            final IDataLoadController controller) {

        final TupleMessageChannelFactory channelFactory = jdbcSourceConfig.jdbcMessageChannelFactory();

        controller.registerSource(jdbcSourceConfig.jdbcSource());
        controller.registerChannel(
                channelFactory.createChannel(
                        JdbcSourceConfig.JDBC_TOPIC__PRODUCT,
                        DatastoreNames.PRODUCT_STORE_NAME),
                Arrays.asList(DatastoreNames.PRODUCT_STORE_NAME),
                new ProductScopeToRemoveWhereConditionConverter());                           
    }

Remove-where condition for unload

    public class ProductScopeToRemoveWhereConditionConverter
            implements IDataLoadController.IScopeToRemoveWhereConditionConverter {

        @Override
        public ICondition apply(
                final String store,
                final Map<String, Object> scope){

            final LocalDate cobDate = LocalDate
                    .parse((String) scope.get(DataLoadControllerRestService.SCOPE_KEY__COB_DATE));
            ICondition cobCond = BaseConditions.Equal(COB_DATE, cobDate);
            ICondition productCond = BaseConditions.Not(BaseConditions.Equal(BASE_STORE_PRODUCT_NAME, "N/A"));

            return BaseConditions.And(cobCond, productCond);
        }
    }

Register topic alias

You may group several topics together under an alias, so that the client / data orchestrator can request to load a single topic alias instead of providing each and every topic name to load in the request. This is useful when submitting a single request to load all the topics.

For example, you may configure an alias “ALL” that groups all the topics together.

    protected void registerTopicAliases(IDataLoadController dataLoadController) {

        dataLoadController.registerTopicAlias(TOPIC__ALL, Arrays.asList(TOPIC__FX_DATA, TOPIC__ORGANISATION_DATA, TOPIC__TRADE_BOOKING_DATA, TOPIC__TRADES));    		
    }
To find out how to create a CSV, JDBC or a Messaging source, please refer to the below section.

The relevant Spring Configuration classes need to be imported in the project’s ApplicationConfig (or the appropriate file) that is the entry point for the Spring “Java Config” of the entire application.

search.js