0.9.0 (Sep 13, 2024)#
Added#
atoti-directquery-jdbcto connect to an external database through JDBC.data_model_transaction(). Batching measure creation with a data model transaction has the same performance as usingMeasures.update()without being limited to independent measures:- m.update({"foo": 13, "bar": 42}) - m.update({"foo + 1": m["foo"] + 1, "bar + 1": m["bar"] + 1}) + with session.data_model_transaction(): + m["foo"] = 13 + m["foo + 1"] = m["foo"] + 1 + m["bar"] = 42 + m["bar + 1"] = m["bar"] + 1
Data model transactions also replace the private API relying on
atoti.MeasureMetadata:- m["foo"] = (13, tt.MeasureMetadata(visible=True)) - m["bar"] = (42, tt.MeasureMetadata(description="The answer")) + with session.data_model_transaction(): + m["foo"] = 13 + m["foo"].visible = True + m["bar"] = 42 + m["bar"].description = "The answer"
atoti_directquery_redshift.ConnectionConfig.connection_pool_size.
User interface#
Filters tool in the sidebar of the JupyterLab extension to see default filters.
Changed#
Packaging#
The
atotipackage and most of its plugins (e.g.atoti-aws,atoti-directquery-directquery,atoti-kafkaetc.) have been split intoatoti-client-*andatoti-server-*packages. Theatoti-client-*packages contain the Python code composing the API while theatoti-server-*packages mostly contain the JARs implementing the corresponding features. The advanced installation section explains the goal of this split.The
atotipackage still exists but has become empty, it is only there to provide a convenient way to install both client and server packages. For instance:pip install atotiwill install bothatoti-clientandatoti-server. It will actually also installjdk4py.Note:
jdk4pyis not a dependency ofatoti-serverso that projects willing to use another JDK can avoid installingjdk4pyby dependending onatoti-clientandatoti-serverdirectly.pip install "atoti[aws]"will installatoti-client,atoti-aws-client,atoti-aws-server, andatoti-server.pip install "atoti[jupyterlab]"will installatoti-client,atoti-server, andatoti-jupyterlab(no client/server split for this package because it only contains frontend assets).
Because Conda does not support “extras”, the installation of Atoti plugins with this package manager is more complex. For instance, the command to install
atotiand its AWS plugin is:conda install atoti atoti-client-aws atoti-server-aws.
Session start and configuration#
atoti.Session.__init__()has been replaced withatoti.Session.start()for symmetry withatoti.Session.connect()(the removed section gives more details about that latter method):- session = tt.Session() + session = tt.Session.start()
The top-level config parameters have been grouped into a
SessionConfigdataclass providing better error reporting and allowing code reuse:- session = tt.Session(port=1337) + session = tt.Session.start(tt.SessionConfig(port=1337))
atoti.Session.__init__()’s authentication parameter has been replaced withatoti.SessionConfig.security:config = tt.OidcConfig(...) - tt.Session(authentication=config) + tt.Session.start(tt.SessionConfig(security=tt.SecurityConfig(sso=config)))
atoti.UserContentStorageConfighas been moved toatoti_jdbc.UserContentStorageConfig:- config = tt.UserContentStorageConfig(url=url) - tt.Session(user_content_storage=config) + from atoti_jdbc import UserContentStorageConfig + config = UserContentStorageConfig(url) + tt.Session.start(tt.SessionConfig(user_content_storage=config))
It makes it obvious that storing user content in an external database requires
atoti-jdbcto be installed.
DirectQuery#
DirectQuery
*ConnectionInfoand*TableOptionsclasses have been renamedConnectionConfigandTableConfig.- from atoti_directquery_clickhouse import ClickhouseConnectionInfo + from atoti_directquery_clickhouse import ConnectionConfig
- from atoti_directquery_clickhouse import ClickhouseTableOptions + from atoti_directquery_clickhouse import TableConfig
The
cacheattribute controlling whether DirectQuery connections should use caching has been moved from the connection instance to the connection config:- from atoti_directquery_snowflake import SnowflakeConnectionInfo + from atoti_directquery_snowflake import ConnectionConfig - connection_config = SnowflakeConnectionInfo(url=...) + connection_config = ConnectionConfig(url=..., cache=True) external_database = session.connect_to_external_database(connection_config) - external_database.cache = True
The
DatabricksConnectionInfo.heavy_load_urlattribute has been renamedfeeding_url.
Other#
Upgraded
jdk4pydependency to 21.0.4 which adds support for Linux Arm64.atoti_aws.AwsKeyPair,atoti_aws.AwsKmsConfig, andatoti_azure.AzureKeyPairhave been renamedKeyPair,KmsConfig, andKeyPair.The
atoti-sqlpackage has been renamedatoti-jdbc.atoti.Session.explain_mdx_query()andatoti.Cube.explain_query()have been replaced with an explain parameter toatoti.Session.query_mdx()andatoti.Cube.query():- session.explain_mdx_query(mdx) + session.query_mdx(mdx, explain=True)
atoti.Table.keysreturns atupleinstead of alist. It communicates that keys cannot be changed once the table exists.create_cube()’s base_table parameter has been renamed fact_table.
User interface#
Upgraded Atoti UI and Admin UI to 5.2.0.
Deprecated#
atoti.Session.port. Useatoti.Session.urlinstead:- url = f"http://localhost:{session.port}" + url = session.url
- port = session.port + from urllib.parse import urlparse + port = urlparse(session.url).port
atoti.Session.start_transaction(). Useatoti.tables.Tables.data_transaction()instead:- with session.start_transaction(): ... + with session.tables.data_model_transaction(): ...
atoti.Session.security.basic. Usebasic_authentication:- session.security.basic.credentials + session.security.basic_authentication.credentials
atoti.Table.columnsandatoti.ExternalTable.columns. Uselist(table)to list column names andfor column_name in table: ...to iterate on column names:- column_names = table.columns + column_names = list(table)
atoti.Hierarchy.levels. Iterate on theHierarchyinstead:- level_names = list(h["Geography"].levels) + level_names = list(h["Geography"])
- h["Date parts"] = {**h["Date parts"].levels, "Date": table["Date"]} + h["Date parts"] = {**h["Date parts"], "Date": table["Date"]}
Removed#
Support for Java 17, 18, 19, and 20.
The
atoti-querypackage and itsQuerySessionclass. Useatoti.Session.connect()instead:- pip install atoti-query + pip install atoti-client
- from atoti_query import QuerySession + from atoti as tt - existing_session = QuerySession(url) + existing_session = tt.Session.connect(url) existing_session.query_mdx(...)
atoti.Table.__len__(). It was ambiguous because it could be interpreted as counting either rows or columns. Instead, useatoti.Table.row_countto count rows andlen(list(table))to count columns:- row_count = len(table) + row_count = table.row_count
Previously deprecated#
Support for Python 3.9.
Support for pandas 1.
Support for JupyterLab 3.
atoti.UserServiceClient.atoti.Cube.schema.Support for passing an
inttoload_kafka()’s batch_duration parameter.Variadic constructor of
OriginScope.Support for calling
atoti.Session.link().atoti.Session.visualize()andatoti.QuerySession.visualize().Support for passing a
Setto:atoti.Session()’s extra_jars or java_options parametersatoti.Cube.query()’s levels parameteratoti.Cube.create_parameter_simulation()’s levels parameteratoti.Cube.create_parameter_hierarchy_from_members()’s members parameter
Support for passing a
Sequenceto:DirectQuery
TableConfigclasses’ clustering_columns attribute
atoti.Session.security.users.atoti.Session.security.basic.usersandatoti.Session.security.basic.create_user().atoti.Session.security.rolesandatoti.Session.security.create_role().In-place mutation of
atoti.Security.security.ldap.role_mappingandatoti.Security.security.oidc.role_mapping’s values.In-place mutation of
atoti.Security.security.individual_roles’ values.