Configuration

The configuration system consists of two main classes:

  • Config: Defines the configuration model.

  • ConfigLoader: Manages loading and accessing configuration.

Config

class openchatbi.config_loader.Config(*, organization: str = 'The Company', dialect: str = 'presto', default_llm: BaseChatModel | MagicMock, embedding_model: BaseModel | MagicMock, text2sql_llm: BaseChatModel | MagicMock | None = None, bi_config: dict[str, Any] = {}, data_warehouse_config: dict[str, Any] = {}, catalog_store: Any = None, mcp_servers: list[dict[str, Any]] = [], report_directory: str = './data', python_executor: str = 'local', visualization_mode: str | None = 'rule', context_config: dict[str, Any] = {}, timeseries_forecasting_service_url: str = 'http://localhost:8765')[source]

Configuration model for the OpenChatBI application.

organization

Organization name. Defaults to “The Company”.

Type:

str

dialect

SQL dialect to use. Defaults to “presto”.

Type:

str

default_llm

Default language model for general tasks.

Type:

BaseChatModel

embedding_model

Language model for embedding generation.

Type:

BaseModel

text2sql_llm

Language model specifically for text-to-SQL tasks.

Type:

Optional[BaseChatModel]

bi_config

BI configuration loaded from YAML file. Defaults to empty dict.

Type:

Dict[str, Any]

data_warehouse_config

Data warehouse configuration. Defaults to empty dict.

Type:

Dict[str, Any]

model_config: ClassVar[ConfigDict] = {'arbitrary_types_allowed': True}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

visualization_mode: str | None
context_config: dict[str, Any]
timeseries_forecasting_service_url: str
classmethod from_dict(config: dict[str, Any]) Config[source]

Creates a Config instance from a dictionary.

Parameters:

config (Dict[str, Any]) – Dictionary containing configuration values.

Returns:

A new Config instance with the provided values.

Return type:

Config

ConfigLoader

class openchatbi.config_loader.ConfigLoader[source]

Bases: object

Singleton class to load and manage configuration settings for OpenChatBI.

This class provides methods to load, get, and set configuration parameters for the application, including LLM models, SQL dialect, and other settings.

llm_configs = ['default_llm', 'embedding_model', 'text2sql_llm']
get() Config[source]

Get the current configuration.

Returns:

The current configuration instance.

Return type:

Config

Raises:

ValueError – If the configuration has not been loaded.

load(config_file: str = None) None[source]

Load configuration from a YAML file.

Parameters:

config_file (str, optional) – Path to configuration file. Uses CONFIG_FILE environment variable or ‘openchatbi/config.yaml’ if not provided.

Raises:
  • ImportError – If pyyaml is not installed.

  • FileNotFoundError – If the configuration file cannot be found.

load_bi_config(bi_config_file: str) dict[str, Any][source]

Load BI configuration from a YAML file.

Parameters:

bi_config_file (str) – Path to the BI configuration file. Defaults to ‘example/bi.yaml’.

Returns:

The loaded BI configuration as a dictionary.

Return type:

Dict[str, Any]

Raises:
  • ImportError – If pyyaml is not installed.

  • FileNotFoundError – If the BI configuration file cannot be found.

set(config: dict[str, Any]) None[source]

Set the configuration from a dictionary.

Parameters:

config (Dict[str, Any]) – Dictionary containing configuration values.