Skip to content

Deployment and Maintenance / Configuration and Data Files

This article mainly introduces the configuration and data file storage locations required by DataFlux Func.

Various configurations and data files required for the operation of DataFlux Func are by default saved on the host machine and can be directly viewed on the host.

By default, the installation directory is /usr/local/dataflux-func

1. Installation Directory Configuration

After the initial installation of DataFlux Func, the installation script automatically records the installation directory to ensure subsequent upgrades are installed to the same path. The configuration file storage location is as follows:

Environment Location
Host /etc/dataflux-func

2. Docker Stack Configuration

By default, the Docker Stack configuration file storage location is as follows:

Environment Location
Host {installation directory}/docker-stack.yaml

3. DataFlux Func Configuration

By default, the configuration file storage location is as follows:

Environment Location
Container /data/user-config.yaml
Host {installation directory}/data/user-config.yaml

4. DataFlux Func Logs

By default, the log file storage location is as follows:

Environment Location
Container /data/logs/
Host {installation directory}/data/logs/

By default, log files are automatically rotated and compressed according to the logrotate configuration. The logrotate configuration file location is as follows:

Environment Location
Host /etc/logrotate.d/dataflux-func

5. DataFlux Func Database Backup

By default, the database of DataFlux Func is automatically backed up using mysqldump, and the storage location is as follows:

Environment Location
Container /data/sqldump/
Host {installation directory}/data/sqldump/

6. DataFlux Func Resource Catalog

By default, files and data generated during the operation of DataFlux Func are saved in the resource directory. The storage location is as follows:

Environment Location
Container /data/resources/
Host {installation directory}/data/resources/

The resource file directory may contain the following:

Host Location Description
{installation directory}/data/resources/extra-python-packages/ Directory for Python packages installed via PIP
{installation directory}/data/resources/user-python-packages/ Directory for locally used Python packages
Users can upload Python packages and import them in scripts
{installation directory}/data/resources/pre-run-scripts/ Pre-run script directory
Users can upload Bash scripts to run automatically when DataFlux Func restarts, refer to Script Development / Pre-run Scripts
{installation directory}/data/resources/script-market/ Local temporary files for Script Market
{installation directory}/data/resources/.downloads/ Temporary directory for downloads
{installation directory}/data/resources/.uploads/ Temporary directory for uploads

Developers / users can also store other required resource files in the resource directory for use in scripts.

7. Built-in Redis Data Directory

If your DataFlux Func uses the built-in Redis, the data storage location is as follows:

Environment Location
Host {installation directory}/redis/

8. Built-in MySQL Data Directory

If your DataFlux Func uses the built-in MySQL, the data storage location is as follows:

Environment Location
Host {installation directory}/mysql/

8.1 Database Table Data

The following database tables are used in DataFlux Func, with their names and purposes as follows:

Table Name Data Remarks
biz_main_api_auth API Authentication
biz_main_blueprint Blueprint
biz_main_connector Connector
biz_main_cron_job Cron Job
biz_main_env_variable Environment Variables
biz_main_file_service File Service
biz_main_func_store Function Storage
biz_main_func Function
biz_main_func_aou Func API
biz_main_operation_record Operation Records
biz_main_script_market Script Market
biz_main_script_publish_history Script Publish History
biz_main_script_recover_point Script Restore Point
biz_main_script_set_export_history Export History
biz_main_script_set_import_history Import History
biz_main_script_set Script Set
biz_main_script Script
biz_main_task_record_func Task Records (Function)
biz_main_task_record Task Records
wat_main_access_key OpenAPI Access Key
wat_main_system_setting System Settings
wat_main_user User
biz_main_sync_api Sync API Legacy Table
biz_main_async_api Async API Legacy Table
biz_main_auth_link Auth Link Legacy Table, latest table uses biz_main_sync_api
biz_main_crontab_config Auto-trigger Configuration Legacy Table, latest table uses biz_main_cron_job
biz_main_batch Batch Legacy Table, latest table uses biz_main_async_api
biz_main_batch_task_info Batch Task Info Legacy Table, latest version deprecated
biz_main_crontab_task_info Auto-trigger Task Info Legacy Table, latest version deprecated
biz_main_script_failure Script Failure Info Legacy Table, latest version deprecated
biz_main_script_log Script Log Info Legacy Table, latest version deprecated
biz_main_task_info Task Info Legacy Table, latest version deprecated
biz_main_task_result_dataflux_func DataFluxFunc Task Result Legacy Table, latest version deprecated
biz_rel_func_running_info Function Execution Info Legacy Table, latest version deprecated
wat_main_task_result_example Example Task Records Legacy Table, latest version deprecated

Some table data may be very large

The task records (function) table biz_main_task_record_func may leave behind tens of GB of task record data after heavy usage.

You can refer to Deployment and Maintenance / System Metrics and Task Records / Disable Local Function Task Records to disable "Local Function Task Records"