Skip to content

Deployment and Maintenance / Configuration and Data Files

This article mainly introduces the configuration and data file locations required by DataFlux Func.

Various configurations and data files required for the operation of DataFlux Func are saved on the host machine by default and can be directly viewed on the host machine.

By default, the installation directory is /usr/local/dataflux-func

1. Installation Directory Configuration

After the initial installation of DataFlux Func, the installation script automatically records the installation directory for subsequent upgrades to be automatically installed in the same path. The configuration file location is as follows:

Environment Location
Host /etc/dataflux-func

2. Docker Stack Configuration

By default, the Docker Stack configuration file location is as follows:

Environment Location
Host {installation directory}/docker-stack.yaml

3. DataFlux Func Configuration

By default, the configuration file location is as follows:

Environment Location
Container /data/user-config.yaml
Host {installation directory}/data/user-config.yaml

4. DataFlux Func Logs

By default, the log file location is as follows:

Environment Location
Container /data/logs/
Host {installation directory}/data/logs/

By default, log files are automatically rotated and compressed according to the logrotate configuration. The logrotate configuration file location is as follows:

Environment Location
Host /etc/logrotate.d/dataflux-func

5. DataFlux Func Database Backup

By default, the database of DataFlux Func is automatically backed up using mysqldump, and the location is as follows:

Environment Location
Container /data/sqldump/
Host {installation directory}/data/sqldump/

6. DataFlux Func Resource Catalog

By default, files and data generated during the operation of DataFlux Func are saved in the resource directory, and the location is as follows:

Environment Location
Container /data/resources/
Host {installation directory}/data/resources/

The resource file directory may include the following contents:

Host Location Description
{installation directory}/data/resources/extra-python-packages/ Directory for Python packages installed via PIP tool
{installation directory}/data/resources/user-python-packages/ Directory for locally used Python packages
Users can upload Python packages themselves and import them in scripts
{installation directory}/data/resources/pre-run-scripts/ Pre-execution script directory
Users can upload Bash scripts themselves, which will automatically run when DataFlux Func restarts. For details, refer to Script Development / Pre-execution Scripts
{installation directory}/data/resources/script-market/ Temporary files for the Script Market
{installation directory}/data/resources/.downloads/ Temporary directory for downloads
{installation directory}/data/resources/.uploads/ Temporary directory for uploads

Developers / users can also store other required resource files in the resource directory for reading and use in scripts.

7. Built-in Redis Data Directory

If your DataFlux Func uses the built-in Redis, the data location is as follows:

Environment Location
Host {installation directory}/redis/

8. Built-in MySQL Data Directory

If your DataFlux Func uses the built-in MySQL, the data location is as follows:

Environment Location
Host {installation directory}/mysql/

8.1 Database Table Data

The following database tables are used in DataFlux Func, and their names and purposes are as follows:

Table Name Data Notes
biz_main_api_auth API Authentication
biz_main_async_api Async API
biz_main_blueprint Blueprint
biz_main_connector Connector
biz_main_cron_job Cron Job
biz_main_env_variable Environment Variables
biz_main_file_service File Service
biz_main_func_store Function Storage
biz_main_func Function
biz_main_operation_record Operation Records
biz_main_script_market Script Market
biz_main_script_publish_history Script Publish History
biz_main_script_recover_point Script Recovery Point
biz_main_script_set_export_history Export History
biz_main_script_set_import_history Import History
biz_main_script_set Script Set
biz_main_script Script
biz_main_sync_api Sync API
biz_main_task_record_func Task Records (Function)
biz_main_task_record Task Records
wat_main_access_key OpenAPI Access Key
wat_main_system_setting System Settings
wat_main_user User
biz_main_auth_link Auth Link Old table, the latest table has been replaced by biz_main_sync_api
biz_main_crontab_config Automatic Trigger Configuration Old table, the latest table has been replaced by biz_main_cron_job
biz_main_batch Batch Old table, the latest table has been replaced by biz_main_async_api
biz_main_batch_task_info Batch Task Information Old table, the latest version is deprecated
biz_main_crontab_task_info Automatic Trigger Task Information Old table, the latest version is deprecated
biz_main_script_failure Script Failure Information Old table, the latest version is deprecated
biz_main_script_log Script Log Information Old table, the latest version is deprecated
biz_main_task_info Task Information Old table, the latest version is deprecated
biz_main_task_result_dataflux_func DataFluxFunc Task Result Old table, the latest version is deprecated
biz_rel_func_running_info Function Execution Information Old table, the latest version is deprecated
wat_main_task_result_example Example Task Record Old table, the latest version is deprecated

Some table data may be very large

The task record (function) table biz_main_task_record_func may leave tens of GB of task record data after heavy use

You can refer to Deployment and Maintenance / System Metrics and Task Records / Disable Local Function Task Records to disable "Local Function Task Records"