Skip to content

Deployment and Maintenance / Configuration and Data Files

This article primarily introduces the storage locations of configurations and data files required by DataFlux Func.

Various configurations and data files required for DataFlux Func to run are, by default, all saved on the host machine, where they can be directly viewed.

By default, the installation directory is /usr/local/dataflux-func

1. Installation Directory Configuration

After the first installation of DataFlux Func, the installation script will automatically record the installation directory so that subsequent upgrades can be installed in the same path automatically. The configuration file save location is as follows:

Environment Location
Within Host /etc/dataflux-func

2. Docker Stack Configuration

By default, the Docker Stack configuration file save location is as follows:

Environment Location
Within Host {Installation Directory}/docker-stack.yaml

3. DataFlux Func Configuration

By default, the configuration file save location is as follows:

Environment Location
Within Container /data/user-config.yaml
Within Host {Installation Directory}/data/user-config.yaml

4. DataFlux Func Logs

By default, the log file save location is as follows:

Environment Location
Within Container /data/logs/
Within Host {Installation Directory}/data/logs/

By default, log files will automatically rotate and compress according to the logrotate configuration. The logrotate configuration file location is as follows:

Environment Location
Within Host /etc/logrotate.d/dataflux-func

5. DataFlux Func Database Backup

By default, the database of DataFlux Func will automatically back up using mysqldump, with the backup saved at the following location:

Environment Location
Within Container /data/sqldump/
Within Host {Installation Directory}/data/sqldump/

6. DataFlux Func Resource Catalog

By default, all files and data generated during the operation of DataFlux Func are saved under the resource catalog, with the save location as follows:

Environment Location
Within Container /data/resources/
Within Host {Installation Directory}/data/resources/

The resource file catalog may contain the following contents:

Host Location Description
{Installation Directory}/data/resources/extra-python-packages/ Directory for Python packages installed via PIP tool
{Installation Directory}/data/resources/user-python-packages/ User's local Python package directory
Users can upload their own Python packages and import them in scripts
{Installation Directory}/data/resources/pre-run-scripts/ Pre-execution script directory
Users can upload Bash scripts to run automatically when DataFlux Func restarts. For more details, refer to Script Development / Pre-execution Scripts
{Installation Directory}/data/resources/script-market/ Temporary local files for the script market
{Installation Directory}/data/resources/.downloads/ Temporary directory for downloads
{Installation Directory}/data/resources/.uploads/ Temporary directory for uploads

Developers/Users can also store other required resource files in the resource catalog for reading and use within scripts.

7. Built-in Redis Data Directory

If your DataFlux Func uses the built-in Redis, then the Redis data save location is as follows:

Environment Location
Within Host {Installation Directory}/redis/

8. Built-in MySQL Data Directory

If your DataFlux Func uses the built-in MySQL, then the MySQL data save location is as follows:

Environment Location
Within Host {Installation Directory}/mysql/

8.1 Database Table Data

DataFlux Func uses the following database tables. The table names and their purposes are as follows:

Table Name Data Notes
biz_main_api_auth API authentication
biz_main_async_api Asynchronous APIs
biz_main_blueprint Blueprints
biz_main_connector Connectors
biz_main_cron_job Scheduled tasks
biz_main_env_variable Environment variables
biz_main_file_service File services
biz_main_func_store Function storage
biz_main_func Functions
biz_main_operation_record Operation records
biz_main_script_market Script markets
biz_main_script_publish_history Script publish history
biz_main_script_recover_point Script recovery points
biz_main_script_set_export_history Export history
biz_main_script_set_import_history Import history
biz_main_script_set Script sets
biz_main_script Scripts
biz_main_sync_api Synchronous APIs
biz_main_task_record_func Task records (functions)
biz_main_task_record Task records
wat_main_access_key OpenAPI Access Key
wat_main_system_setting System settings
wat_main_user Users
biz_main_auth_link Authorization links Deprecated table, latest table replaced by biz_main_sync_api
biz_main_crontab_config Auto-trigger configuration Deprecated table, latest table replaced by biz_main_cron_job
biz_main_batch Batch processing Deprecated table, latest table replaced by biz_main_async_api
biz_main_batch_task_info Batch task information Deprecated table, latest version discontinued
biz_main_crontab_task_info Auto-trigger task information Deprecated table, latest version discontinued
biz_main_script_failure Script failure information Deprecated table, latest version discontinued
biz_main_script_log Script log information Deprecated table, latest version discontinued
biz_main_task_info Task information Deprecated table, latest version discontinued
biz_main_task_result_dataflux_func DataFluxFunc task results Deprecated table, latest version discontinued
biz_rel_func_running_info Function execution information Deprecated table, latest version discontinued
wat_main_task_result_example Example task records Deprecated table, latest version discontinued

Some table data might be very large

The task records (functions) table biz_main_task_record_func may leave behind tens of GBs of task record data after heavy usage.

You can refer to Deployment and Maintenance / System Metrics and Task Records / Disable Local Function Task Records to disable "Local Function Task Records".