Skip to content

Deployment and Maintenance / Backup and Migration

This article mainly introduces how to back up and migrate DataFlux Func data.

1. Database Backup

DataFlux Func will automatically back up the database regularly as .sql files, and save them under the DataFlux Func database backup directory.

This feature backs up the database through mysqldump, and the MySQL user connecting to the database needs sufficient permissions to perform a normal backup, including:

  • SELECT
  • RELOAD
  • LOCK TABLES
  • REPLICATION CLIENT
  • SHOW VIEW
  • PROCESS

The content of automatic backups includes:

  1. Complete database table structure
  2. All data except logs and operation records

By default, the location where database backup files are saved is as follows:

Environment Location
Inside CONTAINERS /data/sqldump/
On HOST {Installation Directory}/data/sqldump/

By default, database backup files are backed up every hour on the hour, with a maximum retention of 7 days (a total of 168 copies), and the file naming rule is as follows:

Text Only
1
dataflux-func-sqldump-YYYYMMDD-hhmmss.sql

2. Exporting / Importing Databases

In addition to the regular database backups included with DataFlux Func, you can also directly use mysqldump to export and import data.

2.1 Exporting Data

The reference command for exporting operations is as follows:

Bash
1
docker exec {MySQL CONTAINER ID} sh -c 'exec mysqldump --user=root --password="$MYSQL_ROOT_PASSWORD" --hex-blob --default-character-set=utf8mb4 --skip-extended-insert --databases dataflux_func' > {Backup file on HOST}

2.2 Importing Data

The reference command for importing operations is as follows:

Bash
1
docker exec -i {MySQL CONTAINER ID} sh -c 'exec mysql -uroot -p"$MYSQL_ROOT_PASSWORD"' < {Backup file on HOST}

3. Migrating the Database

If the system has passed the initial single-machine verification stage after installation, and you need to switch the database to an external database (such as: Alibaba Cloud RDS, Redis), you can operate according to the following steps:

When using an external database, ensure that the MySQL version is 5.7 or higher, and the Redis version is 5.0 or higher

DataFlux Func does not support clustered deployment of Redis

3.1 Confirm Environment

Create a database in the external database instance and ensure the following configuration items:

  • innodb-large-prefix=on
  • character-set-server=utf8mb4
  • collation-server=utf8mb4_unicode_ci

3.2 Import Data

Find the most recent MySQL database backup file from the above "Database Backup" section and import it into the external database.

3.3 Modify DataFlux Func Configuration

Find the DataFlux Func Configuration user-config.yaml and add / modify the following field contents according to actual conditions:

YAML
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
# MySQL connection configuration
MYSQL_HOST    : '127.0.0.1'    # MySQL connection address
MYSQL_PORT    : 3306           # MySQL port number
MYSQL_USER    : root           # MySQL username
MYSQL_PASSWORD: ''             # MySQL password
MYSQL_DATABASE: dataflux_func  # MySQL database name

# Redis connection configuration
REDIS_HOST    : '127.0.0.1'    # Redis connection address
REDIS_PORT    : 6379           # Redis port number
REDIS_DATABASE: 5              # Redis database number (Suggestion: To avoid conflicts with other applications, do not use the default database 0)
REDIS_PASSWORD: ''             # Redis password (Leave blank if no password)
REDIS_USE_TLS : false          # Redis TLS support (true / false)

3.4 Modify Docker Stack Configuration

Find the Docker Stack Configuration docker-stack.yaml and delete the MySQL and Redis related parts (comment them out).

3.5 Restart DataFlux Func

Restart the entire DataFlux Func according to Deployment and Maintenance / Upgrade and Restart.

4. Change Installation Directory

In some cases, you may encounter insufficient disk capacity for the originally installed DataFlux Func, requiring the installation directory to be moved to a new large-capacity disk.

Then, you can change the installation directory according to the following operational steps.

4.1 Shut Down DataFlux Func

  1. Use the docker stack rm dataflux-func command to shut down DataFlux Func (this step may take some time).
  2. Use docker ps to confirm all containers have exited.

4.2 Move Installation Directory

Use the cp {Installation Directory} {New Installation Directory} command to copy the entire DataFlux Func installation directory to the new location.

4.3 Modify Installation Directory Record

Open /etc/dataflux-func and change the recorded installation directory to the new installation directory.

4.4 Modify Docker Stack Configuration

Find the Docker Stack Configuration docker-stack.yaml and change all volumes related to the installation directory to the new installation directory, such as:

Diff
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
 version: '3.1'

 services:
   # Other content omitted
   server:
     image: pubrepo.jiagouyun.com/dataflux-func/dataflux-func:x.y.z
     labels:
       - server
     volumes:
-      - "/usr/local/dataflux-func/data:/data"
+      - "/my-workspace/dataflux-func/data:/data"
     networks:
       - datafluxfunc
       - default
     ports:
       - "8088:8088"
     command: ./run-server.sh

 # Other content omitted
  1. Use docker stack deploy dataflux-func -c {Installation Directory}/docker-stack.yaml --resolve-image never to restart all services

Since the image files in the installation package have already been imported locally, adding the --resolve-image never parameter can prevent Docker from performing meaningless image checks when starting containers.