Skip to content

Deployment and Maintenance / Standalone Deployment

This article mainly introduces how to directly install and deploy the standalone version of DataFlux Func on a server.

For installing DataFlux Func using Helm in k8s, please refer to Deployment and Maintenance / Installation and Deployment / Helm Deployment

1. Download the Installation Package

DataFlux Func supports downloading the required installation files as an "installation package", which can be brought into an environment without public network access via USB drives or other mobile devices.

The downloaded "installation package" itself includes an automatic installation script, Docker, etc. Execute it to proceed with the installation (details below).

1.1 One-Command Download

For systems like Linux, macOS, etc., it is recommended to use the official shell command to download the installation package.

Run the following command to automatically download the required installation files for DataFlux Func. The download script will automatically select the x86_64 or aarch64 architecture version based on the current environment:

Please confirm system requirements and server configuration before installation

Bash
1
/bin/bash -c "$(curl -fsSL docs.dataflux-func.com/download)"

If you need to download an architecture different from the final installation server's architecture, you can add the architecture option to specify it:

Bash
1
/bin/bash -c "$(curl -fsSL docs.dataflux-func.com/download)" -- --arch=x86_64
Bash
1
/bin/bash -c "$(curl -fsSL docs.dataflux-func.com/download)" -- --arch=aarch64

After the command execution is complete, all installation files are saved in the automatically created dataflux-func-portable-{architecture}-{version} directory.

  • To install DataFlux Func on a server without public network access, you can download it locally first, then copy the entire directory to the target machine via USB drives or other mobile storage devices, or using tools like scp.
  • To install DataFlux Func on a server that can access the public network, download it directly on the server.

1.2 Manual Download

For systems where using shell commands is inconvenient, you can manually download the required installation files.

If manual download is needed, here is the complete file list:

# Content Filename x86_64 Architecture (Original) aarch64 Architecture (Original)
1 Docker Binary docker-24.0.9.tgz Download Download
2 DataFlux Func Image dataflux-func.tar.gz Download Download
3 MySQL/MariaDB Image mysql-5.7.26.tar.gz Download Download
4 Redis Image redis-6.2.20.tar.gz Download Download
5 Docker Service Config File docker.service Download Download
6 DataFluxFunc Installation Script run-portable.sh Download Download
7 Docker Stack Config Template docker-stack.example.yaml Download Download
8 Image List image-list Download Download
9 Version Information version Download Download

After manually downloading all installation files, place them in the same directory.

1.3 Download Options

When executing the download script, you can specify download options to meet personalized needs.

Example as follows:

Note: There is a -- (two hyphens) before the specified download options

Bash
1
/bin/bash -c "$(curl -fsSL docs.dataflux-func.com/download)" -- --download-dir=func-download

Supported download options are as follows:

--arch={architecture} Specify Architecture

The download script defaults to downloading installation package files matching the local machine's architecture. If you need to download installation files for a different architecture, you can specify this parameter.

Available architectures are as follows:

Architecture Description Parameter Example
Intel Intel / AMD 64-bit processors --arch=x86_64
ARM ARM64v8 processors --arch=aarch64

--download-dir={download directory} Specify Download Directory

Added in version 2.6.1

This parameter is suitable for automated script deployment of DataFlux Func

The download script will, by default, create/clear the dataflux-func-portable-{architecture}-{version} directory in the current directory and download the installation files there.

If you need to download to a specified directory, you can specify this parameter, e.g.:

Bash
1
/bin/bash -c "$(curl -fsSL docs.dataflux-func.com/download)" -- --download-dir=func-download

2. Execute Installation

Enter the installation package directory downloaded in the previous section and run the one-click installation script run-portable.sh:

Please confirm system requirements and server configuration before installation

DataFlux Func does not support macOS, Windows. Please copy it to a Linux system before running the installation

Bash
1
sudo /bin/bash {installation file directory}/run-portable.sh

Using the automatic installation script enables quick installation and startup within minutes. The automatic configuration includes:

  • Running MySQL, Redis, DataFlux Func (including Server, Worker, Beat)
  • Automatically creating and saving all data under the /usr/local/dataflux-func/ directory (including MySQL data, Redis data, DataFlux Func configuration, log files, etc.)
  • Randomly generating MySQL root user password and system Secret, saving them in the DataFlux Func configuration file
  • Redis has no password set
  • MySQL, Redis are not accessible externally

After execution is complete, you can use a browser to access http://{server IP address/domain}:8088 for the initialization operation interface.

If the runtime environment performance is poor, you should use the docker ps command to confirm all components have started successfully before accessing (see list below)

  1. dataflux-func_server
  2. dataflux-func_worker-0
  3. dataflux-func_worker-1
  4. dataflux-func_worker-2
  5. dataflux-func_worker-3
  6. dataflux-func_worker-5
  7. dataflux-func_worker-6
  8. dataflux-func_beat
  9. dataflux-func_mysql
  10. dataflux-func_redis
  1. dataflux-func_server
  2. dataflux-func_worker-0
  3. dataflux-func_worker-1-6
  4. dataflux-func_worker-7
  5. dataflux-func_worker-8-9
  6. dataflux-func_beat
  7. dataflux-func_mysql
  8. dataflux-func_redis

2.1 Installation Options

When executing the installation script, you can specify installation options to meet personalized needs, e.g.:

Bash
1
sudo /bin/bash run-portable.sh --port=80

--mini Install Mini Version

Installation mode for low-configuration environments where resource conservation is needed.

When enabled:

  • Only a single Worker is started, listening to all queues
  • Heavy load tasks are more likely to cause queue blocking and lag
  • System tasks and function tasks share the processing queue and can affect each other
  • System requirements are reduced to:
    • CPU cores >= 1
    • Memory >= 2GB
  • If the built-in MySQL, Redis are not used, system requirements can be further reduced

--port={port number} Specify Listening Port

DataFlux Func uses port 8088 for access by default. If this port is occupied by another program, you can choose another port, e.g., 9000.

--install-dir={installation directory} Specify Installation Directory

When you need to install to a path different from the default /usr/local/dataflux-func, you can specify this parameter.

--no-mysql Disable Built-in MySQL

When you need to use an existing MySQL database, you can specify this parameter to prevent MySQL from starting locally.

After enabling this option, you need to specify the correct MySQL connection information on the configuration page after installation is complete

--no-redis Disable Built-in Redis

When you need to use an existing Redis database, you can specify this parameter to prevent Redis from starting locally.

After enabling this option, you need to specify the correct Redis connection information on the configuration page after installation is complete

--auto-setup Automatically Execute Configuration

Added in version 2.6.0

This parameter is suitable for automated script deployment of DataFlux Func

When enabled, configuration and database initialization are performed automatically, and there will be no configuration page, e.g.:

Bash
1
sudo /bin/bash run-portable.sh --auto-setup

Furthermore, after enabling the --auto-setup option, you can add other --auto-setup-* options to adjust the automatic configuration. Additional options for automatic configuration are as follows:

Automatic Configuration Extra Option Default Value Description
--auto-setup-admin-username={username} admin Specify administrator username
--auto-setup-admin-password={password} admin Specify administrator password
--auto-setup-ak-secret={AK Secret} Auto-generated Automatically create AK
and use the specified value as the AK Secret
--auto-setup-ak-id={AK ID} ak-auto-setup AK ID when automatically creating AK
(requires use with the --auto-setup-ak-secret option)

For example, to automatically configure and specify the administrator password as AdminPass, the complete command is as follows:

Bash
1
sudo /bin/bash run-portable.sh --auto-setup --auto-setup-admin-password='AdminPass'

3. Verify Installation

By default, after DataFlux Func installation is complete, it already includes some example scripts.

Perform the following operations in sequence to verify the installation:

  1. Click "Script Editor" in the top navigation bar, then select "Script Library" -> "Examples" -> "Basic Demo" in the left sidebar.
  2. At the top of the right-hand script editor, click "Edit" to enter edit mode, select the "hello_world" function, and click the "Execute" button to run the function.
  3. At this point, if you can normally see the function's return value in the bottom "Script Output".

At this point, the installation verification is complete.

3.1 Service Descriptions

By default, after DataFlux Func starts normally, the following services are running:

Queues #7, #8, #9 are dedicated queues for the Data Platform attached version of Func. In the standalone deployment version of Func, they serve as reserved queues.

Name Description
dataflux-func_server The frontend service of DataFlux Func.
Mainly used to provide the Web interface, API interfaces, etc.
dataflux-func_worker-0 Python worker unit listening to queues #0, #4, #7, #8, #9.
Mainly handles internal tasks of DataFlux Func (#0 queue) and other tasks that mistakenly enter reserved queues.
dataflux-func_worker-1 Python worker unit listening to queue #1.
Mainly handles synchronously executed function APIs.
dataflux-func_worker-2 Python worker unit listening to queue #2.
Mainly handles scheduled tasks.
dataflux-func_worker-3 Python worker unit listening to queue #3.
Mainly handles asynchronously executed function APIs.
dataflux-func_worker-5 Python worker unit listening to queue #5.
Mainly handles debugging tasks for scripts run in the Web interface.
dataflux-func_worker-6 Python worker unit listening to queue #6.
Mainly handles tasks executed by connector subscription messages.
dataflux-func_beat Trigger for scheduled tasks, only one can be enabled globally.
dataflux-func_mysql Built-in MySQL that comes with DataFlux Func.
dataflux-func_redis Built-in Redis that comes with DataFlux Func.
Name Description
dataflux-func_server The frontend service of DataFlux Func.
Mainly used to provide the Web interface, API interfaces, etc.
dataflux-func_worker-0 Python worker unit listening to queue #0.
Mainly handles internal tasks of DataFlux Func.
dataflux-func_worker-1-6 Python worker unit listening to queues #1, #2, #3, #4, #5, #6.
Mainly handles synchronously executed function APIs.
dataflux-func_worker-7 Python worker unit listening to queue #7.
Mainly handles debugging tasks for scripts run in the Web interface.
dataflux-func_worker-8-9 Python worker unit listening to queues #8, #9.
Mainly handles asynchronous tasks (scheduled tasks, etc.).
dataflux-func_beat Trigger for scheduled tasks, only one can be enabled globally.
dataflux-func_mysql Built-in MySQL that comes with DataFlux Func.
dataflux-func_redis Built-in Redis that comes with DataFlux Func.

3.2 Data Storage Locations

DataFlux Func operation requires storing various types of data. The general content and storage locations are as follows:

Storage Save Path Storage Content
MySQL {installation directory}/mysql/ The vast majority of data generated during UI operations, including but not limited to:
1. Scripts, connector configurations, environment variables
2. User information, function APIs, scheduled tasks
3. Operation records, script execution records, import/export records, etc.
Redis {installation directory}/redis/ Mainly used for caching and queues, including but not limited to:
1. User login information
2. Various caches established during script runtime
3. Script execution task queues
4. Func's own monitoring data, etc.
Directory {installation directory}/data/ Mainly used for data that must exist in file form, see below for details.
Directory {installation directory}/data/resources/ Resource folder (i.e., the root directory of "File Manager").
Directory {installation directory}/data/resources/extra-python-packages/ Third-party packages installed via PIP tool.
Directory {installation directory}/data/resources/script-market/ Script market download data.
Directory {installation directory}/data/sqldump/ Database automatic backups.
Directory {installation directory}/data/logs/ System logs.
File {installation directory}/data/user-config.yaml DataFlux Func system configuration.
File /etc/dataflux-func DataFlux Func installation directory record
Used to correctly obtain the current DataFlux Func installation location during upgrades.

4. Using External Databases

When performing a fresh installation of DataFlux Func, if you need to use external databases (MySQL, Redis), you can specify disabling the built-in MySQL and Redis when executing the installation script, e.g.:

Bash
1
sudo /bin/bash {installation file directory}/run-portable.sh --no-mysql --no-redis

After installation is complete, on the first-run interface, click "Show More Configuration" and modify the MySQL and Redis configurations.

For MySQL and Redis configuration requirements, please refer to Deployment and Maintenance / System Requirements

If you need to migrate an existing DataFlux Func database to an external one, please refer to Deployment and Maintenance / Daily Operations / Migrating Databases

4. Nginx Reverse Proxy

You can use Nginx to configure a reverse proxy.

Note that DataFlux Func uses WebSocket technology, so relevant configuration needs to be added in Nginx.

Reference configuration is as follows:

Nginx Configuration
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
map $http_upgrade $connection_upgrade {
    default upgrade;
    ''      close;
}

server {
    listen 80;
    server_name your-domain.com;

    proxy_set_header Host $host;
    proxy_set_header X-Real-IP $remote_addr;
    proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_set_header X-Forwarded-Proto $scheme;
    proxy_set_header Upgrade $http_upgrade;
    proxy_set_header Connection $connection_upgrade;
    proxy_http_version 1.1;

    location / {
        proxy_pass http://127.0.0.1:8088;
    }
}
Nginx Configuration
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
map $http_upgrade $connection_upgrade {
    default upgrade;
    ''      close;
}

server {
    listen 443 ssl;
    server_name your-domain.com;

    ssl_certificate     /etc/nginx/ssl/your.pem;
    ssl_certificate_key /etc/nginx/ssl/your.key;

    proxy_set_header Host $host;
    proxy_set_header X-Real-IP $remote_addr;
    proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_set_header X-Forwarded-Proto $scheme;
    proxy_set_header Upgrade $http_upgrade;
    proxy_set_header Connection $connection_upgrade;
    proxy_http_version 1.1;

    location / {
        proxy_pass http://127.0.0.1:8088;
    }
}