Deployment and Maintenance / Single-Machine Deployment
This article primarily introduces how to directly install and deploy a single-machine version of DataFlux Func on a server.
For information on using Helm to install DataFlux Func in k8s, please refer to Deployment and Maintenance / Installation Deployment / Helm Deployment
1. Download the Installation Package
DataFlux Func supports downloading the required installation files as an "installation package" that can be transferred via USB drives or other mobile devices for installation in environments without public network access.
The downloaded "installation package" itself includes automatic installation scripts, Docker, etc., and can be executed to start the installation (details are provided below).
1.1 One-Command Download
For systems like Linux, macOS, etc., it is recommended to use the official shell command to download the installation package.
Running the following command will automatically download the necessary installation files for DataFlux Func. The download script will automatically select between downloading the x86_64
or aarch64
architecture versions based on the current environment:
Please confirm system requirements and server configuration before installation
Bash | |
---|---|
1 |
|
Bash | |
---|---|
1 |
|
GSE Edition vs Old Edition
- For details on the GSE Edition, please refer to Deployment and Maintenance / DataFlux Func GSE Edition
- To download older versions of DataFlux Func such as 1.x, 2.x, 3.x, 5.x, please refer to Change Log / Download Older Versions
If the architecture of the target server differs from the one being downloaded, you can specify the architecture option:
Bash | |
---|---|
1 |
|
Bash | |
---|---|
1 |
|
Bash | |
---|---|
1 |
|
Bash | |
---|---|
1 |
|
After executing the command, all installation files are saved in the automatically created dataflux-func-portable-{architecture}-{version}
directory.
- If you need to install DataFlux Func on a public-network-free server, you can first download it locally, then copy the entire directory to the target machine using a USB drive or tools like
scp
. - If the server where DataFlux Func needs to be installed has public network access, you can download it directly on the server.
1.2 Manual Download
For systems where it is inconvenient to use shell commands, you can manually download the required installation files.
If manual download is needed, here is a list of all the files:
# | Content | File Name | x86_64 Architecture (GSE Edition) | aarch64 Architecture (GSE Edition) |
---|---|---|---|---|
1 | Docker binary program | docker-24.0.9.tgz |
Download | Download |
2 | DataFlux Func image | dataflux-func.tar.gz |
Download | Download |
3 | MySQL/MariaDB image | mysql.tar.gz |
Download | Download |
4 | Redis image | redis.tar.gz |
Download | Download |
5 | Docker service configuration | docker.service |
Download | Download |
6 | DataFluxFunc installation script | run-portable.sh |
Download | Download |
7 | Docker Stack configuration template | docker-stack.example.yaml |
Download | Download |
8 | Image list | image-list |
Download | Download |
9 | Version information | version |
Download | Download |
# | Content | File Name | x86_64 Architecture (Original Edition) | aarch64 Architecture (Original Edition) |
---|---|---|---|---|
1 | Docker binary program | docker-24.0.9.tgz |
Download | Download |
2 | DataFlux Func image | dataflux-func.tar.gz |
Download | Download |
3 | MySQL/MariaDB image | mysql.tar.gz |
Download | Download |
4 | Redis image | redis.tar.gz |
Download | Download |
5 | Docker service configuration | docker.service |
Download | Download |
6 | DataFluxFunc installation script | run-portable.sh |
Download | Download |
7 | Docker Stack configuration template | docker-stack.example.yaml |
Download | Download |
8 | Image list | image-list |
Download | Download |
9 | Version information | version |
Download | Download |
GSE Edition vs Old Edition
- For details on the GSE Edition, please refer to Deployment and Maintenance / DataFlux Func GSE Edition
- To download older versions of DataFlux Func such as 1.x, 2.x, 3.x, 5.x, please refer to Change Log / Download Older Versions
After manually downloading all installation files, place them in the same directory.
1.3 Download Options
When executing the download script, you can specify download options to meet personalized needs.
Example:
Note: Before the specified download options, there is a -- (two hyphens)
Bash | |
---|---|
1 |
|
Supported download options are as follows:
--arch={architecture}
Specify Architecture
The download script defaults to downloading installation packages matching the local machine's architecture. If you need to download installation files for a different architecture, this parameter can be specified.
Available architectures are as follows:
Architecture | Description | Parameter Example |
---|---|---|
Intel | Intel / AMD 64-bit processor | --arch=x86_64 |
ARM | ARM64v8 processor | --arch=aarch64 |
--download-dir={download directory}
Specify Download Directory
Added in version 2.6.1
This parameter is suitable for automated script deployment of DataFlux Func
The download script by default creates/clears the dataflux-func-portable-{architecture}-{version}
directory under the current directory and downloads the installation files there.
If you want to download to a specific directory, this parameter can be specified, for example:
Bash | |
---|---|
1 |
|
2. Execute Installation
Enter the installation package directory downloaded earlier and run the one-click installation script run-portable.sh
:
Please confirm system requirements and server configuration before installation
DataFlux Func does not support macOS or Windows; please copy it to a Linux system before running the installation
Bash | |
---|---|
1 |
|
Using the automatic installation script can achieve quick installation and operation within minutes, with the following automatic configurations:
- Runs MySQL, Redis, DataFlux Func (including Server, Worker, Beat)
- Automatically creates and stores all data in
/usr/local/dataflux-func/
(including MySQL data, Redis data, DataFlux Func configuration, log files, etc.) - Randomly generates MySQL
root
user password, system Secret, and saves them in the DataFlux Func configuration file - No password set for Redis
- No external access provided for MySQL and Redis
After execution, you can access the initialization interface at http://{server IP address/domain}:8088
.
If the performance of the runtime environment is poor, use the docker ps command to confirm that all components have successfully started before accessing (see the list below)
dataflux-func_server
dataflux-func_worker-0
dataflux-func_worker-1
dataflux-func_worker-2
dataflux-func_worker-3
dataflux-func_worker-5
dataflux-func_worker-6
dataflux-func_beat
dataflux-func_mysql
dataflux-func_redis
dataflux-func_server
dataflux-func_worker-0
dataflux-func_worker-1-6
dataflux-func_worker-7
dataflux-func_worker-8-9
dataflux-func_beat
dataflux-func_mysql
dataflux-func_redis
2.1 Installation Options
When executing the installation script, you can specify installation options to meet personalized needs, such as:
Bash | |
---|---|
1 |
|
--mini
Install Mini Version
For low-config environments where resource savings are needed, this installation mode can be used.
After enabling:
- Only starts a single Worker listening to all queues
- More likely to cause queue blockages and lagging under heavy load tasks
- System tasks and function tasks share processing queues, affecting each other
- System requirements reduced to:
- CPU cores >= 1
- Memory capacity >= 2GB
- If no built-in MySQL or Redis is used, system requirements can be further reduced
--port={port number}
Specify Listening Port Number
DataFlux Func defaults to using port 8088
. If this port is occupied by another program, you can choose another port, such as 9000
.
--install-dir={installation directory}
Specify Installation Directory
If you need to install in a path different from the default /usr/local/dataflux-func
, this parameter can be specified.
--no-mysql
Disable Built-in MySQL
If you need to use an existing MySQL database, you can specify this parameter to disable starting MySQL on the local machine.
After enabling this option, correct MySQL connection information must be specified in the configuration page after installation.
--no-redis
Disable Built-in Redis
If you need to use an existing Redis database, you can specify this parameter to disable starting Redis on the local machine.
After enabling this option, correct Redis connection information must be specified in the configuration page after installation.
--auto-setup
Automatically Execute Configuration
Added in version 2.6.0
This parameter is suitable for automated script deployment of DataFlux Func
Enabling this will automatically configure and initialize the database, bypassing the configuration page, such as:
Bash | |
---|---|
1 |
|
Additionally, when enabling the --auto-setup
option, additional --auto-setup-*
options can be added to adjust the automatic configuration. Extra options for automatic configuration are as follows:
Automatic Configuration Additional Option | Default Value | Description |
---|---|---|
--auto-setup-admin-username={username} |
admin |
Specifies the administrator username |
--auto-setup-admin-password={password} |
admin |
Specifies the administrator password |
--auto-setup-ak-secret={AK Secret} |
Auto-generated | Automatically creates AK and uses the specified value as AK Secret |
--auto-setup-ak-id={AK ID} |
ak-auto-setup |
Automatically creates AK ID for AK creation (requires cooperation with the --auto-setup-ak-secret option) |
If you want to automatically configure and specify the administrator password as AdminPass
, the complete command would be as follows:
Bash | |
---|---|
1 |
|
3. Verify Installation
By default, DataFlux Func comes pre-installed with some example scripts after installation.
Follow these steps to verify the installation:
- Click on the "Script Editor" in the top navigation bar, then sequentially select "Script Library" - "Examples" - "Basic Demonstration" from the left panel.
- In the script editor on the right, click "Edit" to enter edit mode, select the
hello_world
function, and click the "Execute" button to run the function. - At this point, if the function returns values normally in the "Script Output" section at the bottom,
At this point, the installation verification is complete.
3.1 Service Descriptions
By default, after DataFlux Func starts up correctly, the following services are running:
Queues #7, #8, #9 are exclusive queues for the TrueWatch attached version of Func, reserved as placeholders in standalone deployments
Name | Description |
---|---|
dataflux-func_server |
Front-end service for DataFlux Func. Primarily used to provide Web interfaces, API interfaces, etc. |
dataflux-func_worker-0 |
Python worker unit listening to queues #0, #4, #7, #8, #9. Mainly handles internal tasks of DataFlux Func (#0 queue) and other tasks mistakenly placed in reserved queues |
dataflux-func_worker-1 |
Python worker unit listening to queue #1. Mainly handles general-purpose user function tasks |
dataflux-func_worker-2 |
Python worker unit listening to queue #2. Mainly handles automatically triggered user function tasks |
dataflux-func_worker-3 |
Python worker unit listening to queue #3. Mainly handles batch-processing user function tasks |
dataflux-func_worker-5 |
Python worker unit listening to queue #5. Mainly handles debugging tasks for running scripts in the Web interface |
dataflux-func_worker-6 |
Python worker unit listening to queue #6. Mainly handles tasks executed by connector subscriptions |
dataflux-func_beat |
Trigger for automatically triggered tasks, globally only one can be enabled |
dataflux-func_mysql |
Built-in MySQL included with DataFlux Func |
dataflux-func_redis |
Built-in Redis included with DataFlux Func |
Name | Description |
---|---|
dataflux-func_server |
Front-end service for DataFlux Func. Primarily used to provide Web interfaces, API interfaces, etc. |
dataflux-func_worker-0 |
Python worker unit listening to queue #0. Mainly handles internal tasks of DataFlux Func |
dataflux-func_worker-1-6 |
Python worker unit listening to queues #1, #2, #3, #4, #5, #6. Mainly handles synchronous APIs (Old version: Authorization Links) |
dataflux-func_worker-7 |
Python worker unit listening to queue #7. Mainly handles debugging tasks for running scripts in the Web interface |
dataflux-func_worker-8-9 |
Python worker unit listening to queues #8, #9. Mainly handles asynchronous tasks (Automatic Triggers, Batch Processing) |
dataflux-func_beat |
Trigger for automatically triggered tasks, globally only one can be enabled |
dataflux-func_mysql |
Built-in MySQL included with DataFlux Func |
dataflux-func_redis |
Built-in Redis included with DataFlux Func |
3.2 Data Storage Location
DataFlux Func requires storing various types of data during its operation, with the following content and storage locations:
Storage | Saved Path | Stored Content |
---|---|---|
MySQL |
{Installation Directory}/mysql/ |
Most data generated through UI operations, including but not limited to: 1. Scripts, connector configurations, environment variables 2. User information, synchronous APIs (Old version: Authorization Links), scheduled tasks (Old version: Automatic Trigger Configurations), asynchronous APIs (Old version: Batch Processing) 3. Operation records, script execution records, import/export records, etc. |
Redis |
{Installation Directory}/redis/ |
Mainly used for caching and queues, including but not limited to: 1. User login information 2. Various caches established during script execution 3. Script execution task queues 4. Func self-monitoring data, etc. |
Directory |
{Installation Directory}/data/ |
Mainly used for data that must exist in file form, see details below |
Directory |
{Installation Directory}/data/resources/ |
Resource folder (i.e., root directory of "File Manager") |
Directory |
{Installation Directory}/data/resources/extra-python-packages/ |
PIP tool installed third-party packages |
Directory |
{Installation Directory}/data/resources/script-market/ |
Script market download data |
Directory |
{Installation Directory}/data/sqldump/ |
Database automatic backup |
Directory |
{Installation Directory}/data/logs/ |
System logs |
File |
{Installation Directory}/data/user-config.yaml |
DataFlux Func system configuration |
File |
/etc/dataflux-func |
DataFlux Func installation directory record Used to correctly retrieve the current DataFlux Func installation location during upgrades |
4. Use External Databases
When installing DataFlux Func, if you need to use external databases (MySQL, Redis), you can specify disabling the built-in MySQL and Redis while executing the installation script:
Bash | |
---|---|
1 |
|
After installation, in the initial run interface, click "Show More Configurations" and modify the MySQL and Redis configurations accordingly.
For configuration requirements regarding MySQL and Redis, please refer to Deployment and Maintenance / System Requirements
If you need to migrate the existing DataFlux Func database to an external one, please refer to Deployment and Maintenance / Daily Maintenance / Migrate Database