Script Development / Upload User Python Modules
In some cases, the Python modules required by users are not published on the PYPI platform but exist directly as files.
In such scenarios, these files can be uploaded to the user-python-packages/
directory in the Resource Catalog of DataFlux Func and directly referenced in scripts.
1. Upload Python Modules
Users can choose the upload method based on their actual situation, as there is no difference in the result.
Direct Upload to Host
If users have access to the host, they can directly upload Python files/directories to the {installation directory}/data/resources/user-python-packages/
directory using their preferred SSH tools.
The default installation directory is /usr/local/dataflux-func. If unsure, use cat /etc/dataflux-func
to check.
Upload Using "File Service"
If users do not have access to the host, they can enable "Manage / Experimental Features / Enable File Service" and then upload via "Manage / File Service."
To prevent the directory structure from being disrupted after decompressing the uploaded zip files, decompression in "File Service" will forcibly create a directory.
Therefore, if the uploaded zip file already contains a directory, an additional directory layer will be created after decompression.
Move the Python module directory to user-python-packages/xxxxx
as needed.
Navigate to the user-python-packages/
directory to confirm the uploaded Python package directory.
2. Use Uploaded Python Modules
Assume the uploaded Python module is as follows:
my_pkg/__init__.py | |
---|---|
1 |
|
my_pkg/hello_world.py | |
---|---|
1 2 |
|
Then, you can directly import and use this module in a script, such as:
Python | |
---|---|
1 2 3 4 5 |
|
3. Notes
Be cautious not to name the uploaded Python modules the same as other Python modules, and do not delete or rename directories that originally exist in extra-python-packages
.