Script Development / Uploading Local Python Modules
In some cases, the required Python modules are not published to the PyPi platform but exist directly in file form.
At this time, these files can be uploaded to the "extra-python-packages/" directory of DataFlux Func's resource catalog and directly referenced in scripts.
1. Uploading Python Modules
Users can choose the upload method based on actual conditions; there is no difference in the results.
Direct Upload to Host Machine
If the user has access permissions to the host machine, they can directly use familiar SSH tools to upload Python files/directories to the {installation directory}/data/resources/extra-python-packages/
directory.
The default installation directory is /usr/local/dataflux-func. If unsure, you can check using cat /etc/dataflux-func.
Using "File Management" for Upload
If the user does not have access to the host machine, they can enable "Management / Experimental Features / Enable File Management" and then upload through "Management / File Management."
To prevent zip archives uploaded by users from disrupting the directory structure after extraction, the extraction process in "File Management" will forcibly create a directory.
Therefore, if the uploaded zip archive already contains a directory, an additional layer of directory structure will appear after extraction.
Based on the actual situation, move the directory of the Python module to /extra-python-packages/xxxxx
.
Return to the extra-python-packages
directory and confirm the Python files/directories that need to be uploaded.
2. Using Uploaded Python Modules
Assume the uploaded Python module is as follows:
my_pkg/__init__.py | |
---|---|
1 |
|
my_pkg/hello_world.py | |
---|---|
1 2 |
|
Then, the module can be directly imported and used in the script, like this:
Python | |
---|---|
1 2 3 4 5 |
|
3. Precautions
Be sure that self-uploaded Python modules do not conflict with names of other Python modules, and do not delete or rename directories that already exist in extra-python-packages
.