SOFT : Ollama ------ Site du soft: https://github.com/ollama/ollama ------------- LICENSE: -------- MIT See software documentation for more informations. Location: /usr/local/bioinfo/src/Ollama --------- Load binaries and environment: ------------------------------ -> Version 0.20.7 module load bioinfo/Ollama/0.20.7 export OLLAMA_MODELS=/path/to/your/working/directory Add "sleep 10" after "ollama serve" to gives the server time to start before executing other commands: ollama serve & sleep 10 Default port on calcul node: 11434 if used: export OLLAMA_HOST="127.0.0.1:" To run on GPU: #Need GPU. To submit on gpuq, see FAQ: https://bioinfo.genotoul.fr/index.php/faq/job_submission_faq/ -> How to use GPU node Example directory for use on cluster: ------------------------------------- CPU example (TEST EN COURS) /usr/local/bioinfo/src/Ollama/example_on_cluster To submit: sbatch test_Ollama-v0.20.7.sh GPU example: /usr/local/bioinfo/src/Ollama/example_on_cluster/Ollama-v0.20.7_GPU To submit: sbatch test_Ollama-v0.20.7_GPU.sh See software documentation and our FAQ (https://vm-genoword.toulouse.inrae.fr/FAQ) for more informations.