This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision | ||
ollama_llm [2024/04/24 18:05] jpummil |
ollama_llm [2024/04/27 02:59] (current) jpummil |
||
---|---|---|---|
Line 2: | Line 2: | ||
Ollama is an open source inference engine for running LLM models. [[https:// | Ollama is an open source inference engine for running LLM models. [[https:// | ||
- | To run a local LLM, you need two ingredients: | + | To run a local LLM, you need two ingredients: |
- | AHPCC is making this tool available to our users in a limited fashion so as not to impact the broad array of research tasks making their way thru the queues. As such, Ollama is only available | + | AHPCC is making this tool available to our users in a limited fashion so as not to impact the broad array of research tasks making their way thru the queues. As such, Ollama is available and can be accessed |
+ | |||
+ | For a query to a model, you can do so via a login node as follows: | ||
< | < | ||
- | srun -p pcon06 -N1 -n1 -q comp -t 01:00:00 -C 4titanv --pty /bin/bash | + | ollama-ahpcc run < |
</ | </ | ||
- | Once you are logged in to your session, you need to determine which LLM model you wish to use for the session. More information about the various models can be seen here: https:// | + | You need to determine which LLM model you wish to use for the query. More information about the various models can be seen here: https:// |
To see the available models: | To see the available models: | ||
< | < | ||
- | $ ollama list | + | $ ollama-ahpcc |
NAME | NAME | ||
all-minilm: | all-minilm: | ||
Line 29: | Line 32: | ||
</ | </ | ||
- | + | For querying with a single command | |
- | There are two ways to utilize Ollama. Interactively, | + | |
- | + | ||
- | To use a session interactively... | + | |
< | < | ||
- | ollama | + | ollama-ahpcc |
- | </ | + | |
- | + | ||
- | When you are done with the session, exit with the command... | + | |
- | + | ||
- | < | + | |
- | /bye> | + | |
- | </ | + | |
- | + | ||
- | For querying with a single command from outside the Ollama shell... | + | |
- | + | ||
- | < | + | |
- | ollama | + | |
</ | </ | ||
Line 53: | Line 41: | ||
< | < | ||
- | ollama run llama3 How many states are in the USA? | + | ollama-ahpcc |
There are 50 states in the United States of America (USA). | There are 50 states in the United States of America (USA). | ||
Line 61: | Line 49: | ||
< | < | ||
- | ollama run llama3 Please list in JSON format the names, state capitols, and populations of all 50 US states | + | ollama-ahpcc |
Here is the list of 50 US states with their names, state capitals, and populations in JSON format: | Here is the list of 50 US states with their names, state capitals, and populations in JSON format: | ||
Line 89: | Line 77: | ||
< | < | ||
- | ollama run llama3 Please list all 50 US states >> states.txt | + | ollama-ahpcc |
</ | </ | ||
Line 95: | Line 83: | ||
< | < | ||
- | ollama run llama3 "$(cat nas.f90)" | + | ollama-ahpcc |
This is a Fortran code file, specifically a main program and subroutine for solving a partial differential | This is a Fortran code file, specifically a main program and subroutine for solving a partial differential | ||
Line 120: | Line 108: | ||
< | < | ||
- | ollama run llama3 " | + | ollama-ahpcc |
This appears to be a PDB (Protein Data Bank) file, which contains structural information about a protein or | This appears to be a PDB (Protein Data Bank) file, which contains structural information about a protein or | ||
Line 129: | Line 117: | ||
* Chain identifiers (e.g. A, B) indicating which chain a particular residue belongs to | * Chain identifiers (e.g. A, B) indicating which chain a particular residue belongs to | ||
</ | </ | ||
+ | |||
+ | It should be noted that for analyzing images and other various files, it requires a multi-modal model like llava or bakllava and results seems to be a bit mixed. | ||
As we are all still learning how to efficiently and properly utilize LLM's in our daily life, you may be better served looking on the Ollama GitHub site under " | As we are all still learning how to efficiently and properly utilize LLM's in our daily life, you may be better served looking on the Ollama GitHub site under " |