User Tools

Site Tools


ollama_llm

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
ollama_llm [2024/04/24 18:19]
jpummil
ollama_llm [2024/04/27 02:59] (current)
jpummil
Line 4: Line 4:
 To run a local LLM, you need two ingredients: the model itself, and the inference engine, which is a piece of software that can run the model. Conceptually, the inference engine processes the input (a text prompt), feeds it through the neural network of the model, and retrieves the response. Ollama is different than Chat-GPT and Gemini as it runs locally without accessing the internet for model data. The models are stored locally on Pinnacle. To run a local LLM, you need two ingredients: the model itself, and the inference engine, which is a piece of software that can run the model. Conceptually, the inference engine processes the input (a text prompt), feeds it through the neural network of the model, and retrieves the response. Ollama is different than Chat-GPT and Gemini as it runs locally without accessing the internet for model data. The models are stored locally on Pinnacle.
  
-AHPCC is making this tool available to our users in a limited fashion so as not to impact the broad array of research tasks making their way thru the queues. As such, Ollama is only available on a single, specific node and should be accessed as follows:+AHPCC is making this tool available to our users in a limited fashion so as not to impact the broad array of research tasks making their way thru the queues. As such, Ollama is available and can be accessed as shown: 
 + 
 +For a query to a model, you can do so via a login node as follows:
  
 <code> <code>
-srun -p pcon06 -N1 -n1 -q comp -t 01:00:00 -C 4titanv --pty /bin/bash+ollama-ahpcc run <model> <query>
 </code> </code>
  
-Once you are logged in to your session, you need to determine which LLM model you wish to use for the session. More information about the various models can be seen here: https://ollama.com/library+You need to determine which LLM model you wish to use for the query. More information about the various models can be seen here: https://ollama.com/library
 To see the available models: To see the available models:
  
 <code> <code>
-$ ollama list+$ ollama-ahpcc list 
 NAME                            ID          SIZE  MODIFIED      NAME                            ID          SIZE  MODIFIED     
 all-minilm:latest              1b226e2802db 45 MB 3 weeks ago  all-minilm:latest              1b226e2802db 45 MB 3 weeks ago
Line 29: Line 32:
 </code> </code>
  
- +For querying with a single command from Ollama...
-There are two ways to utilize Ollama. Interactively, and as a single command query. +
- +
-To use a session interactively... +
- +
-<code> +
-ollama run <model> +
-</code> +
- +
-When you are done with the session, exit with the command... +
- +
-<code> +
-/bye> +
-</code> +
- +
-For querying with a single command from outside the Ollama shell...+
  
 <code> <code>
-ollama run <model> <prompt query>+ollama-ahpcc run <model> <prompt query>
 </code> </code>
  
Line 53: Line 41:
  
 <code> <code>
-ollama run llama3 How many states are in the USA?+ollama-ahpcc run llama3 How many states are in the USA?
  
 There are 50 states in the United States of America (USA). There are 50 states in the United States of America (USA).
Line 61: Line 49:
  
 <code> <code>
-ollama run llama3 Please list in JSON format the names, state capitols, and populations of all 50 US states+ollama-ahpcc run llama3 Please list in JSON format the names, state capitols, and populations of all 50 US states
  
 Here is the list of 50 US states with their names, state capitals, and populations in JSON format: Here is the list of 50 US states with their names, state capitals, and populations in JSON format:
Line 89: Line 77:
  
 <code> <code>
-ollama run llama3 Please list all 50 US states >> states.txt+ollama-ahpcc run llama3 Please list all 50 US states >> states.txt
 </code> </code>
  
Line 95: Line 83:
  
 <code> <code>
-ollama run llama3 "$(cat nas.f90)" Please provide a brief summary of the file+ollama-ahpcc run llama3 "$(cat nas.f90)" Please provide a brief summary of the file
  
 This is a Fortran code file, specifically a main program and subroutine for solving a partial differential  This is a Fortran code file, specifically a main program and subroutine for solving a partial differential 
Line 120: Line 108:
  
 <code> <code>
-ollama run llama3 "$(head -n 1000 3nir.pdb)" Please provide a brief summary of the file+ollama-ahpcc run llama3 "$(head -n 1000 3nir.pdb)" Please provide a brief summary of the file
  
 This appears to be a PDB (Protein Data Bank) file, which contains structural information about a protein or  This appears to be a PDB (Protein Data Bank) file, which contains structural information about a protein or 
ollama_llm.1713982798.txt.gz · Last modified: 2024/04/24 18:19 by jpummil