User Tools

Site Tools


ollama_llm

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
ollama_llm [2024/04/24 17:18]
jpummil
ollama_llm [2024/04/27 02:59] (current)
jpummil
Line 2: Line 2:
  
 Ollama is an open source inference engine for running LLM models. [[https://ollama.com]]. Ollama is an open source inference engine for running LLM models. [[https://ollama.com]].
-To run a local LLM, you need two ingredients: the model itself, and the inference engine, which is a piece of software that can run the model. Conceptually, the inference engine processes the input (a text prompt), feeds it through the neural network of the model, and retrieves the response.+To run a local LLM, you need two ingredients: the model itself, and the inference engine, which is a piece of software that can run the model. Conceptually, the inference engine processes the input (a text prompt), feeds it through the neural network of the model, and retrieves the response. Ollama is different than Chat-GPT and Gemini as it runs locally without accessing the internet for model data. The models are stored locally on Pinnacle.
  
-AHPCC is making this tool available to our users in a limited fashion so as not to impact the broad array of research tasks making their way thru the queues. As such, Ollama is only available on a single, specific node and should be accessed as follows:+AHPCC is making this tool available to our users in a limited fashion so as not to impact the broad array of research tasks making their way thru the queues. As such, Ollama is available and can be accessed as shown: 
 + 
 +For a query to a model, you can do so via a login node as follows:
  
 <code> <code>
-srun -p pcon06 -N1 -n1 -q comp -t 06:00:00 -C 4titanv --pty /bin/bash+ollama-ahpcc run <model> <query>
 </code> </code>
  
-Once you are logged in to your session, you need to determine which LLM model you wish to use for the session. To see the available models:+You need to determine which LLM model you wish to use for the query. More information about the various models can be seen here: https://ollama.com/library 
 +To see the available models:
  
 <code> <code>
-$ ollama list+$ ollama-ahpcc list 
 NAME                            ID          SIZE  MODIFIED      NAME                            ID          SIZE  MODIFIED     
 all-minilm:latest              1b226e2802db 45 MB 3 weeks ago  all-minilm:latest              1b226e2802db 45 MB 3 weeks ago
Line 28: Line 32:
 </code> </code>
  
-There are two ways to utilize Ollama. Interactively, and as a single command query. +For querying with a single command from Ollama...
- +
-To use a session interactively...+
  
 <code> <code>
-ollama run <model>+ollama-ahpcc run <model> <prompt query>
 </code> </code>
  
-When you are done with the session, exit with the command... +Here is a brief example of running a simple query with result...
- +
-<code> +
-/bye> +
-</code> +
- +
-For querying with a single command from outside the Ollama shell...+
  
 <code> <code>
-ollama run <model> <prompt query> +ollama-ahpcc run llama3 How many states are in the USA?
-</code>+
  
-Here is a brief example of running a simple query with result... 
- 
-<code> 
-ollama run llama3 How many states are in the USA? 
 There are 50 states in the United States of America (USA). There are 50 states in the United States of America (USA).
 </code> </code>
Line 58: Line 49:
  
 <code> <code>
-ollama run llama3 Please list in JSON format the names, state capitols, and populations of all 50 US states+ollama-ahpcc run llama3 Please list in JSON format the names, state capitols, and populations of all 50 US states
  
 Here is the list of 50 US states with their names, state capitals, and populations in JSON format: Here is the list of 50 US states with their names, state capitals, and populations in JSON format:
Line 86: Line 77:
  
 <code> <code>
-ollama run llama3 Please list all 50 US states >> states.txt+ollama-ahpcc run llama3 Please list all 50 US states >> states.txt
 </code> </code>
  
Line 92: Line 83:
  
 <code> <code>
-ollama run llama3 "$(cat nas.f90)" please provide a brief summary of the file+ollama-ahpcc run llama3 "$(cat nas.f90)" Please provide a brief summary of the file 
 This is a Fortran code file, specifically a main program and subroutine for solving a partial differential  This is a Fortran code file, specifically a main program and subroutine for solving a partial differential 
 equation (PDE) using a finite difference method. The code appears to be part of a larger numerical simulation  equation (PDE) using a finite difference method. The code appears to be part of a larger numerical simulation 
Line 113: Line 105:
 </code> </code>
  
-As we are all still learning how to efficiently and properly utilize LLM's in our daily lifeyou may be better served looking on the Ollama GitHub site under "Issues"or sign up for their Discord server should you have encounter specific problems or if you should have unique use cases you'd like to explore...+Input size for Ollama is somewhat limited (perhaps 1000 lines?)so it may be necessary to pare down input using other Linux tools. For examplea protein database file with 2200 lines won't fit...but paring the data down to the first 1000 lines does...
  
 <code> <code>
 +ollama-ahpcc run llama3 "$(head -n 1000 3nir.pdb)" Please provide a brief summary of the file
 +
 +This appears to be a PDB (Protein Data Bank) file, which contains structural information about a protein or 
 +protein complex. The file is formatted in a specific way and includes information such as:
 +
 +* Atom coordinates (3D positions) for each atom in the protein
 +* Atom names and types (e.g. N, CA, C, O)
 +* Chain identifiers (e.g. A, B) indicating which chain a particular residue belongs to
 +</code>
 +
 +It should be noted that for analyzing images and other various files, it requires a multi-modal model like llava or bakllava and results seems to be a bit mixed.
 +
 +As we are all still learning how to efficiently and properly utilize LLM's in our daily life, you may be better served looking on the Ollama GitHub site under "Issues", or sign up for their Discord server should you have encounter specific problems or if you should have unique use cases you'd like to explore...
 +
 https://github.com/ollama/ollama/issues https://github.com/ollama/ollama/issues
 +
 https://discord.com/invite/ollama https://discord.com/invite/ollama
-</code>+
  
  
ollama_llm.1713979124.txt.gz · Last modified: 2024/04/24 17:18 by jpummil