User Tools

Site Tools


requirements

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
requirements [2025/12/30 04:27] robynrequirements [2025/12/30 04:29] (current) robyn
Line 3: Line 3:
   - An Avatar, Alt, setup to be an AvaDroid.\\ \\    - An Avatar, Alt, setup to be an AvaDroid.\\ \\ 
   - A machine running [[https://docs.docker.com/engine/install/|Docker]] which will host Corrade and AvaDroid. The OS on the machine can be any that can run Docker. It is recommended to install [[https://github.com/portainer/portainer|Portainer]] on Docker. The Docker Host will need a fixed IP number.\\ \\    - A machine running [[https://docs.docker.com/engine/install/|Docker]] which will host Corrade and AvaDroid. The OS on the machine can be any that can run Docker. It is recommended to install [[https://github.com/portainer/portainer|Portainer]] on Docker. The Docker Host will need a fixed IP number.\\ \\ 
-  - A machine, can be the same one, running Ollama and a Selected LLM. This machine requires an NVIDIA GPU with at least 8GB VRAM, more the better, at least 12GB recommended. For Second Life, I recommend an uncensored LLM such as [[https://ollama.com/library/llama3.1|llama3.1]]  Ollama is preferred as it can dynamically load and unload LLMs allowing the AI to be used for other tasks. The Ollama Server, also, needs a fixed IP number. It is not recommended to Dockerise Ollama as it runs well on the Host machine "closer" to the GPU. I recommend Ubuntu 24.04 LTS server. +  - A machine, can be the same one, running Ollama and a Selected LLM. This machine requires an NVIDIA GPU with at least 8GB VRAM, more the better, at least 12GB recommended. For Second Life, I recommend LLM such as [[https://ollama.com/library/llama3.1|llama3.1]] to start with.  Ollama is preferred as it can dynamically load and unload LLMs allowing the AI engine to be used for other tasks. The Ollama Server, also, needs a fixed IP number. I do not recommend Dockerising Ollama as it runs well on the Host machine "closer" to the GPU. I recommend Ubuntu 24.04 LTS server. 
  
requirements.1767097641.txt.gz · Last modified: by robyn