
How to use Dockerized Powerful AI
Use the Dockerized version of PowerfulAI for a much faster and complete startup of PowerfulAI.
Minimum Requirements
Tip ➤➤ Running PowerfulAI on AWS/GCP/Azure?
➤ You should aim for at least 2GB of RAM. Disk storage is proportional to however much data
➤ You will be storing (documents, vectors, models, etc). Minimum 10GB recommended.
docker
installed on your machine- access to an LLM running locally or remotely
Note
➤ PowerfulAI by default uses a built-in vector database powered by LanceDB (opens in a new tab)
➤ PowerfulAI by default embeds text on instance privately
Recommend way to run dockerized PowerfulAI!
Important!
➤ If you are running another service on localhost like Chroma, LocalAi, or LMStudio you will need to use http://host.docker.internal:xxxx
to access the service from within
the docker container using PowerfulAI as localhost:xxxx
will not resolve for the host system
➤ Requires Docker v18.03+ on Win/Mac and 20.10+ on Linux/Ubuntu for host.docker.internal to resolve!
➤ Linux: add --add-host=host.docker.internal:host-gateway
to docker run command for this to resolve.
➤ eg: Chroma host URL running on localhost:8000 on host machine needs to be http://host.docker.internal:8000
when used in PowerfulAI.
Tip ➤➤ It is best to mount the containers storage volume to a folder on your host machine so that you can pull in future updates without deleting your existing data!
Pull in the latest image from docker. Supports both amd64
and arm64
CPU architectures.
docker pull moorip/powerfulai
Note --cap-add SYS_ADMIN
is a required command if you want to scrape
webpages. We use PuppeeteerJS (opens in a new tab) to
scrape websites links and --cap-add SYS_ADMIN
lets us use sandboxed Chromium
across all runtimes for best security practices.
Mount the storage locally and run PowerfulAI in Docker
Go to http://localhost:3001
and you are now using PowerfulAI! All your data and progress will persist between
container rebuilds.
How to use the user interface
To access the full application, visit http://localhost:3001
in your browser.
Common questions and fixes
1. Cannot connect to service running on localhost!
If you are in docker and cannot connect to a service running on your host machine running on a local interface or loopback:
localhost
127.0.0.1
0.0.0.0
Important!
➤ On linux http://host.docker.internal:xxxx
does not work.
➤ Use http://172.17.0.1:xxxx
instead to emulate this functionality.
Then in docker you need to replace that localhost part with host.docker.internal
. For example, if running Ollama on the host machine, bound to http://127.0.0.1:11434
you should put http://host.docker.internal:11434
into the connection URL in PowerfulAI.
2. Having issues with Ollama?
If you are getting errors like llama:streaming - could not stream chat. Error: connect ECONNREFUSED 172.17.0.1:11434
then visit this README.