python manage.py runserver
ngrok http --url=free-camel-deadly.ngrok-free.app 8000
# in seperate terminal
ollama serve
python app.py
python generate.py
ollama create blogforge -f ./Modelfile
OLLAMA_USE_CUDA=1 ollama run blogforge
Ollama
: Source
Install:
curl -fsSL https://ollama.com/install.sh | sh
Run:
ollama serve
ollama run llama3.2
More:
ollama show llama3.2
ollama list
ollama ps
ollama stop llama3.2
ollama run llava "What's in this image? /home/Vicky/Pictures/Screenshot.png"
WebUI : Blog
wget https://repo.anaconda.com/archive/Anaconda3-2024.10-1-Linux-x86_64.sh
sha256sum Anaconda3-2024.10-1-Linux-x86_64.sh
bash Anaconda3-2024.10-1-Linux-x86_64.sh
source ~/.bashrc
conda init
git clone https://github.com/browser-use/web-ui.git
cd web-ui/
conda info --env
conda activate base
pip install -r requirements.txt
playwright install
python webui.py --ip 127.0.0.1 --port 7788