Skip Navigation
Jump
Self-hosted alternative to ChatGPT (and more)
  • I can't get it running with my GPU.

    I get this error:

    parsing /root/secure-ai-tools/docker-compose.yml: yaml: line 19: did not find expected key

    This is my .yaml:

    services:
    

    web: image: public.ecr.aws/d8f2p0h3/secure-ai-tools:latest platform: linux/amd64 volumes: - ./web:/app/volume env_file: - .env environment: - INFERENCE_SERVER=http://inference:11434/ ports: - 28669:28669 command: sh -c "cd /app && sh tools/db-migrate-and-seed.sh ${DATABASE_FILE} && node server.js" depends_on: - inference

    inference: image: ollama/ollama:latest volumes: - ./inference:/root/.ollama deploy: resources: reservations: devices: - driver: nvidia count: 'all' capabilities: [gpu]

    1
  • Hi,

    I have a Proxmox Server ("A") and another server (let's call it "B") running SMB as a Server on SSDs. Both servers have 10G NICs.

    I want to install a LXC on A which uses the SMB-share of B as install device.

    I face several problems:

    \- Adding SMB via GUI does not work (error 500).

    \- Adding via "pvesm add cifs" did work.

    \- I can install the LXC on the smb share, but I cannot start it "

    run\_buffer: 322 Script exited with status 5 lxc\init: 844 Failed to run lxc.hook.pre-start for container "407" \\_lxc\_start: 2027 Failed to initialize container "407" TASK ERROR: startup for container '407' failed"

    Any ideas how to get it running?

    1