For docker stacks, do you typically run stand alone postgres containers, or do you run one postgres container and setup accounts for each new stack that requries it?
Right now I am running a new postgres in docker compose as needed, but seems this is wasteful and it maybe easier to manage/monitor if I just ran one instance.
I think it's fine. For development, I don't think it's really wasteful at all to spin one up for a project and then throw it away. Those containers are designed to be disposable. Your could keep enough data to bootstrap your application in a file under version control and recreate it each time.
If you put all of your databases for different projects on a single host on your network, it works great until you're on a laptop somewhere away from your network. You'd need to solve for that using ssh tunnels or something. Or you could just host your databases on Linode or Digital Ocean or something so that they are always available, but there is a small cost associated with that.
If I start a project, I typically use the tear down/rebuild database approach until it just doesn't make sense anymore to do it that way, and then worry about more long term connectivity when it's time for that.