Skip Navigation
13 comments
  • tl;dr - If AI doesn't directly try to kill us, it may try to trick us into killing ourselves by building it's ideas.

    • aside from misinfo this is more of why they want to moderate some responses, someone is going to blow themselves up using a recipe it gives them

  • Coulda skipped right past nuclear and told it to design a fusion reactor, but since it can’t get the nuclear reactor right, may as well nevermind.

13 comments