It also shows why those DOOM demos were only 2-3 seconds long, because that's how long it can keep cohesion for.
You know that thing that happened with the AI generated DOOM?
Well, someone decided to do the same thing with Minecraft, and you can see that the results are... basically abysmal:
https://youtu.be/7Jd-Rr9cJYo?si=-9XZ51ss6cBuSiC3
Skip to 2:00 for the actual "gameplay".
TL:DW nothing is saved outside of the view screen, things aren't even saved within the view screen, the resolution is like 240p at 20fps, input latency and mouse latency is awful, and this was all apparently done by training on literal millions of hours of Minecraft footage. The mid-range computer I had from 2006 could run the game better than this. A 14-year-old netbook could run the game better than whatever supercomputer they're using to render this.
Note that the person in the video isn't part of the team/whoever that created it, just someone who is reviewing it.
Only a few thousand.
They've updated the article. Apparently there isn't a model releasing later this year.
Wouldn't really need a script, though. Just open up photoshop or GIMP and add a layer after everything is finished.
"Given these three steps, what's the logical fourth one?"
"..."
"God this embryo is an idiot."
The only reason I use dark mode is because my phone is OLED so it uses less power. Otherwise I go for light.
Awful.systems defaults to dark mode.
1.2 million in the last two days, according to them.
Create the problem and sell the solution.
Is this supposed to be a parody or something? If so, why is this being done at the direction of the FBI? Or is that also part of the parody? If so I didn't know you could use the FBI logo for that purpose.
They need all the training data that they can get.
It would be a sign of the End Times. Problem is that if it is then they've been Left Behind.
Newsom also signed AB 1008, which clarifies that any personal data fed to an AI model retains the same privacy rights it would otherwise — including the consumer’s right to correct and delete their personal information. That’ll be a fun one to implement.
I think what it actually clarified is that personal information generated from an AI model are now covered under the law, instead of just what is used as training data.
And, of course, we know that once one company uses those chats, it becomes literally impossible for anyone else to use them.
Investors demand growth. The problem is that Microsoft has basically won Capitalism and has no real area to grow to.
Investors demand growth. The problem is that Microsoft has basically won Capitalism and has no real area to grow to.
I don't think they're planning to build the 5GW datacentres on TMI, though.
The theoretical Three Mile Island datacentre is probably the end-point of all of this. I think that what they build there (should it actually happen) has to be AGI. Like, unambiguously AGI. Where even the most hardline AI skeptics will react with "Holy shit they actually did it."
Anything less and it'll be nearly impossible to justify building 5 nuclear reactors to just power one of these datacentres.
So even if it is happening, it's still not happening.