Elon Musk's xAI released its Grok large language model as "open source" over the weekend. The billionaire clearly hopes to set his company at odds with
Grokking is actually a concept in ML, when a model's loss start suddenly lower far after it is considered to have overfit. That notion was named by researchers, I'll let people decide if it is aptly named, but Elon likely just took it from there.
And we aren’t just talking about a technical quibble, such as picking a usage license that’s not as open as another (Grok is Apache 2.0, if you’re wondering).
It isn't a "technical quibble" to point out licences like RAIL-M violate several points of the open source definition.
With AI, this is arguably not possible at all, because the way machine learning models are created involves a largely unknowable process whereby a tremendous amount of training data is distilled into a complex statistical representation the structure of which no human really directed, or even understands.
Such a well-worded description. TechCrunch seems to know their shit.