50 million rendered polygons vs one spicy 4.2MB boi
50 million rendered polygons vs one spicy 4.2MB boi
Maybe it's time we invent JPUs (json processing units) to equalize the playing field.
291ReplyWell, do you have dedicated JSON hardware?
165ReplyEverybody gangsta still we invent hardware accelerated JSON parsing
121ReplyRender the json as polygons?
106ReplyPermanently Deleted
91ReplyThat is sometime the issue when your code editor is a disguised web browser 😅
76Replythere are simd accelerated json decoders
57ReplyCPU vs GPU tasks I suppose.
43ReplyWould you rather have 100,000 kg of tasty supreme pizza, or 200 kg of steaming manure?
Choose wisely.
43ReplyI have the same problem with XML too. Notepad++ has a plugin that can format a 50MB XML file in a few seconds. But my current client won't allow plugins installed. So I have to use VS Code, which chokes on anything bigger than what I could do myself manually if I was determined.
37ReplySomeone just needs to make a GPU-accelerated JSON decoder
25ReplyWorks fine in
vim
24ReplyReject MB, embrace MiB.
17ReplyRockstar making GTA online be like: "Computer, here is a 512mb json file please download it from the server and then do nothing with it"
12ReplyLet it be known that heat death is not the last event in the universe
11ReplyYou jest, but i asked for a similar (but much simpler) vector / polygon model, and it generated it.
11ReplyThe obvious solution is parsing jsons with gpus? maybe not...
7ReplyC++ vs JavaScript
6ReplyI've never had any problems with 4,2 MB (and bigger) json files. What languages/libraries/editors chokes on it?
5ReplyGiven it is a CPU is limiting the parsing of the file, I wonder how a GPU-based editor like Zed would handle it.
Been wanting to test out the editor ever since it was partially open sourced but I am too lazy to get around doing it
0Reply