Great job
Great job
Great job
You left "sudo" off that last frame.
The script will prompt you.
Blackarch be like
Some of those can be good if you want a single command to install on any OS.
And not have to wait for a maintainer to update the package to have the latest version.
Bash/Sh on Windows? And what's so bad about 2-3 separate commands anyway?
Gets the job done, but shoudn't and isn't intended for non-programmer end user.
\
I'm not mad at small programs or developers with not much time to setup a distribution pipeline, they should be praised for their work at the program itself. But different OSes have different places to unpack a program and this allows simple updates, we should respect that for consistency at user end. Expect it's Windows, which is a unspecified mess anyway, let's go and unpack everything raw on C:\ or into user directory.
How much you wanna bet the "dev" doesn't realise chromium is a dependency, in this scenario?
What do you mean you don't have to restart your terminal software every afternoon when the four windows consume six gigabytes of RAM?
I saw a terminal app a few weeks ago that had AI INTEGRATION of all things.
Genuinely, fuzzy search and autocomplete is a great application of "AI" (machine learning algorithms).
They just need to stop branding it as AI and selling everything they feed the models....
Hopefully that day is soon what with those 1-bit models I've been hearing about. I'd be all for that, but I'll be damned if I'll be putting an OpenAI key into my terminal lol.
Do those really require ML? For an e-commerce with millions of entries maybe, but for a CLI I don't see it.
But shouldn't it be feature of the shell (extension), not terminal emulator app?
Warp.dev! It’s the best terminal I’ve used so far, and the best use of AI as well! It’s extremely useful with some AI help for the thousands of small commands you know exist but rarely uses. And it’s very well implemented.
I don't understand what is the benefit here over a terminal with a good non-LLM based autocomplete. I understand that, theoretically, LLMs can produce better autocomplete, but idk if it is really that big of a difference with terminal commands. I guess its a small shortcut to have the AI there to ask questions, too. It's good to hear its well implemented, though.
closed source sadly :/
Which one ?
came to mind. It uses web technology to make a terminal. I've never used it, so I have no idea if it works well or not.
is-even
Also curious.
It's about the third newest one in the list now.
Which list bruh ?
It's really simple, it's a container containing a virtual os, which runs a browser and a webserver to run the app. The app connects to several external api services to do it's thing.
It's like, really simple!
I‘m very scared that this might actually be the case in some apps out there.
If they're running a virtual OS in a container, they're doing it very wrong. Containers and VMs are quite different, even on a Windows host.
It probably was very simple for the kid who wrote it, just import everything and write a couple of lines to use all this stuff that already exists!
Gotta love using a base container image that is far too overkill for what you're trying to run.
I mean, isn't the entire point of a container largely non-functional compared to good deploy/install scripts? Both are perfectly capable of guaranteeing a predictable functional environment for the app. The container is just easier to use, harder to accidentally render insecure, and easier to clean up.
All of their benefits are NOT for the app itself.
I get to witness to enterprise services flavor of that. Where the company pays software architects that aren't actually coding and coders not allowed to make architectural decisions.
You have software that takes http? You need to rewrite it so that you only speak rabbitmq, and use it for every http request or Web socket message, don't worry, we have a team that specializes in making http translate to rabbit mq, so you only have to rewrite the server code, another team will handle the http listener that translates to you.
What's that, you have a non http protocol? Well, the other team isn't scoped to handle that, so you'll need to convert your listener to rabbitmq and create a whole separate container to actually receive the packets in udp and then translate to rabbitmq. No "processing" software is allowed to speak anything but rabbitmq, and network listener containers are only allowed to dumb receive and Forward.
Tech hipsters be like: you had me at container!