- case
i
nsensitive r
recursive- only show fi
l
enames
I also like to use:
I
(capital i) to skip binary files, if I'm in a folder with heavy images/videos/etcc 3
to show 3 lines around the matched text
Fun fact: I use "git bash shell" over windows' cmd just because of grep
grep -irl "some text that the file would have"
(Obiously only work for text files, but that's enough to cover 90% of cases for me)
Yup, the other comments helped me understand the idea better, I definitely cannot relate :/
Not like you described, but some days I try so much of the food as I go that when I finish I'm already satisfied (though I would definitely not describe it as "not wanting to eat")
I'll just leave this here: https://www.youtube.com/watch?v=q4rL_Lnt6kA
Bernoulli was a mathematician: https://en.m.wikipedia.org/wiki/Bernoulli_distribution
The Probability Distribution named after him models binary variables
I would disagree on the basis that Linux upgrades don't require hardware upgrades (unless you have a very low end hardware that's hanging by a thread already)
For example, I don't remember seeing all this fuss about upgrading when people were moving from 8.1 to 10 (but it could just be me on my bubble)
I didn't play much myself, but I believe one thing can be said: if you don't like keyboard-heavy controls, go for the steam version
It has a more modern UI and graphics, even though you could get some of the features (or all of them), you'd need to spend some time setting up thr mods
I didn't know about lazy loading, that's cool!
Then I guess that the meme doesn't apply anymore. Though I will state that (from my anedoctal experience) people that can use Panda's most advanced features* are also comfortable with other data processing frameworks (usually more suitable to large datasets**)
*Anything beyond the standard groupby
- apply
can be considered advanced, from the placrs I've been
**I feel the urge to note that 60Mb isn' lt a large dataset by any means, but I believe that's beyond the point
It really depends on the machine that is running the code. Pandas will always have the entire thing loaded in memory, and while 600Mb is not a concern for our modern laptops running a single analysis at a time, it can get really messy if the person is not thinking about hardware limitations
This story describes well why I love DF, the random stuff is insanely good
Too bad I abandon any fortress where the minimal incident happens :(
I don't like the announcement on a meme community But this does looks like one of the few nice LLM applications
Urist's secrer