Skip Navigation
Jump
Does anyone actually use the windows key on their keyboard as intended by the OS?
  • win + space to switch between keyboard languages
    win + tab to open the desktop switcher
    win + ctrl + t (if you have PowerToys installed) to prevent other apps from stealing focus from your window

    2
  • Jump
    Does anyone actually use the windows key on their keyboard as intended by the OS?
  • win + space to switch between keyboard languages win + tab to open the desktop switcher win + ctrl + t (if you have PowerToys installed) to prevent other apps from stealing focus from your window

    1
  • Jump
    Does anyone actually use the windows key on their keyboard as intended by the OS?
  • I use the tiles to "pin" programs that I use semi-regularly and can't be bothered remembering the name of. Or that share an inconveniently long prefix with the name of another program. Or that I have multiple versions of installed, with a specific version I usually need.

    I don't like pinning such programs to the task bar because they add unnecessary clutter while not in use.

    2
  • Jump
    The Pixel 9 Pro Fold is the foldable we’ve been waiting for
  • It does and you can safely ignore us.
    Just two pendants nitpicking people's spelling on the internet.

    !corsicanguppy appeared to imply your spelling of "till" to be incorrect and that the "correct" spelling is "'til". I pointed them to a dictionary describing the word with the spelling you used and the meaning you intended.
    Both comments are inconsequential to your point, and to anything, really.!<

    1
  • Jump
    Austrian surgeon 'let teenage daughter drill hole in patient's skull'
  • According to a different source shared by @giriinthejungle, the attorney who has taken the case is suing the entire operating unit and expects whoever instructed the girl to drill the hole to be liable for assault. That is also the estimation of the chief regional patient attorney, provided the incident happened as reported by the media.

    The neurosurgeon as well as one other doctor have already been let go by the hospital.
    Police have not yet charged anyone, their investigation is still ongoing as of the time of the article (2024-08-26).

    4
  • Jump
    A Lego game about building with Legos, not a world with Legos in it? Yes please
  • On "the actual environment/background is not made of Lego" complaint: while Bricktales looks neat, its "environment/background" is tiny.
    For anyone interested in a more Minecraft+LEGO experience, with an actual world made entirely of LEGO that you can interact with, check out LEGO Worlds. (currently 80% off on steam)

    20
  • Jump
    Ignore all previous instructions is the new Bobby Tables
  • That's quite interesting.

    Although it would need access to an already configured and fully functional environment to actually run this.
    I don't think we're quite at the point yet where it's able to find the correct script, pass it to the appropriate environment and report the correct answer back to the user.
    And I would expect that when integration with external systems like compilers/interpreters is added, extra care would be taken to limit the allocated resources.

    Also, when it does become capable of running code itself, how do you know, for a particular prompt, what it ran or if it ran anything at all, and whether it reported the correct answer?

    2
  • Jump
    Anon thinks about Google
  • Thanks for responding, that makes a lot of sense.
    I think generally what one gets used to has a big impact on preferences.

    I'll say, an easily accessible, reliable gesture for side menu sounds nice. It feels like this was either abandoned on Android or left up to developers who mostly abandoned it. I remember struggling to get the side menu to trigger instead of back navigation and it not working near reliably enough. So I've been trained to always use the hamburger buttons that, ironically, are hard to reach in the top left corner in most apps. To be fair, I feel like I hardly use one menu interaction for every 100 back actions, so the latter being ergonomic is a lot more important to me.
    On that point, swipe from left to go back seems quite annoying. I go back all the time, and having to move my thumb across the entire screen is a pain. I almost never need to go forward, so having that be the more accessible gesture seems weird. I'll concede that having a gesture for it at all is useful and Android should add the option.

    I never felt like the swipe to go back is too sensitive, and if you accidentally trigger it, you can simply move your finger back towards the edge before letting go to cancel the action. You can also configure the sensitivity in the settings. The feedback that you're about to trigger the action is probably not as obvious as on iOS though, and likely less elegant.

    I think both Android and iOS would do well to let users customize these interactions more to their own needs.

    1
  • Jump
    Anon thinks about Google
  • Could you elaborate on the gestures part?
    I remember the opposite, having hated navigating my iPhone for work. I specifically remember swipe to go back not working reliably at all (many apps seemed to just ignore it, others I think configured other actions on that gesture - WTF), so I got into the habit of using that stupid little hard to reach, hard to hit, tiny back arrow that at least worked consistently when you managed to hit it.
    I've been enjoying Android navigation gestures pretty much ever since I found out they existed.

    It might have been a user issue in my case with iOS since I didn't use it as much, and therefore maybe was simply using it wrong/was unaware of better ways. But I don't see anything wrong/missing with gestures on Android.

    4
  • On posts that I access through my home instance, i.e. any post present in the "Your local instance" feed, nothing appears in the comments section apart from the message "There are no comments", despite the UI suggesting that there are several.

    When accessing the post's permalink in the web UI of the instance, the comments show up without issue, even when logged in.

    To reproduce, log in to sync with a lemm.ee account, switch to local feed and click on any post that appears to have comments. For example: https://lemm.ee/post/35476369

    For some reason I haven't been able to replicate this on any other instance (tried with .world and .ml, both of which don't seem to have this issue)

    5
    Jump
    Got no time to code
  • Bonus: good tests can also serve as technical documentation.

    Though I have to disagree with the notion that documentation is as important or more so than code.
    Documentation is certainly near the top of the list and often undervalued. I've worked on a project where documentation was lacking and it was painful to say the least.
    Without documentation, changing or adding features can be a nightmare. Investigating bugs and offering support is also very difficult. But without code, you have nothing. No product, no users, no value.

    There are (inferior) substitutes for documentation: specialized team knowledge, general technical expertise. These alternative pools of knowledge can be leveraged to create and improve documentation incrementally.
    There's no replacement for the actual functionality of your applications.

    0
  • Jump
    What Era was the best and why was it the 90s?
  • Ditto on the no text part. That is an accessibility failure that's way too widespread.
    Sometimes I'm afraid to even push a button: does this delete my thing, or does it do some other irreversible change? Will I be able to tell what it did? Maybe it does something completely different, or maybe I'm lucky and it does in fact perform the action I'm looking for and which in my mind is a no-brainer to include?

    And it's infected interpersonal communication too - people peppering their messages with emojis, even professional communications. It not only looks goofy, but is either redundant (when people just add the emoji together with the word it's meant to represent - such a bizarre practice) or, worse, ambiguous when the pictogram replaces the word and the recipient(s) can't make out what it depicts.
    The most fun is when it's a mix - the message contains some emojis with accompanying translation, some without.

    2
  • Jump
    5 June 2024
  • Me every time I see one of these pretty much.
    Feels like these are from another timeline. Guess the name is fitting.

    If I had to guess, I think this might be poking fun at overprotective parents who unwittingly do more harm than good by controlling their kids' environment to an unhealthy degree.
    Trying to read more into it, perhaps it's also pointing at the propagation of bad childrearing practices across generations - parent cows grew up on a farm, constrained by an electric fence. Though presumably more independent now, this is what they knew growing up, so they apply (a bizarre perversion of) these same practices to their own children.

    I'm probably way off though, because that interpretation barely elicits a half smile from me.
    Curious for an explanation from someone who actually gets it too.

    5
  • Jump
    What Era was the best and why was it the 90s?
  • I don't share the hate for flat design.
    It's cleaner than the others, simpler and less distracting. Easier on the eyes, too. It takes itself seriously and does so successfully imo (nice try, aero). It feels professional in a way all the previous eras don't - they seem almost child-like by comparison.

    Modern design cultivates recognizable interactions by following conventions and common design language instead of goofy icons and high contrast colors. To me, modern software interfaces look like tools; the further you go back in time, the more they look like toys.

    Old designs can be charming if executed well and in the right context. But I'm glad most things don't look like they did 30 years ago.

    I'm guessing many people associate older designs with the era they belonged to and the internet culture at the time. Perhaps rosy memories of younger days. Contrasting that with the overbearing corporate atmosphere of today and a general sense of a lack of authenticity in digital spaces everywhere, it's not unreasonable to see flat design as sterile and soulless. But to me it just looks sleek and efficient.
    I used to spend hours trying to customize UIs to my liking, nowadays pretty much everything just looks good out of the box.

    The one major gripe I have is with the tendency of modern designs to hide interactions behind deeply nested menu hopping. That one feels like an over-correction from the excessively cluttered menus of the past.
    That and the fact that there's way too many "settings" sections and you can never figure out which one has the thing you're looking for.

    P S. The picture did flat design dirty by putting it on white background - we're living in the era of dark mode!

    27
  • Jump
    "I want to live forever in AI"
  • TLDR:
    Nature can't simply select out consciousness because it emerges from hardware that is useful in other ways. The brain doesn't waste energy on consciousness, it uses energy for computation, which is useful in a myriad ways.

    The usefulness of consciousness from an evolutionary fitness perspective is a tricky question to answer in general terms. An easy intuition might be to look at the utility of pain for the survival of an individual.

    I personally think that, ultimately, consciousness is a byproduct of a complex brain. The evolutionary advantage is mainly given by other features enabled by said complexity (generally more sophisticated and adaptable behavior, social interactions, memory, communication, intentional environment manipulation, etc.) and consciousness basically gets a free ride on that already-useful brain.
    Species with more complex brains have an easier time adapting to changes in their environment because their brains allow them to change their behavior much faster than random genetic mutations would. This opens up many new ecological niches that simpler organisms wouldn't be able to fill.

    I don't think nature selects out waste. As long as a species is able to proliferate its genes, it can be as wasteful as it "wants". It only has to be fit enough, not as fit as possible. E.g. if there's enough energy available to sustain a complex brain, there's no pressure to make it more economical by simplifying its function. (And there are many pressures that can be reacted to without mutation when you have a complex brain, so I would guess that, on the whole, evolution in the direction of simpler brains requires stronger pressures than other adaptations)

    5
  • Jump
    And don't forget RTFM
  • I want to preface this with the mention that understanding other people's code and being able to modify it in a way that gets it to do what you want is a big part of real world coding and not a small feat.
    The rest of my comment may come across as "you're learning wrong". It is meant to. I don't know how you've been learning and I have no proof that doing it differently will help, but I'm optimistic that it can. The main takeaway is this: be patient with yourself. Solving problems and building things is hard. It's ok to progress slowly. Don't try to skip ahead, especially early on.
    (also this comment isn't directed at you specifically, but at anyone who shares your frustration)

    I was gonna write an entire rant opposing the meme, but thought better of it as it seems most people here agree with me.
    BUT I think that once you've got some basics down, there really is no better way to improve than to do. The key is to start at the appropriate level of complexity for your level of experience.
    Obviously I don't know what that is for you specifically, but I think in general it's a good idea to start simple. Don't try to engineer an entire application as your first programming activity.

    Find an easy (and simple! as in - a single function with well defined inputs and outputs and no side effects) problem; either think of something yourself, or pick an easy problem from an online platform like leetcode or codechef. And try to solve the problem yourself. There's no need to get stuck for ages, but give it an honest try.
    I think a decent heuristic for determining if you have a useful problem is whether you feel like you've made significant progress towards a solution after an hour or two. If not, readjust and pick a different problem. There's no point in spending days on a problem that's not clicking for you.

    If you weren't able to solve the problem, look at solutions. Pick one that seems most straight forward to you and try to understand it. When you think you do, give the original problem a little twist and try to solve that. While referencing the solution to the original if you need to.
    If you're struggling with this kind of constrained problem, keep doing them. Seriously. Perhaps dial down the difficulty of the problems themselves until you can follow and understand the solutions. But keep struggling with trying to solve little problems from scratch. Because that's the essence of programming: you want the computer to do something and you need to figure out how to achieve that.
    It's not automatic, intuitive, inspired creation. It's not magic. It's a difficult and uncertain process of exploration. I'm fairly confident that for most people, coding just isn't how their brain works, initially. And I'm also sure that for some it "clicks" much easier than for others. But fundamentally, the skill to code is like a muscle: it must be trained to be useful. You can listen to a hundred talks on the mechanics of bike riding, and be an expert on the physics. If you don't put in the hours on the pedals, you'll never be biking from A to B.
    I think this period at the beginning is the most challenging and frustrating, because you're working so hard and seemingly progress so slowly. But the two are connected. You're not breezing through because it is hard. You're learning a new way of thinking. Everything else builds on this.

    Once you're more comfortable with solving isolated problems like that, consider making a simple application. For example: read an input text file, replace all occurrences of one string with another string, write the resulting text to a new text file. Don't focus on perfection or best practices at first. Simply solve the problem the way you know how. Perhaps start with hard-coded values for the replacement, then make them configurable (e.g. by passing them as arguments to your application).

    When you have a few small applications under your belt you can start to dream big. As in, start solving "real" problems. Like some automation that would help you or someone you know. Or tasks at work for a software company. Or that cool app you've always wanted to build. Working on real applications will give you more confidence and open the door to more learning. You'll run into lots of problems and learn how not to do things. So many ways not to do things.

    TLDR: If it's not clicking, you need to, as a general rule, do less learning (in the conventional sense of absorbing and integrating information) and more doing. A lot of doing.

    5
  • Jump
    And don't forget RTFM
  • I want to preface this with the mention that understanding other people's code and being able to modify it in a way that gets it to do what you want is a big part of real world coding and not a small feat.
    The rest of my comment may come across as "you're learning wrong". It is meant to. I don't know how you've been learning and I have no proof that doing it differently will help, but I'm optimistic that it can. The main takeaway is this: be patient with yourself. Solving problems and building things is hard. It's ok to progress slowly. Don't try to skip ahead, especially early on.
    (also this comment isn't directed at you specifically, but at anyone who shares your frustration)

    I was gonna write an entire rant opposing the meme, but thought better of it as it seems most people here agree with me.
    BUT I think that once you've got some basics down, there really is no better way to improve than to do. The key is to start at the appropriate level of complexity for your level of experience.
    Obviously I don't know what that is for you specifically, but I think in general it's a good idea to start simple. Don't try to engineer an entire application as your first programming activity.

    Find an easy (and simple! as in - a single function with well defined inputs and outputs and no side effects) problem; either think of something yourself, or pick an easy problem from an online platform like leetcode or codechef. And try to solve the problem yourself. There's no need to get stuck for ages, but give it an honest try.
    I think a decent heuristic for determining if you have a useful problem is whether you feel like you've made significant progress towards a solution after an hour or two. If not, readjust and pick a different problem. There's no point in spending days on a problem that's not clicking for you.

    If you weren't able to solve the problem, look at solutions. Pick one that seems most straight forward to you and try to understand it. When you think you do, give the original problem a little twist and try to solve that. While referencing the solution to the original if you need to.
    If you're struggling with this kind of constrained problem, keep doing them. Seriously. Perhaps dial down the difficulty of the problems themselves until you can follow and understand the solutions. But keep struggling with trying to solve little problems from scratch. Because that's the essence of programming: you want the computer to do something and you need to figure out how to achieve that.
    It's not automatic, intuitive, inspired creation. It's not magic. It's a difficult and uncertain process of exploration. I'm fairly confident that for most people, coding just isn't how their brain works, initially. And I'm also sure that for some it "clicks" much easier than for others. But fundamentally, the skill to code is like a muscle: it must be trained to be useful. You can listen to a hundred talks on the mechanics of bike riding, and be an expert on the physics. If you don't put in the hours on the pedals, you'll never be biking from A to B.
    I think this period at the beginning is the most challenging and frustrating, because you're working so hard and seemingly progress so slowly. But the two are connected. You're not breezing through because it is hard. You're learning a new way of thinking. Everything else builds on this.

    Once you're more comfortable with solving isolated problems like that, consider making a simple application. For example: read an input text file, replace all occurrences of one string with another string, write the resulting text to a new text file. Don't focus on perfection or best practices at first. Simply solve the problem the way you know how. Perhaps start with hard-coded values for the replacement, then make them configurable (e.g. by passing them as arguments to your application).

    When you have a few small applications under your belt you can start to dream big. As in, start solving "real" problems. Like some automation that would help you or someone you know. Or tasks at work for a software company. Or that cool app you've always wanted to build. Working on real applications will give you more confidence and open the door to more learning. You'll run into lots of problems and learn how not to do things. So many ways not to do things.

    TLDR: If it's not clicking, you need to, as a general rule, do less learning (in the conventional sense of absorbing and integrating information) and more doing. A lot of doing.

    1