I'm sure this has been posted here before, but it's too good not to repost.
I'm sure this has been posted here before, but it's too good not to repost.


I'm sure this has been posted here before, but it's too good not to repost.
You're viewing a single thread.
I delivered pizza during COVID and most people I worked with couldn't follow simple directions to an address or read a road map. If a destination didn't show up on their cellphone's navigation then they were immediately and hopelessly lost.
If you don't use and exercise your brain then it atrophies and dies. AI is going turn a lot of people into conscious vegetables.
We need to teach people curiosity. I use my GPS all the time because of construction and stuff but I also look at the route before I leave so that I know where I’m headed on my own, too. Meanwhile I know people who’ve lived in a city for decades and still can’t get around it without help.
We need to teach people curiosity.
This is called being a lifelong learner. Learning something new every week, or even daily, no matter how small, will always improve your life. It keeps your mind active and it adds to your problem solving.
So tell me how using a technology that can help summarize topics, create transcripts from meetings, and act like a teacher to ask questions too prevents this from happening? We need to teach people how to use the tools at hand pretending they dont exist won't put the genie back in the bottle it will only further excasbate the problem. Yes using gen ai to write your paper for you is a terrible use case. Feeding it a research paper and asking it to break it down into simpler topics so one can build their knowledge and asking for it to help with creating a bibliography on a paper so you can worry about the information at hand instead of trying to remember the syntax for the myriad of different ways one can site sources on the other hand is extremely useful and helps contribute to lifelong learners.
Summarizing topics is nothing more than Cliffs Notes, and if you got caught using those, you were busted. You needed to do the work and read the whole thing to complete the assignment. Shortcuts mean you lose things that may be important.
Transcripts are fine if you are actually there, voice to text is never perfect. People have accents and computers mess up words that sound alike, accent or not. People don't always pronounce correctly.
Asking an LLM to teach you something is never going to work out until the creators specifically feed it valid, true information, not scrape the internet and people's text messages. And then you need to teach it to think like a Human, which it never will.
Feeding it a research paper seems like it might work out, but that deprives you of the ability to problem solve. You need to learn to be organized, take notes in a structured manner, choose what you believe is pertinent information in that paper. You participate, not passively get told what it is. This is a brain expanding activity. You are connected, that's how we learn.
I am very pro computer and automation. Computers are there to help us save time on tasks that take a lot of time, and repetitive tasks. Screwing bolts onto tires in a car factory is hard on Humans for 8 hours, robots can do it. But having AI write junk articles that make no sense to fill up websites is a greedy money grab, and distorts facts. I don't need Google telling me to put glue in my pizza cheese, or to shove my dick in a loaf of bread to see if it's done. And now all the 'AI' owners want to scan every personal thing you have on your phone, computer, social media, and here in the US, all of our private government data.
Welcome to 1984, run by clowns. No one is putting in the hard work required to make any of the public tools do what is claimed on the label. It's just invasive technology right now that produces less than stellar products and infringes on so many Human Rights in the process.
I'm just going to leave this here. Title is a bit clickbait-y, but it's a good video. https://youtu.be/DNE0sy7mR5g
Absolutely.
Thinking about it, our school systems do prioritize memorizing just enough information to pass a test and then people just kinda forget it all because they didn’t really get a chance to internalize it. The best teacher I ever had earned that title from me because he took the main curriculum and threw it out, teaching us instead how to be comfortable and confident with the CAD program. When the other class, taught by the moron who wrote the curriculum, even, joined us the semester after they basically had to be retaught because they retained nothing over the Christmas break and the rest of us kinda just sat there until they figured it out.
It ends up discouraging “frivilous” learning, demanding we learn not only specific stuff but so much of it that there’s no way we can actually absorb it. It’s the difference between letting a sponge soak in a bucket and just dipping it in the ocean.
I have this problem with a bunch of new hires. I'll show them another way to do something and they'll ask, oh where was that written down? I said Just think about what I just did and how it makes sense, its not written down this is a neat trick i'm showing you. I swear there is no creativity or critical thinking anymore, just a bunch of automatons that follow protocol to the letter and the second there is a situation outside those very narrow parameters they just implode. Someone had to figure all of this out at one point and make the protocol in the first place, sometimes there is no step by step guide and you need to exercise judgement and make some decisions on your own.
I have this problem with a bunch of new hires.
sometimes there is no step by step guide and you need to exercise judgement and make some decisions on your own.
They probably think they aren't paid enough to care so much to actually exert mental effort beyond strict requests and step by step and don't expect to have a future to look forward to so why worry about progressing in a career?
Because that's how I treat my job. I don't get paid nearly enough to try beyond the absolute bare fucking minimum.
Humans are unlearning how to adapt.
Pretty sure this has been happening for decades. The "problem" (it's not a problem) is navigation systems, not LLMs.
I don't think they were blaming AI for the inability to follow directions without a GPS... They were making an analogy.
The root issue is the same, which is: to delegate more and more rational thought to machines.
The root of the issue is people are tired of seeing little reward for their effort while those at the top rake in all the benefit with little effort. This teaches people putting in effort amounts to nothing and that thought process then permeates into everything they do. It's not tools causing the issue its our societies inaction to reduce inequality at large and removing the incentive to be ambitious which eventually creates people with low drive.
Yup. How many phone numbers do you have memorized? If you're from the era before cellphones you had to memorize numbers or carry a cheat sheet. You probably had anywhere from 10-30 numbers memorized. Now people don't even know their spouses numbers.