AI chatbots unable to accurately summarise news, BBC finds
-
-
Um… yea? It’s not supposed to? Let’s ignore how dangerous and foolish it would be to allow llm’s admin control of a system. The thing that prevents it from doing that is well, the llm has no mechanism to do that. The best it could do is ask you to open a command line and give you some code to put in. Its kinda like asking siri to preheat your oven. It didn’t have access to your ovens system.
You COULD get a digital only stove, and the llm could be changed to give it to reach out side itself, but its not there yet, and with how much siri miss interprets things, there would be a lot more fires
-
Anyone blindly saying a tool is ineffective for every situation that exists in the world is a tool themselves.
-
I noticed that. When I ask it about things that I am knowledgeable about or simply wish to troubleshoot I often find myself having to correct it. This does make me hestitant to follow the instructions given on something I DON'T know much about.
-
I am a creative writer (as in, I write stories and stuff) or at least I used to be. Sometimes when talking to chatGPT about ideas for writing it can be interesting, but other times it is kinda annoying since I am more into fine tuning instead of having it innudate me with ideas that I don't find particularly interesting.
-
Lame platitude
-
Wrong thread?
-
I'm pretty sure that every user of Apple Intelligence could've told you that. If AI is good at anything, it isn't things that require nuance and factual accuracy.
-
Oh yes. The LLM will lie to you, confidently.
-
No better time to get into self hosting!
-
You're supposed to gatekeep code. There is nothing wrong with gatekeeping things that aren't hobbies.
If someone can't explain every change they're making and why they chose to do it that way they're getting denied. The bar is low.
-
Exactly. I think this is a good barometer of gauging whether or not you can trust it. Ask it about things you know you're good at or knowledgeable about. If it is giving good information, the type you would give out, then it is probably OK. If it is bullshitting you or making you go 'uhh, no, actually...' then you need to do more old-school research.