The future sucks
-
This post did not contain any content.
-
This post did not contain any content.
eh bad take imo, this is one of the few places where AI shines, it's great because you no longer need to go to a recipe website to begin with, you just ask it for a recipe and it gives you one and then you can discuss different variants etc
-
eh bad take imo, this is one of the few places where AI shines, it's great because you no longer need to go to a recipe website to begin with, you just ask it for a recipe and it gives you one and then you can discuss different variants etc
Like those that use glue to keep cheese from sliding off the pizza.
In case you're not joking, please don't trust this technology with anything that you are putting into your or someone else's body. You're going to have a bad time.
-
eh bad take imo, this is one of the few places where AI shines, it's great because you no longer need to go to a recipe website to begin with, you just ask it for a recipe and it gives you one and then you can discuss different variants etc
I respectfully disagree, while you might be able to get some pointers, I would not trust LLMs with the ingredients quantities (given that replacing a number or measure unit is quite easy and would go unnoticed)
So while I could understand asking: "should I put bell peppers on this dish?", I would never trust it's answer to "how much bell pepper should I put in the recipe?" (Which I believe is what recipes are about)
-
This post did not contain any content.
There's a firefox extention that filters out everything irrelevant to the recipe. At this point as soon as I open the browser half my ressources go into reverting enshittification.
-
Like those that use glue to keep cheese from sliding off the pizza.
In case you're not joking, please don't trust this technology with anything that you are putting into your or someone else's body. You're going to have a bad time.
wrote on last edited by [email protected]In case you’re not joking, please don’t trust this technology with anything that you are putting into your or someone else’s body. You’re going to have a bad time.
It's too late buddy
Similarly, a February study from the University of Sydney, which surveyed more than 2,000 adults, reported that nearly six in ten respondents had asked ChatGPT at least one high-risk health question—queries that would typically require professional clinical input.
https://observer.com/2025/05/openai-chatgpt-health-care-use/
Also please don't go blindly believing all advice you're given, you obviously don't use glue on a pizza in the same way you don't follow google maps through a river or off a pier.
-
I respectfully disagree, while you might be able to get some pointers, I would not trust LLMs with the ingredients quantities (given that replacing a number or measure unit is quite easy and would go unnoticed)
So while I could understand asking: "should I put bell peppers on this dish?", I would never trust it's answer to "how much bell pepper should I put in the recipe?" (Which I believe is what recipes are about)
I would never trust it’s answer to “how much bell pepper should I put in the recipe?” (Which I believe is what recipes are about)
I mean to be fair, you're free to click on the links if you want to verify these things no?
-
In case you’re not joking, please don’t trust this technology with anything that you are putting into your or someone else’s body. You’re going to have a bad time.
It's too late buddy
Similarly, a February study from the University of Sydney, which surveyed more than 2,000 adults, reported that nearly six in ten respondents had asked ChatGPT at least one high-risk health question—queries that would typically require professional clinical input.
https://observer.com/2025/05/openai-chatgpt-health-care-use/
Also please don't go blindly believing all advice you're given, you obviously don't use glue on a pizza in the same way you don't follow google maps through a river or off a pier.
So in this case you would have to go to another website to find a real recipe anyway.
Just use the glue like a good acolyte!
-
So in this case you would have to go to another website to find a real recipe anyway.
Just use the glue like a good acolyte!
Just use the glue like a good acolyte!
I personally wouldn't but I'm scared I might be talking to someone who drinks it
So in this case you would have to go to another website to find a real recipe anyway.
Right, have you used perplexity at all?
-
Just use the glue like a good acolyte!
I personally wouldn't but I'm scared I might be talking to someone who drinks it
So in this case you would have to go to another website to find a real recipe anyway.
Right, have you used perplexity at all?
I heard perplexity eats electricity, which makes it even dumber than me.
-
I would never trust it’s answer to “how much bell pepper should I put in the recipe?” (Which I believe is what recipes are about)
I mean to be fair, you're free to click on the links if you want to verify these things no?
So since we have to manually verify everything anyway, the LLM just becomes a mere search engine.
This contradicts the entire point you claimed it was useful in the first place because we would still have to visit those websites.
-
I would never trust it’s answer to “how much bell pepper should I put in the recipe?” (Which I believe is what recipes are about)
I mean to be fair, you're free to click on the links if you want to verify these things no?
Why use the AI in the first place then? Just search for the actual recipe sources from the start.
-
eh bad take imo, this is one of the few places where AI shines, it's great because you no longer need to go to a recipe website to begin with, you just ask it for a recipe and it gives you one and then you can discuss different variants etc
I've seen too many hallucinations specifically with this to even want to try it.
-
So since we have to manually verify everything anyway, the LLM just becomes a mere search engine.
This contradicts the entire point you claimed it was useful in the first place because we would still have to visit those websites.
Why do I feel like I'm teaching toddlers how basic AI works
-
I've seen too many hallucinations specifically with this to even want to try it.
wrote on last edited by [email protected]What ai were you using? I'm curious (and expecting either Google AI summary or no response)
-
This post did not contain any content.
Add “cooked.wiki/“ in front of any recipe url to preserve your sanity
-
Add “cooked.wiki/“ in front of any recipe url to preserve your sanity
time to make a recipe website that hides the recipe in JavaScript, but leaves the story in HTML.
-
time to make a recipe website that hides the recipe in JavaScript, but leaves the story in HTML.
Satan: “Alright, let’s all calm down for just a moment.”
-
Why do I feel like I'm teaching toddlers how basic AI works
Because you're being a pretentious asshole and you yourself do not understand how AI works, nor can you argue against "it isn't reliable for recipes since it hallucinates"? It's either that, or you are the only smart person in this thread. Not sure which.
-
Because you're being a pretentious asshole and you yourself do not understand how AI works, nor can you argue against "it isn't reliable for recipes since it hallucinates"? It's either that, or you are the only smart person in this thread. Not sure which.
wrote on last edited by [email protected]it isn’t reliable for recipes since it hallucinates
This is how it goes:
- Don't use AI
- See memes about AI getting it wrong
- Believe that AI gets it wrong 100% of the time
You guys are just as bad as trump supporters
- Don't have an EV
- See memes about EV's catching on fire
- Believe that EV's catch on fire all the time
8/10
I'm impressed
https://youtu.be/Ci-Evf8nQH4?t=934
Lemmy users:
Look out you'll die if you use AI to make some food! Don't even use it for recommendations or ideas or maybe different things you can try or maybe you want to know a way to do a specific thing or try a slight variant because you might drink battery acid by mistake!!!1