Still much appreciated, gifted it to some cousins of mine who are the age I was when I read it.
And again, what ChatGPT here had to offer was utterly – absolutely – false.
Like a fluent and practised (but unwise) liar it had contrived an account that fitted only the available information."
A fundamental flaw in modern "AI" is that it has no model of reality against which it is checking its (or anyone else's) words. It isn't even lying; it has no concept of the difference between truth and lies, therefore it cannot be made to stop lying, because it isn't even lying, it's just spewing language that sounds good.
This makes it a passable tool for sales, but an extremely poor tool for anything which requires accuracy of any kind. It's not that it is trying to be accurate and failing (in which case further work on it might be expected to improve it); it is not attempting to be accurate; indeed the concept of accuracy is not anywhere in its architecture. Vocabulary, grammar, syntax, yes; accuracy or truth, no. It will not get better at being truthful as it is worked on, it will only get better at being persuasive.
For the early ones, the reality against which they were checking their words, was their training corpus. Now they've also got RAG and search.
In the context of "find a story that fits this description", even the original training corpus was pretty effective. Not perfect, but pretty good… for stuff that was well represented.
If all Transformer models could do was vocabulary, grammar, and syntax, they wouldn't have ever been usable as more than a very light form of autocomplete.
Even word2vec got analogies (man is to woman what king is to queen, etc.).
> it will only get better at being persuasive
I kinda agree, unfortunately: it will *also* keep getting better at being persuasive, and this is indeed a bad thing. "LGTM" is easier to test for than "correct" — but to me, that's an example of humans having limits on how well we can model reality, how well we can differentiate between truth and lies, etc.
But the author's absolutely right to warn that it also regularly fails us, and the author's also right the celebrate the folks who are trained specifically in finding this sort of information in ways that the automated tools can't replicate yet.
This has yet to help. If it can find it, it (so far) has not needed to change the details I provided; if it doesn't know, it will change them to something thematically similar but still not find what I wanted (and if I insist on requiring certain story elements that it had changed, it will say something along the lines of "no results found" but with more flowery language).
Perhaps somebody with more compute than they know what to do with could try this with Project Gutenberg as a proof of concept.
I put a short piece of it into Google books and it found it! I asked ChatGPT about the actual book and it claimed to know it!
It was a book called Blood Knots by Luke Jennings. I bought it, and before I read it I saw the friend who sent me the excerpt, and gave it to him. A year later I saw the same book, shelf soiled, in an independent store. It was worth the wait, it was a great read.
I also saw David Allen Green (author of the above) ask his question on Bluesky on my first day using it. Somehow I feel part of this story
I'd love the satisfaction of tracking it down some day just like this person did.
Maybe you clever folk will be more ingenious than I've been.
The story goes like this:
An old king's life is upended one day when a beautiful, mysterious woman appears and says she'll grant youth and her hand in marriage to the man that completes some challenges. The only challenge I remember was that she sets up 3 cauldrons: One had boiling oil. One had milk. The last had ice-cold water.
The king wants to see it work first so he points to a random little boy and orders him to jump into the cauldrons or he'd be put to death.
The boy leaps into each cauldron and there's a terrible delay on the last one. The cold water cauldron even freezes over.
The boy breaks out of the last cauldron and has been transformed into a strapping young man.
The king, seeing proof that it works, decides to jump into the cauldrons. However, when he hops out, he's still an old man.
The woman announces that the magic only works once, and she and the stable boy walk away together, arm-in-arm.
...
I've searched for it online a fair bit but I've never found it.
Some details from my memory:
* The cartoon was very short (less than 30 minutes. probably closer to 10 or 15)
* It had no dialog, only sound effects and music.
* A woman's voice narrated it. I can still hear her.
* Now that I'm grown, I see it having a Slavic or Russian aesthetic.
* The woman had black hair and a long white dress.
* The king was very short with a big white beard.
* The boy, when he turns into a man, has pointy boots and shoulder pads. :)
* Probably made between 1975 and 1985
* Part of an anthology (many cartoons on one VHS tape... ours had been run so much that it started to skew and stretch the image)
...In my mind, it's aesthetically very similar to an heirloom that my grandmother made and I assume that's why I've always wanted to find it.
ChatGPT and the intertubes in general haven't been very useful.
Maybe it was in fact produced in Russia or one of the former Warsaw Pact countries? They had their own animation tradition, and some of it was translated in the West (like The Little Mole from Czech is), but I can easily see how such works could be very obscure to English-language searches.
Sadly, I can't recall any more about it than that, but maybe it'll help that you're not alone. And of course this could be nothing at all related to what you're after.
[0] https://en.wikipedia.org/wiki/Belle_and_Sebastian_(1965_TV_s... [1] https://en.wikipedia.org/wiki/The_Storyteller_(TV_series)
This doesn't quite fit several of the points you remember (very much in line with the post!), so perhaps it was some other edition of that same story.
EDIT: so, I just plugged your description into ChatGPT and it gave the exact same answer, including an identical timestamp! Weird.
tldr: it is actually (literally) "written", but not all in the same place. It's scatterred throughout various places.
https://chatgpt.com/share/678061d1-7920-8000-9b72-0b2d7ea60e...
I gave up trying to get it to fix its answer.
> The second is that nowadays the real problem perhaps is not with Christmas decorations staying up too late, but with them going up too early, and with shops selling Christmas wares and playing Christmas music well before Advent, let alone Christmas