people warn you against regarding neural nets as having anything to do with the actual human brain but a lot of the stuff my 3yo says does sound like GPT-2 output
like she’ll start a long sentence but lose track of the grammar and content by the end and just start filling in bits that sound roughly appropriate
example I happened to record today: “Science animals are just animals in movies, Daddy, or animals that don’t really know that their friends are at parks. That’s why they’re science animals.”
"Today is February of the March."
“I don’t want to be your mommy anymore, Mommy, because I DON’T KNOW HOW TO READ AND I DON’T WANT TO LEARN!!”
“I’m making myself look exactly like me!”
“I’m making Latin words — in a pot!”
“California has the evil spirits and monsters.”
It’s developing insights: “‘Jealous’ means you don’t like something, but ‘jelly’ means yum yum, and ‘yum yum’ means you like something!”
“The butter smells like GPS device.”
You can follow @LucreSnooker.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: