Thank you for sharing this! It's hilarious! Reminds me of how in movies or TV series actors pretend to knit (it's horrible to watch as a knitter). LLM also imitates knowledge but does not make sense. It's a great experiment anyone can try: pick a topic you know a lot about and ask LLM to clarify something or answer a question about it and read carefully. You will notice that it will sort of sound smart and correct but approximately 10-20% of information will be incorrect, contradictory, and even stupid. If you are a novice, you might not catch these 10-20% right away and believe that the content is good quality, but it's not. I tried it with several subjects I studied in depth and understand very well (both practical things, like crafts, and more abstract things, like social science) - the result is always the same: lots of grammatically correct text that sounds clever and just imitates knowledge.
Thank you! The only thing I would add to this is extrapolation, because we all know some people who can’t connect dots. If LLM can be this wildly inaccurate here, where we are technically proficient enough to be able to spot the glaring boners, the smart scout will bring that same level of skepticism to any technical content generated the same way.
Thank you for sharing this! It's hilarious! Reminds me of how in movies or TV series actors pretend to knit (it's horrible to watch as a knitter). LLM also imitates knowledge but does not make sense. It's a great experiment anyone can try: pick a topic you know a lot about and ask LLM to clarify something or answer a question about it and read carefully. You will notice that it will sort of sound smart and correct but approximately 10-20% of information will be incorrect, contradictory, and even stupid. If you are a novice, you might not catch these 10-20% right away and believe that the content is good quality, but it's not. I tried it with several subjects I studied in depth and understand very well (both practical things, like crafts, and more abstract things, like social science) - the result is always the same: lots of grammatically correct text that sounds clever and just imitates knowledge.
Love, love this!!!
Thank you! The only thing I would add to this is extrapolation, because we all know some people who can’t connect dots. If LLM can be this wildly inaccurate here, where we are technically proficient enough to be able to spot the glaring boners, the smart scout will bring that same level of skepticism to any technical content generated the same way.
Hahahaha now I know where certain political figures get their speeches! What a discombobulated mess!