One thing I struggle with AI is the answers it gives always seem plausable, but any time I quiz it on things I understand well, it seems to constantly get things slightly wrong. Which tells me it is getting everything slightly wrong, I just don’t know enough to know it.
I see the same issue with TV. Anyone who works in a compicated field has felt the sting of watching a TV show fail to accurate represent it while most people watching just assume that’s how your job works.
This is what I call “confidently wrong”. If you ask it about things you have no clue about, it seems incredibly well-informed and insightful. Ask it something you know deeply, and you’ll easily see it’s just babbling and spouting nonsense - sure makes you wonder about those earlier statements it made, doesn’t it?
One thing I struggle with AI is the answers it gives always seem plausable, but any time I quiz it on things I understand well, it seems to constantly get things slightly wrong. Which tells me it is getting everything slightly wrong, I just don’t know enough to know it.
I see the same issue with TV. Anyone who works in a compicated field has felt the sting of watching a TV show fail to accurate represent it while most people watching just assume that’s how your job works.
This is what I call “confidently wrong”. If you ask it about things you have no clue about, it seems incredibly well-informed and insightful. Ask it something you know deeply, and you’ll easily see it’s just babbling and spouting nonsense - sure makes you wonder about those earlier statements it made, doesn’t it?