{"id":778,"date":"2025-06-10T10:53:33","date_gmt":"2025-06-10T09:53:33","guid":{"rendered":"https:\/\/www.methodix.co.uk\/?p=778"},"modified":"2025-06-10T10:53:33","modified_gmt":"2025-06-10T09:53:33","slug":"hallucination-isnt-a-bug-but-it-is-a-problem","status":"publish","type":"post","link":"https:\/\/www.methodix.co.uk\/index.php\/2025\/06\/10\/hallucination-isnt-a-bug-but-it-is-a-problem\/","title":{"rendered":"Hallucination isn&#8217;t a bug (but it is a problem)"},"content":{"rendered":"\n<div class=\"wp-block-uagb-image uagb-block-c215e0d8 wp-block-uagb-image--layout-default wp-block-uagb-image--effect-static wp-block-uagb-image--align-none\"><figure class=\"wp-block-uagb-image__figure\"><img decoding=\"async\" srcset=\"https:\/\/www.methodix.co.uk\/wp-content\/uploads\/2025\/06\/hallucination.jpg ,https:\/\/www.methodix.co.uk\/wp-content\/uploads\/2025\/06\/hallucination.jpg 780w, https:\/\/www.methodix.co.uk\/wp-content\/uploads\/2025\/06\/hallucination.jpg 360w\" sizes=\"auto, (max-width: 480px) 150px\" src=\"https:\/\/www.methodix.co.uk\/wp-content\/uploads\/2025\/06\/hallucination.jpg\" alt=\"\" class=\"uag-image-779\" width=\"1024\" height=\"1024\" title=\"hallucination\" loading=\"lazy\" role=\"img\"\/><\/figure><\/div>\n\n\n<p><strong>Part 2 of our \u201cBuzzword Breakdown\u201d series<\/strong><br \/>This week, Sky News caught ChatGPT confidently delivering a transcript for a show\u2026 that it didn\u2019t have access to: <a href=\"https:\/\/news.sky.com\/video\/did-chatgpt-lie-to-sam-coates-about-transcript-for-podcast-13380234\">https:\/\/news.sky.com\/video\/did-chatgpt-lie-to-sam-coates-about-transcript-for-podcast-13380234.<\/a><br \/><br \/>\ud83d\udc68 \u201cHas this been uploaded?\u201d<br \/>\ud83d\udde8\ufe0f Yes.<br \/>\ud83d\udc68 \u201cIs it synthetic?\u201d<br \/>\ud83d\udde8\ufe0f No.<br \/>\ud83d\udc68 \u201cAre you sure?\u201d<br \/>\ud83d\udde8\ufe0f Yes. (pause) Okay, yes. I made it up.<br \/><br \/>This wasn\u2019t lying.<br \/>It\u2019s what the industry calls a hallucination.<br \/><br \/><strong>\ud83e\udd16 What is a hallucination?<\/strong><br \/>It\u2019s when an AI gives you an answer that sounds perfectly reasonable\u2014but isn\u2019t true.<br \/>These models don\u2019t deliberately mislead.<br \/>They\u2019ve just been trained to produce what sounds like the right answer\u2014not what is the right answer.<br \/><br \/><strong>\ud83e\udde0 Why does this happen?<\/strong><br \/>Because AI models don\u2019t \u201cknow\u201d things.<br \/>They don\u2019t have memory or understanding the way humans do.<br \/>If they don\u2019t have access to the right data,<em> they fill in the blanks with their best guess<\/em>.<br \/><br \/><strong>\u2705 So how do we stop it?<\/strong><br \/>You can\u2019t eliminate hallucinations completely\u2014but you can design systems that catch and contain them. Here\u2019s how:<br \/><br \/><strong>\ud83d\udcbe Give the model trusted information to work from<\/strong><br \/>Pull answers from a live database or approved source material, not just the model\u2019s training data.<br \/><strong>\ud83d\udde8\ufe0f Ask for the basis of an answer<\/strong><br \/>Prompts like \u201cWhat\u2019s your source?\u201d or \u201cHow confident are you?\u201d encourage more cautious, transparent responses.<br \/><strong>\ud83d\udc41\ufe0f\u200d\ud83d\udde8\ufe0f Ask the model to double-check its own output<\/strong><br \/>With the right prompt, it can reflect and revise its initial answer.<br \/><strong>\u2696\ufe0f Use a second model to review and flag risky content<\/strong><br \/>Especially useful when accuracy really matters\u2014like legal, financial, or medical scenarios.<br \/><strong>\ud83e\udd13 Design prompts that reward accuracy, not just fluency<\/strong><br \/>Even small changes in phrasing can reduce the risk of confident nonsense.<br \/><strong>\u26a1 Use model capabilities wisely<\/strong><br \/>For high-stakes tasks, treat the AI as an assistant or idea-generator\u2014not the final authority.<br \/><br \/>At Methodix, this is the kind of design thinking we bring to every AI solution: practical, strategic, and human-aware.<br \/><br \/>Some hallucinations are obvious. Others? Convincing\u2014but still wrong.<br \/><br \/>That\u2019s why smart AI systems include a safety net\u2014catching the unicorns before they leave the factory. \ud83e\udd84<\/p>","protected":false},"excerpt":{"rendered":"<p>Part 2 of our \u201cBuzzword Breakdown\u201d seriesThis week, Sky News caught ChatGPT confidently delivering a transcript for a show\u2026 that [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":779,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_uag_custom_page_level_css":"","site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"disabled","footer-sml-layout":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"footnotes":""},"categories":[16],"tags":[],"class_list":["post-778","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-tips"],"uagb_featured_image_src":{"full":["https:\/\/www.methodix.co.uk\/wp-content\/uploads\/2025\/06\/hallucination.jpg",800,800,false],"thumbnail":["https:\/\/www.methodix.co.uk\/wp-content\/uploads\/2025\/06\/hallucination-150x150.jpg",150,150,true],"medium":["https:\/\/www.methodix.co.uk\/wp-content\/uploads\/2025\/06\/hallucination-300x300.jpg",300,300,true],"medium_large":["https:\/\/www.methodix.co.uk\/wp-content\/uploads\/2025\/06\/hallucination-768x768.jpg",768,768,true],"large":["https:\/\/www.methodix.co.uk\/wp-content\/uploads\/2025\/06\/hallucination.jpg",800,800,false],"1536x1536":["https:\/\/www.methodix.co.uk\/wp-content\/uploads\/2025\/06\/hallucination.jpg",800,800,false],"2048x2048":["https:\/\/www.methodix.co.uk\/wp-content\/uploads\/2025\/06\/hallucination.jpg",800,800,false]},"uagb_author_info":{"display_name":"methodix","author_link":"https:\/\/www.methodix.co.uk\/index.php\/author\/methodix\/"},"uagb_comment_info":1,"uagb_excerpt":"Part 2 of our \u201cBuzzword Breakdown\u201d seriesThis week, Sky News caught ChatGPT confidently delivering a transcript for a show\u2026 that [&hellip;]","_links":{"self":[{"href":"https:\/\/www.methodix.co.uk\/index.php\/wp-json\/wp\/v2\/posts\/778","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.methodix.co.uk\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.methodix.co.uk\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.methodix.co.uk\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.methodix.co.uk\/index.php\/wp-json\/wp\/v2\/comments?post=778"}],"version-history":[{"count":1,"href":"https:\/\/www.methodix.co.uk\/index.php\/wp-json\/wp\/v2\/posts\/778\/revisions"}],"predecessor-version":[{"id":780,"href":"https:\/\/www.methodix.co.uk\/index.php\/wp-json\/wp\/v2\/posts\/778\/revisions\/780"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.methodix.co.uk\/index.php\/wp-json\/wp\/v2\/media\/779"}],"wp:attachment":[{"href":"https:\/\/www.methodix.co.uk\/index.php\/wp-json\/wp\/v2\/media?parent=778"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.methodix.co.uk\/index.php\/wp-json\/wp\/v2\/categories?post=778"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.methodix.co.uk\/index.php\/wp-json\/wp\/v2\/tags?post=778"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}