{"id":122423,"date":"2024-11-05T18:27:41","date_gmt":"2024-11-05T11:27:41","guid":{"rendered":"https:\/\/hotvideos24.online\/?p=122423"},"modified":"2024-11-05T18:27:41","modified_gmt":"2024-11-05T11:27:41","slug":"google-claims-ai-first-after-sqlite-security-bug-discovered-the-register","status":"publish","type":"post","link":"https:\/\/hotvideos24.online\/?p=122423","title":{"rendered":"Google claims AI first after SQLite security bug discovered \u2022 The Register"},"content":{"rendered":"<p> <script async src=\"https:\/\/pagead2.googlesyndication.com\/pagead\/js\/adsbygoogle.js?client=ca-pub-3711241968723425\"\r\n     crossorigin=\"anonymous\"><\/script>\r\n<ins class=\"adsbygoogle\"\r\n     style=\"display:block\"\r\n     data-ad-format=\"fluid\"\r\n     data-ad-layout-key=\"-fb+5w+4e-db+86\"\r\n     data-ad-client=\"ca-pub-3711241968723425\"\r\n     data-ad-slot=\"7910942971\"><\/ins>\r\n<script>\r\n     (adsbygoogle = window.adsbygoogle || []).push({});\r\n<\/script><br \/>\n<\/p>\n<div id=\"body\">\n<p>Google claims one of its AI models is the first of its kind to spot a memory safety vulnerability in the wild \u2013 specifically an exploitable stack buffer underflow in SQLite \u2013 which was then fixed before the buggy code&#8217;s official release.<\/p>\n<p>The Chocolate Factory&#8217;s LLM-based bug-hunting tool, dubbed Big Sleep, is a collaboration between Google&#8217;s Project Zero and DeepMind. This software is said to be an evolution of earlier Project Naptime, announced in June.\u00a0<\/p>\n<p>SQLite is an open source database engine, and the stack buffer underflow <a target=\"_blank\" rel=\"nofollow noopener\" href=\"https:\/\/project-zero.issues.chromium.org\/issues\/372435124\">vulnerability<\/a> could have allowed an attacker to cause a crash or perhaps even achieve arbitrary code execution. More specifically, the crash or code execution would happen in the SQLite executable (not the library) due to a magic value of -1 accidentally being used at one point as an array index. There is an assert() in the code to catch the use of -1 as an index, but in release builds, this debug-level check would be removed.<\/p>\n<div aria-hidden=\"true\" class=\"adun\" data-pos=\"top\" data-raptor=\"condor\" data-xsm=\",fluid,mpu,\" data-sm=\",fluid,mpu,\" data-md=\",fluid,mpu,\">\n        <noscript><br \/>\n            <a href=\"https:\/\/pubads.g.doubleclick.net\/gampad\/jump?co=1&amp;iu=\/6978\/reg_software\/aiml&amp;sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&amp;tile=2&amp;c=2ZyoBLeCDTKSR59YS1OSqRQAAAEg&amp;t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0\" target=\"_blank\" rel=\"noopener\"><br \/>\n                <img decoding=\"async\" src=\"https:\/\/pubads.g.doubleclick.net\/gampad\/ad?co=1&amp;iu=\/6978\/reg_software\/aiml&amp;sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&amp;tile=2&amp;c=2ZyoBLeCDTKSR59YS1OSqRQAAAEg&amp;t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0\" alt=\"\"\/><br \/>\n            <\/a><br \/>\n        <\/noscript>\n    <\/div>\n<p>Thus, a miscreant could cause a crash or achieve code execution on a victim&#8217;s machine by, perhaps, triggering that bad index bug with a maliciously crafted database shared with that user or through some SQL injection. Even the Googlers admit the flaw is non-trivial to exploit, so be aware that the severity of the hole is not really the news here \u2013 it&#8217;s that the web giant believes its AI has scored a first.<\/p>\n<div aria-hidden=\"true\" class=\"adun\" data-pos=\"top\" data-raptor=\"falcon\" data-xmd=\",fluid,mpu,leaderboard,\" data-lg=\",fluid,mpu,leaderboard,\" data-xlg=\",fluid,billboard,superleaderboard,mpu,leaderboard,\" data-xxlg=\",fluid,billboard,superleaderboard,brandwidth,brandimpact,leaderboard,mpu,\">\n            <noscript><br \/>\n                <a href=\"https:\/\/pubads.g.doubleclick.net\/gampad\/jump?co=1&amp;iu=\/6978\/reg_software\/aiml&amp;sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&amp;tile=4&amp;c=44ZyoBLeCDTKSR59YS1OSqRQAAAEg&amp;t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0\" target=\"_blank\" rel=\"noopener\"><br \/>\n                    <img decoding=\"async\" src=\"https:\/\/pubads.g.doubleclick.net\/gampad\/ad?co=1&amp;iu=\/6978\/reg_software\/aiml&amp;sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&amp;tile=4&amp;c=44ZyoBLeCDTKSR59YS1OSqRQAAAEg&amp;t=ct%3Dns%26unitnum%3D426raptor%3Dfalcon%26pos%3Dmid%26test%3D0\" alt=\"\"\/><br \/>\n                <\/a><br \/>\n            <\/noscript>\n        <\/div>\n<div class=\"adun_eagle_desktop_story_wrapper\">\n<div aria-hidden=\"true\" class=\"adun\" data-pos=\"mid\" data-raptor=\"eagle\" data-xxlg=\",mpu,dmpu,\">\n                <noscript><br \/>\n                    <a href=\"https:\/\/pubads.g.doubleclick.net\/gampad\/jump?co=1&amp;iu=\/6978\/reg_software\/aiml&amp;sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&amp;tile=3&amp;c=33ZyoBLeCDTKSR59YS1OSqRQAAAEg&amp;t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0\" target=\"_blank\" rel=\"noopener\"><br \/>\n                        <img decoding=\"async\" src=\"https:\/\/pubads.g.doubleclick.net\/gampad\/ad?co=1&amp;iu=\/6978\/reg_software\/aiml&amp;sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&amp;tile=3&amp;c=33ZyoBLeCDTKSR59YS1OSqRQAAAEg&amp;t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0\" alt=\"\"\/><br \/>\n                    <\/a><br \/>\n                <\/noscript>\n            <\/div>\n<\/p><\/div>\n<p>We&#8217;re told that fuzzing \u2013 feeding random and\/or carefully crafted data into software to uncover exploitable bugs \u2013 didn&#8217;t find the issue.<\/p>\n<p>The LLM, however, did. According to Google, this is the first time an AI agent has found a previously unknown exploitable memory-safety flaw in widely used real-world software. After Big Sleep clocked the bug in early October, having been told to go through a bunch of commits to the project&#8217;s source code, SQLite&#8217;s developers <a target=\"_blank\" rel=\"nofollow noopener\" href=\"https:\/\/sqlite.org\/src\/info\/41d58a014ce89356\">fixed it<\/a> on the same day. Thus the flaw was removed before an official release.<\/p>\n<div aria-hidden=\"true\" class=\"adun\" data-pos=\"top\" data-raptor=\"falcon\" data-xsm=\",fluid,mpu,\" data-sm=\",fluid,mpu,\" data-md=\",fluid,mpu,\">\n            <noscript><br \/>\n                <a href=\"https:\/\/pubads.g.doubleclick.net\/gampad\/jump?co=1&amp;iu=\/6978\/reg_software\/aiml&amp;sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&amp;tile=4&amp;c=44ZyoBLeCDTKSR59YS1OSqRQAAAEg&amp;t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0\" target=\"_blank\" rel=\"noopener\"><br \/>\n                    <img decoding=\"async\" src=\"https:\/\/pubads.g.doubleclick.net\/gampad\/ad?co=1&amp;iu=\/6978\/reg_software\/aiml&amp;sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&amp;tile=4&amp;c=44ZyoBLeCDTKSR59YS1OSqRQAAAEg&amp;t=ct%3Dns%26unitnum%3D426raptor%3Dfalcon%26pos%3Dmid%26test%3D0\" alt=\"\"\/><br \/>\n                <\/a><br \/>\n            <\/noscript>\n        <\/div>\n<p>&#8220;We think that this work has tremendous defensive potential,&#8221; the Big Sleep team <a target=\"_blank\" rel=\"nofollow noopener\" href=\"https:\/\/googleprojectzero.blogspot.com\/2024\/10\/from-naptime-to-big-sleep.html\">crowed<\/a> in a November 1 write-up. &#8220;Fuzzing has helped significantly, but we need an approach that can help defenders to find the bugs that are difficult (or impossible) to find by fuzzing, and we&#8217;re hopeful that AI can narrow this gap.&#8221;\u00a0<\/p>\n<p>We should note that in October, Seattle-based Protect AI <a target=\"_blank\" rel=\"nofollow noopener\" href=\"https:\/\/www.theregister.com\/2024\/10\/20\/python_zero_day_tool\/\">announced<\/a> a free, open source tool that it claimed can find zero-day vulnerabilities in Python codebases with an assist from Anthropic&#8217;s Claude AI model.<\/p>\n<p>This tool is called Vulnhuntr and, according to its developers, it has found more than a dozen zero-day bugs in large, open source Python projects.<\/p>\n<div aria-hidden=\"true\" class=\"adun\" id=\"story_eagle_xsm_sm_md_xmd_lg_xlg\" data-pos=\"mid\" data-raptor=\"eagle\" data-xsm=\",mpu,dmpu,\" data-sm=\",mpu,dmpu,\" data-md=\",mpu,dmpu,\" data-xmd=\",mpu,dmpu,\" data-lg=\",mpu,dmpu,\" data-xlg=\",mpu,dmpu,\">\n            <noscript><br \/>\n                <a href=\"https:\/\/pubads.g.doubleclick.net\/gampad\/jump?co=1&amp;iu=\/6978\/reg_software\/aiml&amp;sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&amp;tile=3&amp;c=33ZyoBLeCDTKSR59YS1OSqRQAAAEg&amp;t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0\" target=\"_blank\" rel=\"noopener\"><br \/>\n                    <img decoding=\"async\" src=\"https:\/\/pubads.g.doubleclick.net\/gampad\/ad?co=1&amp;iu=\/6978\/reg_software\/aiml&amp;sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&amp;tile=3&amp;c=33ZyoBLeCDTKSR59YS1OSqRQAAAEg&amp;t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0\" alt=\"\"\/><br \/>\n                <\/a><br \/>\n            <\/noscript>\n        <\/div>\n<p>The two tools have different purposes, according to Google. &#8220;Our assertion in the blog post is that Big Sleep discovered the first unknown exploitable <em>memory-safety issue<\/em> in widely used real-world software,&#8221; a Google spokesperson told <em>The Register<\/em>, with our emphasis added. &#8220;The Python LLM finds different types of bugs that aren&#8217;t related to memory safety.&#8221;<\/p>\n<p>Big Sleep, which is still in the research stage, has thus far used small programs with known vulnerabilities to evaluate its bug-finding prowess. This was its first real-world experiment.<\/p>\n<p>For the test, the team collected several recent commits to the SQLite repository. After manually removing trivial and document-only changes, &#8220;we then adjusted the prompt to provide the agent with both the commit message and a diff for the change, and asked the agent to review the current repository (at <a target=\"_blank\" rel=\"nofollow noopener\" href=\"https:\/\/sqlite.org\/src\/info\/2f7eab381e167609\">HEAD<\/a>) for related issues that might not have been fixed,&#8221; the team wrote.<\/p>\n<p>The LLM, based on Gemini 1.5 Pro, ultimately found the bug, which was loosely related to changes in the seed commit <a target=\"_blank\" rel=\"nofollow noopener\" href=\"https:\/\/sqlite.org\/src\/info\/1976c3f7e1fe77cf\">[1976c3f7]<\/a>. &#8220;This is not uncommon in manual variant analysis, understanding one bug in a codebase often leads a researcher to other problems,&#8221; the Googlers explained.<\/p>\n<p>In the write-up, the Big Sleep team also detailed the &#8220;highlights&#8221; of the steps that the agent took to evaluate the code, find the vulnerability, crash the system, and then produce a root-cause analysis.<\/p>\n<p>&#8220;However, we want to reiterate that these are highly experimental results,&#8221; they wrote. &#8220;The position of the Big Sleep team is that at present, it&#8217;s likely that a target-specific fuzzer would be at least as effective (at finding vulnerabilities).&#8221; \u00ae<\/p>\n<\/p><\/div>\n<p><script async src=\"https:\/\/pagead2.googlesyndication.com\/pagead\/js\/adsbygoogle.js?client=ca-pub-3711241968723425\"\r\n     crossorigin=\"anonymous\"><\/script>\r\n<ins class=\"adsbygoogle\"\r\n     style=\"display:block\"\r\n     data-ad-format=\"fluid\"\r\n     data-ad-layout-key=\"-fb+5w+4e-db+86\"\r\n     data-ad-client=\"ca-pub-3711241968723425\"\r\n     data-ad-slot=\"7910942971\"><\/ins>\r\n<script>\r\n     (adsbygoogle = window.adsbygoogle || []).push({});\r\n<\/script><br \/>\n<br \/><div data-type=\"_mgwidget\" data-widget-id=\"1660802\">\r\n<\/div>\r\n<script>(function(w,q){w[q]=w[q]||[];w[q].push([\"_mgc.load\"])})(window,\"_mgq\");\r\n<\/script>\r\n<br \/>\n<br \/><a href=\"https:\/\/www.theregister.com\/2024\/11\/05\/google_ai_vulnerability_hunting\/\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Google claims one of its AI models is the first of its kind to spot a memory safety vulnerability in the wild \u2013 specifically an exploitable stack buffer underflow in &hellip; <a href=\"https:\/\/hotvideos24.online\/?p=122423\" class=\"more-link\">Read More<\/a><\/p>\n","protected":false},"author":2,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[8630],"tags":[],"class_list":["post-122423","post","type-post","status-publish","format-standard","hentry","category-technology","entry"],"_links":{"self":[{"href":"https:\/\/hotvideos24.online\/index.php?rest_route=\/wp\/v2\/posts\/122423","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/hotvideos24.online\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/hotvideos24.online\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/hotvideos24.online\/index.php?rest_route=\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/hotvideos24.online\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=122423"}],"version-history":[{"count":0,"href":"https:\/\/hotvideos24.online\/index.php?rest_route=\/wp\/v2\/posts\/122423\/revisions"}],"wp:attachment":[{"href":"https:\/\/hotvideos24.online\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=122423"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/hotvideos24.online\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=122423"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/hotvideos24.online\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=122423"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}