{"id":6812,"date":"2026-05-14T10:56:04","date_gmt":"2026-05-14T14:56:04","guid":{"rendered":"https:\/\/ionic.io\/blog\/?p=6812"},"modified":"2026-05-14T11:29:34","modified_gmt":"2026-05-14T15:29:34","slug":"capacitor-showcase-localllm","status":"publish","type":"post","link":"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm","title":{"rendered":"Capacitor Showcase &#8211; LocalLLM"},"content":{"rendered":"\n<p>With Capacitor, anything you can build for the web, you can build for mobile. Beyond brochure and CRUD apps, it&#8217;s possible to build advanced applications that rely heavily on native hardware functionality, rich media and local first data. Starting this year, the Capacitor Team is embarking on a process to explore the real world ergonomics of developing Capacitor apps spanning some of the more difficult and technically challenging app genres, such as rich media, background processing, and AI. We will document the common pain points found during that process, come up with solutions for those issues (which may be bug fixes, Capacitor features or even new plugins), and document the process here with the community.<\/p>\n\n\n\n<!--more-->\n\n\n\n<p id=\"Capacitor-Showcase:-Oakline-Bank\">In our first installment of Capacitor Showcase, we are introducing <strong>Oakline Bank<\/strong>.<\/p>\n\n\n\n<p>Oakline Bank is a fictional digital-first regional bank built as a Capacitor demo application, showcasing how modern mobile banking experiences can be crafted with a single cross-platform codebase targeting both iOS and Android. The app demonstrates real-world patterns you&#8217;d find in production fintech apps: account dashboards, transaction histories, and, appropriately timed in our current AI craze: an AI-powered in-app assistant called OakBot.<\/p>\n\n\n\n<p>The (fictional) team realized, when a user asks OakBot &#8220;What&#8217;s my account balance?&#8221; or &#8220;Summarize my spending this month,&#8221; answering that question requires sending real financial data to an AI model. With a cloud-based model (OpenAI, Anthropic, Google, etc.), that data leaves the device and travels to a third-party server, which would be a significant concern for any financial institution.<\/p>\n\n\n\n<p>In 2026, there is now a way to solve this problem on modern mobile devices &#8211; on-device AI.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"Introducing-Capacitor-LocalLLM\">Introducing Capacitor LocalLLM<\/h3>\n\n\n\n<p><a href=\"https:\/\/github.com\/ionic-team\/capacitor-local-llm\">Capacitor LocalLLM<\/a> is a native Capacitor plugin that brings the power of on-device AI directly to your iOS and Android apps. By using both Apple Intelligence and Android&#8217;s on-device AI frameworks, it gives developers a simple, unified TypeScript API to send prompts with conversation session support, all while respecting user privacy and working completely offline. Whether you&#8217;re building a smart assistant, a creative tool, or an offline AI-powered bank chatbot, LocalLLM makes it as straightforward as:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>const response = await LocalLLM.prompt({\n  sessionId: chatSessionId,\n  instructions: instructions,\n  prompt: userMessageText,\n  options: {\n    temperature: 0.7,\n    maximumOutputTokens: 256,\n  },\n});<\/code><\/pre>\n\n\n\n<p>With that you get a cross platform prompt interface without needing worry about the intricacies of local AI models on both iOS and Android, or without having to research and provide your own custom models.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"Grounding-a-Local-LLM-with-Private-Data:-How-OakBot-Works\">Grounding a Local LLM with Private Data: How OakBot Works<\/h3>\n\n\n\n<p>On-device language models are trained on vast general knowledge, but they have no awareness of anything specific to the user running them. To be useful as a banking assistant, OakBot needs to answer questions like:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><em>&#8220;How much did I spend on dining this month?&#8221;<\/em><\/li>\n\n\n\n<li><em>&#8220;What&#8217;s my current balance?&#8221;<\/em> &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;<\/li>\n\n\n\n<li><em>&#8220;Have I paid my Apple Card bill recently?&#8221;<\/em><\/li>\n<\/ul>\n\n\n\n<p>None of that exists inside the model&#8217;s weights. The only way to make it available is to inject it&nbsp; into the context window at inference time: essentially telling the model what it needs to know before it answers. Here is how:<\/p>\n\n\n\n<ol start=\"1\" class=\"wp-block-list\">\n<li>Fetch the relevant data from the local source (in this case, the app&#8217;s in-memory transaction state loaded from the fake API)<\/li>\n\n\n\n<li>Serialize it into natural language that the model can reason over<\/li>\n\n\n\n<li>Inject it into the session context before the user&#8217;s first message<\/li>\n<\/ol>\n\n\n\n<p>In OakBot specifically, this happens in <code>formatTransactionHistory()<\/code>, which transforms structured Transaction objects into a plain-text block:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>Spending by Category (all time):\n  - Housing: $7,400.00\n  - Groceries: $1,823.45\nRecent Transactions (last 60):                                                                      \n  - ...\n  - 2026-03-10: Whole Foods Market (Groceries) - -$87.43   \n  - ...\nTotal transactions on record: 160\n  Current balance: $3,241.18<\/code><\/pre>\n\n\n\n<p>That block is then passed to <code>LocalLLM.warmup()<\/code> as the <code>promptPrefix<\/code> \u2014 a special parameter that pre-loads the context into the session before any conversation begins. This is the critical handshake between the app&#8217;s local data and the inference engine, and because it goes through the plugin\u2019s abstraction layer, it works the same way in code regardless of whether the user is on iOS or Android.<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>await LocalLLM.warmup({\n  sessionId: chatSessionId,\n  promptPrefix: systemPromptWithTransactionContext,\n});<\/code><\/pre>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"Why-not-JSON-over-Plain-Text?\">Why not JSON over Plain Text?<\/h3>\n\n\n\n<p>Token efficiency actually favors natural language here. Each JSON object repeats field names for every entry. Compare:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>{&quot;date&quot;:&quot;2026-03-10&quot;,&quot;merchant&quot;:&quot;Whole Foods Market&quot;,&quot;category&quot;:&quot;Groceries&quot;,&quot;amount&quot;:-87.43}<\/code><\/pre>\n\n\n\n<p>vs:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>- 2026-03-10: Whole Foods Market (Groceries) - -$87.43<\/code><\/pre>\n\n\n\n<p>The natural language line is noticeably more compact. Multiply that across 60 transactions and the difference is meaningful on a tight context budget.<\/p>\n\n\n\n<p>Small on-device models reason better over natural language. The models running via LocalLLM (Foundation Models, Gemini Nano) are far smaller than cloud models. Larger models like GPT-4 handle JSON structure reliably; smaller on-device models can stumble on it. Natural prose plays to their strengths.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<p>To experiment with the Oakline Bank demo app yourself, <a href=\"https:\/\/github.com\/ionic-team\/oakline-bank-showcase\">check out the app source here<\/a>, and run it on your own modern device that can support running on-device AI models.<\/p>\n\n\n\n<p>LocalLLM will be launched as a Capacitor Labs plugin initially as the first party on-device AI story is still in flux, especially on Android, however feel free to submit feedback, bug reports and suggestions for future directions we can take the plugin.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"Community-Mentions\">Community Mentions<\/h2>\n\n\n\n<p>Capacitor LocalLLM isn&#8217;t the only solution the Capacitor ecosystem for local AI usage. Check out some related efforts from the Capacitor community below<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><a href=\"https:\/\/github.com\/Cap-go\/capacitor-llm\">GitHub &#8211; Cap-go\/capacitor-llm: Capacitor plugin to run LLM models locally in IOS and Android, support AppleInteligence<\/a><\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"More-Coming-Soon\">More Coming Soon<\/h2>\n\n\n\n<p>Stay tuned for more installments of the Capacitor Showcase series, as we continue to explore what can be built with Capacitor.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>With Capacitor, anything you can build for the web, you can build for mobile. Beyond brochure and CRUD apps, it&#8217;s possible to build advanced applications that rely heavily on native hardware functionality, rich media and local first data. Starting this year, the Capacitor Team is embarking on a process to explore the real world ergonomics [&hellip;]<\/p>\n","protected":false},"author":97,"featured_media":6816,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"publish_to_discourse":"","publish_post_category":"","wpdc_auto_publish_overridden":"","wpdc_topic_tags":"","wpdc_pin_topic":"","wpdc_pin_until":"","discourse_post_id":"","discourse_permalink":"","wpdc_publishing_response":"","wpdc_publishing_error":"","_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[1,121],"tags":[274,302,151,301,300,299],"class_list":["post-6812","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-all","category-engineering","tag-ai","tag-apple-intelligence","tag-capacitor","tag-gemini-nano","tag-llm","tag-showcase"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO Premium plugin v23.0 (Yoast SEO v23.0) - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Capacitor Showcase - LocalLLM - Ionic Blog<\/title>\n<meta name=\"description\" content=\"Explore how Capacitor LocalLLM enables on-device AI for iOS and Android apps with a unified TypeScript API. Learn how to build privacy-first mobile experiences using local LLMs, Apple Intelligence, and Gemini Nano\u2014no cloud required.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Capacitor Showcase - LocalLLM\" \/>\n<meta property=\"og:description\" content=\"Explore how Capacitor LocalLLM enables on-device AI for iOS and Android apps with a unified TypeScript API. Learn how to build privacy-first mobile experiences using local LLMs, Apple Intelligence, and Gemini Nano\u2014no cloud required.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm\" \/>\n<meta property=\"og:site_name\" content=\"Ionic Blog\" \/>\n<meta property=\"article:published_time\" content=\"2026-05-14T14:56:04+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2026-05-14T15:29:34+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/ionic.io\/blog\/wp-content\/uploads\/2026\/05\/cc80a21a-ae0e-4db1-ab66-09853be46f1f.png\" \/>\n\t<meta property=\"og:image:width\" content=\"1774\" \/>\n\t<meta property=\"og:image:height\" content=\"887\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Joseph Pender\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@ionicframework\" \/>\n<meta name=\"twitter:site\" content=\"@ionicframework\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Joseph Pender\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"4 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm#article\",\"isPartOf\":{\"@id\":\"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm\"},\"author\":{\"name\":\"Joseph Pender\",\"@id\":\"https:\/\/ionic.io\/blog\/#\/schema\/person\/c5a0825125cbd820e75fa559ab6fb482\"},\"headline\":\"Capacitor Showcase &#8211; LocalLLM\",\"datePublished\":\"2026-05-14T14:56:04+00:00\",\"dateModified\":\"2026-05-14T15:29:34+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm\"},\"wordCount\":887,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\/\/ionic.io\/blog\/#organization\"},\"image\":{\"@id\":\"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm#primaryimage\"},\"thumbnailUrl\":\"https:\/\/ionic.io\/blog\/wp-content\/uploads\/2026\/05\/cc80a21a-ae0e-4db1-ab66-09853be46f1f.png\",\"keywords\":[\"AI\",\"Apple Intelligence\",\"Capacitor\",\"Gemini Nano\",\"LLM\",\"Showcase\"],\"articleSection\":[\"All\",\"Engineering\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm\",\"url\":\"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm\",\"name\":\"Capacitor Showcase - LocalLLM - Ionic Blog\",\"isPartOf\":{\"@id\":\"https:\/\/ionic.io\/blog\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm#primaryimage\"},\"image\":{\"@id\":\"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm#primaryimage\"},\"thumbnailUrl\":\"https:\/\/ionic.io\/blog\/wp-content\/uploads\/2026\/05\/cc80a21a-ae0e-4db1-ab66-09853be46f1f.png\",\"datePublished\":\"2026-05-14T14:56:04+00:00\",\"dateModified\":\"2026-05-14T15:29:34+00:00\",\"description\":\"Explore how Capacitor LocalLLM enables on-device AI for iOS and Android apps with a unified TypeScript API. Learn how to build privacy-first mobile experiences using local LLMs, Apple Intelligence, and Gemini Nano\u2014no cloud required.\",\"breadcrumb\":{\"@id\":\"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm#primaryimage\",\"url\":\"https:\/\/ionic.io\/blog\/wp-content\/uploads\/2026\/05\/cc80a21a-ae0e-4db1-ab66-09853be46f1f.png\",\"contentUrl\":\"https:\/\/ionic.io\/blog\/wp-content\/uploads\/2026\/05\/cc80a21a-ae0e-4db1-ab66-09853be46f1f.png\",\"width\":1774,\"height\":887},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/ionic.io\/blog\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Capacitor Showcase &#8211; LocalLLM\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/ionic.io\/blog\/#website\",\"url\":\"https:\/\/ionic.io\/blog\/\",\"name\":\"ionic.io\/blog\",\"description\":\"Build amazing native and progressive web apps with the web\",\"publisher\":{\"@id\":\"https:\/\/ionic.io\/blog\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/ionic.io\/blog\/?s={search_term_string}\"},\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/ionic.io\/blog\/#organization\",\"name\":\"Ionic\",\"url\":\"https:\/\/ionic.io\/blog\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/ionic.io\/blog\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/ionic.io\/blog\/wp-content\/uploads\/2020\/10\/white-on-color.png\",\"contentUrl\":\"https:\/\/ionic.io\/blog\/wp-content\/uploads\/2020\/10\/white-on-color.png\",\"width\":1920,\"height\":854,\"caption\":\"Ionic\"},\"image\":{\"@id\":\"https:\/\/ionic.io\/blog\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/x.com\/ionicframework\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/ionic.io\/blog\/#\/schema\/person\/c5a0825125cbd820e75fa559ab6fb482\",\"name\":\"Joseph Pender\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/ionic.io\/blog\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/ionic.io\/blog\/wp-content\/uploads\/2022\/05\/joey_pender-150x150.jpeg\",\"contentUrl\":\"https:\/\/ionic.io\/blog\/wp-content\/uploads\/2022\/05\/joey_pender-150x150.jpeg\",\"caption\":\"Joseph Pender\"},\"url\":\"https:\/\/ionic.io\/blog\/author\/joey\"}]}<\/script>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"Capacitor Showcase - LocalLLM - Ionic Blog","description":"Explore how Capacitor LocalLLM enables on-device AI for iOS and Android apps with a unified TypeScript API. Learn how to build privacy-first mobile experiences using local LLMs, Apple Intelligence, and Gemini Nano\u2014no cloud required.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm","og_locale":"en_US","og_type":"article","og_title":"Capacitor Showcase - LocalLLM","og_description":"Explore how Capacitor LocalLLM enables on-device AI for iOS and Android apps with a unified TypeScript API. Learn how to build privacy-first mobile experiences using local LLMs, Apple Intelligence, and Gemini Nano\u2014no cloud required.","og_url":"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm","og_site_name":"Ionic Blog","article_published_time":"2026-05-14T14:56:04+00:00","article_modified_time":"2026-05-14T15:29:34+00:00","og_image":[{"width":1774,"height":887,"url":"https:\/\/ionic.io\/blog\/wp-content\/uploads\/2026\/05\/cc80a21a-ae0e-4db1-ab66-09853be46f1f.png","type":"image\/png"}],"author":"Joseph Pender","twitter_card":"summary_large_image","twitter_creator":"@ionicframework","twitter_site":"@ionicframework","twitter_misc":{"Written by":"Joseph Pender","Est. reading time":"4 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm#article","isPartOf":{"@id":"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm"},"author":{"name":"Joseph Pender","@id":"https:\/\/ionic.io\/blog\/#\/schema\/person\/c5a0825125cbd820e75fa559ab6fb482"},"headline":"Capacitor Showcase &#8211; LocalLLM","datePublished":"2026-05-14T14:56:04+00:00","dateModified":"2026-05-14T15:29:34+00:00","mainEntityOfPage":{"@id":"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm"},"wordCount":887,"commentCount":0,"publisher":{"@id":"https:\/\/ionic.io\/blog\/#organization"},"image":{"@id":"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm#primaryimage"},"thumbnailUrl":"https:\/\/ionic.io\/blog\/wp-content\/uploads\/2026\/05\/cc80a21a-ae0e-4db1-ab66-09853be46f1f.png","keywords":["AI","Apple Intelligence","Capacitor","Gemini Nano","LLM","Showcase"],"articleSection":["All","Engineering"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/ionic.io\/blog\/capacitor-showcase-localllm#respond"]}]},{"@type":"WebPage","@id":"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm","url":"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm","name":"Capacitor Showcase - LocalLLM - Ionic Blog","isPartOf":{"@id":"https:\/\/ionic.io\/blog\/#website"},"primaryImageOfPage":{"@id":"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm#primaryimage"},"image":{"@id":"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm#primaryimage"},"thumbnailUrl":"https:\/\/ionic.io\/blog\/wp-content\/uploads\/2026\/05\/cc80a21a-ae0e-4db1-ab66-09853be46f1f.png","datePublished":"2026-05-14T14:56:04+00:00","dateModified":"2026-05-14T15:29:34+00:00","description":"Explore how Capacitor LocalLLM enables on-device AI for iOS and Android apps with a unified TypeScript API. Learn how to build privacy-first mobile experiences using local LLMs, Apple Intelligence, and Gemini Nano\u2014no cloud required.","breadcrumb":{"@id":"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/ionic.io\/blog\/capacitor-showcase-localllm"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm#primaryimage","url":"https:\/\/ionic.io\/blog\/wp-content\/uploads\/2026\/05\/cc80a21a-ae0e-4db1-ab66-09853be46f1f.png","contentUrl":"https:\/\/ionic.io\/blog\/wp-content\/uploads\/2026\/05\/cc80a21a-ae0e-4db1-ab66-09853be46f1f.png","width":1774,"height":887},{"@type":"BreadcrumbList","@id":"https:\/\/ionic.io\/blog\/capacitor-showcase-localllm#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/ionic.io\/blog"},{"@type":"ListItem","position":2,"name":"Capacitor Showcase &#8211; LocalLLM"}]},{"@type":"WebSite","@id":"https:\/\/ionic.io\/blog\/#website","url":"https:\/\/ionic.io\/blog\/","name":"ionic.io\/blog","description":"Build amazing native and progressive web apps with the web","publisher":{"@id":"https:\/\/ionic.io\/blog\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/ionic.io\/blog\/?s={search_term_string}"},"query-input":"required name=search_term_string"}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/ionic.io\/blog\/#organization","name":"Ionic","url":"https:\/\/ionic.io\/blog\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/ionic.io\/blog\/#\/schema\/logo\/image\/","url":"https:\/\/ionic.io\/blog\/wp-content\/uploads\/2020\/10\/white-on-color.png","contentUrl":"https:\/\/ionic.io\/blog\/wp-content\/uploads\/2020\/10\/white-on-color.png","width":1920,"height":854,"caption":"Ionic"},"image":{"@id":"https:\/\/ionic.io\/blog\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/ionicframework"]},{"@type":"Person","@id":"https:\/\/ionic.io\/blog\/#\/schema\/person\/c5a0825125cbd820e75fa559ab6fb482","name":"Joseph Pender","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/ionic.io\/blog\/#\/schema\/person\/image\/","url":"https:\/\/ionic.io\/blog\/wp-content\/uploads\/2022\/05\/joey_pender-150x150.jpeg","contentUrl":"https:\/\/ionic.io\/blog\/wp-content\/uploads\/2022\/05\/joey_pender-150x150.jpeg","caption":"Joseph Pender"},"url":"https:\/\/ionic.io\/blog\/author\/joey"}]}},"jetpack_sharing_enabled":true,"jetpack_featured_media_url":"https:\/\/ionic.io\/blog\/wp-content\/uploads\/2026\/05\/cc80a21a-ae0e-4db1-ab66-09853be46f1f.png","_links":{"self":[{"href":"https:\/\/ionic.io\/blog\/wp-json\/wp\/v2\/posts\/6812","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/ionic.io\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/ionic.io\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/ionic.io\/blog\/wp-json\/wp\/v2\/users\/97"}],"replies":[{"embeddable":true,"href":"https:\/\/ionic.io\/blog\/wp-json\/wp\/v2\/comments?post=6812"}],"version-history":[{"count":6,"href":"https:\/\/ionic.io\/blog\/wp-json\/wp\/v2\/posts\/6812\/revisions"}],"predecessor-version":[{"id":6820,"href":"https:\/\/ionic.io\/blog\/wp-json\/wp\/v2\/posts\/6812\/revisions\/6820"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/ionic.io\/blog\/wp-json\/wp\/v2\/media\/6816"}],"wp:attachment":[{"href":"https:\/\/ionic.io\/blog\/wp-json\/wp\/v2\/media?parent=6812"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/ionic.io\/blog\/wp-json\/wp\/v2\/categories?post=6812"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/ionic.io\/blog\/wp-json\/wp\/v2\/tags?post=6812"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}