<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[InContext by Stijn Bakker: Quick notes & postits]]></title><description><![CDATA[Quick (and often incoherent) observations and ideas on sorts of topics. Unstructured notes and observations for my digital garden to develop and flourish over time. On whatever catches my attention.]]></description><link>https://incontext.digital/s/quick-notes-and-postits</link><generator>Substack</generator><lastBuildDate>Wed, 22 Apr 2026 00:54:20 GMT</lastBuildDate><atom:link href="https://incontext.digital/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Stijn Bakker]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[stijn@incontext.digital]]></webMaster><itunes:owner><itunes:email><![CDATA[stijn@incontext.digital]]></itunes:email><itunes:name><![CDATA[Stijn Bakker]]></itunes:name></itunes:owner><itunes:author><![CDATA[Stijn Bakker]]></itunes:author><googleplay:owner><![CDATA[stijn@incontext.digital]]></googleplay:owner><googleplay:email><![CDATA[stijn@incontext.digital]]></googleplay:email><googleplay:author><![CDATA[Stijn Bakker]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[The ladder of engineering craftsmanship]]></title><description><![CDATA[Engineering seniority isn't about writing better code. It's about zooming out &#8212; from syntax to systems to strategy. Each rung demands a harder context switch.]]></description><link>https://incontext.digital/p/the-ladder-of-engineering-craftsmanship</link><guid isPermaLink="false">https://incontext.digital/p/the-ladder-of-engineering-craftsmanship</guid><dc:creator><![CDATA[Stijn Bakker]]></dc:creator><pubDate>Thu, 16 Apr 2026 18:01:30 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!9JuI!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fddb6f2db-eff3-4c1a-b1ed-f754a0982810_826x705.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>There&#8217;s a ladder to engineering craftsmanship that nobody draws explicitly. But everyone climbs it.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!9JuI!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fddb6f2db-eff3-4c1a-b1ed-f754a0982810_826x705.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!9JuI!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fddb6f2db-eff3-4c1a-b1ed-f754a0982810_826x705.png 424w, https://substackcdn.com/image/fetch/$s_!9JuI!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fddb6f2db-eff3-4c1a-b1ed-f754a0982810_826x705.png 848w, https://substackcdn.com/image/fetch/$s_!9JuI!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fddb6f2db-eff3-4c1a-b1ed-f754a0982810_826x705.png 1272w, https://substackcdn.com/image/fetch/$s_!9JuI!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fddb6f2db-eff3-4c1a-b1ed-f754a0982810_826x705.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!9JuI!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fddb6f2db-eff3-4c1a-b1ed-f754a0982810_826x705.png" width="826" height="705" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ddb6f2db-eff3-4c1a-b1ed-f754a0982810_826x705.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:705,&quot;width&quot;:826,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:49366,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://incontext.digital/i/194100320?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fddb6f2db-eff3-4c1a-b1ed-f754a0982810_826x705.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!9JuI!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fddb6f2db-eff3-4c1a-b1ed-f754a0982810_826x705.png 424w, https://substackcdn.com/image/fetch/$s_!9JuI!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fddb6f2db-eff3-4c1a-b1ed-f754a0982810_826x705.png 848w, https://substackcdn.com/image/fetch/$s_!9JuI!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fddb6f2db-eff3-4c1a-b1ed-f754a0982810_826x705.png 1272w, https://substackcdn.com/image/fetch/$s_!9JuI!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fddb6f2db-eff3-4c1a-b1ed-f754a0982810_826x705.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><p>At the bottom, you&#8217;re in the syntax. The commas, the brackets, the precise implementation of a function. It&#8217;s the craft of making code work.</p><p>One rung up, you&#8217;re designing functions and classes within a domain. You understand patterns, you write clean code, you make good local decisions.</p><p>Higher still, you start seeing dependencies. How different modules interact. How changes in one part ripple through another. You stop thinking in files and start thinking in systems.</p><p>Then comes full mastery of an application. Everything in it, every quirk, every critical dependency it relies on. The people at this level are the ones who can debug anything in their domain because they hold the whole picture in their head.</p><p>Keep climbing and the view expands beyond one app. You understand the landscape; how multiple systems depend on each other, how they serve business logic, where the data flows. These are the rare engineers who can parachute into any fire and instantly pinpoint what went wrong.</p><p>Near the top, it turns into strategy. You&#8217;re mapping not just technology but organizational dynamics. Incentives, accountabilities, competing priorities between departments. And you&#8217;re thinking in time; how decisions made today create lock-in, technical debt, or opportunity years from now.</p><p>At the very top, you&#8217;re thinking in markets. Competitors, business models, how the entire enterprise stack serves or undermines strategic positioning.</p><p>Here&#8217;s what makes this ladder exhausting: you never stop needing the lower rungs. A CTO still needs to drop into syntax sometimes. But the context switch between strategy and semicolons is brutal. The wider the gap you have to jump, the more it drains you.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!tbBM!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff40e4b03-05f2-46b4-a3f5-d58aedbc1ed1_779x281.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!tbBM!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff40e4b03-05f2-46b4-a3f5-d58aedbc1ed1_779x281.png 424w, https://substackcdn.com/image/fetch/$s_!tbBM!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff40e4b03-05f2-46b4-a3f5-d58aedbc1ed1_779x281.png 848w, https://substackcdn.com/image/fetch/$s_!tbBM!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff40e4b03-05f2-46b4-a3f5-d58aedbc1ed1_779x281.png 1272w, https://substackcdn.com/image/fetch/$s_!tbBM!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff40e4b03-05f2-46b4-a3f5-d58aedbc1ed1_779x281.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!tbBM!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff40e4b03-05f2-46b4-a3f5-d58aedbc1ed1_779x281.png" width="779" height="281" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f40e4b03-05f2-46b4-a3f5-d58aedbc1ed1_779x281.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:281,&quot;width&quot;:779,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:68839,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://incontext.digital/i/194100320?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff40e4b03-05f2-46b4-a3f5-d58aedbc1ed1_779x281.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!tbBM!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff40e4b03-05f2-46b4-a3f5-d58aedbc1ed1_779x281.png 424w, https://substackcdn.com/image/fetch/$s_!tbBM!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff40e4b03-05f2-46b4-a3f5-d58aedbc1ed1_779x281.png 848w, https://substackcdn.com/image/fetch/$s_!tbBM!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff40e4b03-05f2-46b4-a3f5-d58aedbc1ed1_779x281.png 1272w, https://substackcdn.com/image/fetch/$s_!tbBM!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff40e4b03-05f2-46b4-a3f5-d58aedbc1ed1_779x281.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>The engineers who thrive won&#8217;t be the ones who write the best code. They&#8217;ll be the ones who can hold the most rungs in their head at once.</p><p></p><p></p>]]></content:encoded></item><item><title><![CDATA[The multi-modal interface]]></title><description><![CDATA[Software used to have one interface. Now the best apps support clicking, typing, touching, and talking to AI; all at once. The winners will be the ones that let you interact however you want.]]></description><link>https://incontext.digital/p/the-multi-modal-interface</link><guid isPermaLink="false">https://incontext.digital/p/the-multi-modal-interface</guid><dc:creator><![CDATA[Stijn Bakker]]></dc:creator><pubDate>Tue, 14 Apr 2026 18:01:28 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!N7PQ!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F535d714d-99bd-4f4f-bc3e-95dbb4e0954e_1280x1280.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Every piece of software has an interface. That&#8217;s obvious. What&#8217;s changing is how many interfaces it needs.</p><p>We used to think in a single mode. Desktop apps were designed for a mouse. Click buttons, drag sliders, navigate menus. Photoshop is the extreme version of this; hundreds of tiny buttons, panels everywhere, built for precision pointing. Then over the last few years we&#8217;ve seen a de-cluttering of those buttons. The command bar popped up, reachable via <code>cmd+K</code> in apps like Notion, Linear, Raycast. Allowing you to type in the tool you&#8217;d need.</p><p>Then we have mobile. Entirely different in terms of interaction. A touch is different from a click. Buttons got bigger. Interfaces got simpler. Gestures like swiping, pinching, pulling became intuitive and expected. Though maybe tablets made it a bit weird. An iPad with a keyboard and trackpad behaves like a laptop. Flip the keyboard off and it&#8217;s a giant phone. The same device, two completely different interaction models.</p><p>But here&#8217;s where it gets interesting. The oldest interface is making a comeback.</p><p>The command line interface. The original computer interface, before everything went graphical, is growing again. AI coding assistants live in the terminal. Apps like Obsidian and Google Workspace have announced first-party CLI support, in order to be operable by AI agents. This developer-first workflow is increasingly text-command-first. And it&#8217;s not just for developers anymore. Apps are exposing programmatic interfaces to a much wider audience.</p><p>APIs have been the backbone of B2B software for years. CommerceTools, the commerce engine, is API-first by design. It doesn&#8217;t even pretend to be a complete application, it&#8217;s a platform developers build on top of. Notion opened up full read-write access through its API. Now MCP servers (Model Context Protocol) add another layer. They&#8217;re essentially abstraction layers on top of REST or GraphQL APIs, designed specifically for AI agents to interact with applications. Your app doesn&#8217;t just serve humans anymore. It serves other software.</p><p>The apps that are going to win are the ones that support all of these modes simultaneously.</p><p>Linear gets this right. It&#8217;s keyboard-first for power users. It has a beautiful click-based GUI for everyone else. It works on mobile with touch. It has a CLI. It has an API. It has MCP support. Every type of user, every type of device, every type of interaction covered.</p><p>That&#8217;s the new bar. Not just a great UI. A great interface. Plural.</p>]]></content:encoded></item><item><title><![CDATA[The impossible business model of LLMs]]></title><description><![CDATA[SaaS companies spend on development, then serve users cheaply. LLMs spend on development AND on every single user interaction. That cost structure breaks everything we know about software pricing.]]></description><link>https://incontext.digital/p/the-impossible-business-model-of</link><guid isPermaLink="false">https://incontext.digital/p/the-impossible-business-model-of</guid><dc:creator><![CDATA[Stijn Bakker]]></dc:creator><pubDate>Sun, 12 Apr 2026 18:01:15 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!N7PQ!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F535d714d-99bd-4f4f-bc3e-95dbb4e0954e_1280x1280.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>The economics of LLMs are fundamentally broken. At least by the standards we&#8217;re used to.</p><p>Traditional SaaS has a beautiful cost structure. You invest heavily in development; build the product, ship it, maintain it. But serving users is almost free. A few servers, some bandwidth, done. Whether you have a thousand users or a million, the marginal cost per user is tiny. That&#8217;s why SaaS companies can charge $10 a month and make enormous margins at scale.</p><p>LLMs flip this on its head.</p><p>Development costs are massive, not just building the model, but training it, which requires obscene amounts of compute. Then, unlike SaaS, every single user interaction costs money. Every query, every token, every response. The operational cost doesn&#8217;t flatten at scale. It grows with usage.</p><p>And here&#8217;s the cultural problem: we&#8217;ve been trained to expect fixed pricing for online software. A flat monthly fee, use it as much as you want. Netflix, Spotify, Notion, unlimited usage for a predictable price.</p><p>LLMs can&#8217;t deliver that without losing money on heavy users. The math doesn&#8217;t work. The distribution of usage is wildly uneven, a small percentage of users consume a disproportionate amount of tokens, racking up costs that the subscription fee doesn&#8217;t cover.</p><p>OpenAI is trapped in this. They need platform pricing expectations (flat, predictable) but have infrastructure cost structures (variable, per-use). And this isn&#8217;t just an OpenAI problem. It&#8217;s a structural challenge for the entire industry. Until someone figures out how to make per-token costs negligible, or finds a pricing model users will accept, the LLM business model remains a very expensive bet.</p>]]></content:encoded></item><item><title><![CDATA[The feature parity problem]]></title><description><![CDATA[When Sonos or any app ships a v2, users revolt. Features vanish, interfaces change, and suddenly the thing you relied on daily feels stolen. You never really own your software.]]></description><link>https://incontext.digital/p/the-feature-parity-problem</link><guid isPermaLink="false">https://incontext.digital/p/the-feature-parity-problem</guid><dc:creator><![CDATA[Stijn Bakker]]></dc:creator><pubDate>Fri, 10 Apr 2026 18:01:02 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!N7PQ!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F535d714d-99bd-4f4f-bc3e-95dbb4e0954e_1280x1280.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em><strong>Originally published june 2024 on stijnbakker.com</strong></em></p><p>Apps are becoming public property. At least, that&#8217;s how it feels to users.</p><p>When Sonos launched their redesigned app, people hated it. Features disappeared. Familiar UI patterns changed. Something they used every day was suddenly alien. The Dutch public broadcaster did the sam, moved from a decent app to a horrible one. Both companies said the same thing: &#8220;It took courage. We needed a stable foundation for the future.&#8221;</p><p>Users didn&#8217;t care about foundations. They cared that their thing was broken.</p><p>This raises two problems nobody talks about honestly.</p><p><strong>The first is technical.</strong> We like to say software can be molded over time. Iterate, improve, ship. But sometimes a codebase is so far gone that a complete rebuild is the only option. And rebuilding means starting from scratch. Which means losing features. Achieving feature parity with the old version is brutally hard and expensive, especially when the old version accumulated years of small additions that nobody documented properly.</p><p><strong>The second is cultural.</strong> Designers get bored. Engineers want to work on new things. A product that&#8217;s been stable for years feels stale <em>to the people building it</em>, even if users love it. I suspect the Sonos redesign wasn&#8217;t driven purely by technical necessity. Someone wanted to make something new. The problem is that &#8220;new&#8221; for the builder means &#8220;broken&#8221; for the user.</p><p>And underneath both problems sits a deeper truth: <em>as a user you never really <strong>own</strong> your software.</em></p><p>Every app you depend on can change overnight. An update you didn&#8217;t ask for can remove the feature you relied on most. You have no say, no vote, no recourse. You&#8217;re at the whims of whoever controls the update cycle.</p><p>That&#8217;s the feature parity problem. Not just the technical challenge of rebuilding without losing functionality. But the uncomfortable reality that the software you shape your life around doesn&#8217;t actually belong to you.</p>]]></content:encoded></item><item><title><![CDATA[Opportunity: the AI devops engineer]]></title><description><![CDATA[When everyone can vibe-code their own tools, the bottleneck shifts from features to stability. The opportunity is an AI engineer that monitors your platform 24/7 &#8212; and knows when to wake you up.]]></description><link>https://incontext.digital/p/opportunity-the-ai-devops-engineer</link><guid isPermaLink="false">https://incontext.digital/p/opportunity-the-ai-devops-engineer</guid><dc:creator><![CDATA[Stijn Bakker]]></dc:creator><pubDate>Wed, 08 Apr 2026 18:01:05 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!N7PQ!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F535d714d-99bd-4f4f-bc3e-95dbb4e0954e_1280x1280.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>In a world where every business can vibe-code its own software, features stop being the bottleneck. Stability does.</p><p>Building an internal tool is getting trivially easy. Keeping it running reliably is not. Uptime, monitoring, bug triage, incident response, that&#8217;s where the real pain lives. And most small businesses can&#8217;t justify a full-time devops hire for their handful of custom tools.</p><p>Here&#8217;s what I think should exist: an AI devops engineer monitoring your platform 24/7.</p><p>Not a dashboard. Not an alerting system. An actual reasoning agent that catches bugs, analyzes root causes, and creates pull requests to fix them. One that has very clear, very strict permissions, so you always know your data won&#8217;t be touched. One that escalates intelligently, only waking you up at 3 AM when it genuinely matters.</p><p>An AI engineer you can yell at the next morning when it turns out the midnight alert was a stupid typo. One that apologizes, learns, and adjusts its threshold.</p><p>But it goes deeper than firefighting. The real value is in the long view. An AI devops engineer that tracks bugs over time, spots structural patterns, and brainstorms with you: why wasn&#8217;t this caught in testing? Why does this endpoint keep failing? What architectural weakness keeps producing the same category of incident?</p><p>Self-healing systems are the dream. Bugs automatically caught, analyzed, fixed. But even short of that dream, there&#8217;s enormous value in a tireless, always-on engineer that handles the grunt work of keeping software alive.</p><p>That&#8217;s a startup waiting to be built.</p>]]></content:encoded></item><item><title><![CDATA[The MVP is dead]]></title><description><![CDATA[Users are spoiled. Feature expectations are sky-high. The minimum viable product isn&#8217;t viable anymore when everyone compares your v1 to someone else&#8217;s v10.]]></description><link>https://incontext.digital/p/the-mvp-is-dead</link><guid isPermaLink="false">https://incontext.digital/p/the-mvp-is-dead</guid><dc:creator><![CDATA[Stijn Bakker]]></dc:creator><pubDate>Mon, 06 Apr 2026 18:01:37 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!N7PQ!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F535d714d-99bd-4f4f-bc3e-95dbb4e0954e_1280x1280.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>The minimum viable product is dead.</p><p>Not the concept, the bar. The bar has moved so far that what used to be &#8220;viable&#8221; now feels embarrassing.</p><p>Users are spoiled. And they should be. They use beautifully designed apps every day. Smooth animations. Instant responsiveness. Features that just work. Their reference point isn&#8217;t your competitor&#8217;s MVP, it&#8217;s the best app on their phone.</p><p>Launch something half-baked and they&#8217;ll push right through it. One clunky interaction, one missing feature, one loading spinner too many, gone. They&#8217;re not coming back to check your next release.</p><p>We still have frameworks that make building fast. But meeting modern expectations is harder than ever. Users expect polish from day one. They expect the feature set of a mature product from your beta. They expect speed, reliability, and design quality that used to take years to achieve.</p><p>The irony: building is cheaper and faster than ever, but the bar for &#8220;good enough&#8221; has never been higher.</p><p>The MVP was designed for an era where shipping something imperfect was acceptable because users had patience and alternatives were scarce. Neither is true anymore.</p>]]></content:encoded></item><item><title><![CDATA[The underrated art of simplification]]></title><description><![CDATA[We laughed at Trump for needing things dumbed down. But the ability to simplify without losing the essence is one of the most valuable, and rarest, skills there is.]]></description><link>https://incontext.digital/p/the-underrated-art-of-simplification</link><guid isPermaLink="false">https://incontext.digital/p/the-underrated-art-of-simplification</guid><dc:creator><![CDATA[Stijn Bakker]]></dc:creator><pubDate>Sat, 04 Apr 2026 18:01:14 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!N7PQ!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F535d714d-99bd-4f4f-bc3e-95dbb4e0954e_1280x1280.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em><strong>Originally published april 2023 on stijnbakker.com </strong></em></p><p>Trump needed things dumbed down, everyone laughed.</p><p>But here&#8217;s the thing nobody admits: simplification is an incredibly valuable skill. And almost nobody can do it well.</p><p>The trick isn&#8217;t making things short. It&#8217;s making things simple without losing the essence. Crafting a story that can be followed instantly. Where every part is understood on its own. Where the pieces build toward a conclusion that stands on its own weight.</p><p>That requires two things most people don&#8217;t have simultaneously.</p><p>First, you need to deeply understand your material. You can&#8217;t simplify what you don&#8217;t fully grasp. The physicist who explains quantum mechanics in plain language understands it better than the one who hides behind jargon. Simplification isn&#8217;t dumbing down, it&#8217;s distilling.</p><p>Second, you need to understand your audience. What they already know. What they care about. What metaphors will land. What level of abstraction they can comfortably hold in their head.</p><p>Most experts fail at this. They know their domain inside out but can&#8217;t explain it to anyone outside their bubble. They mistake complexity for rigor. They confuse &#8220;thorough&#8221; with &#8220;clear.&#8221;</p><p>The people who can take something genuinely complex and make it genuinely simple, without losing the truth of it, are rare. And valuable. In any field.</p>]]></content:encoded></item><item><title><![CDATA[Three types of AI engineering]]></title><description><![CDATA[&#8220;AI engineering&#8221; means three completely different things depending on who says it.]]></description><link>https://incontext.digital/p/three-types-of-ai-engineering</link><guid isPermaLink="false">https://incontext.digital/p/three-types-of-ai-engineering</guid><dc:creator><![CDATA[Stijn Bakker]]></dc:creator><pubDate>Thu, 02 Apr 2026 18:01:59 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!N7PQ!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F535d714d-99bd-4f4f-bc3e-95dbb4e0954e_1280x1280.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>When someone says &#8220;AI engineering,&#8221; they could mean three completely different things. Let&#8217;s untangle that a bit:</p><p><strong>Type one: building the models.</strong> This is the original &#8216;AI engineering&#8217;. The field formerly known as machine learning, data science, sometimes operations research. Deep research, deep statistics, data pipelines. These engineers build the foundational algorithms; a recommendation engine trained on purchasing patterns, a large language model consumed via API. It&#8217;s science-heavy, math-heavy, and requires understanding things most software engineers never touch.</p><p><strong>Type two: engineering with AI as a tool.</strong> This is traditional software engineering, but with AI supercharging the craft. The systems are still deterministic; same input, same output. The engineer uses Claude Code, GitHub Copilot, or ChatGPT the way a previous generation used Stack Overflow. The skill here is judgment. Knowing when the AI&#8217;s suggestion is good and when it&#8217;s garbage. Managing a team of agents. Keeping code quality high when generating of code is cheap.</p><p><strong>Type three: building products with AI inside them.</strong> This is the new discipline. Prompt engineering, token optimization, context management, cost control. The art of embedding a <a href="https://incontext.digital/p/stochastic-systems">stochastic system</a>, one that gives different outputs for the same input, inside a deterministic application. You&#8217;re introducing unpredictability into a system designed for predictability. That&#8217;s a fundamentally different engineering challenge.</p>]]></content:encoded></item><item><title><![CDATA[Stochastic systems]]></title><description><![CDATA[Traditional software is deterministic; same input, same output. LLMs are stochastic. We spent decades mastering one paradigm. Now we&#8217;re building with its opposite.]]></description><link>https://incontext.digital/p/stochastic-systems</link><guid isPermaLink="false">https://incontext.digital/p/stochastic-systems</guid><dc:creator><![CDATA[Stijn Bakker]]></dc:creator><pubDate>Tue, 31 Mar 2026 18:01:05 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!N7PQ!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F535d714d-99bd-4f4f-bc3e-95dbb4e0954e_1280x1280.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Traditional software systems are deterministic. Fixed input gives fixed output. That&#8217;s one of the beautiful things about computers. Outcomes are predictable. Bugs can be traced, reproduced, understood with logic, and debugged. The entire system is a stack of logically connected parts.</p><p>We got good at this. Really good.</p><p>We learned where the rubber hits the road, where deterministic systems meet the chaotic real world. Where humans click buttons they shouldn&#8217;t, enter data incorrectly, or unintentionally bypass the system in ways nobody anticipated.</p><p>So we built fault-tolerant systems. Systems designed to handle the unexpected gracefully. Input validation. Error boundaries. Fallback states. An entire engineering discipline built around the principle: the system is predictable, the world isn&#8217;t, and we need to bridge that gap.</p><p>And where users and the physical world (a physical hard-drive failing for example) used to be the chaotic actors in those systems, we are now adding a third catogory; Large Language Models.</p><p>LLMs are fundamentally different.</p><p>They are stochastic systems. Systems where the same input can produce different outputs. Not because something went wrong, but by design. The randomness is the feature.</p><p>Stochastic systems, from the Greek word &#8220;stokhastikos&#8221;, meaning &#8220;skilled in aiming&#8221;.</p><p>This breaks everything we know about testing, debugging, and reliability. You can&#8217;t reproduce a bug in a stochastic system the way you can in a deterministic one. You can&#8217;t write a test that guarantees a specific output. The system doesn&#8217;t have &#8220;correct&#8221; behavior in the traditional sense, it has a probability distribution of behaviors.</p><p>We spent decades mastering deterministic systems. Now we&#8217;re building products that embed their opposite. Stochastic components inside deterministic shells. Unpredictability wrapped in predictability.</p><p>That&#8217;s not just an engineering challenge. It&#8217;s a philosophical shift in how we think about software.</p>]]></content:encoded></item><item><title><![CDATA[2026 as the year of AI equilibrium]]></title><description><![CDATA[For the most part of 2025 I took a sabbatical.]]></description><link>https://incontext.digital/p/2026-as-the-year-of-ai-equilibrium</link><guid isPermaLink="false">https://incontext.digital/p/2026-as-the-year-of-ai-equilibrium</guid><dc:creator><![CDATA[Stijn Bakker]]></dc:creator><pubDate>Fri, 02 Jan 2026 08:29:38 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!N7PQ!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F535d714d-99bd-4f4f-bc3e-95dbb4e0954e_1280x1280.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>For the most part of 2025 I took a sabbatical. Renovated the house. Stepped away from digital agencies, computers, building apps, e-commerce platforms. I kept an eye out of course.</p><p>I saw a lot of hype about AI. A lot of promise. And a lot of misunderstanding. Talk to any software engineer and they will tell you AI is (still) shit). Yet LLMs captured the imagination, in particular of non-technical people. The [imagination](https://incontext.digital/p/the-two-platform-shifts-and-why-ai) of abstract thinkers, strategical thinkers. The executives. </p><p>Dreaming of automatically making stuff, without having to learn any code. Of automating anything. Of rapid prototyping. Halving the cost of digital expenses. The end of SaaS lock-in, and the end of being locked in by those annoying vague and expensive software engineers and development agencies you depend on to run your digital business.</p><p>Yet LLMs are flatlining in what extra benefits they give us. Microsoft is in AI trouble, OpenAI has code red, and thanks to Google Nvidia no longer has the monopoly on LLM capable chips. So much investment has been built on this imagination that it is interesting to see if we will end up in an AI bubble in 2026.</p><p>But in terms of &#8216;regular business&#8217; I believe we will see a return to normal in 2026. Of the &#8216;AI will replace jobs&#8217; hype dying down a little. Of development agencies taking over work, and us continuing as we have always done. </p><p>LLMs will go the route of 3D printing. Revolutionary and a gateway technology for sure, but probably one mostly for the &#8216;backend&#8217; of technology and business. Gradual progress.</p>]]></content:encoded></item><item><title><![CDATA[Audio design as an upcoming discipline]]></title><description><![CDATA[When we think of design, we usually think of how a product looks.]]></description><link>https://incontext.digital/p/audio-design-as-an-upcoming-discipline</link><guid isPermaLink="false">https://incontext.digital/p/audio-design-as-an-upcoming-discipline</guid><dc:creator><![CDATA[Stijn Bakker]]></dc:creator><pubDate>Wed, 31 Dec 2025 18:32:00 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!N7PQ!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F535d714d-99bd-4f4f-bc3e-95dbb4e0954e_1280x1280.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>When we think of design, we usually think of how a product looks. Maybe also how a product feels. UI and UX design respectively. Talk to academically trained designers and they believe design encompasses everything of the product, from the business strategy it supports to the unconscious behavior that a service or button on a website elicits.</p><p>But what about audio design? Audio in product design is usually a byproduct. The sound of a bluetooth speaker upon connecting. Or the sound of an app notification.</p><p>But I think audio might just play a more and more crucial role as a computer interface. We have Siri, Gemini, AI note takers joining our meetings. We are rumored to be getting an audio assistant from OpenAI later this year.</p><p>It looks like audio might become a second sense, next to the visual screens we are used to interfacing with. Which begs the question; what about audio design? How ought a computer to sound? And how ought those sounds mix with other sounds? The environment we use our computers in, the music that is on simultaneously while working, the meeting going on at the same time? Audio literally joining the conversation also requires a thoughtful audio footprint, that blends well into its environment.</p><p>The discipline of <em>sound design</em>, up till now mostly associated with sounds of media and video (like games, movies) might offer an interesting foundation to build upon. But I&#226;&#8364;&#8482;ll be on the lookout for the first audio design vacancies, and audio design as a discipline taking off this year.</p>]]></content:encoded></item><item><title><![CDATA[The copilot guessing game]]></title><description><![CDATA[Every product has a copilot now.]]></description><link>https://incontext.digital/p/the-copilot-guessing-game</link><guid isPermaLink="false">https://incontext.digital/p/the-copilot-guessing-game</guid><dc:creator><![CDATA[Stijn Bakker]]></dc:creator><pubDate>Fri, 18 Jul 2025 08:00:00 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/8237b7c7-467b-421a-b921-fcfa712834bf_1456x1048.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Every product has a copilot now. Most are laughably bad. Worse, every co-pilot is behaving and formatted just a little bit differently in every app.</p><p>The core of the weirdness I believe is this; predictability.</p><p>Click any &#8220;AI Assistant&#8221; button and you&#8217;re rolling the dice. Will it rewrite your entire document? Suggest a single word change? Completely misunderstand what you wanted? Nobody knows.</p><p>This breaks something fundamental about interface design.</p><p>Good UI design is about removing anxiety. When you see a red &#8220;Delete&#8221; button, you know exactly what happens next. The trash icon means trash. The save button saves. Users build mental models based on consistent, predictable behavior.</p><p>AI throws all of that out the window.</p><p>Every AI prompt is a black box. You type something in, cross your fingers, and hope the algorithm interprets your intent correctly. Sometimes it nails it. Sometimes it does something completely random. Sometimes it just fails silently.</p><p>We&#8217;ve created interfaces where the primary interaction is guessing.</p><p>This isn&#8217;t just bad UX&#8212;it&#8217;s the opposite of what interfaces should do. Instead of reducing cognitive load, AI features often increase it. Users could easily spend more mental energy trying to craft the perfect prompt than they would just doing the task manually.</p><p>Most &#8220;copilot&#8221; features feel like they were added because everyone else has one, not because they actually improve the user experience. They&#8217;re checkbox features. Marketing bullets. Not tools that genuinely help people get work done.</p><p>The best AI implementations hide their complexity. They work predictably, even if the underlying technology is probabilistic. They feel like magic, not gambling.</p><p>But those are rare.</p><p>Most copilots are just expensive guessing games dressed up as innovation.</p>]]></content:encoded></item><item><title><![CDATA[Why LLMs feel natural but digital systems don't]]></title><description><![CDATA[LLMs work because they speak human.]]></description><link>https://incontext.digital/p/why-llms-feel-natural-but-digital</link><guid isPermaLink="false">https://incontext.digital/p/why-llms-feel-natural-but-digital</guid><dc:creator><![CDATA[Stijn Bakker]]></dc:creator><pubDate>Mon, 14 Jul 2025 08:00:00 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/e46cfd5f-2346-48a8-a01c-c836c1924ffe_1456x1048.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>LLMs work because they speak human.</p><p>That&#8217;s the real innovation. Not the size of the models or the training data, it&#8217;s that they understand natural language the way we actually use it. Messy, contextual, full of exceptions.</p><p>You can tell ChatGPT &#8220;make it more professional but keep the casual tone&#8221; and it gets it. Try programming that instruction into traditional software. You&#8217;d spend months defining what &#8220;professional&#8221; means, building taxonomies for &#8220;casual,&#8221; creating exception handlers for edge cases.</p><p>But here&#8217;s the problem: that natural flexibility crashes into digital reality.</p><p>Digital systems are fundamentally rigid. Databases need structured fields. APIs expect specific formats. Workflows demand binary decisions. Yes or no, approved or rejected, category A or category B.</p><p>The world isn&#8217;t binary. It&#8217;s fluid.</p><p>When you tell an LLM &#8220;this customer is frustrated but loyal,&#8221; it understands the nuance. When you try to put that same customer into your CRM, you&#8217;re forced to pick: frustrated OR loyal. The system can&#8217;t handle both. It can&#8217;t capture the contradiction that makes the insight valuable.</p><p>This creates a translation problem that didn&#8217;t exist before.</p><p>Pre-LLM, we accepted that software was clunky. We learned its language. Dropdown menus, mandatory fields, rigid categories. We bent our thinking to fit the system.</p><p>Now LLMs show us what natural interaction feels like. We can think out loud, change our minds mid-sentence, reference context from three conversations ago. The AI gets it.</p><p>But then we hit the wall. The LLM understands perfectly, but it still has to cram that understanding into the same rigid systems we&#8217;ve always had.</p><p>The innovation of natural language interface is real. But it exposes just how fundamentally broken our digital infrastructure is for handling the way humans actually think and work.</p><p>We&#8217;re not just building better AI. We&#8217;re discovering that everything else needs to be rebuilt too.</p>]]></content:encoded></item><item><title><![CDATA[The limits of non-linear work]]></title><description><![CDATA[We live in a bubble.]]></description><link>https://incontext.digital/p/the-limits-of-non-linear-work</link><guid isPermaLink="false">https://incontext.digital/p/the-limits-of-non-linear-work</guid><dc:creator><![CDATA[Stijn Bakker]]></dc:creator><pubDate>Thu, 10 Jul 2025 08:00:00 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/bb8595a3-49f7-4990-89fa-3afdc1ea3015_1456x1048.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>We live in a bubble.</p><p>As technologists, we&#8217;re obsessed with non-linear work. Automation. Exponential curves. Making everything faster, more efficient, infinitely scalable.</p><p>But here&#8217;s what we miss: some work will always be linear.</p><p>When my water pipes need replacing, it&#8217;s still four people working for a day. A digging machine might speed things up a bit, but it&#8217;s fundamentally the same job. One day of work.</p><p>Having coffee with a colleague? An hour. Something that can&#8217;t be accelerated. Something you wouldn&#8217;t want to accelerate.</p><p>There&#8217;s a hard limit to what can be digitized.</p><p>Digital is purely administrative. It facilitates. It connects. But it doesn&#8217;t replace the core work that actually matters; the human work, the physical work.</p><p>Think about it: why do we even need all this digital infrastructure? What&#8217;s the real core of digital technology?</p><p>Connection, maybe. But even that has limits.</p><p>We keep trying to optimize everything, to make everything non-linear and scalable. But the most meaningful work remains stubbornly linear. Building something with your hands, solving a real problem for a real person, having an actual conversation.</p><p>And that&#8217;s not a bug. It&#8217;s a feature.</p>]]></content:encoded></item><item><title><![CDATA[The text UI paradox]]></title><description><![CDATA[Is text supposedly the UI of the future?]]></description><link>https://incontext.digital/p/the-text-ui-paradox</link><guid isPermaLink="false">https://incontext.digital/p/the-text-ui-paradox</guid><dc:creator><![CDATA[Stijn Bakker]]></dc:creator><pubDate>Mon, 02 Jun 2025 08:00:00 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!N7PQ!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F535d714d-99bd-4f4f-bc3e-95dbb4e0954e_1280x1280.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Is text supposedly the UI of the future? Chat interfaces are everywhere. We talk to ChatGPT, Claude, and countless AI assistants through text. The entire AI revolution runs on typing words into boxes.</p><p>So why is mobile absolutely terrible at text?</p><p>Think about it: when you need to write anything substantial, you reach for a laptop. Mobile keyboards are cramped, autocorrect is aggressive and wrong, and thumb-typing longer messages feels like punishment. We&#8217;ve optimized the world&#8217;s most personal computing device for everything except the interface that&#8217;s supposed to define our digital future.</p><p>Voice was meant to solve this. For decades, we&#8217;ve been promised that speaking to our devices would free us from keyboards. Apple launched Siri in 2011. Amazon&#8217;s Alexa arrived in 2014. Google&#8217;s been pushing voice for even longer.</p><p>But voice input still sucks.</p><p>It&#8217;s not private &#8212; you can&#8217;t dictate a sensitive email on the train. It&#8217;s error-prone &#8212; try explaining technical concepts to voice recognition. And it&#8217;s surprisingly low bandwidth. Reading is faster than listening. Typing (when you have a real keyboard) is faster than speaking. We all hate voice messages because they waste our time.</p><p>This creates a fundamental tension in computing. The interfaces we&#8217;re now building assume text input. But the devices we carry make text input painful. Voice hasn&#8217;t bridged that gap and likely won&#8217;t anytime soon.</p><p>Maybe this points to something bigger: the future isn&#8217;t actually text-first. Maybe text interfaces are just a bridge&#8212;a way to interact with AI until something better emerges. Or maybe we&#8217;re heading toward a world where serious work happens on devices with real keyboards, while mobile becomes purely consumptive.</p><p>Either way, the contradiction is real. And it suggests our assumptions about the &#8220;text UI future&#8221; might need rethinking.</p>]]></content:encoded></item><item><title><![CDATA[Let's build a better ERP]]></title><description><![CDATA[Every business needs (software) tools to run.]]></description><link>https://incontext.digital/p/lets-build-a-better-erp</link><guid isPermaLink="false">https://incontext.digital/p/lets-build-a-better-erp</guid><dc:creator><![CDATA[Stijn Bakker]]></dc:creator><pubDate>Fri, 30 May 2025 08:00:00 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!N7PQ!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F535d714d-99bd-4f4f-bc3e-95dbb4e0954e_1280x1280.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Every business needs (software) tools to run. Even the simplest company needs bookkeeping, at a minimum.</p><p>But here&#8217;s my frustration; these tools often end up running the business.</p><p>Companies bend themselves around ERP systems. They hire expensive consultants to jam their processes into rigid data models. They spend months of expensive consulting work adapting workflows to fit software that was built for someone else&#8217;s idea of how your business should work.</p><p>This architecture (and, let&#8217;s be frank, the ERP business model) make ERP expensive and inflexible. Worse still, because those ERPs run on architectures from the 90s, they can never adapt ever to anything modern, without once again going through a very painful migration process.</p><p>Rigid ERP makes employees hate their jobs. Or at least their computers. Think about the software you use personally - your phone, your apps. They adapt to you. They learn your patterns, get out of your way. Now think about the software you use at work. It feels like a punishment. Forms with mandatory fields, workflows that make no sense. Interfaces that look like they&#8217;re straight out of 2004. No wonder<strong><a href="https://www.masterclassnieuwezorg.nl/wp-content/uploads/Fenna-Heyning-Why-doctors-hate-their-computers.pdf"> doctors hate their computers</a></strong>. Most have simply given up hope</p><h2><strong>A new type of ERP</strong></h2><p>Here&#8217;s a different approach; build ERP systems that adapt to businesses, not the other way around. The architecture would mirror Notion - a core foundation where you can define you own database, objects and properties. Everything pluggable. Every bit of customisation you want can be easily build. With plugins and API calls that do not break the core system.</p><p>This would open up three powerful possibilities:</p><p><strong>Maintainable modern UI.</strong> The core team can evolve the interface with current design standards. Think design systems that feel coherent and modern while letting users build their own workflows. Software that bends, not breaks.</p><p><strong>AI-generated plugins.</strong> When the plugin API is simple and well-documented, generative AI can create custom extensions on demand. Need a specific workflow? Generate it. Business process changes? Adapt it instantly.</p><p><strong>Malleable data structures.</strong> Like Notion&#8217;s graph of objects, the data model evolves as your business evolves. No expensive consultants. No code changes. Anyone can adjust the structure as processes and insights develop.</p><p>The result? ERP that doesn&#8217;t feel restrictive&#8212;it feels empowering.</p><h2><strong>What this enables</strong></h2><p>Small, specialised development teams could build hyper-personalized ERP systems for local businesses. Developers and designers working directly with users, building and continuously adjusting tools that actually fit.</p><p>Small businesses could finally afford the same quality tooling as large enterprises. No more choosing between expensive, bloated systems and doing everything in spreadsheets.</p><p>The real opportunity isn&#8217;t just better software. It&#8217;s flipping the relationship between businesses and their tools.</p><p>Instead of businesses serving their software, software finally serves businesses.</p>]]></content:encoded></item><item><title><![CDATA[Humanity as a compass]]></title><description><![CDATA[Jony Ive gave an interview.]]></description><link>https://incontext.digital/p/humanity-as-a-compass</link><guid isPermaLink="false">https://incontext.digital/p/humanity-as-a-compass</guid><dc:creator><![CDATA[Stijn Bakker]]></dc:creator><pubDate>Tue, 20 May 2025 09:00:00 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!N7PQ!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F535d714d-99bd-4f4f-bc3e-95dbb4e0954e_1280x1280.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Jony Ive gave an interview. And something simple he said struck me as instantly true. <strong><a href="https://youtu.be/wLb9g_8r-mE?si=4_onnGJSlXtHcwAq">&#8220;Humanity as a compass&#8221;</a></strong>.</p><p>What he meant was this. When design gets murky, when you get stuck in your head, return to the person. The actual human with the actual problem. Solve for that. Nothing else. Everything else follows.</p><p>This is the cornerstone of good design, of course. We know that. This is not new. But it is good to be reminded every now and again; what truly makes a designer</p>]]></content:encoded></item><item><title><![CDATA[The evolving craft of venture capital]]></title><description><![CDATA[It seems like there is some buzz in venture capital.]]></description><link>https://incontext.digital/p/the-evolving-craft-of-venture-capital</link><guid isPermaLink="false">https://incontext.digital/p/the-evolving-craft-of-venture-capital</guid><dc:creator><![CDATA[Stijn Bakker]]></dc:creator><pubDate>Tue, 20 May 2025 08:00:00 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!N7PQ!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F535d714d-99bd-4f4f-bc3e-95dbb4e0954e_1280x1280.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>It seems like there is some <strong><a href="https://www.linkedin.com/posts/harveyesq_240-vcs-becoming-rias-activity-7326624541768667136-JChH/">buzz</a></strong> in venture capital. Are venture capital firms becoming Private Equity firms? Some firms are, like <a href="https://a16z.com/about/">Andreessen Horowitz</a> and <a href="https://www.generalcatalyst.com/stories/unveiling-gc-wealth">General Catalyst</a>. This often involves becoming <a href="https://pitchbook.com/news/articles/venture-industry-sec-ria-investment-flexibility-nvca">Registered Investment Advisers (RIAs)</a>, enabling a more hands-on role in acquiring, building, and managing portfolio companies.</p><p>What does all of that mean? I believe it is a response to a changing landscape, but only for those firms. Nothin substantial is going to change in venture capital. Here&#8217;s my logic;</p><p>These firms are jumping on the AI bandwagon, making a strategic choice to be more hands-on. They see a big opportunity in making vertical <strong><a href="https://a16z.com/financial-opportunity-of-ai/">AI companies</a></strong>. And are willing to make a big bet. These VCs are equipping themselves not just to fund, but to help forge these businesses. Be more hands-on. And that means restructuring their setup.</p><p>For founders, this suggests a changing dialogue: an opportunity for capital that comes with deeper, more operational collaboration. It points towards a future where investors are more like co-builders in the long craft of creating sustainable, category-defining companies. This is less a revolution, more a considered evolution in the art of venture.</p>]]></content:encoded></item><item><title><![CDATA[Edge-computing gold-rush]]></title><description><![CDATA[Every business wants AI that works like ChatGPT.]]></description><link>https://incontext.digital/p/edge-computing-gold-rush</link><guid isPermaLink="false">https://incontext.digital/p/edge-computing-gold-rush</guid><dc:creator><![CDATA[Stijn Bakker]]></dc:creator><pubDate>Fri, 09 May 2025 08:00:00 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!N7PQ!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F535d714d-99bd-4f4f-bc3e-95dbb4e0954e_1280x1280.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Every business wants AI that works like ChatGPT. But most won&#8217;t send their sensitive data to Google&#8217;s, OpenAI&#8217;s or Anthropics&#8217; servers. Nor should they.</p><p>There is an opportunity in that tension.</p><p>Companies are caught between two realities. Their employees expect AI tools that actually help; smart assistants, automated workflows, intelligent data analysis. But their legal teams won&#8217;t let corporate secrets flow through third-party APIs.</p><p>The solution isn&#8217;t to give up on AI. It&#8217;s to bring AI in-house.</p><h2><strong>Three paths to private AI</strong></h2><p><strong>The cloud-hosted approach.</strong> Rent private GPU clusters from providers like Hetzner or Fly.io. You get dedicated hardware in a data center, but it&#8217;s still managed infrastructure. Pay monthly, scale as needed, upgrade without buying new boxes. This feels like the sweet spot for most businesses right now&#8212;private enough for lawyers, flexible enough for rapid AI development.</p><p><strong>The on-premise service model.</strong> Buy or rent actual hardware that sits in your office. Think a stack of Mac Studios wired together, but with a maintenance contract. It&#8217;s like the old IBM model from the 1950s&#8212;you get the hardware and the expertise to keep it running. Complete control, zero data leaving your building.</p><p><strong>The prebuilt cluster approach.</strong> Sell businesses a complete package&#8212;computers, GPUs, software, all pre-configured. Plug it in and start training models. No IT department headaches, no linking machines together, no wondering if your setup actually works.</p><h2><strong>What makes this work</strong></h2><p>The real opportunity isn&#8217;t just selling hardware. It&#8217;s making private AI actually usable.</p><p>Most businesses can&#8217;t hire AI engineers. But they might have one or two developers who could fine-tune models if the platform made it simple enough. That&#8217;s the key; build developer tools so good that tweaking AI models becomes easy, a competitive advantage, not a technical nightmare.</p><p>And businesses need proof their data stays safe. Complete telemetry. Zero data loss guarantees. Show them exactly what&#8217;s happening to their information.</p><h2><strong>Why now</strong></h2><p>Smaller AI models keep getting better. What required massive server farms last year now runs on expensive-but-manageable local hardware. The gap between cloud AI and on-premise AI is shrinking fast.</p><p>Meanwhile, the business case gets stronger every month. Companies see AI boosting productivity but hate the privacy trade-offs. They want both benefits&#8212;and they&#8217;re willing to pay for infrastructure that delivers them.</p><p>The businesses that figure out private AI first will have a serious edge. And someone needs to build the infrastructure to make that possible.</p>]]></content:encoded></item><item><title><![CDATA[Audiobooks' third wave]]></title><description><![CDATA[Audiobooks should have exploded by now.]]></description><link>https://incontext.digital/p/audiobooks-third-wave</link><guid isPermaLink="false">https://incontext.digital/p/audiobooks-third-wave</guid><dc:creator><![CDATA[Stijn Bakker]]></dc:creator><pubDate>Sun, 04 May 2025 08:00:00 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!N7PQ!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F535d714d-99bd-4f4f-bc3e-95dbb4e0954e_1280x1280.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Audiobooks should have exploded by now.</p><p>They&#8217;ve grown steadily&#8212;20% annually in recent years&#8212;but they haven&#8217;t had their podcast moment. That explosive, culture-shifting breakthrough where everyone suddenly gets it.</p><p>The bottleneck has always been narration.</p><p>Most audiobooks sound flat. Lifeless. Text written to be read silently gets read aloud by someone who treats every sentence the same. No rhythm. No emphasis. No soul.</p><p>The good narrators&#8212;the ones who bring personality and energy&#8212;create hits. But hiring them is expensive. Recording is expensive. Direction is expensive. Publishers pick their battles. And also, no clear business model to get that investment back. For the most part, audiobooks are still meant to be bought as a whole, rather simply streamed like a podcast or a song.</p><p>This creates a vicious cycle. High production costs mean high prices. High prices mean listeners hesitate to try new books. Limited audience means limited budgets for good narration.</p><p>But I&#8217;m hopeful there is two factors that are changing this completely.</p><p>First is business model. We&#8217;re used to paying for Spotify, and Spotify would like you to spend more time in their app. They are rolling out support for audiobooks; giving users an incentive to simply try a book, and publishers an incentive to stream books and get paid for them. It also allows publishers, like podcasters, to ad-support their books; adding another revenue model to their business. Users are now more used to audio from podcast, I&#8217;m hopeful audiobooks will catch on as well.</p><p>Second change is the promise of genAI. Which could change the narration game completely. Paradoxically, AI could replace the robotic (albeit human) voices of audiobooks we have today, with much more intonation. Making bland text a lot more fun to listen to. That means that every book suddenly can have engaging narration and the cost drops from thousands of dollars. Publishers are more free to experiment. And that means listeners can browse freely without the $15-per-book commitment anxiety.</p><p>This isn&#8217;t just about making audiobooks cheaper. It&#8217;s about making them discoverable. Let&#8217;s give audiobooks their moment to shine</p>]]></content:encoded></item></channel></rss>